WorldWideScience

Sample records for reasonable computation time

  1. Computer Security: SAHARA - Security As High As Reasonably Achievable

    CERN Multimedia

    Stefan Lueders, Computer Security Team

    2015-01-01

    History has shown us time and again that our computer systems, computing services and control systems have digital security deficiencies. Too often we deploy stop-gap solutions and improvised hacks, or we just accept that it is too late to change things.    In my opinion, this blatantly contradicts the professionalism we show in our daily work. Other priorities and time pressure force us to ignore security or to consider it too late to do anything… but we can do better. Just look at how “safety” is dealt with at CERN! “ALARA” (As Low As Reasonably Achievable) is the objective set by the CERN HSE group when considering our individual radiological exposure. Following this paradigm, and shifting it from CERN safety to CERN computer security, would give us “SAHARA”: “Security As High As Reasonably Achievable”. In other words, all possible computer security measures must be applied, so long as ...

  2. Computational approaches to analogical reasoning current trends

    CERN Document Server

    Richard, Gilles

    2014-01-01

    Analogical reasoning is known as a powerful mode for drawing plausible conclusions and solving problems. It has been the topic of a huge number of works by philosophers, anthropologists, linguists, psychologists, and computer scientists. As such, it has been early studied in artificial intelligence, with a particular renewal of interest in the last decade. The present volume provides a structured view of current research trends on computational approaches to analogical reasoning. It starts with an overview of the field, with an extensive bibliography. The 14 collected contributions cover a large scope of issues. First, the use of analogical proportions and analogies is explained and discussed in various natural language processing problems, as well as in automated deduction. Then, different formal frameworks for handling analogies are presented, dealing with case-based reasoning, heuristic-driven theory projection, commonsense reasoning about incomplete rule bases, logical proportions induced by similarity an...

  3. Open Graphs and Computational Reasoning

    Directory of Open Access Journals (Sweden)

    Lucas Dixon

    2010-06-01

    Full Text Available We present a form of algebraic reasoning for computational objects which are expressed as graphs. Edges describe the flow of data between primitive operations which are represented by vertices. These graphs have an interface made of half-edges (edges which are drawn with an unconnected end and enjoy rich compositional principles by connecting graphs along these half-edges. In particular, this allows equations and rewrite rules to be specified between graphs. Particular computational models can then be encoded as an axiomatic set of such rules. Further rules can be derived graphically and rewriting can be used to simulate the dynamics of a computational system, e.g. evaluating a program on an input. Examples of models which can be formalised in this way include traditional electronic circuits as well as recent categorical accounts of quantum information.

  4. Using Computer Simulations for Promoting Model-based Reasoning. Epistemological and Educational Dimensions

    Science.gov (United States)

    Develaki, Maria

    2017-11-01

    Scientific reasoning is particularly pertinent to science education since it is closely related to the content and methodologies of science and contributes to scientific literacy. Much of the research in science education investigates the appropriate framework and teaching methods and tools needed to promote students' ability to reason and evaluate in a scientific way. This paper aims (a) to contribute to an extended understanding of the nature and pedagogical importance of model-based reasoning and (b) to exemplify how using computer simulations can support students' model-based reasoning. We provide first a background for both scientific reasoning and computer simulations, based on the relevant philosophical views and the related educational discussion. This background suggests that the model-based framework provides an epistemologically valid and pedagogically appropriate basis for teaching scientific reasoning and for helping students develop sounder reasoning and decision-taking abilities and explains how using computer simulations can foster these abilities. We then provide some examples illustrating the use of computer simulations to support model-based reasoning and evaluation activities in the classroom. The examples reflect the procedure and criteria for evaluating models in science and demonstrate the educational advantages of their application in classroom reasoning activities.

  5. Time-Predictable Computer Architecture

    Directory of Open Access Journals (Sweden)

    Schoeberl Martin

    2009-01-01

    Full Text Available Today's general-purpose processors are optimized for maximum throughput. Real-time systems need a processor with both a reasonable and a known worst-case execution time (WCET. Features such as pipelines with instruction dependencies, caches, branch prediction, and out-of-order execution complicate WCET analysis and lead to very conservative estimates. In this paper, we evaluate the issues of current architectures with respect to WCET analysis. Then, we propose solutions for a time-predictable computer architecture. The proposed architecture is evaluated with implementation of some features in a Java processor. The resulting processor is a good target for WCET analysis and still performs well in the average case.

  6. [APPLICATION OF COMPUTER-ASSISTED TECHNOLOGY IN ANALYSIS OF REVISION REASON OF UNICOMPARTMENTAL KNEE ARTHROPLASTY].

    Science.gov (United States)

    Jia, Di; Li, Yanlin; Wang, Guoliang; Gao, Huanyu; Yu, Yang

    2016-01-01

    To conclude the revision reason of unicompartmental knee arthroplasty (UKA) using computer-assisted technology so as to provide reference for reducing the revision incidence and improving the level of surgical technique and rehabilitation. The relevant literature on analyzing revision reason of UKA using computer-assisted technology in recent years was extensively reviewed. The revision reasons by computer-assisted technology are fracture of the medial tibial plateau, progressive osteoarthritis of reserved compartment, dislocation of mobile bearing, prosthesis loosening, polyethylene wear, and unexplained persistent pain. Computer-assisted technology can be used to analyze the revision reason of UKA and guide the best operating method and rehabilitation scheme by simulating the operative process and knee joint activities.

  7. Reasoning about real-time systems with temporal interval logic constraints on multi-state automata

    Science.gov (United States)

    Gabrielian, Armen

    1991-01-01

    Models of real-time systems using a single paradigm often turn out to be inadequate, whether the paradigm is based on states, rules, event sequences, or logic. A model-based approach to reasoning about real-time systems is presented in which a temporal interval logic called TIL is employed to define constraints on a new type of high level automata. The combination, called hierarchical multi-state (HMS) machines, can be used to model formally a real-time system, a dynamic set of requirements, the environment, heuristic knowledge about planning-related problem solving, and the computational states of the reasoning mechanism. In this framework, mathematical techniques were developed for: (1) proving the correctness of a representation; (2) planning of concurrent tasks to achieve goals; and (3) scheduling of plans to satisfy complex temporal constraints. HMS machines allow reasoning about a real-time system from a model of how truth arises instead of merely depending of what is true in a system.

  8. Developing Computer Model-Based Assessment of Chemical Reasoning: A Feasibility Study

    Science.gov (United States)

    Liu, Xiufeng; Waight, Noemi; Gregorius, Roberto; Smith, Erica; Park, Mihwa

    2012-01-01

    This paper reports a feasibility study on developing computer model-based assessments of chemical reasoning at the high school level. Computer models are flash and NetLogo environments to make simultaneously available three domains in chemistry: macroscopic, submicroscopic, and symbolic. Students interact with computer models to answer assessment…

  9. Data science in R a case studies approach to computational reasoning and problem solving

    CERN Document Server

    Nolan, Deborah

    2015-01-01

    Effectively Access, Transform, Manipulate, Visualize, and Reason about Data and ComputationData Science in R: A Case Studies Approach to Computational Reasoning and Problem Solving illustrates the details involved in solving real computational problems encountered in data analysis. It reveals the dynamic and iterative process by which data analysts approach a problem and reason about different ways of implementing solutions. The book's collection of projects, comprehensive sample solutions, and follow-up exercises encompass practical topics pertaining to data processing, including: Non-standar

  10. Psychological Trauma as a Reason for Computer Game Addiction among Adolescents

    Science.gov (United States)

    Oskenbay, Fariza; Tolegenova, Aliya; Kalymbetova, Elmira; Chung, Man Cheung; Faizullina, Aida; Jakupov, Maksat

    2016-01-01

    This study explores psychological trauma as a reason for computer game addiction among adolescents. The findings of this study show that there is a connection between psychological trauma and computer game addiction. Some psychologists note that the main cause of any type of addiction derives from psychological trauma, and that finding such…

  11. Reflexive reasoning for distributed real-time systems

    Science.gov (United States)

    Goldstein, David

    1994-01-01

    This paper discusses the implementation and use of reflexive reasoning in real-time, distributed knowledge-based applications. Recently there has been a great deal of interest in agent-oriented systems. Implementing such systems implies a mechanism for sharing knowledge, goals and other state information among the agents. Our techniques facilitate an agent examining both state information about other agents and the parameters of the knowledge-based system shell implementing its reasoning algorithms. The shell implementing the reasoning is the Distributed Artificial Intelligence Toolkit, which is a derivative of CLIPS.

  12. Improving the learning of clinical reasoning through computer-based cognitive representation.

    Science.gov (United States)

    Wu, Bian; Wang, Minhong; Johnson, Janice M; Grotzer, Tina A

    2014-01-01

    Clinical reasoning is usually taught using a problem-solving approach, which is widely adopted in medical education. However, learning through problem solving is difficult as a result of the contextualization and dynamic aspects of actual problems. Moreover, knowledge acquired from problem-solving practice tends to be inert and fragmented. This study proposed a computer-based cognitive representation approach that externalizes and facilitates the complex processes in learning clinical reasoning. The approach is operationalized in a computer-based cognitive representation tool that involves argument mapping to externalize the problem-solving process and concept mapping to reveal the knowledge constructed from the problems. Twenty-nine Year 3 or higher students from a medical school in east China participated in the study. Participants used the proposed approach implemented in an e-learning system to complete four learning cases in 4 weeks on an individual basis. For each case, students interacted with the problem to capture critical data, generate and justify hypotheses, make a diagnosis, recall relevant knowledge, and update their conceptual understanding of the problem domain. Meanwhile, students used the computer-based cognitive representation tool to articulate and represent the key elements and their interactions in the learning process. A significant improvement was found in students' learning products from the beginning to the end of the study, consistent with students' report of close-to-moderate progress in developing problem-solving and knowledge-construction abilities. No significant differences were found between the pretest and posttest scores with the 4-week period. The cognitive representation approach was found to provide more formative assessment. The computer-based cognitive representation approach improved the learning of clinical reasoning in both problem solving and knowledge construction.

  13. A fascinating country in the world of computing your guide to automated reasoning

    CERN Document Server

    Wos, Larry

    1999-01-01

    This book shows you - through examples and puzzles and intriguing questions - how to make your computer reason logically. To help you, the book includes a CD-ROM with OTTER, the world's most powerful general-purpose reasoning program. The automation of reasoning has advanced markedly in the past few decades, and this book discusses some of the remarkable successes that automated reasoning programs have had in tackling challenging problems in mathematics, logic, program verification, and circuit design. Because the intended audience includes students and teachers, the book provides many exercis

  14. Analogical Reasoning and Computer Programming.

    Science.gov (United States)

    Clement, Catherine A.; And Others

    1986-01-01

    A study of correlations between analogical reasoning and Logo programming mastery among female high school students related the results of pretests of analogical reasoning to posttests of programming mastery. A significant correlation was found between analogical reasoning and the ability to write subprocedures for use in several different…

  15. Improving the learning of clinical reasoning through computer-based cognitive representation

    Directory of Open Access Journals (Sweden)

    Bian Wu

    2014-12-01

    Full Text Available Objective: Clinical reasoning is usually taught using a problem-solving approach, which is widely adopted in medical education. However, learning through problem solving is difficult as a result of the contextualization and dynamic aspects of actual problems. Moreover, knowledge acquired from problem-solving practice tends to be inert and fragmented. This study proposed a computer-based cognitive representation approach that externalizes and facilitates the complex processes in learning clinical reasoning. The approach is operationalized in a computer-based cognitive representation tool that involves argument mapping to externalize the problem-solving process and concept mapping to reveal the knowledge constructed from the problems. Methods: Twenty-nine Year 3 or higher students from a medical school in east China participated in the study. Participants used the proposed approach implemented in an e-learning system to complete four learning cases in 4 weeks on an individual basis. For each case, students interacted with the problem to capture critical data, generate and justify hypotheses, make a diagnosis, recall relevant knowledge, and update their conceptual understanding of the problem domain. Meanwhile, students used the computer-based cognitive representation tool to articulate and represent the key elements and their interactions in the learning process. Results: A significant improvement was found in students’ learning products from the beginning to the end of the study, consistent with students’ report of close-to-moderate progress in developing problem-solving and knowledge-construction abilities. No significant differences were found between the pretest and posttest scores with the 4-week period. The cognitive representation approach was found to provide more formative assessment. Conclusions: The computer-based cognitive representation approach improved the learning of clinical reasoning in both problem solving and knowledge

  16. Visual Reasoning in Computational Environment: A Case of Graph Sketching

    Science.gov (United States)

    Leung, Allen; Chan, King Wah

    2004-01-01

    This paper reports the case of a form six (grade 12) Hong Kong student's exploration of graph sketching in a computational environment. In particular, the student summarized his discovery in the form of two empirical laws. The student was interviewed and the interviewed data were used to map out a possible path of his visual reasoning. Critical…

  17. A Computational Account of Children's Analogical Reasoning: Balancing Inhibitory Control in Working Memory and Relational Representation

    Science.gov (United States)

    Morrison, Robert G.; Doumas, Leonidas A. A.; Richland, Lindsey E.

    2011-01-01

    Theories accounting for the development of analogical reasoning tend to emphasize either the centrality of relational knowledge accretion or changes in information processing capability. Simulations in LISA (Hummel & Holyoak, 1997, 2003), a neurally inspired computer model of analogical reasoning, allow us to explore how these factors may…

  18. Foundations for Reasoning in Cognition-Based Computational Representations of Human Decision Making; TOPICAL

    International Nuclear Information System (INIS)

    SENGLAUB, MICHAEL E.; HARRIS, DAVID L.; RAYBOURN, ELAINE M.

    2001-01-01

    In exploring the question of how humans reason in ambiguous situations or in the absence of complete information, we stumbled onto a body of knowledge that addresses issues beyond the original scope of our effort. We have begun to understand the importance that philosophy, in particular the work of C. S. Peirce, plays in developing models of human cognition and of information theory in general. We have a foundation that can serve as a basis for further studies in cognition and decision making. Peircean philosophy provides a foundation for understanding human reasoning and capturing behavioral characteristics of decision makers due to cultural, physiological, and psychological effects. The present paper describes this philosophical approach to understanding the underpinnings of human reasoning. We present the work of C. S. Peirce, and define sets of fundamental reasoning behavior that would be captured in the mathematical constructs of these newer technologies and would be able to interact in an agent type framework. Further, we propose the adoption of a hybrid reasoning model based on his work for future computational representations or emulations of human cognition

  19. Real-time capture of student reasoning while writing

    Science.gov (United States)

    Franklin, Scott V.; Hermsen, Lisa M.

    2014-12-01

    We present a new approach to investigating student reasoning while writing: real-time capture of the dynamics of the writing process. Key-capture or video software is used to record the entire writing episode, including all pauses, deletions, insertions, and revisions. A succinct shorthand, "S notation," is used to highlight significant moments in the episode that may be indicative of shifts in understanding and can be used in followup interviews for triangulation. The methodology allows one to test the widespread belief that writing is a valuable pedagogical technique, which currently has little directly supportive research. To demonstrate the method, we present a case study of a writing episode. The data reveal an evolution of expression and articulation, discontinuous in both time and space. Distinct shifts in the tone and topic that follow long pauses and revisions are not restricted to the most recently written text. Real-time writing analysis, with its study of the temporal breaks and revision locations, can serve as a complementary tool to more traditional research methods (e.g., speak-aloud interviews) into student reasoning during the writing process.

  20. Reasoning about repairability of workflows at design time

    NARCIS (Netherlands)

    Tagni, Gaston; Ten Teije, Annette; Van Harmelen, Frank

    2009-01-01

    This paper describes an approach for reasoning about the repairability of workflows at design time. We propose a heuristic-based analysis of a workflow that aims at evaluating its definition, considering different design aspects and characteristics that affect its repairability (called repairability

  1. Neural correlates of belief-bias reasoning under time pressure: a near-infrared spectroscopy study.

    Science.gov (United States)

    Tsujii, Takeo; Watanabe, Shigeru

    2010-04-15

    The dual-process theory of reasoning explained the belief-bias effect, the tendency for human reasoning to be erroneously biased when logical conclusions are incongruent with belief about the world, by proposing a belief-based fast heuristic system and a logic-based slow analytic system. Although the claims were supported by behavioral findings that the belief-bias effect was enhanced when subjects were not given sufficient time for reasoning, the neural correlates were still unknown. The present study therefore examined the relationship between the time-pressure effect and activity in the inferior frontal cortex (IFC) during belief-bias reasoning using near-infrared spectroscopy (NIRS). Forty-eight subjects performed congruent and incongruent reasoning tasks, involving long-span (20 s) and short-span trials (10 s). Behavioral analysis found that only incongruent reasoning performance was impaired by the time-pressure of short-span trials. NIRS analysis found that the time-pressure decreased right IFC activity during incongruent trials. Correlation analysis showed that subjects with enhanced right IFC activity could perform better in incongruent trials, while subjects for whom the right IFC activity was impaired by the time-pressure could not maintain better reasoning performance. These findings suggest that the right IFC may be responsible for the time-pressure effect in conflicting reasoning processes. When the right IFC activity was impaired in the short-span trials in which subjects were not given sufficient time for reasoning, the subjects may rely on the fast heuristic system, which result in belief-bias responses. We therefore offer the first demonstration of neural correlates of time-pressure effect on the IFC activity in belief-bias reasoning. Copyright 2009 Elsevier Inc. All rights reserved.

  2. Real-time capture of student reasoning while writing

    Directory of Open Access Journals (Sweden)

    Scott V. Franklin

    2014-09-01

    Full Text Available We present a new approach to investigating student reasoning while writing: real-time capture of the dynamics of the writing process. Key-capture or video software is used to record the entire writing episode, including all pauses, deletions, insertions, and revisions. A succinct shorthand, “S notation,” is used to highlight significant moments in the episode that may be indicative of shifts in understanding and can be used in followup interviews for triangulation. The methodology allows one to test the widespread belief that writing is a valuable pedagogical technique, which currently has little directly supportive research. To demonstrate the method, we present a case study of a writing episode. The data reveal an evolution of expression and articulation, discontinuous in both time and space. Distinct shifts in the tone and topic that follow long pauses and revisions are not restricted to the most recently written text. Real-time writing analysis, with its study of the temporal breaks and revision locations, can serve as a complementary tool to more traditional research methods (e.g., speak-aloud interviews into student reasoning during the writing process.

  3. The Theory of Reasoned Action as Parallel Constraint Satisfaction: Towards a Dynamic Computational Model of Health Behavior

    OpenAIRE

    Orr, Mark G.; Thrush, Roxanne; Plaut, David C.

    2013-01-01

    The reasoned action approach, although ubiquitous in health behavior theory (e.g., Theory of Reasoned Action/Planned Behavior), does not adequately address two key dynamical aspects of health behavior: learning and the effect of immediate social context (i.e., social influence). To remedy this, we put forth a computational implementation of the Theory of Reasoned Action (TRA) using artificial-neural networks. Our model re-conceptualized behavioral intention as arising from a dynamic constrain...

  4. COMPUTER-BASED REASONING SYSTEMS: AN OVERVIEW

    Directory of Open Access Journals (Sweden)

    CIPRIAN CUCU

    2012-12-01

    Full Text Available Argumentation is nowadays seen both as skill that people use in various aspects of their lives, as well as an educational technique that can support the transfer or creation of knowledge thus aiding in the development of other skills (e.g. Communication, critical thinking or attitudes. However, teaching argumentation and teaching with argumentation is still a rare practice, mostly due to the lack of available resources such as time or expert human tutors that are specialized in argumentation. Intelligent Computer Systems (i.e. Systems that implement an inner representation of particular knowledge and try to emulate the behavior of humans could allow more people to understand the purpose, techniques and benefits of argumentation. The proposed paper investigates the state of the art concepts of computer-based argumentation used in education and tries to develop a conceptual map, showing benefits, limitation and relations between various concepts focusing on the duality “learning to argue – arguing to learn”.

  5. 12 CFR 1102.27 - Computing time.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Computing time. 1102.27 Section 1102.27 Banks... for Proceedings § 1102.27 Computing time. (a) General rule. In computing any period of time prescribed... time begins to run is not included. The last day so computed is included, unless it is a Saturday...

  6. 12 CFR 622.21 - Computing time.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Computing time. 622.21 Section 622.21 Banks and... Formal Hearings § 622.21 Computing time. (a) General rule. In computing any period of time prescribed or... run is not to be included. The last day so computed shall be included, unless it is a Saturday, Sunday...

  7. Moral and Property Harm as Consequence of Violation of Reasonable Time of Preliminary Investigation

    Directory of Open Access Journals (Sweden)

    Ibiamin N. Nuriev

    2017-08-01

    Full Text Available The relevance of the article is that it examines the provisions of the criminal procedure law on a reasonable time of preliminary investigation, the consequences of violation of a reasonable period of preliminary investigation. A formula is proposed that makes it possible to make a clearer idea of a reasonable time as a border in the criminal procedural time between conditionally permitted and absolutely unacceptable. In the Author's opinion, the consequences of a violation of a reasonable period is the person's moral and (or property damage. The Author recommends that in the Federal Law of April 30, 2010, No. 68-FZ, replace the concept of “significance of consequences” with the notion of “harm”.

  8. Reliability Assessment of Cloud Computing Platform Based on Semiquantitative Information and Evidential Reasoning

    Directory of Open Access Journals (Sweden)

    Hang Wei

    2016-01-01

    Full Text Available A reliability assessment method based on evidential reasoning (ER rule and semiquantitative information is proposed in this paper, where a new reliability assessment architecture including four aspects with both quantitative data and qualitative knowledge is established. The assessment architecture is more objective in describing complex dynamic cloud computing environment than that in traditional method. In addition, the ER rule which has good performance for multiple attribute decision making problem is employed to integrate different types of the attributes in assessment architecture, which can obtain more accurate assessment results. The assessment results of the case study in an actual cloud computing platform verify the effectiveness and the advantage of the proposed method.

  9. 12 CFR 908.27 - Computing time.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Computing time. 908.27 Section 908.27 Banks and... PRACTICE AND PROCEDURE IN HEARINGS ON THE RECORD General Rules § 908.27 Computing time. (a) General rule. In computing any period of time prescribed or allowed by this subpart, the date of the act or event...

  10. High-school students' reasoning while constructing plant growth models in a computer-supported educational environment

    Science.gov (United States)

    Ergazaki, Marida; Komis, Vassilis; Zogza, Vassiliki

    2005-08-01

    This paper highlights specific aspects of high-school students’ reasoning while coping with a modeling task of plant growth in a computer-supported educational environment. It is particularly concerned with the modeling levels (‘macro-phenomenological’ and ‘micro-conceptual’ level) activated by peers while exploring plant growth and with their ability to shift between or within these levels. The focus is on the types of reasoning developed in the modeling process, as well as on the reasoning coherence around the central concept of plant growth. The findings of the study show that a significant proportion of the 18 participating dyads perform modeling on both levels, while their ability to shift between them as well as between the various elements of the ‘micro-conceptual’ level is rather constrained. Furthermore, the reasoning types identified in peers’ modeling process are ‘convergent’, ‘serial’, ‘linked’ and ‘convergent attached’, with the first type being the most frequent. Finally, a significant part of the participating dyads display a satisfactory degree of reasoning ‘coherence’, performing their task committed to the main objective of exploring plant growth. Teaching implications of the findings are also discussed.

  11. Walking for Transportation: What do U.S. Adults Think is a Reasonable Distance and Time?

    Science.gov (United States)

    Watson, Kathleen B; Carlson, Susan A; Humbert-Rico, Tiffany; Carroll, Dianna D; Fulton, Janet E

    2015-06-16

    Less than one-third of U.S. adults walk for transportation. Public health strategies to increase transportation walking would benefit from knowing what adults think is a reasonable distance to walk. Our purpose was to determine 1) what adults think is a reasonable distance and amount of time to walk and 2) whether there were differences in minutes spent transportation walking by what adults think is reasonable. Analyses used a cross-sectional nationwide adult sample (n = 3653) participating in the 2010 Summer ConsumerStyles mail survey. Most adults (> 90%) think transportation walking is reasonable. However, less than half (43%) think walking a mile or more or for 20 minutes or more is reasonable. What adults think is reasonable is similar across most demographic subgroups, except for older adults (≥ 65 years) who think shorter distances and times are reasonable. Trend analysis that adjust for demographic characteristics indicates adults who think longer distances and times are reasonable walk more. Walking for short distances is acceptable to most U.S. adults. Public health programs designed to encourage longer distance trips may wish to improve supports for transportation walking to make walking longer distances seem easier and more acceptable to most U.S. adults.

  12. 12 CFR 1780.11 - Computing time.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 7 2010-01-01 2010-01-01 false Computing time. 1780.11 Section 1780.11 Banks... time. (a) General rule. In computing any period of time prescribed or allowed by this subpart, the date of the act or event that commences the designated period of time is not included. The last day so...

  13. 6 CFR 13.27 - Computation of time.

    Science.gov (United States)

    2010-01-01

    ... 6 Domestic Security 1 2010-01-01 2010-01-01 false Computation of time. 13.27 Section 13.27 Domestic Security DEPARTMENT OF HOMELAND SECURITY, OFFICE OF THE SECRETARY PROGRAM FRAUD CIVIL REMEDIES § 13.27 Computation of time. (a) In computing any period of time under this part or in an order issued...

  14. Speed of reasoning and its relation to reasoning ability

    NARCIS (Netherlands)

    Goldhammer, F.; Klein Entink, R.H.

    2011-01-01

    The study investigates empirical properties of reasoning speed which is conceived as the fluency of solving reasoning problems. Responses and response times in reasoning tasks are modeled jointly to clarify the covariance structure of reasoning speed and reasoning ability. To determine underlying

  15. Computer-Based Assessment of School Readiness and Early Reasoning

    Science.gov (United States)

    Csapó, Beno; Molnár, Gyöngyvér; Nagy, József

    2014-01-01

    This study explores the potential of using online tests for the assessment of school readiness and for monitoring early reasoning. Four tests of a face-to-face-administered school readiness test battery (speech sound discrimination, relational reasoning, counting and basic numeracy, and deductive reasoning) and a paper-and-pencil inductive…

  16. Noise-constrained switching times for heteroclinic computing

    Science.gov (United States)

    Neves, Fabio Schittler; Voit, Maximilian; Timme, Marc

    2017-03-01

    Heteroclinic computing offers a novel paradigm for universal computation by collective system dynamics. In such a paradigm, input signals are encoded as complex periodic orbits approaching specific sequences of saddle states. Without inputs, the relevant states together with the heteroclinic connections between them form a network of states—the heteroclinic network. Systems of pulse-coupled oscillators or spiking neurons naturally exhibit such heteroclinic networks of saddles, thereby providing a substrate for general analog computations. Several challenges need to be resolved before it becomes possible to effectively realize heteroclinic computing in hardware. The time scales on which computations are performed crucially depend on the switching times between saddles, which in turn are jointly controlled by the system's intrinsic dynamics and the level of external and measurement noise. The nonlinear dynamics of pulse-coupled systems often strongly deviate from that of time-continuously coupled (e.g., phase-coupled) systems. The factors impacting switching times in pulse-coupled systems are still not well understood. Here we systematically investigate switching times in dependence of the levels of noise and intrinsic dissipation in the system. We specifically reveal how local responses to pulses coact with external noise. Our findings confirm that, like in time-continuous phase-coupled systems, piecewise-continuous pulse-coupled systems exhibit switching times that transiently increase exponentially with the number of switches up to some order of magnitude set by the noise level. Complementarily, we show that switching times may constitute a good predictor for the computation reliability, indicating how often an input signal must be reiterated. By characterizing switching times between two saddles in conjunction with the reliability of a computation, our results provide a first step beyond the coding of input signal identities toward a complementary coding for

  17. The theory of reasoned action as parallel constraint satisfaction: towards a dynamic computational model of health behavior.

    Directory of Open Access Journals (Sweden)

    Mark G Orr

    Full Text Available The reasoned action approach, although ubiquitous in health behavior theory (e.g., Theory of Reasoned Action/Planned Behavior, does not adequately address two key dynamical aspects of health behavior: learning and the effect of immediate social context (i.e., social influence. To remedy this, we put forth a computational implementation of the Theory of Reasoned Action (TRA using artificial-neural networks. Our model re-conceptualized behavioral intention as arising from a dynamic constraint satisfaction mechanism among a set of beliefs. In two simulations, we show that constraint satisfaction can simultaneously incorporate the effects of past experience (via learning with the effects of immediate social context to yield behavioral intention, i.e., intention is dynamically constructed from both an individual's pre-existing belief structure and the beliefs of others in the individual's social context. In a third simulation, we illustrate the predictive ability of the model with respect to empirically derived behavioral intention. As the first known computational model of health behavior, it represents a significant advance in theory towards understanding the dynamics of health behavior. Furthermore, our approach may inform the development of population-level agent-based models of health behavior that aim to incorporate psychological theory into models of population dynamics.

  18. The theory of reasoned action as parallel constraint satisfaction: towards a dynamic computational model of health behavior.

    Science.gov (United States)

    Orr, Mark G; Thrush, Roxanne; Plaut, David C

    2013-01-01

    The reasoned action approach, although ubiquitous in health behavior theory (e.g., Theory of Reasoned Action/Planned Behavior), does not adequately address two key dynamical aspects of health behavior: learning and the effect of immediate social context (i.e., social influence). To remedy this, we put forth a computational implementation of the Theory of Reasoned Action (TRA) using artificial-neural networks. Our model re-conceptualized behavioral intention as arising from a dynamic constraint satisfaction mechanism among a set of beliefs. In two simulations, we show that constraint satisfaction can simultaneously incorporate the effects of past experience (via learning) with the effects of immediate social context to yield behavioral intention, i.e., intention is dynamically constructed from both an individual's pre-existing belief structure and the beliefs of others in the individual's social context. In a third simulation, we illustrate the predictive ability of the model with respect to empirically derived behavioral intention. As the first known computational model of health behavior, it represents a significant advance in theory towards understanding the dynamics of health behavior. Furthermore, our approach may inform the development of population-level agent-based models of health behavior that aim to incorporate psychological theory into models of population dynamics.

  19. Effects of computing time delay on real-time control systems

    Science.gov (United States)

    Shin, Kang G.; Cui, Xianzhong

    1988-01-01

    The reliability of a real-time digital control system depends not only on the reliability of the hardware and software used, but also on the speed in executing control algorithms. The latter is due to the negative effects of computing time delay on control system performance. For a given sampling interval, the effects of computing time delay are classified into the delay problem and the loss problem. Analysis of these two problems is presented as a means of evaluating real-time control systems. As an example, both the self-tuning predicted (STP) control and Proportional-Integral-Derivative (PID) control are applied to the problem of tracking robot trajectories, and their respective effects of computing time delay on control performance are comparatively evaluated. For this example, the STP (PID) controller is shown to outperform the PID (STP) controller in coping with the delay (loss) problem.

  20. Job resignation after cancer diagnosis among working survivors in Japan: timing, reasons and change of information needs over time.

    Science.gov (United States)

    Takahashi, Miyako; Tsuchiya, Miyako; Horio, Yoshitsugu; Funazaki, Hatsumi; Aogi, Kenjiro; Miyauchi, Kazue; Arai, Yasuaki

    2018-01-01

    Despite advances in work-related policies for cancer survivors, support systems for working survivors in healthcare settings in Japan remain underdeveloped. We aimed to reveal (i) the present situation of cancer survivors' job resignation, the timing of resignation, and reasons for resignation; (ii) healthcare providers' screening behaviors of cancer survivors' work-related difficulties and (iii) changes to cancer survivors' information/support needs over time since diagnosis. We conducted an anonymous, cross-sectional survey using a convenience sample of re-visiting outpatients at three cancer centers in Japan in 2015. The questionnaire covered participants' demographic and clinical characteristics, change to job status, timing of and reasons for job resignation, screening experience regarding work-related difficulties by healthcare providers, and information/support needs at four distinct timings (at diagnosis, between diagnosis and initial treatment, between initial treatment and return-to-work, and after return-to-work). The results of 950 participants were eligible for statistical analysis. Only 23.5% of participants were screened about work-related issues by healthcare providers despite 21.3% participants reporting resigning at least once. Among participants who resigned, 40.2% decided to do so before initial treatment began. Regarding reasons for resignation, self-regulating and pessimistic reasons were ranked highly. Respondents' work-related information and support needs were observed to change over time. While treatment-related information (schedule and cost) was ranked highly at diagnosis, the need for more individually tailored information and support on work increased after treatment began. This study provides important basic data for developing effective support systems for working survivors of cancer in hospital settings. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  1. Do medical students’ scores using different assessment instruments predict their scores in clinical reasoning using a computer-based simulation?

    Directory of Open Access Journals (Sweden)

    Fida M

    2015-02-01

    Full Text Available Mariam Fida,1 Salah Eldin Kassab2 1Department of Molecular Medicine, College of Medicine and Medical Sciences, Arabian Gulf University, Manama, Bahrain; 2Department of Medical Education, Faculty of Medicine, Suez Canal University, Ismailia, Egypt Purpose: The development of clinical problem-solving skills evolves over time and requires structured training and background knowledge. Computer-based case simulations (CCS have been used for teaching and assessment of clinical reasoning skills. However, previous studies examining the psychometric properties of CCS as an assessment tool have been controversial. Furthermore, studies reporting the integration of CCS into problem-based medical curricula have been limited. Methods: This study examined the psychometric properties of using CCS software (DxR Clinician for assessment of medical students (n=130 studying in a problem-based, integrated multisystem module (Unit IX during the academic year 2011–2012. Internal consistency reliability of CCS scores was calculated using Cronbach's alpha statistics. The relationships between students' scores in CCS components (clinical reasoning, diagnostic performance, and patient management and their scores in other examination tools at the end of the unit including multiple-choice questions, short-answer questions, objective structured clinical examination (OSCE, and real patient encounters were analyzed using stepwise hierarchical linear regression. Results: Internal consistency reliability of CCS scores was high (α=0.862. Inter-item correlations between students' scores in different CCS components and their scores in CCS and other test items were statistically significant. Regression analysis indicated that OSCE scores predicted 32.7% and 35.1% of the variance in clinical reasoning and patient management scores, respectively (P<0.01. Multiple-choice question scores, however, predicted only 15.4% of the variance in diagnostic performance scores (P<0.01, while

  2. Cluster Computing for Embedded/Real-Time Systems

    Science.gov (United States)

    Katz, D.; Kepner, J.

    1999-01-01

    Embedded and real-time systems, like other computing systems, seek to maximize computing power for a given price, and thus can significantly benefit from the advancing capabilities of cluster computing.

  3. Fast algorithms for computing phylogenetic divergence time.

    Science.gov (United States)

    Crosby, Ralph W; Williams, Tiffani L

    2017-12-06

    The inference of species divergence time is a key step in most phylogenetic studies. Methods have been available for the last ten years to perform the inference, but the performance of the methods does not yet scale well to studies with hundreds of taxa and thousands of DNA base pairs. For example a study of 349 primate taxa was estimated to require over 9 months of processing time. In this work, we present a new algorithm, AncestralAge, that significantly improves the performance of the divergence time process. As part of AncestralAge, we demonstrate a new method for the computation of phylogenetic likelihood and our experiments show a 90% improvement in likelihood computation time on the aforementioned dataset of 349 primates taxa with over 60,000 DNA base pairs. Additionally, we show that our new method for the computation of the Bayesian prior on node ages reduces the running time for this computation on the 349 taxa dataset by 99%. Through the use of these new algorithms we open up the ability to perform divergence time inference on large phylogenetic studies.

  4. Measuring scientific reasoning through behavioral analysis in a computer-based problem solving exercise

    Science.gov (United States)

    Mead, C.; Horodyskyj, L.; Buxner, S.; Semken, S. C.; Anbar, A. D.

    2016-12-01

    Developing scientific reasoning skills is a common learning objective for general-education science courses. However, effective assessments for such skills typically involve open-ended questions or tasks, which must be hand-scored and may not be usable online. Using computer-based learning environments, reasoning can be assessed automatically by analyzing student actions within the learning environment. We describe such an assessment under development and present pilot results. In our content-neutral instrument, students solve a problem by collecting and interpreting data in a logical, systematic manner. We then infer reasoning skill automatically based on student actions. Specifically, students investigate why Earth has seasons, a scientifically simple but commonly misunderstood topic. Students are given three possible explanations and asked to select a set of locations on a world map from which to collect temperature data. They then explain how the data support or refute each explanation. The best approaches will use locations in both the Northern and Southern hemispheres to argue that the contrasting seasonality of the hemispheres supports only the correct explanation. We administered a pilot version to students at the beginning of an online, introductory science course (n = 223) as an optional extra credit exercise. We were able to categorize students' data collection decisions as more and less logically sound. Students who choose the most logical measurement locations earned higher course grades, but not significantly higher. This result is encouraging, but not definitive. In the future, we will clarify our results in two ways. First, we plan to incorporate more open-ended interactions into the assessment to improve the resolving power of this tool. Second, to avoid relying on course grades, we will independently measure reasoning skill with one of the existing hand-scored assessments (e.g., Critical Thinking Assessment Test) to cross-validate our new

  5. Computer network time synchronization the network time protocol

    CERN Document Server

    Mills, David L

    2006-01-01

    What started with the sundial has, thus far, been refined to a level of precision based on atomic resonance: Time. Our obsession with time is evident in this continued scaling down to nanosecond resolution and beyond. But this obsession is not without warrant. Precision and time synchronization are critical in many applications, such as air traffic control and stock trading, and pose complex and important challenges in modern information networks.Penned by David L. Mills, the original developer of the Network Time Protocol (NTP), Computer Network Time Synchronization: The Network Time Protocol

  6. Belief–logic conflict resolution in syllogistic reasoning: Inspection-time evidence for a parallel process model

    OpenAIRE

    Stupple, Edward J.N; Ball, Linden

    2008-01-01

    An experiment is reported examining dual-process models of belief bias in syllogistic reasoning using a problem complexity manipulation and an inspection-time method to monitor processing latencies for premises and conclusions. Endorsement rates indicated increased belief bias on complex problems, a finding that runs counter to the “belief-first” selective scrutiny model, but which is consistent with other theories, including “reasoning-first” and “parallel-process” models. Inspection-time da...

  7. Geometric Reasoning for Automated Planning

    Science.gov (United States)

    Clement, Bradley J.; Knight, Russell L.; Broderick, Daniel

    2012-01-01

    An important aspect of mission planning for NASA s operation of the International Space Station is the allocation and management of space for supplies and equipment. The Stowage, Configuration Analysis, and Operations Planning teams collaborate to perform the bulk of that planning. A Geometric Reasoning Engine is developed in a way that can be shared by the teams to optimize item placement in the context of crew planning. The ISS crew spends (at the time of this writing) a third or more of their time moving supplies and equipment around. Better logistical support and optimized packing could make a significant impact on operational efficiency of the ISS. Currently, computational geometry and motion planning do not focus specifically on the optimized orientation and placement of 3D objects based on multiple distance and containment preferences and constraints. The software performs reasoning about the manipulation of 3D solid models in order to maximize an objective function based on distance. It optimizes for 3D orientation and placement. Spatial placement optimization is a general problem and can be applied to object packing or asset relocation.

  8. 5 CFR 890.101 - Definitions; time computations.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Definitions; time computations. 890.101....101 Definitions; time computations. (a) In this part, the terms annuitant, carrier, employee, employee... in section 8901 of title 5, United States Code, and supplement the following definitions: Appropriate...

  9. Competent Reasoning with Rational Numbers.

    Science.gov (United States)

    Smith, John P. III

    1995-01-01

    Analyzed students' reasoning with fractions. Found that skilled students applied strategies specifically tailored to restricted classes of fractions and produced reliable solutions with a minimum of computation effort. Results suggest that competent reasoning depends on a knowledge base that includes numerically specific and invented strategies,…

  10. Development of real-time visualization system for Computational Fluid Dynamics on parallel computers

    International Nuclear Information System (INIS)

    Muramatsu, Kazuhiro; Otani, Takayuki; Matsumoto, Hideki; Takei, Toshifumi; Doi, Shun

    1998-03-01

    A real-time visualization system for computational fluid dynamics in a network connecting between a parallel computing server and the client terminal was developed. Using the system, a user can visualize the results of a CFD (Computational Fluid Dynamics) simulation on the parallel computer as a client terminal during the actual computation on a server. Using GUI (Graphical User Interface) on the client terminal, to user is also able to change parameters of the analysis and visualization during the real-time of the calculation. The system carries out both of CFD simulation and generation of a pixel image data on the parallel computer, and compresses the data. Therefore, the amount of data from the parallel computer to the client is so small in comparison with no compression that the user can enjoy the swift image appearance comfortably. Parallelization of image data generation is based on Owner Computation Rule. GUI on the client is built on Java applet. A real-time visualization is thus possible on the client PC only if Web browser is implemented on it. (author)

  11. 29 CFR 1921.22 - Computation of time.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 7 2010-07-01 2010-07-01 false Computation of time. 1921.22 Section 1921.22 Labor Regulations Relating to Labor (Continued) OCCUPATIONAL SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR... WORKERS' COMPENSATION ACT Miscellaneous § 1921.22 Computation of time. Sundays and holidays shall be...

  12. Changes from 1986 to 2006 in reasons for liking leisure-time physical activity among adolescents.

    Science.gov (United States)

    Wold, B; Littlecott, H; Tynjälä, J; Samdal, O; Moore, L; Roberts, C; Kannas, L; Villberg, J; Aarø, L E

    2016-08-01

    Reasons for participating in physical activity (PA) may have changed in accordance with the general modernization of society. The aim is to examine changes in self-reported reasons for liking leisure-time physical activity (LTPA) and their association with self-reported LTPA over a 20-year period. Data were collected among nationally representative samples of 13-year-olds in Finland, Norway, and Wales in 1986 and 2006 (N = 9252) as part of the WHO cross-national Health Behaviour in School-aged Children (HBSC) study. Univariate ANOVAs to establish differences according to gender, year, and country were conducted. In all countries, 13-year-olds in 2006 tended to report higher importance in terms of achievement and social reasons than their counterparts in 1986, while changes in health reasons were minor. These reasons were associated with LTPA in a similar way at both time points. Health reasons for liking LTPA were considered most important, and were the strongest predictor of LTPA. The findings seem robust as they were consistent across countries and genders. Health education constitutes the most viable strategy for promoting adolescents' motivation for PA, and interventions and educational efforts could be improved by an increased focus on LTPA and sport as a social activity. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  13. General purpose computers in real time

    International Nuclear Information System (INIS)

    Biel, J.R.

    1989-01-01

    I see three main trends in the use of general purpose computers in real time. The first is more processing power. The second is the use of higher speed interconnects between computers (allowing more data to be delivered to the processors). The third is the use of larger programs running in the computers. Although there is still work that needs to be done, I believe that all indications are that the online need for general purpose computers should be available for the SCC and LHC machines. 2 figs

  14. Modeling mental spatial reasoning about cardinal directions.

    Science.gov (United States)

    Schultheis, Holger; Bertel, Sven; Barkowsky, Thomas

    2014-01-01

    This article presents research into human mental spatial reasoning with orientation knowledge. In particular, we look at reasoning problems about cardinal directions that possess multiple valid solutions (i.e., are spatially underdetermined), at human preferences for some of these solutions, and at representational and procedural factors that lead to such preferences. The article presents, first, a discussion of existing, related conceptual and computational approaches; second, results of empirical research into the solution preferences that human reasoners actually have; and, third, a novel computational model that relies on a parsimonious and flexible spatio-analogical knowledge representation structure to robustly reproduce the behavior observed with human reasoners. Copyright © 2014 Cognitive Science Society, Inc.

  15. Giving Devices the Ability to Exercise Reason

    Directory of Open Access Journals (Sweden)

    Thomas Keeley

    2008-10-01

    Full Text Available One of the capabilities that separates humans from computers has been the ability to exercise "reason / judgment". Computers and computerized devices have provided excellent platforms for following rules. Computer programs provide the scripts for processing the rules. The exercise of reason, however, is more of an image processing function than a function composed of a series of rules. The exercise of reason is more right brain than left brain. It involves the interpretation of information and balancing inter-related alternatives. This paper will discuss a new way to define and process information that will give devices the ability to exercise human-like reasoning and judgment. The paper will discuss the characteristics of a "dynamic graphical language" in the context of addressing judgment, since judgment is often required to adjust rules when operating in a dynamic environment. The paper will touch on architecture issues and how judgment is integrated with rule processing.

  16. Reasoning Abilities in Primary School: A Pilot Study on Poor Achievers vs. Normal Achievers in Computer Game Tasks

    Science.gov (United States)

    Dagnino, Francesca Maria; Ballauri, Margherita; Benigno, Vincenza; Caponetto, Ilaria; Pesenti, Elia

    2013-01-01

    This paper presents the results of preliminary research on the assessment of reasoning abilities in primary school poor achievers vs. normal achievers using computer game tasks. Subjects were evaluated by means of cognitive assessment on logical abilities and academic skills. The aim of this study is to better understand the relationship between…

  17. 37 CFR 350.5 - Time.

    Science.gov (United States)

    2010-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2010-07-01 2010-07-01 false Time. 350.5 Section 350.5... RULES AND PROCEDURES GENERAL ADMINISTRATIVE PROVISIONS § 350.5 Time. (a) Computation. To compute the due... reasons why there is good cause for the delay; (5) The justification for the amount of additional time...

  18. Integration of domain and resource-based reasoning for real-time control in dynamic environments

    Science.gov (United States)

    Morgan, Keith; Whitebread, Kenneth R.; Kendus, Michael; Cromarty, Andrew S.

    1993-01-01

    A real-time software controller that successfully integrates domain-based and resource-based control reasoning to perform task execution in a dynamically changing environment is described. The design of the controller is based on the concept of partitioning the process to be controlled into a set of tasks, each of which achieves some process goal. It is assumed that, in general, there are multiple ways (tasks) to achieve a goal. The controller dynamically determines current goals and their current criticality, choosing and scheduling tasks to achieve those goals in the time available. It incorporates rule-based goal reasoning, a TMS-based criticality propagation mechanism, and a real-time scheduler. The controller has been used to build a knowledge-based situation assessment system that formed a major component of a real-time, distributed, cooperative problem solving system built under DARPA contract. It is also being employed in other applications now in progress.

  19. Qualitative Reasoning about Relative Directions : Computational Complexity and Practical Algorithm

    OpenAIRE

    Lee, Jae Hee

    2013-01-01

    Qualitative spatial reasoning (QSR) enables cognitive agents to reason about space using abstract symbols. Among several aspects of space (e.g., topology, direction, distance) directional information is useful for agents navigating in space. Observers typically describe their environment by specifying the relative directions in which they see other objects or other people from their point of view. As such, qualitative reasoning about relative directions, i.e., determining whether a given stat...

  20. Evidence in clinical reasoning: a computational linguistics analysis of 789,712 medical case summaries 1983-2012.

    Science.gov (United States)

    Seidel, Bastian M; Campbell, Steven; Bell, Erica

    2015-03-21

    Better understanding of clinical reasoning could reduce diagnostic error linked to 8% of adverse medical events and 30% of malpractice cases. To a greater extent than the evidence-based movement, the clinical reasoning literature asserts the importance of practitioner intuition—unconscious elements of diagnostic reasoning. The study aimed to analyse the content of case report summaries in ways that explored the importance of an evidence concept, not only in relation to research literature but also intuition. The study sample comprised all 789,712 abstracts in English for case reports contained in the database PUBMED for the period 1 January 1983 to 31 December 2012. It was hypothesised that, if evidence and intuition concepts were viewed by these clinical authors as essential to understanding their case reports, they would be more likely to be found in the abstracts. Computational linguistics software was used in 1) concept mapping of 21,631,481 instances of 201 concepts, and 2) specific concept analyses examining 200 paired co-occurrences for 'evidence' and research 'literature' concepts. 'Evidence' is a fundamentally patient-centred, intuitive concept linked to less common concepts about underlying processes, suspected disease mechanisms and diagnostic hunches. In contrast, the use of research literature in clinical reasoning is linked to more common reasoning concepts about specific knowledge and descriptions or presenting features of cases. 'Literature' is by far the most dominant concept, increasing in relevance since 2003, with an overall relevance of 13% versus 5% for 'evidence' which has remained static. The fact that the least present types of reasoning concepts relate to diagnostic hunches to do with underlying processes, such as what is suspected, raises questions about whether intuitive practitioner evidence-making, found in a constellation of dynamic, process concepts, has become less important. The study adds support to the existing corpus of

  1. Design of an EEG-based brain-computer interface (BCI) from standard components running in real-time under Windows.

    Science.gov (United States)

    Guger, C; Schlögl, A; Walterspacher, D; Pfurtscheller, G

    1999-01-01

    An EEG-based brain-computer interface (BCI) is a direct connection between the human brain and the computer. Such a communication system is needed by patients with severe motor impairments (e.g. late stage of Amyotrophic Lateral Sclerosis) and has to operate in real-time. This paper describes the selection of the appropriate components to construct such a BCI and focuses also on the selection of a suitable programming language and operating system. The multichannel system runs under Windows 95, equipped with a real-time Kernel expansion to obtain reasonable real-time operations on a standard PC. Matlab controls the data acquisition and the presentation of the experimental paradigm, while Simulink is used to calculate the recursive least square (RLS) algorithm that describes the current state of the EEG in real-time. First results of the new low-cost BCI show that the accuracy of differentiating imagination of left and right hand movement is around 95%.

  2. Recent achievements in real-time computational seismology in Taiwan

    Science.gov (United States)

    Lee, S.; Liang, W.; Huang, B.

    2012-12-01

    Real-time computational seismology is currently possible to be achieved which needs highly connection between seismic database and high performance computing. We have developed a real-time moment tensor monitoring system (RMT) by using continuous BATS records and moment tensor inversion (CMT) technique. The real-time online earthquake simulation service is also ready to open for researchers and public earthquake science education (ROS). Combine RMT with ROS, the earthquake report based on computational seismology can provide within 5 minutes after an earthquake occurred (RMT obtains point source information ROS completes a 3D simulation real-time now. For more information, welcome to visit real-time computational seismology earthquake report webpage (RCS).

  3. Adversarial reasoning: challenges and approaches

    Science.gov (United States)

    Kott, Alexander; Ownby, Michael

    2005-05-01

    This paper defines adversarial reasoning as computational approaches to inferring and anticipating an enemy's perceptions, intents and actions. It argues that adversarial reasoning transcends the boundaries of game theory and must also leverage such disciplines as cognitive modeling, control theory, AI planning and others. To illustrate the challenges of applying adversarial reasoning to real-world problems, the paper explores the lessons learned in the CADET -- a battle planning system that focuses on brigade-level ground operations and involves adversarial reasoning. From this example of current capabilities, the paper proceeds to describe RAID -- a DARPA program that aims to build capabilities in adversarial reasoning, and how such capabilities would address practical requirements in Defense and other application areas.

  4. 7 CFR 1.603 - How are time periods computed?

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 1 2010-01-01 2010-01-01 false How are time periods computed? 1.603 Section 1.603... Licenses General Provisions § 1.603 How are time periods computed? (a) General. Time periods are computed as follows: (1) The day of the act or event from which the period begins to run is not included. (2...

  5. Instruction timing for the CDC 7600 computer

    International Nuclear Information System (INIS)

    Lipps, H.

    1975-01-01

    This report provides timing information for all instructions of the Control Data 7600 computer, except for instructions of type 01X, to enable the optimization of 7600 programs. The timing rules serve as background information for timing charts which are produced by a program (TIME76) of the CERN Program Library. The rules that co-ordinate the different sections of the CPU are stated in as much detail as is necessary to time the flow of instructions for a given sequence of code. Instruction fetch, instruction issue, and access to small core memory are treated at length, since details are not available from the computer manuals. Annotated timing charts are given for 24 examples, chosen to display the full range of timing considerations. (Author)

  6. Exploring the Deep-Level Reasoning Questions Effect during Vicarious Learning among Eighth to Eleventh Graders in the Domains of Computer Literacy and Newtonian Physics

    Science.gov (United States)

    Gholson, Barry; Witherspoon, Amy; Morgan, Brent; Brittingham, Joshua K.; Coles, Robert; Graesser, Arthur C.; Sullins, Jeremiah; Craig, Scotty D.

    2009-01-01

    This paper tested the deep-level reasoning questions effect in the domains of computer literacy between eighth and tenth graders and Newtonian physics for ninth and eleventh graders. This effect claims that learning is facilitated when the materials are organized around questions that invite deep-reasoning. The literature indicates that vicarious…

  7. 50 CFR 221.3 - How are time periods computed?

    Science.gov (United States)

    2010-10-01

    ... 50 Wildlife and Fisheries 7 2010-10-01 2010-10-01 false How are time periods computed? 221.3... Provisions § 221.3 How are time periods computed? (a) General. Time periods are computed as follows: (1) The day of the act or event from which the period begins to run is not included. (2) The last day of the...

  8. The Comparison of Inductive Reasoning under Risk Conditions between Chinese and Japanese Based on Computational Models: Toward the Application to CAE for Foreign Language

    Science.gov (United States)

    Zhang, Yujie; Terai, Asuka; Nakagawa, Masanori

    2013-01-01

    Inductive reasoning under risk conditions is an important thinking process not only for sciences but also in our daily life. From this viewpoint, it is very useful for language learning to construct computational models of inductive reasoning which realize the CAE for foreign languages. This study proposes the comparison of inductive reasoning…

  9. Closed to reason: time for accountability for the International Narcotic Control Board

    Directory of Open Access Journals (Sweden)

    Small Dan

    2007-05-01

    International Harm Reduction Development Program (IHRD joined by former United Nations Special Envoy for HIV/AIDS in Africa, the respected Canadian statesman Stephen Lewis. The full report, "Closed to Reason: The International Narcotics Control Board and HIV/AIDS" is attached along [see Additional file 1] with a Russian translation of the key findings of the authors [see Additional files 2] as well as Russian and Chinese translations of this abstract [see Additional 3 and 4]. As the report makes very clear, the time to inject some accountability and reason into the INCB is now. Additional file 1 Closed to Reason: The International Narcotics Control Board and HIV/AIDS. Report by J. Csete and D. Wolfe of the Canadian HIV/AIDS Legal Network; International Harm Reduction Development Program (IHRD; Open Society Institute (OSI; 2007:1–32. Click here for file Additional file 2 Closed to Reason: The International Narcotics Control Board and HIV/AIDS Key Findings. Russian translation of key findings of the above report. Click here for file Additional file 3 "Closed to Reason": Time for Accountability for the International Narcotic Control Board. Russian translation of abstract of the above editorial. Click here for file Additional file 4 "Closed to Reason": Time for accountability for the International Narcotic Control Board. Chinese translation of abstract of the above editorial. Click here for file Howmany times must a man look up Before he can see the sky Yes and how many ears Must one man have Before he can hear people cry? Yes, and how many deaths Will it take till he knows That too many people have died? Bob Dylan

  10. The Effects of Computer Programming on High School Students' Reasoning Skills and Mathematical Self-Efficacy and Problem Solving

    Science.gov (United States)

    Psycharis, Sarantos; Kallia, Maria

    2017-01-01

    In this paper we investigate whether computer programming has an impact on high school student's reasoning skills, problem solving and self-efficacy in Mathematics. The quasi-experimental design was adopted to implement the study. The sample of the research comprised 66 high school students separated into two groups, the experimental and the…

  11. Model Based Temporal Reasoning

    Science.gov (United States)

    Rabin, Marla J.; Spinrad, Paul R.; Fall, Thomas C.

    1988-03-01

    Systems that assess the real world must cope with evidence that is uncertain, ambiguous, and spread over time. Typically, the most important function of an assessment system is to identify when activities are occurring that are unusual or unanticipated. Model based temporal reasoning addresses both of these requirements. The differences among temporal reasoning schemes lies in the methods used to avoid computational intractability. If we had n pieces of data and we wanted to examine how they were related, the worst case would be where we had to examine every subset of these points to see if that subset satisfied the relations. This would be 2n, which is intractable. Models compress this; if several data points are all compatible with a model, then that model represents all those data points. Data points are then considered related if they lie within the same model or if they lie in models that are related. Models thus address the intractability problem. They also address the problem of determining unusual activities if the data do not agree with models that are indicated by earlier data then something out of the norm is taking place. The models can summarize what we know up to that time, so when they are not predicting correctly, either something unusual is happening or we need to revise our models. The model based reasoner developed at Advanced Decision Systems is thus both intuitive and powerful. It is currently being used on one operational system and several prototype systems. It has enough power to be used in domains spanning the spectrum from manufacturing engineering and project management to low-intensity conflict and strategic assessment.

  12. Real-time computational photon-counting LiDAR

    Science.gov (United States)

    Edgar, Matthew; Johnson, Steven; Phillips, David; Padgett, Miles

    2018-03-01

    The availability of compact, low-cost, and high-speed MEMS-based spatial light modulators has generated widespread interest in alternative sampling strategies for imaging systems utilizing single-pixel detectors. The development of compressed sensing schemes for real-time computational imaging may have promising commercial applications for high-performance detectors, where the availability of focal plane arrays is expensive or otherwise limited. We discuss the research and development of a prototype light detection and ranging (LiDAR) system via direct time of flight, which utilizes a single high-sensitivity photon-counting detector and fast-timing electronics to recover millimeter accuracy three-dimensional images in real time. The development of low-cost real time computational LiDAR systems could have importance for applications in security, defense, and autonomous vehicles.

  13. The reason project

    International Nuclear Information System (INIS)

    Atwood, W.; Blankenbecler, R.; Kunz, P.F.; Mours, B.; Weir, A.; Word, G.

    1990-01-01

    Reason is a software package to allow one to do physics analysis with the look and feel of the Apple Macintosh. It was implemented on a NeXT computer which does not yet support the standard HEP packages for graphics and histogramming. This paper will review our experiences and the program

  14. TimeSet: A computer program that accesses five atomic time services on two continents

    Science.gov (United States)

    Petrakis, P. L.

    1993-01-01

    TimeSet is a shareware program for accessing digital time services by telephone. At its initial release, it was capable of capturing time signals only from the U.S. Naval Observatory to set a computer's clock. Later the ability to synchronize with the National Institute of Standards and Technology was added. Now, in Version 7.10, TimeSet is able to access three additional telephone time services in Europe - in Sweden, Austria, and Italy - making a total of five official services addressable by the program. A companion program, TimeGen, allows yet another source of telephone time data strings for callers equipped with TimeSet version 7.10. TimeGen synthesizes UTC time data strings in the Naval Observatory's format from an accurately set and maintained DOS computer clock, and transmits them to callers. This allows an unlimited number of 'freelance' time generating stations to be created. Timesetting from TimeGen is made feasible by the advent of Becker's RighTime, a shareware program that learns the drift characteristics of a computer's clock and continuously applies a correction to keep it accurate, and also brings .01 second resolution to the DOS clock. With clock regulation by RighTime and periodic update calls by the TimeGen station to an official time source via TimeSet, TimeGen offers the same degree of accuracy within the resolution of the computer clock as any official atomic time source.

  15. TV time but not computer time is associated with cardiometabolic risk in Dutch young adults.

    Science.gov (United States)

    Altenburg, Teatske M; de Kroon, Marlou L A; Renders, Carry M; Hirasing, Remy; Chinapaw, Mai J M

    2013-01-01

    TV time and total sedentary time have been positively related to biomarkers of cardiometabolic risk in adults. We aim to examine the association of TV time and computer time separately with cardiometabolic biomarkers in young adults. Additionally, the mediating role of waist circumference (WC) is studied. Data of 634 Dutch young adults (18-28 years; 39% male) were used. Cardiometabolic biomarkers included indicators of overweight, blood pressure, blood levels of fasting plasma insulin, cholesterol, glucose, triglycerides and a clustered cardiometabolic risk score. Linear regression analyses were used to assess the cross-sectional association of self-reported TV and computer time with cardiometabolic biomarkers, adjusting for demographic and lifestyle factors. Mediation by WC was checked using the product-of-coefficient method. TV time was significantly associated with triglycerides (B = 0.004; CI = [0.001;0.05]) and insulin (B = 0.10; CI = [0.01;0.20]). Computer time was not significantly associated with any of the cardiometabolic biomarkers. We found no evidence for WC to mediate the association of TV time or computer time with cardiometabolic biomarkers. We found a significantly positive association of TV time with cardiometabolic biomarkers. In addition, we found no evidence for WC as a mediator of this association. Our findings suggest a need to distinguish between TV time and computer time within future guidelines for screen time.

  16. Time-of-Flight Cameras in Computer Graphics

    DEFF Research Database (Denmark)

    Kolb, Andreas; Barth, Erhardt; Koch, Reinhard

    2010-01-01

    Computer Graphics, Computer Vision and Human Machine Interaction (HMI). These technologies are starting to have an impact on research and commercial applications. The upcoming generation of ToF sensors, however, will be even more powerful and will have the potential to become “ubiquitous real-time geometry...

  17. 29 CFR 4245.8 - Computation of time.

    Science.gov (United States)

    2010-07-01

    ... 29 Labor 9 2010-07-01 2010-07-01 false Computation of time. 4245.8 Section 4245.8 Labor Regulations Relating to Labor (Continued) PENSION BENEFIT GUARANTY CORPORATION INSOLVENCY, REORGANIZATION, TERMINATION, AND OTHER RULES APPLICABLE TO MULTIEMPLOYER PLANS NOTICE OF INSOLVENCY § 4245.8 Computation of...

  18. Computation Offloading for Frame-Based Real-Time Tasks under Given Server Response Time Guarantees

    Directory of Open Access Journals (Sweden)

    Anas S. M. Toma

    2014-11-01

    Full Text Available Computation offloading has been adopted to improve the performance of embedded systems by offloading the computation of some tasks, especially computation-intensive tasks, to servers or clouds. This paper explores computation offloading for real-time tasks in embedded systems, provided given response time guarantees from the servers, to decide which tasks should be offloaded to get the results in time. We consider frame-based real-time tasks with the same period and relative deadline. When the execution order of the tasks is given, the problem can be solved in linear time. However, when the execution order is not specified, we prove that the problem is NP-complete. We develop a pseudo-polynomial-time algorithm for deriving feasible schedules, if they exist.  An approximation scheme is also developed to trade the error made from the algorithm and the complexity. Our algorithms are extended to minimize the period/relative deadline of the tasks for performance maximization. The algorithms are evaluated with a case study for a surveillance system and synthesized benchmarks.

  19. Stochastic nonlinear time series forecasting using time-delay reservoir computers: performance and universality.

    Science.gov (United States)

    Grigoryeva, Lyudmila; Henriques, Julie; Larger, Laurent; Ortega, Juan-Pablo

    2014-07-01

    Reservoir computing is a recently introduced machine learning paradigm that has already shown excellent performances in the processing of empirical data. We study a particular kind of reservoir computers called time-delay reservoirs that are constructed out of the sampling of the solution of a time-delay differential equation and show their good performance in the forecasting of the conditional covariances associated to multivariate discrete-time nonlinear stochastic processes of VEC-GARCH type as well as in the prediction of factual daily market realized volatilities computed with intraday quotes, using as training input daily log-return series of moderate size. We tackle some problems associated to the lack of task-universality for individually operating reservoirs and propose a solution based on the use of parallel arrays of time-delay reservoirs. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Solving probability reasoning based on DNA strand displacement and probability modules.

    Science.gov (United States)

    Zhang, Qiang; Wang, Xiaobiao; Wang, Xiaojun; Zhou, Changjun

    2017-12-01

    In computation biology, DNA strand displacement technology is used to simulate the computation process and has shown strong computing ability. Most researchers use it to solve logic problems, but it is only rarely used in probabilistic reasoning. To process probabilistic reasoning, a conditional probability derivation model and total probability model based on DNA strand displacement were established in this paper. The models were assessed through the game "read your mind." It has been shown to enable the application of probabilistic reasoning in genetic diagnosis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Artificial intelligence approach to legal reasoning

    International Nuclear Information System (INIS)

    Gardner, A.V.D.L.

    1984-01-01

    For artificial intelligence, understanding the forms of human reasoning is a central goal. Legal reasoning is a form that makes a new set of demands on artificial intelligence methods. Most importantly, a computer program that reasons about legal problems must be able to distinguish between questions it is competent to answer and questions that human lawyers could seriously argue either way. In addition, a program for analyzing legal problems should be able to use both general legal rules and decisions in past cases; and it should be able to work with technical concepts that are only partly defined and subject to shifts of meaning. Each of these requirements has wider applications in artificial intelligence, beyond the legal domain. This dissertation presents a computational framework for legal reasoning, within which such requirements can be accommodated. The development of the framework draws significantly on the philosophy of law, in which the elucidation of legal reasoning is an important topic. A key element of the framework is the legal distinction between hard cases and clear cases. In legal writing, this distinction has been taken for granted more often than it has been explored. Here, some initial heuristics are proposed by which a program might make the distinction

  2. Changes from 1986 to 2006 in reasons for liking leisure-time physical activity among adolescents

    OpenAIRE

    Wold, Bente; Littlecott, H.; Tynjala, J; Samdal, Oddrun; Moore, L; Roberts, C; Kannas, L; Villberg, J; Aarø, Leif Edvard

    2015-01-01

    Reasons for participating in physical activity (PA) may have changed in accordance with the general modernization of society. The aim is to examine changes in self-reported reasons for liking leisure-time physical activity (LTPA) and their association with self-reported LTPA over a 20-year period. Data were collected among nationally representative samples of 13-year-olds in Finland, Norway, and Wales in 1986 and 2006 (N = 9252) as part of the WHO cross-national Health Behaviour in School-age...

  3. Students' inductive reasoning skills and the relevance of prior knowledge: an exploratory study with a computer-based training course on the topic of acne vulgaris.

    Science.gov (United States)

    Horn-Ritzinger, Sabine; Bernhardt, Johannes; Horn, Michael; Smolle, Josef

    2011-04-01

    The importance of inductive instruction in medical education is increasingly growing. Little is known about the relevance of prior knowledge regarding students' inductive reasoning abilities. The purpose is to evaluate this inductive teaching method as a means of fostering higher levels of learning and to explore how individual differences in prior knowledge (high [HPK] vs. low [LPK]) contribute to students' inductive reasoning skills. Twenty-six LPK and 18 HPK students could train twice with an interactive computer-based training object to discover the underlying concept before doing the final comprehension check. Students had a median of 76.9% of correct answers in the first, 90.9% in the second training, and answered 92% of the final assessment questions correctly. More important, 86% of all students succeeded with inductive learning, among them 83% of the HPK students and 89% of the LPK students. Prior knowledge did not predict performance on overall comprehension. This inductive instructional strategy fostered students' deep approaches to learning in a time-effective way.

  4. 26 CFR 301.7508-1 - Time for performing certain acts postponed by reason of service in a combat zone.

    Science.gov (United States)

    2010-04-01

    ... postponed by reason of service in a combat zone. (a) General rule. The period of time that may be... 26 Internal Revenue 18 2010-04-01 2010-04-01 false Time for performing certain acts postponed by reason of service in a combat zone. 301.7508-1 Section 301.7508-1 Internal Revenue INTERNAL REVENUE...

  5. Computation of a long-time evolution in a Schroedinger system

    International Nuclear Information System (INIS)

    Girard, R.; Kroeger, H.; Labelle, P.; Bajzer, Z.

    1988-01-01

    We compare different techniques for the computation of a long-time evolution and the S matrix in a Schroedinger system. As an application we consider a two-nucleon system interacting via the Yamaguchi potential. We suggest computation of the time evolution for a very short time using Pade approximants, the long-time evolution being obtained by iterative squaring. Within the technique of strong approximation of Moller wave operators (SAM) we compare our calculation with computation of the time evolution in the eigenrepresentation of the Hamiltonian and with the standard Lippmann-Schwinger solution for the S matrix. We find numerical agreement between these alternative methods for time-evolution computation up to half the number of digits of internal machine precision, and fairly rapid convergence of both techniques towards the Lippmann-Schwinger solution

  6. Imprecise results: Utilizing partial computations in real-time systems

    Science.gov (United States)

    Lin, Kwei-Jay; Natarajan, Swaminathan; Liu, Jane W.-S.

    1987-01-01

    In real-time systems, a computation may not have time to complete its execution because of deadline requirements. In such cases, no result except the approximate results produced by the computations up to that point will be available. It is desirable to utilize these imprecise results if possible. Two approaches are proposed to enable computations to return imprecise results when executions cannot be completed normally. The milestone approach records results periodically, and if a deadline is reached, returns the last recorded result. The sieve approach demarcates sections of code which can be skipped if the time available is insufficient. By using these approaches, the system is able to produce imprecise results when deadlines are reached. The design of the Concord project is described which supports imprecise computations using these techniques. Also presented is a general model of imprecise computations using these techniques, as well as one which takes into account the influence of the environment, showing where the latter approach fits into this model.

  7. Real-Time Thevenin Impedance Computation

    DEFF Research Database (Denmark)

    Sommer, Stefan Horst; Jóhannsson, Hjörtur

    2013-01-01

    operating state, and strict time constraints are difficult to adhere to as the complexity of the grid increases. Several suggested approaches for real-time stability assessment require Thevenin impedances to be determined for the observed system conditions. By combining matrix factorization, graph reduction......, and parallelization, we develop an algorithm for computing Thevenin impedances an order of magnitude faster than previous approaches. We test the factor-and-solve algorithm with data from several power grids of varying complexity, and we show how the algorithm allows realtime stability assessment of complex power...

  8. Intuitionistic Fuzzy Time Series Forecasting Model Based on Intuitionistic Fuzzy Reasoning

    Directory of Open Access Journals (Sweden)

    Ya’nan Wang

    2016-01-01

    Full Text Available Fuzzy sets theory cannot describe the data comprehensively, which has greatly limited the objectivity of fuzzy time series in uncertain data forecasting. In this regard, an intuitionistic fuzzy time series forecasting model is built. In the new model, a fuzzy clustering algorithm is used to divide the universe of discourse into unequal intervals, and a more objective technique for ascertaining the membership function and nonmembership function of the intuitionistic fuzzy set is proposed. On these bases, forecast rules based on intuitionistic fuzzy approximate reasoning are established. At last, contrast experiments on the enrollments of the University of Alabama and the Taiwan Stock Exchange Capitalization Weighted Stock Index are carried out. The results show that the new model has a clear advantage of improving the forecast accuracy.

  9. Reasoning about the past

    DEFF Research Database (Denmark)

    Nielsen, Mogens

    1998-01-01

    In this extended abstract, we briefly recall the abstract (categorical) notion of bisimulation from open morphisms, as introduced by Joyal, Nielsen and Winskel. The approach is applicable across a wide range of models of computation, and any such bisimulation comes automatically with characterist...... of reasoning about the past....

  10. Probabilistic Reasoning for Robustness in Automated Planning

    Science.gov (United States)

    Schaffer, Steven; Clement, Bradley; Chien, Steve

    2007-01-01

    A general-purpose computer program for planning the actions of a spacecraft or other complex system has been augmented by incorporating a subprogram that reasons about uncertainties in such continuous variables as times taken to perform tasks and amounts of resources to be consumed. This subprogram computes parametric probability distributions for time and resource variables on the basis of user-supplied models of actions and resources that they consume. The current system accepts bounded Gaussian distributions over action duration and resource use. The distributions are then combined during planning to determine the net probability distribution of each resource at any time point. In addition to a full combinatoric approach, several approximations for arriving at these combined distributions are available, including maximum-likelihood and pessimistic algorithms. Each such probability distribution can then be integrated to obtain a probability that execution of the plan under consideration would violate any constraints on the resource. The key idea is to use these probabilities of conflict to score potential plans and drive a search toward planning low-risk actions. An output plan provides a balance between the user s specified averseness to risk and other measures of optimality.

  11. Different strategies in solving series completion inductive reasoning problems: an fMRI and computational study.

    Science.gov (United States)

    Liang, Peipeng; Jia, Xiuqin; Taatgen, Niels A; Zhong, Ning; Li, Kuncheng

    2014-08-01

    Neural correlate of human inductive reasoning process is still unclear. Number series and letter series completion are two typical inductive reasoning tasks, and with a common core component of rule induction. Previous studies have demonstrated that different strategies are adopted in number series and letter series completion tasks; even the underlying rules are identical. In the present study, we examined cortical activation as a function of two different reasoning strategies for solving series completion tasks. The retrieval strategy, used in number series completion tasks, involves direct retrieving of arithmetic knowledge to get the relations between items. The procedural strategy, used in letter series completion tasks, requires counting a certain number of times to detect the relations linking two items. The two strategies require essentially the equivalent cognitive processes, but have different working memory demands (the procedural strategy incurs greater demands). The procedural strategy produced significant greater activity in areas involved in memory retrieval (dorsolateral prefrontal cortex, DLPFC) and mental representation/maintenance (posterior parietal cortex, PPC). An ACT-R model of the tasks successfully predicted behavioral performance and BOLD responses. The present findings support a general-purpose dual-process theory of inductive reasoning regarding the cognitive architecture. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Spying on real-time computers to improve performance

    International Nuclear Information System (INIS)

    Taff, L.M.

    1975-01-01

    The sampled program-counter histogram, an established technique for shortening the execution times of programs, is described for a real-time computer. The use of a real-time clock allows particularly easy implementation. (Auth.)

  13. Residential normalcy and environmental experiences of very old people: changes in residential reasoning over time.

    Science.gov (United States)

    Granbom, Marianne; Himmelsbach, Ines; Haak, Maria; Löfqvist, Charlotte; Oswald, Frank; Iwarsson, Susanne

    2014-04-01

    The decision to relocate in old age is intricately linked to thoughts and desires to stay put. However, most research focuses either on strategies that allow people to age in place or on their reasons for relocation. There is a need for more knowledge on very old peoples' residential reasoning, including thoughts about aging in place and thoughts about relocation as one intertwined process evolving in everyday life. The aim of this study was to explore what we refer to as the process of residential reasoning and how it changes over time among very old people, and to contribute to the theoretical development regarding aging in place and relocation. Taking a longitudinal perspective, data stem from the ENABLE-AGE In-depth Study, with interviews conducted in 2003 followed up in interviews in 2011. The 16 participants of the present study were 80-89years at the time of the first interview. During analysis the Theoretical Model of Residential Normalcy by Golant and the Life Course Model of Environmental Experience by Rowles & Watkins were used as sensitizing concepts. The findings revealed changes in the process of residential reasoning that related to a wide variety of issues. Such issues included the way very old people use their environmental experience, their striving to build upon or dismiss attachment to place, and their attempts to maintain or regain residential normalcy during years of declining health and loss of independence. In addition, the changes in reasoning were related to end-of-life issues. The findings contribute to the theoretical discussion on aging in place, relocation as a coping strategy, and reattachment after moving in very old age. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. A Distributed Computing Network for Real-Time Systems.

    Science.gov (United States)

    1980-11-03

    7 ) AU2 o NAVA TUNDEWATER SY$TEMS CENTER NEWPORT RI F/G 9/2 UIS RIBUT E 0 COMPUTIN G N LTWORK FOR REAL - TIME SYSTEMS .(U) UASSIFIED NOV Al 6 1...MORAIS - UT 92 dLEVEL c A Distributed Computing Network for Real - Time Systems . 11 𔃺-1 Gordon E/Morson I7 y tm- ,r - t "en t As J 2 -p .. - 7 I’ cNaval...NUMBER TD 5932 / N 4. TITLE mand SubotI. S. TYPE OF REPORT & PERIOD COVERED A DISTRIBUTED COMPUTING NETWORK FOR REAL - TIME SYSTEMS 6. PERFORMING ORG

  15. Real-time computing platform for spiking neurons (RT-spike).

    Science.gov (United States)

    Ros, Eduardo; Ortigosa, Eva M; Agís, Rodrigo; Carrillo, Richard; Arnold, Michael

    2006-07-01

    A computing platform is described for simulating arbitrary networks of spiking neurons in real time. A hybrid computing scheme is adopted that uses both software and hardware components to manage the tradeoff between flexibility and computational power; the neuron model is implemented in hardware and the network model and the learning are implemented in software. The incremental transition of the software components into hardware is supported. We focus on a spike response model (SRM) for a neuron where the synapses are modeled as input-driven conductances. The temporal dynamics of the synaptic integration process are modeled with a synaptic time constant that results in a gradual injection of charge. This type of model is computationally expensive and is not easily amenable to existing software-based event-driven approaches. As an alternative we have designed an efficient time-based computing architecture in hardware, where the different stages of the neuron model are processed in parallel. Further improvements occur by computing multiple neurons in parallel using multiple processing units. This design is tested using reconfigurable hardware and its scalability and performance evaluated. Our overall goal is to investigate biologically realistic models for the real-time control of robots operating within closed action-perception loops, and so we evaluate the performance of the system on simulating a model of the cerebellum where the emulation of the temporal dynamics of the synaptic integration process is important.

  16. 43 CFR 45.3 - How are time periods computed?

    Science.gov (United States)

    2010-10-01

    ... 43 Public Lands: Interior 1 2010-10-01 2010-10-01 false How are time periods computed? 45.3... IN FERC HYDROPOWER LICENSES General Provisions § 45.3 How are time periods computed? (a) General... run is not included. (2) The last day of the period is included. (i) If that day is a Saturday, Sunday...

  17. Meta-Reasoning: Monitoring and Control of Thinking and Reasoning.

    Science.gov (United States)

    Ackerman, Rakefet; Thompson, Valerie A

    2017-08-01

    Meta-Reasoning refers to the processes that monitor the progress of our reasoning and problem-solving activities and regulate the time and effort devoted to them. Monitoring processes are usually experienced as feelings of certainty or uncertainty about how well a process has, or will, unfold. These feelings are based on heuristic cues, which are not necessarily reliable. Nevertheless, we rely on these feelings of (un)certainty to regulate our mental effort. Most metacognitive research has focused on memorization and knowledge retrieval, with little attention paid to more complex processes, such as reasoning and problem solving. In that context, we recently developed a Meta-Reasoning framework, used here to review existing findings, consider their consequences, and frame questions for future research. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Context based support for Clinical Reasoning

    DEFF Research Database (Denmark)

    Vilstrup Pedersen, Klaus

    2004-01-01

    In many areas of the medical domain, the decision process i.e. reasoning, involving health care professionals is distributed, cooperative and complex. Computer based decision support systems has usually been focusing on the outcome of the decision making and treated it as a single task....... In this paper a framework for a Clinical Reasoning Knowledge Warehouse (CRKW) is presented, intended to support the reasoning process, by providing the decision participants with an analysis platform that captures and enhances information and knowledge. The CRKW mixes theories and models from Artificial...... Intelligence, Knowledge Management Systems and Business Intelligence to make context sensitive, patient case specific analysis and knowledge management. The knowledge base consists of patient health records, reasoning process information and clinical guidelines. Patient specific information and knowledge...

  19. Relativistic Photoionization Computations with the Time Dependent Dirac Equation

    Science.gov (United States)

    2016-10-12

    Naval Research Laboratory Washington, DC 20375-5320 NRL/MR/6795--16-9698 Relativistic Photoionization Computations with the Time Dependent Dirac... Photoionization Computations with the Time Dependent Dirac Equation Daniel F. Gordon and Bahman Hafizi Naval Research Laboratory 4555 Overlook Avenue, SW...Unclassified Unlimited Unclassified Unlimited 22 Daniel Gordon (202) 767-5036 Tunneling Photoionization Ionization of inner shell electrons by laser

  20. A research on applications of qualitative reasoning techniques in Human Acts Simulation Program

    International Nuclear Information System (INIS)

    Far, B.H.

    1992-04-01

    Human Acts Simulation Program (HASP) is a ten-year research project of the Computing and Information Systems Center of JAERI. In HASP the goal is developing programs for an advanced intelligent robot to accomplish multiple instructions (for instance, related to surveillance, inspection and maintenance) in nuclear power plants. Some recent artificial intelligence techniques can contribute to this project. This report introduces some original contributions concerning application of Qualitative Reasoning (QR) techniques in HASP. The focus is on the knowledge-intensive tasks, including model-based reasoning, analytic learning, fault diagnosis and functional reasoning. The multi-level extended qualitative modeling for the Skill-Rule-Knowledge (S-R-K) based reasoning, that included the coordination and timing of events, Qualitative Sensitivity analysis (Q S A), Subjective Qualitative Fault Diagnosis (S Q F D) and Qualitative Function Formation (Q F F ) techniques are introduced. (author) 123 refs

  1. Real-time qualitative reasoning for telerobotic systems

    Science.gov (United States)

    Pin, Eancois G.

    1993-01-01

    This paper discusses the sensor-based telerobotic driving of a car in a-priori unknown environments using 'human-like' reasoning schemes implemented on custom-designed VLSI fuzzy inferencing boards. These boards use the Fuzzy Set theoretic framework to allow very vast (30 kHz) processing of full sets of information that are expressed in qualitative form using membership functions. The sensor-based and fuzzy inferencing system was incorporated on an outdoor test-bed platform to investigate two control modes for driving a car on the basis of very sparse and imprecise range data. In the first mode, the car navigates fully autonomously to a goal specified by the operator, while in the second mode, the system acts as a telerobotic driver's aid providing the driver with linguistic (fuzzy) commands to turn left or right, speed up, slow down, stop, or back up depending on the obstacles perceived by the sensors. Indoor and outdoor experiments with both modes of control are described in which the system uses only three acoustic range (sonar) sensor channels to perceive the environment. Sample results are presented that illustrate the feasibility of developing autonomous navigation modules and robust, safety-enhancing driver's aids for telerobotic systems using the new fuzzy inferencing VLSI hardware and 'human-like' reasoning schemes.

  2. Are there reasons to challenge a symbolic computationalist approach in explaining deductive reasoning?

    Science.gov (United States)

    Faiciuc, Lucia E

    2008-06-01

    The majority of the existing theories explaining deductive reasoning could be included in a classic computationalist approach of the cognitive processes. In fact, deductive reasoning could be seen to be the pinnacle of the symbolic computationalism, its last fortress to be defended in the face of new, dynamic, and ecological perspectives over cognition. But are there weak points in that position regarding deductive reasoning? What would be the reasons for which new perspectives could gain in credibility? What could be their most important tenets? The answers given to those questions in the paper include two main points. The first one is that the present empirical data could not sustain unambiguously one view over the other, that they are obtained in artificial experimental conditions, and that there are data that are not easily explainable using the traditional computationalist paradigm. The second one is that approaching the deductive reasoning from dynamic and ecological perspectives could have significant advantages. The most obvious one is the possibility to integrate more easily the research regarding the deductive reasoning with the results obtained in other domains of the psychology (especially in what respects the lower cognitive processes), in artificial intelligence or in neurophysiology. The reasons for that would be that such perspectives, as they are sketched in the paper, would imply, essentially, processes of second-order pattern formation and recognition (as it is the case for perception), embodied cognition, and dynamic processes as the brain ones are.

  3. Ethical Guidelines for Computer Security Researchers: "Be Reasonable"

    Science.gov (United States)

    Sassaman, Len

    For most of its existence, the field of computer science has been lucky enough to avoid ethical dilemmas by virtue of its relatively benign nature. The subdisciplines of programming methodology research, microprocessor design, and so forth have little room for the greater questions of human harm. Other, more recently developed sub-disciplines, such as data mining, social network analysis, behavioral profiling, and general computer security, however, open the door to abuse of users by practitioners and researchers. It is therefore the duty of the men and women who chart the course of these fields to set rules for themselves regarding what sorts of actions on their part are to be considered acceptable and what should be avoided or handled with caution out of ethical concerns. This paper deals solely with the issues faced by computer security researchers, be they vulnerability analysts, privacy system designers, malware experts, or reverse engineers.

  4. Semantical Markov Logic Network for Distributed Reasoning in Cyber-Physical Systems

    Directory of Open Access Journals (Sweden)

    Abdul-Wahid Mohammed

    2017-01-01

    Full Text Available The challenges associated with developing accurate models for cyber-physical systems are attributable to the intrinsic concurrent and heterogeneous computations of these systems. Even though reasoning based on interconnected domain specific ontologies shows promise in enhancing modularity and joint functionality modelling, it has become necessary to build interoperable cyber-physical systems due to the growing pervasiveness of these systems. In this paper, we propose a semantically oriented distributed reasoning architecture for cyber-physical systems. This model accomplishes reasoning through a combination of heterogeneous models of computation. Using the flexibility of semantic agents as a formal representation for heterogeneous computational platforms, we define autonomous and intelligent agent-based reasoning procedure for distributed cyber-physical systems. Sensor networks underpin the semantic capabilities of this architecture, and semantic reasoning based on Markov logic networks is adopted to address uncertainty in modelling. To illustrate feasibility of this approach, we present a Markov logic based semantic event model for cyber-physical systems and discuss a case study of event handling and processing in a smart home.

  5. STICK: Spike Time Interval Computational Kernel, a Framework for General Purpose Computation Using Neurons, Precise Timing, Delays, and Synchrony.

    Science.gov (United States)

    Lagorce, Xavier; Benosman, Ryad

    2015-11-01

    There has been significant research over the past two decades in developing new platforms for spiking neural computation. Current neural computers are primarily developed to mimic biology. They use neural networks, which can be trained to perform specific tasks to mainly solve pattern recognition problems. These machines can do more than simulate biology; they allow us to rethink our current paradigm of computation. The ultimate goal is to develop brain-inspired general purpose computation architectures that can breach the current bottleneck introduced by the von Neumann architecture. This work proposes a new framework for such a machine. We show that the use of neuron-like units with precise timing representation, synaptic diversity, and temporal delays allows us to set a complete, scalable compact computation framework. The framework provides both linear and nonlinear operations, allowing us to represent and solve any function. We show usability in solving real use cases from simple differential equations to sets of nonlinear differential equations leading to chaotic attractors.

  6. Computer-controlled neutron time-of-flight spectrometer. Part II

    International Nuclear Information System (INIS)

    Merriman, S.H.

    1979-12-01

    A time-of-flight spectrometer for neutron inelastic scattering research has been interfaced to a PDP-15/30 computer. The computer is used for experimental data acquisition and analysis and for apparatus control. This report was prepared to summarize the functions of the computer and to act as a users' guide to the software system

  7. Time series modeling, computation, and inference

    CERN Document Server

    Prado, Raquel

    2010-01-01

    The authors systematically develop a state-of-the-art analysis and modeling of time series. … this book is well organized and well written. The authors present various statistical models for engineers to solve problems in time series analysis. Readers no doubt will learn state-of-the-art techniques from this book.-Hsun-Hsien Chang, Computing Reviews, March 2012My favorite chapters were on dynamic linear models and vector AR and vector ARMA models.-William Seaver, Technometrics, August 2011… a very modern entry to the field of time-series modelling, with a rich reference list of the current lit

  8. Heterogeneous real-time computing in radio astronomy

    Science.gov (United States)

    Ford, John M.; Demorest, Paul; Ransom, Scott

    2010-07-01

    Modern computer architectures suited for general purpose computing are often not the best choice for either I/O-bound or compute-bound problems. Sometimes the best choice is not to choose a single architecture, but to take advantage of the best characteristics of different computer architectures to solve your problems. This paper examines the tradeoffs between using computer systems based on the ubiquitous X86 Central Processing Units (CPU's), Field Programmable Gate Array (FPGA) based signal processors, and Graphical Processing Units (GPU's). We will show how a heterogeneous system can be produced that blends the best of each of these technologies into a real-time signal processing system. FPGA's tightly coupled to analog-to-digital converters connect the instrument to the telescope and supply the first level of computing to the system. These FPGA's are coupled to other FPGA's to continue to provide highly efficient processing power. Data is then packaged up and shipped over fast networks to a cluster of general purpose computers equipped with GPU's, which are used for floating-point intensive computation. Finally, the data is handled by the CPU and written to disk, or further processed. Each of the elements in the system has been chosen for its specific characteristics and the role it can play in creating a system that does the most for the least, in terms of power, space, and money.

  9. Ubiquitous computing technology for just-in-time motivation of behavior change.

    Science.gov (United States)

    Intille, Stephen S

    2004-01-01

    This paper describes a vision of health care where "just-in-time" user interfaces are used to transform people from passive to active consumers of health care. Systems that use computational pattern recognition to detect points of decision, behavior, or consequences automatically can present motivational messages to encourage healthy behavior at just the right time. Further, new ubiquitous computing and mobile computing devices permit information to be conveyed to users at just the right place. In combination, computer systems that present messages at the right time and place can be developed to motivate physical activity and healthy eating. Computational sensing technologies can also be used to measure the impact of the motivational technology on behavior.

  10. A new system of computer-assisted navigation leading to reduction in operating time in uncemented total hip replacement in a matched population.

    Science.gov (United States)

    Chaudhry, Fouad A; Ismail, Sanaa Z; Davis, Edward T

    2018-05-01

    Computer-assisted navigation techniques are used to optimise component placement and alignment in total hip replacement. It has developed in the last 10 years but despite its advantages only 0.3% of all total hip replacements in England and Wales are done using computer navigation. One of the reasons for this is that computer-assisted technology increases operative time. A new method of pelvic registration has been developed without the need to register the anterior pelvic plane (BrainLab hip 6.0) which has shown to improve the accuracy of THR. The purpose of this study was to find out if the new method reduces the operating time. This was a retrospective analysis of comparing operating time in computer navigated primary uncemented total hip replacement using two methods of registration. Group 1 included 128 cases that were performed using BrainLab versions 2.1-5.1. This version relied on the acquisition of the anterior pelvic plane for registration. Group 2 included 128 cases that were performed using the newest navigation software, BrainLab hip 6.0 (registration possible with the patient in the lateral decubitus position). The operating time was 65.79 (40-98) minutes using the old method of registration and was 50.87 (33-74) minutes using the new method of registration. This difference was statistically significant. The body mass index (BMI) was comparable in both groups. The study supports the use of new method of registration in improving the operating time in computer navigated primary uncemented total hip replacements.

  11. Continuous-Time Symmetric Hopfield Nets are Computationally Universal

    Czech Academy of Sciences Publication Activity Database

    Šíma, Jiří; Orponen, P.

    2003-01-01

    Roč. 15, č. 3 (2003), s. 693-733 ISSN 0899-7667 R&D Projects: GA AV ČR IAB2030007; GA ČR GA201/02/1456 Institutional research plan: AV0Z1030915 Keywords : continuous-time Hopfield network * Liapunov function * analog computation * computational power * Turing universality Subject RIV: BA - General Mathematics Impact factor: 2.747, year: 2003

  12. Causal Reasoning in Medicine: Analysis of a Protocol.

    Science.gov (United States)

    Kuipers, Benjamin; Kassirer, Jerome P.

    1984-01-01

    Describes the construction of a knowledge representation from the identification of the problem (nephrotic syndrome) to a running computer simulation of causal reasoning to provide a vertical slice of the construction of a cognitive model. Interactions between textbook knowledge, observations of human experts, and computational requirements are…

  13. Multiscale Space-Time Computational Methods for Fluid-Structure Interactions

    Science.gov (United States)

    2015-09-13

    thermo-fluid analysis of a ground vehicle and its tires ST-SI Computational Analysis of a Vertical - Axis Wind Turbine We have successfully...of a vertical - axis wind turbine . Multiscale Compressible-Flow Computation with Particle Tracking We have successfully tested the multiscale...Tezduyar, Spenser McIntyre, Nikolay Kostov, Ryan Kolesar, Casey Habluetzel. Space–time VMS computation of wind - turbine rotor and tower aerodynamics

  14. Computer generated movies to display biotelemetry data

    International Nuclear Information System (INIS)

    White, G.C.

    1979-01-01

    The three dimensional nature of biotelemetry data (x, y, time) makes them difficult to comprehend. Graphic displays provide a means of extracting information and analyzing biotelemetry data. The extensive computer graphics facilities at Los Alamos Scientific Laboratory have been utilized to analyze elk biotelemetry data. Fixes have been taken weekly for 14 months on 14 animals'. The inadequacy of still graphic displays to portray the time dimension of this data has lead to the use of computer generated movies to help grasp time relationships. A computer movie of the data from one animal demonstrates habitat use as a function of time, while a movie of 2 or more animals illustrates the correlations between the animals movements. About 2 hours of computer time were required to generate the movies for each animal for 1 year of data. The cost of the movies is quite small relative to the cost of collecting the data, so that computer generated movies are a reasonable method to depict biotelemetry data

  15. Cluster Computing For Real Time Seismic Array Analysis.

    Science.gov (United States)

    Martini, M.; Giudicepietro, F.

    A seismic array is an instrument composed by a dense distribution of seismic sen- sors that allow to measure the directional properties of the wavefield (slowness or wavenumber vector) radiated by a seismic source. Over the last years arrays have been widely used in different fields of seismological researches. In particular they are applied in the investigation of seismic sources on volcanoes where they can be suc- cessfully used for studying the volcanic microtremor and long period events which are critical for getting information on the volcanic systems evolution. For this reason arrays could be usefully employed for the volcanoes monitoring, however the huge amount of data produced by this type of instruments and the processing techniques which are quite time consuming limited their potentiality for this application. In order to favor a direct application of arrays techniques to continuous volcano monitoring we designed and built a small PC cluster able to near real time computing the kinematics properties of the wavefield (slowness or wavenumber vector) produced by local seis- mic source. The cluster is composed of 8 Intel Pentium-III bi-processors PC working at 550 MHz, and has 4 Gigabytes of RAM memory. It runs under Linux operating system. The developed analysis software package is based on the Multiple SIgnal Classification (MUSIC) algorithm and is written in Fortran. The message-passing part is based upon the LAM programming environment package, an open-source imple- mentation of the Message Passing Interface (MPI). The developed software system includes modules devote to receiving date by internet and graphical applications for the continuous displaying of the processing results. The system has been tested with a data set collected during a seismic experiment conducted on Etna in 1999 when two dense seismic arrays have been deployed on the northeast and the southeast flanks of this volcano. A real time continuous acquisition system has been simulated by

  16. Applied time series analysis and innovative computing

    CERN Document Server

    Ao, Sio-Iong

    2010-01-01

    This text is a systematic, state-of-the-art introduction to the use of innovative computing paradigms as an investigative tool for applications in time series analysis. It includes frontier case studies based on recent research.

  17. 22 CFR 1429.21 - Computation of time for filing papers.

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 2 2010-04-01 2010-04-01 true Computation of time for filing papers. 1429.21... MISCELLANEOUS AND GENERAL REQUIREMENTS General Requirements § 1429.21 Computation of time for filing papers. In... subchapter requires the filing of any paper, such document must be received by the Board or the officer or...

  18. Highly reliable computer network for real time system

    International Nuclear Information System (INIS)

    Mohammed, F.A.; Omar, A.A.; Ayad, N.M.A.; Madkour, M.A.I.; Ibrahim, M.K.

    1988-01-01

    Many of computer networks have been studied different trends regarding the network architecture and the various protocols that govern data transfers and guarantee a reliable communication among all a hierarchical network structure has been proposed to provide a simple and inexpensive way for the realization of a reliable real-time computer network. In such architecture all computers in the same level are connected to a common serial channel through intelligent nodes that collectively control data transfers over the serial channel. This level of computer network can be considered as a local area computer network (LACN) that can be used in nuclear power plant control system since it has geographically dispersed subsystems. network expansion would be straight the common channel for each added computer (HOST). All the nodes are designed around a microprocessor chip to provide the required intelligence. The node can be divided into two sections namely a common section that interfaces with serial data channel and a private section to interface with the host computer. This part would naturally tend to have some variations in the hardware details to match the requirements of individual host computers. fig 7

  19. Computer simulations of long-time tails: what's new?

    NARCIS (Netherlands)

    Hoef, van der M.A.; Frenkel, D.

    1995-01-01

    Twenty five years ago Alder and Wainwright discovered, by simulation, the 'long-time tails' in the velocity autocorrelation function of a single particle in fluid [1]. Since then, few qualitatively new results on long-time tails have been obtained by computer simulations. However, within the

  20. Spike-timing-based computation in sound localization.

    Directory of Open Access Journals (Sweden)

    Dan F M Goodman

    2010-11-01

    Full Text Available Spike timing is precise in the auditory system and it has been argued that it conveys information about auditory stimuli, in particular about the location of a sound source. However, beyond simple time differences, the way in which neurons might extract this information is unclear and the potential computational advantages are unknown. The computational difficulty of this task for an animal is to locate the source of an unexpected sound from two monaural signals that are highly dependent on the unknown source signal. In neuron models consisting of spectro-temporal filtering and spiking nonlinearity, we found that the binaural structure induced by spatialized sounds is mapped to synchrony patterns that depend on source location rather than on source signal. Location-specific synchrony patterns would then result in the activation of location-specific assemblies of postsynaptic neurons. We designed a spiking neuron model which exploited this principle to locate a variety of sound sources in a virtual acoustic environment using measured human head-related transfer functions. The model was able to accurately estimate the location of previously unknown sounds in both azimuth and elevation (including front/back discrimination in a known acoustic environment. We found that multiple representations of different acoustic environments could coexist as sets of overlapping neural assemblies which could be associated with spatial locations by Hebbian learning. The model demonstrates the computational relevance of relative spike timing to extract spatial information about sources independently of the source signal.

  1. Real-time Tsunami Inundation Prediction Using High Performance Computers

    Science.gov (United States)

    Oishi, Y.; Imamura, F.; Sugawara, D.

    2014-12-01

    Recently off-shore tsunami observation stations based on cabled ocean bottom pressure gauges are actively being deployed especially in Japan. These cabled systems are designed to provide real-time tsunami data before tsunamis reach coastlines for disaster mitigation purposes. To receive real benefits of these observations, real-time analysis techniques to make an effective use of these data are necessary. A representative study was made by Tsushima et al. (2009) that proposed a method to provide instant tsunami source prediction based on achieving tsunami waveform data. As time passes, the prediction is improved by using updated waveform data. After a tsunami source is predicted, tsunami waveforms are synthesized from pre-computed tsunami Green functions of linear long wave equations. Tsushima et al. (2014) updated the method by combining the tsunami waveform inversion with an instant inversion of coseismic crustal deformation and improved the prediction accuracy and speed in the early stages. For disaster mitigation purposes, real-time predictions of tsunami inundation are also important. In this study, we discuss the possibility of real-time tsunami inundation predictions, which require faster-than-real-time tsunami inundation simulation in addition to instant tsunami source analysis. Although the computational amount is large to solve non-linear shallow water equations for inundation predictions, it has become executable through the recent developments of high performance computing technologies. We conducted parallel computations of tsunami inundation and achieved 6.0 TFLOPS by using 19,000 CPU cores. We employed a leap-frog finite difference method with nested staggered grids of which resolution range from 405 m to 5 m. The resolution ratio of each nested domain was 1/3. Total number of grid points were 13 million, and the time step was 0.1 seconds. Tsunami sources of 2011 Tohoku-oki earthquake were tested. The inundation prediction up to 2 hours after the

  2. Taxing Times: An Educational Intervention to Enhance Moral Reasoning in Tax

    Science.gov (United States)

    Doyle, Elaine

    2015-01-01

    This paper outlines the development and implementation of an online educational intervention designed to enhance moral reasoning in higher level tax students. Before decisions are made about how to behave ethically, cognitive moral reasoning takes place. The importance of education in developing morally sensitive individuals who use principled…

  3. 5 CFR 831.703 - Computation of annuities for part-time service.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Computation of annuities for part-time... part-time service. (a) Purpose. The computational method in this section shall be used to determine the annuity for an employee who has part-time service on or after April 7, 1986. (b) Definitions. In this...

  4. THE RELATIONSHIP BETWEEN FUZZY REASONING AND ITS TEMPORAL CHARACTERISTICS FOR KNOWLEDGE MANAGEMENT

    Directory of Open Access Journals (Sweden)

    Daniela SARPE

    2006-01-01

    Full Text Available The knowledge management systems based on artificial reasoning (KMAR tries to provide computers the capabilities of performing various intelligent tasks for which their human users resort to their knowledge and collective intelligence. There is a need for incorporating aspects of time and imprecision into knowledge management systems, considering appropriate semantic foundations. The aim of this paper is to present the FRTES, a real-time fuzzy expert system, embedded in a knowledge management system. Our expert system is a special possibilistic expert system, developed in order to focus on fuzzy knowledge.

  5. Real-time data acquisition and feedback control using Linux Intel computers

    International Nuclear Information System (INIS)

    Penaflor, B.G.; Ferron, J.R.; Piglowski, D.A.; Johnson, R.D.; Walker, M.L.

    2006-01-01

    This paper describes the experiences of the DIII-D programming staff in adapting Linux based Intel computing hardware for use in real-time data acquisition and feedback control systems. Due to the highly dynamic and unstable nature of magnetically confined plasmas in tokamak fusion experiments, real-time data acquisition and feedback control systems are in routine use with all major tokamaks. At DIII-D, plasmas are created and sustained using a real-time application known as the digital plasma control system (PCS). During each experiment, the PCS periodically samples data from hundreds of diagnostic signals and provides these data to control algorithms implemented in software. These algorithms compute the necessary commands to send to various actuators that affect plasma performance. The PCS consists of a group of rack mounted Intel Xeon computer systems running an in-house customized version of the Linux operating system tailored specifically to meet the real-time performance needs of the plasma experiments. This paper provides a more detailed description of the real-time computing hardware and custom developed software, including recent work to utilize dual Intel Xeon equipped computers within the PCS

  6. Qualitative Spatial Reasoning for Visual Grouping in Sketches

    National Research Council Canada - National Science Library

    Forbus, Kenneth D; Tomai, Emmett; Usher, Jeffrey

    2003-01-01

    We believe that qualitative spatial reasoning provides a bridge between perception and cognition, by using visual computations to construct structural descriptions that have functional significance...

  7. Darwin's "strange inversion of reasoning".

    Science.gov (United States)

    Dennett, Daniel

    2009-06-16

    Darwin's theory of evolution by natural selection unifies the world of physics with the world of meaning and purpose by proposing a deeply counterintuitive "inversion of reasoning" (according to a 19th century critic): "to make a perfect and beautiful machine, it is not requisite to know how to make it" [MacKenzie RB (1868) (Nisbet & Co., London)]. Turing proposed a similar inversion: to be a perfect and beautiful computing machine, it is not requisite to know what arithmetic is. Together, these ideas help to explain how we human intelligences came to be able to discern the reasons for all of the adaptations of life, including our own.

  8. Structure induction in diagnostic causal reasoning.

    Science.gov (United States)

    Meder, Björn; Mayrhofer, Ralf; Waldmann, Michael R

    2014-07-01

    Our research examines the normative and descriptive adequacy of alternative computational models of diagnostic reasoning from single effects to single causes. Many theories of diagnostic reasoning are based on the normative assumption that inferences from an effect to its cause should reflect solely the empirically observed conditional probability of cause given effect. We argue against this assumption, as it neglects alternative causal structures that may have generated the sample data. Our structure induction model of diagnostic reasoning takes into account the uncertainty regarding the underlying causal structure. A key prediction of the model is that diagnostic judgments should not only reflect the empirical probability of cause given effect but should also depend on the reasoner's beliefs about the existence and strength of the link between cause and effect. We confirmed this prediction in 2 studies and showed that our theory better accounts for human judgments than alternative theories of diagnostic reasoning. Overall, our findings support the view that in diagnostic reasoning people go "beyond the information given" and use the available data to make inferences on the (unobserved) causal rather than on the (observed) data level. (c) 2014 APA, all rights reserved.

  9. Evidential reasoning research on intrusion detection

    Science.gov (United States)

    Wang, Xianpei; Xu, Hua; Zheng, Sheng; Cheng, Anyu

    2003-09-01

    In this paper, we mainly aim at D-S theory of evidence and the network intrusion detection these two fields. It discusses the method how to apply this probable reasoning as an AI technology to the Intrusion Detection System (IDS). This paper establishes the application model, describes the new mechanism of reasoning and decision-making and analyses how to implement the model based on the synscan activities detection on the network. The results suggest that if only rational probability values were assigned at the beginning, the engine can, according to the rules of evidence combination and hierarchical reasoning, compute the values of belief and finally inform the administrators of the qualities of the traced activities -- intrusions, normal activities or abnormal activities.

  10. New Flutter Analysis Technique for Time-Domain Computational Aeroelasticity

    Science.gov (United States)

    Pak, Chan-Gi; Lung, Shun-Fat

    2017-01-01

    A new time-domain approach for computing flutter speed is presented. Based on the time-history result of aeroelastic simulation, the unknown unsteady aerodynamics model is estimated using a system identification technique. The full aeroelastic model is generated via coupling the estimated unsteady aerodynamic model with the known linear structure model. The critical dynamic pressure is computed and used in the subsequent simulation until the convergence of the critical dynamic pressure is achieved. The proposed method is applied to a benchmark cantilevered rectangular wing.

  11. Relating Derived Relations as a Model of Analogical Reasoning: Reaction Times and Event-Related Potentials

    Science.gov (United States)

    Barnes-Holmes, Dermot; Regan, Donal; Barnes-Holmes, Yvonne; Commins, Sean; Walsh, Derek; Stewart, Ian; Smeets, Paul M.; Whelan, Robert; Dymond, Simon

    2005-01-01

    The current study aimed to test a Relational Frame Theory (RFT) model of analogical reasoning based on the relating of derived same and derived difference relations. Experiment 1 recorded reaction time measures of similar-similar (e.g., "apple is to orange as dog is to cat") versus different-different (e.g., "he is to his brother as…

  12. Proportional reasoning as a heuristic-based process: time constraint and dual task considerations.

    Science.gov (United States)

    Gillard, Ellen; Van Dooren, Wim; Schaeken, Walter; Verschaffel, Lieven

    2009-01-01

    The present study interprets the overuse of proportional solution methods from a dual process framework. Dual process theories claim that analytic operations involve time-consuming executive processing, whereas heuristic operations are fast and automatic. In two experiments to test whether proportional reasoning is heuristic-based, the participants solved "proportional" problems, for which proportional solution methods provide correct answers, and "nonproportional" problems known to elicit incorrect answers based on the assumption of proportionality. In Experiment 1, the available solution time was restricted. In Experiment 2, the executive resources were burdened with a secondary task. Both manipulations induced an increase in proportional answers and a decrease in correct answers to nonproportional problems. These results support the hypothesis that the choice for proportional methods is heuristic-based.

  13. Network Forensics Method Based on Evidence Graph and Vulnerability Reasoning

    Directory of Open Access Journals (Sweden)

    Jingsha He

    2016-11-01

    Full Text Available As the Internet becomes larger in scale, more complex in structure and more diversified in traffic, the number of crimes that utilize computer technologies is also increasing at a phenomenal rate. To react to the increasing number of computer crimes, the field of computer and network forensics has emerged. The general purpose of network forensics is to find malicious users or activities by gathering and dissecting firm evidences about computer crimes, e.g., hacking. However, due to the large volume of Internet traffic, not all the traffic captured and analyzed is valuable for investigation or confirmation. After analyzing some existing network forensics methods to identify common shortcomings, we propose in this paper a new network forensics method that uses a combination of network vulnerability and network evidence graph. In our proposed method, we use vulnerability evidence and reasoning algorithm to reconstruct attack scenarios and then backtrack the network packets to find the original evidences. Our proposed method can reconstruct attack scenarios effectively and then identify multi-staged attacks through evidential reasoning. Results of experiments show that the evidence graph constructed using our method is more complete and credible while possessing the reasoning capability.

  14. Registered nurses' clinical reasoning skills and reasoning process: A think-aloud study.

    Science.gov (United States)

    Lee, JuHee; Lee, Young Joo; Bae, JuYeon; Seo, Minjeong

    2016-11-01

    As complex chronic diseases are increasing, nurses' prompt and accurate clinical reasoning skills are essential. However, little is known about the reasoning skills of registered nurses. This study aimed to determine how registered nurses use their clinical reasoning skills and to identify how the reasoning process proceeds in the complex clinical situation of hospital setting. A qualitative exploratory design was used with a think-aloud method. A total of 13 registered nurses (mean years of experience=11.4) participated in the study, solving an ill-structured clinical problem based on complex chronic patients cases in a hospital setting. Data were analyzed using deductive content analysis. Findings showed that the registered nurses used a variety of clinical reasoning skills. The most commonly used skill was 'checking accuracy and reliability.' The reasoning process of registered nurses covered assessment, analysis, diagnosis, planning/implementation, and evaluation phase. It is critical that registered nurses apply appropriate clinical reasoning skills in complex clinical practice. The main focus of registered nurses' reasoning in this study was assessing a patient's health problem, and their reasoning process was cyclic, rather than linear. There is a need for educational strategy development to enhance registered nurses' competency in determining appropriate interventions in a timely and accurate fashion. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Distributed computing for real-time petroleum reservoir monitoring

    Energy Technology Data Exchange (ETDEWEB)

    Ayodele, O. R. [University of Alberta, Edmonton, AB (Canada)

    2004-05-01

    Computer software architecture is presented to illustrate how the concept of distributed computing can be applied to real-time reservoir monitoring processes, permitting the continuous monitoring of the dynamic behaviour of petroleum reservoirs at much shorter intervals. The paper describes the fundamental technologies driving distributed computing, namely Java 2 Platform Enterprise edition (J2EE) by Sun Microsystems, and the Microsoft Dot-Net (Microsoft.Net) initiative, and explains the challenges involved in distributed computing. These are: (1) availability of permanently placed downhole equipment to acquire and transmit seismic data; (2) availability of high bandwidth to transmit the data; (3) security considerations; (4) adaptation of existing legacy codes to run on networks as downloads on demand; and (5) credibility issues concerning data security over the Internet. Other applications of distributed computing in the petroleum industry are also considered, specifically MWD, LWD and SWD (measurement-while-drilling, logging-while-drilling, and simulation-while-drilling), and drill-string vibration monitoring. 23 refs., 1 fig.

  16. Real Time Animation of Trees Based on BBSC in Computer Games

    Directory of Open Access Journals (Sweden)

    Xuefeng Ao

    2009-01-01

    Full Text Available That researchers in the field of computer games usually find it is difficult to simulate the motion of actual 3D model trees lies in the fact that the tree model itself has very complicated structure, and many sophisticated factors need to be considered during the simulation. Though there are some works on simulating 3D tree and its motion, few of them are used in computer games due to the high demand for real-time in computer games. In this paper, an approach of animating trees in computer games based on a novel tree model representation—Ball B-Spline Curves (BBSCs are proposed. By taking advantage of the good features of the BBSC-based model, physical simulation of the motion of leafless trees with wind blowing becomes easier and more efficient. The method can generate realistic 3D tree animation in real-time, which meets the high requirement for real time in computer games.

  17. Features of calculation of reasonable time of the trial in civil cases in the context of the practice of the European court of human rights

    Directory of Open Access Journals (Sweden)

    Т. Цувіна

    2015-11-01

    Full Text Available Problem setting. European Convention of Human Rights (ECHR guarantees right to a fair trial within a reasonable time for everyone (par. 1 art. 6 ECHR. Reasonable time of the trial is an element of the right to a fair trial. One of the main directions for development of civil procedure in Ukraine is the implementation of international standards of fair trial, in particular standards of reasonable time of the trial. Recent research and publications analyses. Foreign and Ukrainian scientists such as Komarov V. V., Neshataeva T. M., Sakara N. U. and others in their works paid attention to different aspects of problems connected with the right to a fair trial within a reasonable time, but a comprehensive study devoted to a features of calculation of reasonable time of the trial taking into account the practice of the ECHR on this issue wasn’t conducted. Paper objective. Main objective of the article is to study decisions of the ECHR concerning the interpretation of Par. 1, Art. 6 ECHR and analyze features of calculation of reasonable time of the trial to make recommendations on implementation of such national level. Paper main body. As a rule, according to a practice of ECHR reasonable time of civil proceedings begins on the date on which the case is referred to a judicial authority. Thus ECHR can take as the starting point the date of a preliminary application to an administrative authority, especially when this is a prerequisite for commencement of proceedings. The end of reasonable time of the trial connected with the moment when the court decision become final or its execution. Conclusions of the research. Calculation of reasonable time of the trial in civil cases in circumstances when an application to the court was preceded by a seeking for protection from the authorities and public servants of executive power has features. In such situations a calculation of reasonable time of the trial doesn’t begin from the moment of seeking for

  18. Predicting Reasoning from Memory

    Science.gov (United States)

    Heit, Evan; Hayes, Brett K.

    2011-01-01

    In an effort to assess the relations between reasoning and memory, in 8 experiments, the authors examined how well responses on an inductive reasoning task are predicted from responses on a recognition memory task for the same picture stimuli. Across several experimental manipulations, such as varying study time, presentation frequency, and the…

  19. Computer network time synchronization the network time protocol on earth and in space

    CERN Document Server

    Mills, David L

    2010-01-01

    Carefully coordinated, reliable, and accurate time synchronization is vital to a wide spectrum of fields-from air and ground traffic control, to buying and selling goods and services, to TV network programming. Ill-gotten time could even lead to the unimaginable and cause DNS caches to expire, leaving the entire Internet to implode on the root servers.Written by the original developer of the Network Time Protocol (NTP), Computer Network Time Synchronization: The Network Time Protocol on Earth and in Space, Second Edition addresses the technological infrastructure of time dissemination, distrib

  20. The reliable solution and computation time of variable parameters Logistic model

    OpenAIRE

    Pengfei, Wang; Xinnong, Pan

    2016-01-01

    The reliable computation time (RCT, marked as Tc) when applying a double precision computation of a variable parameters logistic map (VPLM) is studied. First, using the method proposed, the reliable solutions for the logistic map are obtained. Second, for a time-dependent non-stationary parameters VPLM, 10000 samples of reliable experiments are constructed, and the mean Tc is then computed. The results indicate that for each different initial value, the Tcs of the VPLM are generally different...

  1. Real-Time Control of an Articulatory-Based Speech Synthesizer for Brain Computer Interfaces.

    Directory of Open Access Journals (Sweden)

    Florent Bocquelet

    2016-11-01

    Full Text Available Restoring natural speech in paralyzed and aphasic people could be achieved using a Brain-Computer Interface (BCI controlling a speech synthesizer in real-time. To reach this goal, a prerequisite is to develop a speech synthesizer producing intelligible speech in real-time with a reasonable number of control parameters. We present here an articulatory-based speech synthesizer that can be controlled in real-time for future BCI applications. This synthesizer converts movements of the main speech articulators (tongue, jaw, velum, and lips into intelligible speech. The articulatory-to-acoustic mapping is performed using a deep neural network (DNN trained on electromagnetic articulography (EMA data recorded on a reference speaker synchronously with the produced speech signal. This DNN is then used in both offline and online modes to map the position of sensors glued on different speech articulators into acoustic parameters that are further converted into an audio signal using a vocoder. In offline mode, highly intelligible speech could be obtained as assessed by perceptual evaluation performed by 12 listeners. Then, to anticipate future BCI applications, we further assessed the real-time control of the synthesizer by both the reference speaker and new speakers, in a closed-loop paradigm using EMA data recorded in real time. A short calibration period was used to compensate for differences in sensor positions and articulatory differences between new speakers and the reference speaker. We found that real-time synthesis of vowels and consonants was possible with good intelligibility. In conclusion, these results open to future speech BCI applications using such articulatory-based speech synthesizer.

  2. Probabilistic reasoning in intelligent systems networks of plausible inference

    CERN Document Server

    Pearl, Judea

    1988-01-01

    Probabilistic Reasoning in Intelligent Systems is a complete and accessible account of the theoretical foundations and computational methods that underlie plausible reasoning under uncertainty. The author provides a coherent explication of probability as a language for reasoning with partial belief and offers a unifying perspective on other AI approaches to uncertainty, such as the Dempster-Shafer formalism, truth maintenance systems, and nonmonotonic logic. The author distinguishes syntactic and semantic approaches to uncertainty--and offers techniques, based on belief networks, that provid

  3. The Potentials of Using Cloud Computing in Schools:A Systematic Literature Review

    OpenAIRE

    Hartmann, Simon Birk; Nygaard, Lotte Qulleq Victhoria; Pedersen, Sine; Khalid, Md. Saifuddin

    2017-01-01

    Cloud Computing (CC) refers to the physical structure of a communications network, where data is saved and stored in large data centers and thus can be accessed anywhere, at any time and from different devices. It is evident that the integration and adoption of CC and discontinuation of an alternative ICT includes some underlying reasons. Optimistically, these reasons can be interpreted as the potentials of using cloud computing and as the functions or values that circumvent or solve some of ...

  4. Computer-Assisted Program Reasoning Based on a Relational Semantics of Programs

    Directory of Open Access Journals (Sweden)

    Wolfgang Schreiner

    2012-02-01

    Full Text Available We present an approach to program reasoning which inserts between a program and its verification conditions an additional layer, the denotation of the program expressed in a declarative form. The program is first translated into its denotation from which subsequently the verification conditions are generated. However, even before (and independently of any verification attempt, one may investigate the denotation itself to get insight into the "semantic essence" of the program, in particular to see whether the denotation indeed gives reason to believe that the program has the expected behavior. Errors in the program and in the meta-information may thus be detected and fixed prior to actually performing the formal verification. More concretely, following the relational approach to program semantics, we model the effect of a program as a binary relation on program states. A formal calculus is devised to derive from a program a logic formula that describes this relation and is subject for inspection and manipulation. We have implemented this idea in a comprehensive form in the RISC ProgramExplorer, a new program reasoning environment for educational purposes which encompasses the previously developed RISC ProofNavigator as an interactive proving assistant.

  5. Anthropomorphic reasoning about neuromorphic AGI safety

    Science.gov (United States)

    Jilk, David J.; Herd, Seth J.; Read, Stephen J.; O'Reilly, Randall C.

    2017-11-01

    One candidate approach to creating artificial general intelligence (AGI) is to imitate the essential computations of human cognition. This process is sometimes called 'reverse-engineering the brain' and the end product called 'neuromorphic.' We argue that, unlike with other approaches to AGI, anthropomorphic reasoning about behaviour and safety concerns is appropriate and crucial in a neuromorphic context. Using such reasoning, we offer some initial ideas to make neuromorphic AGI safer. In particular, we explore how basic drives that promote social interaction may be essential to the development of cognitive capabilities as well as serving as a focal point for human-friendly outcomes.

  6. Soft Real-Time PID Control on a VME Computer

    Science.gov (United States)

    Karayan, Vahag; Sander, Stanley; Cageao, Richard

    2007-01-01

    microPID (uPID) is a computer program for real-time proportional + integral + derivative (PID) control of a translation stage in a Fourier-transform ultraviolet spectrometer. microPID implements a PID control loop over a position profile at sampling rate of 8 kHz (sampling period 125microseconds). The software runs in a strippeddown Linux operating system on a VersaModule Eurocard (VME) computer operating in real-time priority queue using an embedded controller, a 16-bit digital-to-analog converter (D/A) board, and a laser-positioning board (LPB). microPID consists of three main parts: (1) VME device-driver routines, (2) software that administers a custom protocol for serial communication with a control computer, and (3) a loop section that obtains the current position from an LPB-driver routine, calculates the ideal position from the profile, and calculates a new voltage command by use of an embedded PID routine all within each sampling period. The voltage command is sent to the D/A board to control the stage. microPID uses special kernel headers to obtain microsecond timing resolution. Inasmuch as microPID implements a single-threaded process and all other processes are disabled, the Linux operating system acts as a soft real-time system.

  7. Relational Reasoning for Markov Chains in a Probabilistic Guarded Lambda Calculus

    DEFF Research Database (Denmark)

    Aguirre, Alejandro; Barthe, Gilles; Birkedal, Lars

    2018-01-01

    We extend the simply-typed guarded $\\lambda$-calculus with discrete probabilities and endow it with a program logic for reasoning about relational properties of guarded probabilistic computations. This provides a framework for programming and reasoning about infinite stochastic processes like...

  8. Experiences with Aber-OWL, an Ontology Repository with OWL EL Reasoning

    KAUST Repository

    Slater, Luke; Rodriguez-Garcia, Miguel Angel; O’ Shea, Keiron; Schofield, Paul N.; Gkoutos, Georgios V.; Hoehndorf, Robert

    2016-01-01

    expressed in the Web Ontology Language (OWL). Computational access to the knowledge contained within them relies on the use of automated reasoning. We have developed Aber-OWL, an ontology repository that provides OWL EL reasoning to answer queries and verify

  9. Increased Specificity of Wechsler Adult Intelligence Scale-Third Edition Matrix Reasoning Test Instructions and Time Limits

    Science.gov (United States)

    Callens, Andy M.; Atchison, Timothy B.; Engler, Rachel R.

    2009-01-01

    Instructions for the Matrix Reasoning Test (MRT) of the Wechsler Adult Intelligence Scale-Third Edition were modified by explicitly stating that the subtest was untimed or that a per-item time limit would be imposed. The MRT was administered within one of four conditions: with (a) standard administration instructions, (b) explicit instructions…

  10. Cone-beam computed tomography: Time to move from ALARA to ALADA

    Energy Technology Data Exchange (ETDEWEB)

    Jaju, Prashant P.; Jaju, Sushma P. [Rishiraj College of Dental Sciences and Research Centre, Bhopa(Indonesia)

    2015-12-15

    Cone-beam computed tomography (CBCT) is routinely recommended for dental diagnosis and treatment planning. CBCT exposes patients to less radiation than does conventional CT. Still, lack of proper education among dentists and specialists is resulting in improper referral for CBCT. In addition, aiming to generate high-quality images, operators may increase the radiation dose, which can expose the patient to unnecessary risk. This letter advocates appropriate radiation dosing during CBCT to the benefit of both patients and dentists, and supports moving from the concept of 'as low as reasonably achievable' (ALARA) to 'as low as diagnostically acceptable' (ALADA.

  11. Computing return times or return periods with rare event algorithms

    Science.gov (United States)

    Lestang, Thibault; Ragone, Francesco; Bréhier, Charles-Edouard; Herbert, Corentin; Bouchet, Freddy

    2018-04-01

    The average time between two occurrences of the same event, referred to as its return time (or return period), is a useful statistical concept for practical applications. For instance insurances or public agencies may be interested by the return time of a 10 m flood of the Seine river in Paris. However, due to their scarcity, reliably estimating return times for rare events is very difficult using either observational data or direct numerical simulations. For rare events, an estimator for return times can be built from the extrema of the observable on trajectory blocks. Here, we show that this estimator can be improved to remain accurate for return times of the order of the block size. More importantly, we show that this approach can be generalised to estimate return times from numerical algorithms specifically designed to sample rare events. So far those algorithms often compute probabilities, rather than return times. The approach we propose provides a computationally extremely efficient way to estimate numerically the return times of rare events for a dynamical system, gaining several orders of magnitude of computational costs. We illustrate the method on two kinds of observables, instantaneous and time-averaged, using two different rare event algorithms, for a simple stochastic process, the Ornstein–Uhlenbeck process. As an example of realistic applications to complex systems, we finally discuss extreme values of the drag on an object in a turbulent flow.

  12. Modelling Chemical Reasoning to Predict and Invent Reactions.

    Science.gov (United States)

    Segler, Marwin H S; Waller, Mark P

    2017-05-02

    The ability to reason beyond established knowledge allows organic chemists to solve synthetic problems and invent novel transformations. Herein, we propose a model that mimics chemical reasoning, and formalises reaction prediction as finding missing links in a knowledge graph. We have constructed a knowledge graph containing 14.4 million molecules and 8.2 million binary reactions, which represents the bulk of all chemical reactions ever published in the scientific literature. Our model outperforms a rule-based expert system in the reaction prediction task for 180 000 randomly selected binary reactions. The data-driven model generalises even beyond known reaction types, and is thus capable of effectively (re-)discovering novel transformations (even including transition metal-catalysed reactions). Our model enables computers to infer hypotheses about reactivity and reactions by only considering the intrinsic local structure of the graph and because each single reaction prediction is typically achieved in a sub-second time frame, the model can be used as a high-throughput generator of reaction hypotheses for reaction discovery. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Near real-time digital holographic microscope based on GPU parallel computing

    Science.gov (United States)

    Zhu, Gang; Zhao, Zhixiong; Wang, Huarui; Yang, Yan

    2018-01-01

    A transmission near real-time digital holographic microscope with in-line and off-axis light path is presented, in which the parallel computing technology based on compute unified device architecture (CUDA) and digital holographic microscopy are combined. Compared to other holographic microscopes, which have to implement reconstruction in multiple focal planes and are time-consuming the reconstruction speed of the near real-time digital holographic microscope can be greatly improved with the parallel computing technology based on CUDA, so it is especially suitable for measurements of particle field in micrometer and nanometer scale. Simulations and experiments show that the proposed transmission digital holographic microscope can accurately measure and display the velocity of particle field in micrometer scale, and the average velocity error is lower than 10%.With the graphic processing units(GPU), the computing time of the 100 reconstruction planes(512×512 grids) is lower than 120ms, while it is 4.9s using traditional reconstruction method by CPU. The reconstruction speed has been raised by 40 times. In other words, it can handle holograms at 8.3 frames per second and the near real-time measurement and display of particle velocity field are realized. The real-time three-dimensional reconstruction of particle velocity field is expected to achieve by further optimization of software and hardware. Keywords: digital holographic microscope,

  14. Real-time context aware reasoning in on-board intelligent traffic systems: An Architecture for Ontology-based Reasoning using Finite State Machines

    NARCIS (Netherlands)

    Stoter, Arjan; Dalmolen, Simon; Drenth, Eduard; Cornelisse, Erik; Mulder, Wico

    2011-01-01

    In-vehicle information management is vital in intelligent traffic systems. In this paper we motivate an architecture for ontology-based context-aware reasoning for in-vehicle information management. An ontology is essential for system standardization and communication, and ontology-based reasoning

  15. The Potentials of Using Cloud Computing in Schools

    DEFF Research Database (Denmark)

    Hartmann, Simon Birk; Nygaard, Lotte Qulleq Victhoria; Pedersen, Sine

    2017-01-01

    Cloud Computing (CC) refers to the physical structure of a communications network, where data is saved and stored in large data centers and thus can be accessed anywhere, at any time and from different devices. It is evident that the integration and adoption of CC and discontinuation...... of an alternative ICT includes some underlying reasons. Optimistically, these reasons can be interpreted as the potentials of using cloud computing and as the functions or values that circumvent or solve some of the existing challenges with other form of educational technology. This systematic literature review...... from ERIC, IEEE Xplore, Science Direct and Primo, and after screening and eligibility checking, 13 articles focusing on “cloud computing and school” were included for qualitative analysis and meta-analysis. The papers are coded, from which 31 themes are devised, and five categories are made to group...

  16. Time reversibility, computer simulation, algorithms, chaos

    CERN Document Server

    Hoover, William Graham

    2012-01-01

    A small army of physicists, chemists, mathematicians, and engineers has joined forces to attack a classic problem, the "reversibility paradox", with modern tools. This book describes their work from the perspective of computer simulation, emphasizing the author's approach to the problem of understanding the compatibility, and even inevitability, of the irreversible second law of thermodynamics with an underlying time-reversible mechanics. Computer simulation has made it possible to probe reversibility from a variety of directions and "chaos theory" or "nonlinear dynamics" has supplied a useful vocabulary and a set of concepts, which allow a fuller explanation of irreversibility than that available to Boltzmann or to Green, Kubo and Onsager. Clear illustration of concepts is emphasized throughout, and reinforced with a glossary of technical terms from the specialized fields which have been combined here to focus on a common theme. The book begins with a discussion, contrasting the idealized reversibility of ba...

  17. A prognostic model for temporal courses that combines temporal abstraction and case-based reasoning.

    Science.gov (United States)

    Schmidt, Rainer; Gierl, Lothar

    2005-03-01

    Since clinical management of patients and clinical research are essentially time-oriented endeavours, reasoning about time has become a hot topic in medical informatics. Here we present a method for prognosis of temporal courses, which combines temporal abstractions with case-based reasoning. It is useful for application domains where neither well-known standards, nor known periodicity, nor a complete domain theory exist. We have used our method in two prognostic applications. The first one deals with prognosis of the kidney function for intensive care patients. The idea is to elicit impairments on time, especially to warn against threatening kidney failures. Our second application deals with a completely different domain, namely geographical medicine. Its intention is to compute early warnings against approaching infectious diseases, which are characterised by irregular cyclic occurrences. So far, we have applied our program on influenza and bronchitis. In this paper, we focus on influenza forecast and show first experimental results.

  18. Reasoning and change management in modular ontologies

    NARCIS (Netherlands)

    Stuckenschmidt, Heiner; Klein, Michel

    2007-01-01

    The benefits of modular representations are well known from many areas of computer science. While in software engineering modularization is mainly a vehicle for supporting distributed development and re-use, in knowledge representation, the main goal of modularization is efficiency of reasoning. In

  19. Imaging deductive reasoning and the new paradigm

    Science.gov (United States)

    Oaksford, Mike

    2015-01-01

    There has been a great expansion of research into human reasoning at all of Marr’s explanatory levels. There is a tendency for this work to progress within a level largely ignoring the others which can lead to slippage between levels (Chater et al., 2003). It is argued that recent brain imaging research on deductive reasoning—implementational level—has largely ignored the new paradigm in reasoning—computational level (Over, 2009). Consequently, recent imaging results are reviewed with the focus on how they relate to the new paradigm. The imaging results are drawn primarily from a recent meta-analysis by Prado et al. (2011) but further imaging results are also reviewed where relevant. Three main observations are made. First, the main function of the core brain region identified is most likely elaborative, defeasible reasoning not deductive reasoning. Second, the subtraction methodology and the meta-analytic approach may remove all traces of content specific System 1 processes thought to underpin much human reasoning. Third, interpreting the function of the brain regions activated by a task depends on theories of the function that a task engages. When there are multiple interpretations of that function, interpreting what an active brain region is doing is not clear cut. It is concluded that there is a need to more tightly connect brain activation to function, which could be achieved using formalized computational level models and a parametric variation approach. PMID:25774130

  20. Temporal Reasoning and Default Logics.

    Science.gov (United States)

    1985-10-01

    Aritificial Intelligence ", Computer Science Research Report, Yale University, forthcoming (1985). . 74 .-, A Axioms for Describing Persistences and Clipping...34Circumscription - A Form of Non-Monotonic Reasoning", Artificial Intelligence , vol. 13 (1980), pp. 27-39. [13] McCarthy, John, "Applications of...and P. J. Hayes, "Some philosophical problems from the standpoint of artificial intelligence ", in: B. Meltzer and D. Michie (eds.), Machine

  1. Rational Thinking and Reasonable Thinking in Physics

    Directory of Open Access Journals (Sweden)

    Isaeva E. A.

    2008-04-01

    Full Text Available The usual concept of space and time, based on Aristotle's principle of contemplation of the world and of the absoluteness of time, is a product of rational thinking. At the same time, in philosophy, rational thinking differs from reasonable thinking; the aim of logic is to distinguish finite forms from infinite forms. Agreeing that space and time are things of infinity in this work, we shall show that, with regard to these two things, it is necessary to apply reasonable thinking. Spaces with non-Euclidean geometry, for example Riemannian and Finslerian spaces, in particular, the space of the General Theory of the Relativity (four-dimensional pseudo-Riemannian geometry and also the concept of multi-dimensional space-time are products of reasonable thinking. Consequently, modern physical experiment not dealing with daily occurrences (greater speeds than a low speed to the velocity of light, strong fields, singularities, etc. can be covered only by reasonable thinking.

  2. Rational Thinking and Reasonable Thinking in Physics

    Directory of Open Access Journals (Sweden)

    Isaeva E. A.

    2008-04-01

    Full Text Available The usual concept of space and time, based on Aristotle’s principle of contemplation of the world and of the absoluteness of time, is a product of rational thinking. At the same time, in philosophy, rational thinking differs from reasonable thinking; the aim of logic is to distinguish finite forms from infinite forms. Agreeing that space and time are things of infinity in this work, we shall show that, with regard to these two things, it is necessary to apply reasonable thinking. Spaces with non-Euclidean geometry, for example Riemannian and Finslerian spaces, in particular, the space of the General Theory of the Relativity (four-dimensional pseudo-Riemannian geometry and also the concept of multi-dimensional space-time are products of reasonable thinking. Consequently, modern physical experiment not dealing with daily occurrences (greater speeds than a low speed to the velocity of light, strong fields, singularities, etc. can be covered only by reasonable thinking.

  3. Computation of reactor control rod drop time under accident conditions

    International Nuclear Information System (INIS)

    Dou Yikang; Yao Weida; Yang Renan; Jiang Nanyan

    1998-01-01

    The computational method of reactor control rod drop time under accident conditions lies mainly in establishing forced vibration equations for the components under action of outside forces on control rod driven line and motion equation for the control rod moving in vertical direction. The above two kinds of equations are connected by considering the impact effects between control rod and its outside components. Finite difference method is adopted to make discretization of the vibration equations and Wilson-θ method is applied to deal with the time history problem. The non-linearity caused by impact is iteratively treated with modified Newton method. Some experimental results are used to validate the validity and reliability of the computational method. Theoretical and experimental testing problems show that the computer program based on the computational method is applicable and reliable. The program can act as an effective tool of design by analysis and safety analysis for the relevant components

  4. Integrated software health management for aerospace guidance, navigation, and control systems: A probabilistic reasoning approach

    Science.gov (United States)

    Mbaya, Timmy

    Embedded Aerospace Systems have to perform safety and mission critical operations in a real-time environment where timing and functional correctness are extremely important. Guidance, Navigation, and Control (GN&C) systems substantially rely on complex software interfacing with hardware in real-time; any faults in software or hardware, or their interaction could result in fatal consequences. Integrated Software Health Management (ISWHM) provides an approach for detection and diagnosis of software failures while the software is in operation. The ISWHM approach is based on probabilistic modeling of software and hardware sensors using a Bayesian network. To meet memory and timing constraints of real-time embedded execution, the Bayesian network is compiled into an Arithmetic Circuit, which is used for on-line monitoring. This type of system monitoring, using an ISWHM, provides automated reasoning capabilities that compute diagnoses in a timely manner when failures occur. This reasoning capability enables time-critical mitigating decisions and relieves the human agent from the time-consuming and arduous task of foraging through a multitude of isolated---and often contradictory---diagnosis data. For the purpose of demonstrating the relevance of ISWHM, modeling and reasoning is performed on a simple simulated aerospace system running on a real-time operating system emulator, the OSEK/Trampoline platform. Models for a small satellite and an F-16 fighter jet GN&C (Guidance, Navigation, and Control) system have been implemented. Analysis of the ISWHM is then performed by injecting faults and analyzing the ISWHM's diagnoses.

  5. Real-Time Accumulative Computation Motion Detectors

    Directory of Open Access Journals (Sweden)

    Saturnino Maldonado-Bascón

    2009-12-01

    Full Text Available The neurally inspired accumulative computation (AC method and its application to motion detection have been introduced in the past years. This paper revisits the fact that many researchers have explored the relationship between neural networks and finite state machines. Indeed, finite state machines constitute the best characterized computational model, whereas artificial neural networks have become a very successful tool for modeling and problem solving. The article shows how to reach real-time performance after using a model described as a finite state machine. This paper introduces two steps towards that direction: (a A simplification of the general AC method is performed by formally transforming it into a finite state machine. (b A hardware implementation in FPGA of such a designed AC module, as well as an 8-AC motion detector, providing promising performance results. We also offer two case studies of the use of AC motion detectors in surveillance applications, namely infrared-based people segmentation and color-based people tracking, respectively.

  6. Elucidating reaction mechanisms on quantum computers

    Science.gov (United States)

    Reiher, Markus; Wiebe, Nathan; Svore, Krysta M.; Wecker, Dave; Troyer, Matthias

    2017-01-01

    With rapid recent advances in quantum technology, we are close to the threshold of quantum devices whose computational powers can exceed those of classical supercomputers. Here, we show that a quantum computer can be used to elucidate reaction mechanisms in complex chemical systems, using the open problem of biological nitrogen fixation in nitrogenase as an example. We discuss how quantum computers can augment classical computer simulations used to probe these reaction mechanisms, to significantly increase their accuracy and enable hitherto intractable simulations. Our resource estimates show that, even when taking into account the substantial overhead of quantum error correction, and the need to compile into discrete gate sets, the necessary computations can be performed in reasonable time on small quantum computers. Our results demonstrate that quantum computers will be able to tackle important problems in chemistry without requiring exorbitant resources. PMID:28674011

  7. Elucidating reaction mechanisms on quantum computers

    Science.gov (United States)

    Reiher, Markus; Wiebe, Nathan; Svore, Krysta M.; Wecker, Dave; Troyer, Matthias

    2017-07-01

    With rapid recent advances in quantum technology, we are close to the threshold of quantum devices whose computational powers can exceed those of classical supercomputers. Here, we show that a quantum computer can be used to elucidate reaction mechanisms in complex chemical systems, using the open problem of biological nitrogen fixation in nitrogenase as an example. We discuss how quantum computers can augment classical computer simulations used to probe these reaction mechanisms, to significantly increase their accuracy and enable hitherto intractable simulations. Our resource estimates show that, even when taking into account the substantial overhead of quantum error correction, and the need to compile into discrete gate sets, the necessary computations can be performed in reasonable time on small quantum computers. Our results demonstrate that quantum computers will be able to tackle important problems in chemistry without requiring exorbitant resources.

  8. Elucidating reaction mechanisms on quantum computers.

    Science.gov (United States)

    Reiher, Markus; Wiebe, Nathan; Svore, Krysta M; Wecker, Dave; Troyer, Matthias

    2017-07-18

    With rapid recent advances in quantum technology, we are close to the threshold of quantum devices whose computational powers can exceed those of classical supercomputers. Here, we show that a quantum computer can be used to elucidate reaction mechanisms in complex chemical systems, using the open problem of biological nitrogen fixation in nitrogenase as an example. We discuss how quantum computers can augment classical computer simulations used to probe these reaction mechanisms, to significantly increase their accuracy and enable hitherto intractable simulations. Our resource estimates show that, even when taking into account the substantial overhead of quantum error correction, and the need to compile into discrete gate sets, the necessary computations can be performed in reasonable time on small quantum computers. Our results demonstrate that quantum computers will be able to tackle important problems in chemistry without requiring exorbitant resources.

  9. Application study of non-linear reasoning for nuclear plant operation

    International Nuclear Information System (INIS)

    Yoshikawa, Shinji; Yonekawa, Tsuyoshi; Suda, Kazunori; Hasegawa, Makoto

    1999-03-01

    Achievement of improved operation safety of complex nuclear power plants by decision-making functions based on information processing technologies requires real time concluding to be in-time for plant state evolutions, transparency for human operators to understand the derived conclusion, and robustness against local defects inevitably hidden in huge amount of the referred information. These requirements can not be met by the simple rule-based reasoning. The study introduced in this report put a focus on the robustness of the three requirements. This study derives the required feature of robust decision making, after thinking of possible contribution of fast emerging measurement technologies. Finally, two concrete algorithm based on mutual evaluation of relatively simple diagnostic modules are implemented on computer systems and evaluated in views of applicability. (author)

  10. Concepts and Relations in Neurally Inspired In Situ Concept-Based Computing.

    Science.gov (United States)

    van der Velde, Frank

    2016-01-01

    In situ concept-based computing is based on the notion that conceptual representations in the human brain are "in situ." In this way, they are grounded in perception and action. Examples are neuronal assemblies, whose connection structures develop over time and are distributed over different brain areas. In situ concepts representations cannot be copied or duplicated because that will disrupt their connection structure, and thus the meaning of these concepts. Higher-level cognitive processes, as found in language and reasoning, can be performed with in situ concepts by embedding them in specialized neurally inspired "blackboards." The interactions between the in situ concepts and the blackboards form the basis for in situ concept computing architectures. In these architectures, memory (concepts) and processing are interwoven, in contrast with the separation between memory and processing found in Von Neumann architectures. Because the further development of Von Neumann computing (more, faster, yet power limited) is questionable, in situ concept computing might be an alternative for concept-based computing. In situ concept computing will be illustrated with a recently developed BABI reasoning task. Neurorobotics can play an important role in the development of in situ concept computing because of the development of in situ concept representations derived in scenarios as needed for reasoning tasks. Neurorobotics would also benefit from power limited and in situ concept computing.

  11. Television viewing, computer use and total screen time in Canadian youth.

    Science.gov (United States)

    Mark, Amy E; Boyce, William F; Janssen, Ian

    2006-11-01

    Research has linked excessive television viewing and computer use in children and adolescents to a variety of health and social problems. Current recommendations are that screen time in children and adolescents should be limited to no more than 2 h per day. To determine the percentage of Canadian youth meeting the screen time guideline recommendations. The representative study sample consisted of 6942 Canadian youth in grades 6 to 10 who participated in the 2001/2002 World Health Organization Health Behaviour in School-Aged Children survey. Only 41% of girls and 34% of boys in grades 6 to 10 watched 2 h or less of television per day. Once the time of leisure computer use was included and total daily screen time was examined, only 18% of girls and 14% of boys met the guidelines. The prevalence of those meeting the screen time guidelines was higher in girls than boys. Fewer than 20% of Canadian youth in grades 6 to 10 met the total screen time guidelines, suggesting that increased public health interventions are needed to reduce the number of leisure time hours that Canadian youth spend watching television and using the computer.

  12. Neural Computations in a Dynamical System with Multiple Time Scales.

    Science.gov (United States)

    Mi, Yuanyuan; Lin, Xiaohan; Wu, Si

    2016-01-01

    Neural systems display rich short-term dynamics at various levels, e.g., spike-frequency adaptation (SFA) at the single-neuron level, and short-term facilitation (STF) and depression (STD) at the synapse level. These dynamical features typically cover a broad range of time scales and exhibit large diversity in different brain regions. It remains unclear what is the computational benefit for the brain to have such variability in short-term dynamics. In this study, we propose that the brain can exploit such dynamical features to implement multiple seemingly contradictory computations in a single neural circuit. To demonstrate this idea, we use continuous attractor neural network (CANN) as a working model and include STF, SFA and STD with increasing time constants in its dynamics. Three computational tasks are considered, which are persistent activity, adaptation, and anticipative tracking. These tasks require conflicting neural mechanisms, and hence cannot be implemented by a single dynamical feature or any combination with similar time constants. However, with properly coordinated STF, SFA and STD, we show that the network is able to implement the three computational tasks concurrently. We hope this study will shed light on the understanding of how the brain orchestrates its rich dynamics at various levels to realize diverse cognitive functions.

  13. 12 CFR 516.10 - How does OTS compute time periods under this part?

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false How does OTS compute time periods under this part? 516.10 Section 516.10 Banks and Banking OFFICE OF THRIFT SUPERVISION, DEPARTMENT OF THE TREASURY APPLICATION PROCESSING PROCEDURES § 516.10 How does OTS compute time periods under this part? In computing...

  14. Student Opinion Survey On Delivery Of ECE431: Computer ...

    African Journals Online (AJOL)

    2017-09-10

    , tidak: no). Reasons for interest/non-interest in computer programming. Reason. Number of Com. Reasons for Interest in Computer Programming essfully solving a problem/creating something new potential applications.

  15. Accessible high performance computing solutions for near real-time image processing for time critical applications

    Science.gov (United States)

    Bielski, Conrad; Lemoine, Guido; Syryczynski, Jacek

    2009-09-01

    High Performance Computing (HPC) hardware solutions such as grid computing and General Processing on a Graphics Processing Unit (GPGPU) are now accessible to users with general computing needs. Grid computing infrastructures in the form of computing clusters or blades are becoming common place and GPGPU solutions that leverage the processing power of the video card are quickly being integrated into personal workstations. Our interest in these HPC technologies stems from the need to produce near real-time maps from a combination of pre- and post-event satellite imagery in support of post-disaster management. Faster processing provides a twofold gain in this situation: 1. critical information can be provided faster and 2. more elaborate automated processing can be performed prior to providing the critical information. In our particular case, we test the use of the PANTEX index which is based on analysis of image textural measures extracted using anisotropic, rotation-invariant GLCM statistics. The use of this index, applied in a moving window, has been shown to successfully identify built-up areas in remotely sensed imagery. Built-up index image masks are important input to the structuring of damage assessment interpretation because they help optimise the workload. The performance of computing the PANTEX workflow is compared on two different HPC hardware architectures: (1) a blade server with 4 blades, each having dual quad-core CPUs and (2) a CUDA enabled GPU workstation. The reference platform is a dual CPU-quad core workstation and the PANTEX workflow total computing time is measured. Furthermore, as part of a qualitative evaluation, the differences in setting up and configuring various hardware solutions and the related software coding effort is presented.

  16. Effect of the MCNP model definition on the computation time

    International Nuclear Information System (INIS)

    Šunka, Michal

    2017-01-01

    The presented work studies the influence of the method of defining the geometry in the MCNP transport code and its impact on the computational time, including the difficulty of preparing an input file describing the given geometry. Cases using different geometric definitions including the use of basic 2-dimensional and 3-dimensional objects and theirs combinations were studied. The results indicate that an inappropriate definition can increase the computational time by up to 59% (a more realistic case indicates 37%) for the same results and the same statistical uncertainty. (orig.)

  17. Scalable space-time adaptive simulation tools for computational electrocardiology

    OpenAIRE

    Krause, Dorian; Krause, Rolf

    2013-01-01

    This work is concerned with the development of computational tools for the solution of reaction-diffusion equations from the field of computational electrocardiology. We designed lightweight spatially and space-time adaptive schemes for large-scale parallel simulations. We propose two different adaptive schemes based on locally structured meshes, managed either via a conforming coarse tessellation or a forest of shallow trees. A crucial ingredient of our approach is a non-conforming morta...

  18. NNSA?s Computing Strategy, Acquisition Plan, and Basis for Computing Time Allocation

    Energy Technology Data Exchange (ETDEWEB)

    Nikkel, D J

    2009-07-21

    This report is in response to the Omnibus Appropriations Act, 2009 (H.R. 1105; Public Law 111-8) in its funding of the National Nuclear Security Administration's (NNSA) Advanced Simulation and Computing (ASC) Program. This bill called for a report on ASC's plans for computing and platform acquisition strategy in support of stockpile stewardship. Computer simulation is essential to the stewardship of the nation's nuclear stockpile. Annual certification of the country's stockpile systems, Significant Finding Investigations (SFIs), and execution of Life Extension Programs (LEPs) are dependent on simulations employing the advanced ASC tools developed over the past decade plus; indeed, without these tools, certification would not be possible without a return to nuclear testing. ASC is an integrated program involving investments in computer hardware (platforms and computing centers), software environments, integrated design codes and physical models for these codes, and validation methodologies. The significant progress ASC has made in the past derives from its focus on mission and from its strategy of balancing support across the key investment areas necessary for success. All these investment areas must be sustained for ASC to adequately support current stockpile stewardship mission needs and to meet ever more difficult challenges as the weapons continue to age or undergo refurbishment. The appropriations bill called for this report to address three specific issues, which are responded to briefly here but are expanded upon in the subsequent document: (1) Identify how computing capability at each of the labs will specifically contribute to stockpile stewardship goals, and on what basis computing time will be allocated to achieve the goal of a balanced program among the labs. (2) Explain the NNSA's acquisition strategy for capacity and capability of machines at each of the labs and how it will fit within the existing budget constraints. (3

  19. Computing with concepts, computing with numbers: Llull, Leibniz, and Boole

    NARCIS (Netherlands)

    Uckelman, S.L.

    2010-01-01

    We consider two ways to understand "reasoning as computation", one which focuses on the computation of concept symbols and the other on the computation of number symbols. We illustrate these two ways with Llull’s Ars Combinatoria and Leibniz’s attempts to arithmetize language, respectively. We then

  20. LHC Computing Grid Project Launches intAction with International Support. A thousand times more computing power by 2006

    CERN Multimedia

    2001-01-01

    The first phase of the LHC Computing Grid project was approved at an extraordinary meeting of the Council on 20 September 2001. CERN is preparing for the unprecedented avalanche of data that will be produced by the Large Hadron Collider experiments. A thousand times more computer power will be needed by 2006! CERN's need for a dramatic advance in computing capacity is urgent. As from 2006, the four giant detectors observing trillions of elementary particle collisions at the LHC will accumulate over ten million Gigabytes of data, equivalent to the contents of about 20 million CD-ROMs, each year of its operation. A thousand times more computing power will be needed than is available to CERN today. The strategy the collabortations have adopted to analyse and store this unprecedented amount of data is the coordinated deployment of Grid technologies at hundreds of institutes which will be able to search out and analyse information from an interconnected worldwide grid of tens of thousands of computers and storag...

  1. Real-time brain computer interface using imaginary movements

    DEFF Research Database (Denmark)

    El-Madani, Ahmad; Sørensen, Helge Bjarup Dissing; Kjær, Troels W.

    2015-01-01

    Background: Brain Computer Interface (BCI) is the method of transforming mental thoughts and imagination into actions. A real-time BCI system can improve the quality of life of patients with severe neuromuscular disorders by enabling them to communicate with the outside world. In this paper...

  2. Benchmark study of the I-DYNEV evacuation time estimate computer code

    International Nuclear Information System (INIS)

    Urbanik, T. II; Moeller, M.P.; Barnes, K.

    1988-06-01

    This report compares observed vehicle movement on a highway network during periods of peak computer traffic with a simulation of the traffic flow made with the I-DYNEV computer model. The purpose of the comparison is to detemine if the model can accurately simulate the patterns of vehicular movement and delay during congested commuter traffic. The results indicate that the I-DYNEV model adequately simulates the pattens of vehicular movement and delay associated with an evacuation, provided that the model's capacity reduction factor is an input parameter. The current I-DYNEV model automatically reduces capacity by 15% of input capacity to account for congestion-induced losses in capacity reduction due to congestion, the model underestimated capacity during congestion. Therefore, the use of a capacity reduction factor should be a decision made by the analysts, not the model. When I-DYNEV was used with a capacity reduction factor appropriate to the data set used (i.e., no reduction in capacity), I-DYNEV produced reasonable results. 3 refs., 18 figs., 3 tabs

  3. Designing to support reasoned imagination through embodied metaphor

    NARCIS (Netherlands)

    Antle, A.N. (Alissa); Corness, G.; Bakker, S.; Droumeva, M.; Hoven, van den E.A.W.H.; Bevans, A.; Bryan-Kinns, N.

    2009-01-01

    Supporting users' reasoned imagination in sense making during interaction with tangible and embedded computation involves supporting the application of their existing mental schemata in understanding new forms of interaction. Recent studies that include an embodied metaphor in the interaction model,

  4. Sorting on STAR. [CDC computer algorithm timing comparison

    Science.gov (United States)

    Stone, H. S.

    1978-01-01

    Timing comparisons are given for three sorting algorithms written for the CDC STAR computer. One algorithm is Hoare's (1962) Quicksort, which is the fastest or nearly the fastest sorting algorithm for most computers. A second algorithm is a vector version of Quicksort that takes advantage of the STAR's vector operations. The third algorithm is an adaptation of Batcher's (1968) sorting algorithm, which makes especially good use of vector operations but has a complexity of N(log N)-squared as compared with a complexity of N log N for the Quicksort algorithms. In spite of its worse complexity, Batcher's sorting algorithm is competitive with the serial version of Quicksort for vectors up to the largest that can be treated by STAR. Vector Quicksort outperforms the other two algorithms and is generally preferred. These results indicate that unusual instruction sets can introduce biases in program execution time that counter results predicted by worst-case asymptotic complexity analysis.

  5. Overview of the history of neutronics computations

    International Nuclear Information System (INIS)

    Gelbard, E.M.

    1992-01-01

    This paper provides some background and perspective on the history of neutronics computations. The author started his carerr at Bettis Atomic Power Laboratory (BAPL) in 1954. At that time, computers were primitive and unfamiliar to most of us. Bettis did own an IBM Card Programmed Computer, which, as its name suggested, had a punched-card memory, and some of the industry's first cross-section preparation codes (the early MUFTs) were written for this machine by Ben Mount. There was also at BAPL another digital machine, which spewed output everywhere on punched tape. But the laboratory's most powerful machine was an analog computer that solved two-group diffusion equations in two dimensions. An excellent physicist of that period argued that analog machines were the wave of the future because digital computers could never solve the diffusion equation in two dimensions. At the time, this seemed like a very reasonable prediction; yet it was not very long before Bettis was renting time on UNIVACs all over the country precisely to solve the two-dimensional diffusion equation. Since then, we have learned that it is always risky to predict future trends in the computer industry

  6. Variation in computer time with geometry prescription in monte carlo code KENO-IV

    International Nuclear Information System (INIS)

    Gopalakrishnan, C.R.

    1988-01-01

    In most studies, the Monte Carlo criticality code KENO-IV has been compared with other Monte Carlo codes, but evaluation of its performance with different box descriptions has not been done so far. In Monte Carlo computations, any fractional savings of computing time is highly desirable. Variation in computation time with box description in KENO for two different fast reactor fuel subassemblies of FBTR and PFBR is studied. The K eff of an infinite array of fuel subassemblies is calculated by modelling the subassemblies in two different ways (i) multi-region, (ii) multi-box. In addition to these two cases, excess reactivity calculations of FBTR are also performed in two ways to study this effect in a complex geometry. It is observed that the K eff values calculated by multi-region and multi-box models agree very well. However the increase in computation time from the multi-box to the multi-region is considerable, while the difference in computer storage requirements for the two models is negligible. This variation in computing time arises from the way the neutron is tracked in the two cases. (author)

  7. On the logos a naïve view on ordinary reasoning and fuzzy logic

    CERN Document Server

    Trillas, Enric

    2017-01-01

    This book offers an inspiring and naïve view on language and reasoning. It presents a new approach to ordinary reasoning that follows the author’s former work on fuzzy logic. Starting from a pragmatic scientific view on meaning as a quantity, and the common sense reasoning from a primitive notion of inference, which is shared by both laypeople and experts, the book shows how this can evolve, through the addition of more and more suppositions, into various formal and specialized modes of precise, imprecise, and approximate reasoning. The logos are intended here as a synonym for rationality, which is usually shown by the processes of questioning, guessing, telling, and computing. Written in a discursive style and without too many technicalities, the book presents a number of reflections on the study of reasoning, together with a new perspective on fuzzy logic and Zadeh’s “computing with words” grounded in both language and reasoning. It also highlights some mathematical developments supporting this vie...

  8. Exploring High School Students Beginning Reasoning about Significance Tests with Technology

    Science.gov (United States)

    García, Víctor N.; Sánchez, Ernesto

    2017-01-01

    In the present study we analyze how students reason about or make inferences given a particular hypothesis testing problem (without having studied formal methods of statistical inference) when using Fathom. They use Fathom to create an empirical sampling distribution through computer simulation. It is found that most student´s reasoning rely on…

  9. SLMRACE: a noise-free RACE implementation with reduced computational time

    Science.gov (United States)

    Chauvin, Juliet; Provenzi, Edoardo

    2017-05-01

    We present a faster and noise-free implementation of the RACE algorithm. RACE has mixed characteristics between the famous Retinex model of Land and McCann and the automatic color equalization (ACE) color-correction algorithm. The original random spray-based RACE implementation suffers from two main problems: its computational time and the presence of noise. Here, we will show that it is possible to adapt two techniques recently proposed by Banić et al. to the RACE framework in order to drastically decrease the computational time and noise generation. The implementation will be called smart-light-memory-RACE (SLMRACE).

  10. Reason destroys itself

    CERN Multimedia

    Penrose, Roger

    2008-01-01

    "Do we know for certain that 2 lus 2 equals 4? Of course we don't. Maybe every time everybody in the whole world has ever done that calculation and reasoned it through, they've made a mistake." (1 page0

  11. Efficient Geo-Computational Algorithms for Constructing Space-Time Prisms in Road Networks

    Directory of Open Access Journals (Sweden)

    Hui-Ping Chen

    2016-11-01

    Full Text Available The Space-time prism (STP is a key concept in time geography for analyzing human activity-travel behavior under various Space-time constraints. Most existing time-geographic studies use a straightforward algorithm to construct STPs in road networks by using two one-to-all shortest path searches. However, this straightforward algorithm can introduce considerable computational overhead, given the fact that accessible links in a STP are generally a small portion of the whole network. To address this issue, an efficient geo-computational algorithm, called NTP-A*, is proposed. The proposed NTP-A* algorithm employs the A* and branch-and-bound techniques to discard inaccessible links during two shortest path searches, and thereby improves the STP construction performance. Comprehensive computational experiments are carried out to demonstrate the computational advantage of the proposed algorithm. Several implementation techniques, including the label-correcting technique and the hybrid link-node labeling technique, are discussed and analyzed. Experimental results show that the proposed NTP-A* algorithm can significantly improve STP construction performance in large-scale road networks by a factor of 100, compared with existing algorithms.

  12. Computational Procedures for a Class of GI/D/k Systems in Discrete Time

    Directory of Open Access Journals (Sweden)

    Md. Mostafizur Rahman

    2009-01-01

    Full Text Available A class of discrete time GI/D/k systems is considered for which the interarrival times have finite support and customers are served in first-in first-out (FIFO order. The system is formulated as a single server queue with new general independent interarrival times and constant service duration by assuming cyclic assignment of customers to the identical servers. Then the queue length is set up as a quasi-birth-death (QBD type Markov chain. It is shown that this transformed GI/D/1 system has special structures which make the computation of the matrix R simple and efficient, thereby reducing the number of multiplications in each iteration significantly. As a result we were able to keep the computation time very low. Moreover, use of the resulting structural properties makes the computation of the distribution of queue length of the transformed system efficient. The computation of the distribution of waiting time is also shown to be simple by exploiting the special structures.

  13. In-Network Computation is a Dumb Idea Whose Time Has Come

    KAUST Repository

    Sapio, Amedeo; Abdelaziz, Ibrahim; Aldilaijan, Abdulla; Canini, Marco; Kalnis, Panos

    2017-01-01

    Programmable data plane hardware creates new opportunities for infusing intelligence into the network. This raises a fundamental question: what kinds of computation should be delegated to the network? In this paper, we discuss the opportunities and challenges for co-designing data center distributed systems with their network layer. We believe that the time has finally come for offloading part of their computation to execute in-network. However, in-network computation tasks must be judiciously crafted to match the limitations of the network machine architecture of programmable devices. With the help of our experiments on machine learning and graph analytics workloads, we identify that aggregation functions raise opportunities to exploit the limited computation power of networking hardware to lessen network congestion and improve the overall application performance. Moreover, as a proof-of-concept, we propose DAIET, a system that performs in-network data aggregation. Experimental results with an initial prototype show a large data reduction ratio (86.9%-89.3%) and a similar decrease in the workers' computation time.

  14. In-Network Computation is a Dumb Idea Whose Time Has Come

    KAUST Repository

    Sapio, Amedeo

    2017-11-27

    Programmable data plane hardware creates new opportunities for infusing intelligence into the network. This raises a fundamental question: what kinds of computation should be delegated to the network? In this paper, we discuss the opportunities and challenges for co-designing data center distributed systems with their network layer. We believe that the time has finally come for offloading part of their computation to execute in-network. However, in-network computation tasks must be judiciously crafted to match the limitations of the network machine architecture of programmable devices. With the help of our experiments on machine learning and graph analytics workloads, we identify that aggregation functions raise opportunities to exploit the limited computation power of networking hardware to lessen network congestion and improve the overall application performance. Moreover, as a proof-of-concept, we propose DAIET, a system that performs in-network data aggregation. Experimental results with an initial prototype show a large data reduction ratio (86.9%-89.3%) and a similar decrease in the workers\\' computation time.

  15. Icon arrays help younger children's proportional reasoning.

    Science.gov (United States)

    Ruggeri, Azzurra; Vagharchakian, Laurianne; Xu, Fei

    2018-06-01

    We investigated the effects of two context variables, presentation format (icon arrays or numerical frequencies) and time limitation (limited or unlimited time), on the proportional reasoning abilities of children aged 7 and 10 years, as well as adults. Participants had to select, between two sets of tokens, the one that offered the highest likelihood of drawing a gold token, that is, the set of elements with the greater proportion of gold tokens. Results show that participants performed better in the unlimited time condition. Moreover, besides a general developmental improvement in accuracy, our results show that younger children performed better when proportions were presented as icon arrays, whereas older children and adults were similarly accurate in the two presentation format conditions. Statement of contribution What is already known on this subject? There is a developmental improvement in proportional reasoning accuracy. Icon arrays facilitate reasoning in adults with low numeracy. What does this study add? Participants were more accurate when they were given more time to make the proportional judgement. Younger children's proportional reasoning was more accurate when they were presented with icon arrays. Proportional reasoning abilities correlate with working memory, approximate number system, and subitizing skills. © 2018 The British Psychological Society.

  16. Computational Fair Division

    DEFF Research Database (Denmark)

    Branzei, Simina

    Fair division is a fundamental problem in economic theory and one of the oldest questions faced through the history of human society. The high level scenario is that of several participants having to divide a collection of resources such that everyone is satisfied with their allocation -- e.g. two...... heirs dividing a car, house, and piece of land inherited. The literature on fair division was developed in the 20th century in mathematics and economics, but computational work on fair division is still sparse. This thesis can be seen as an excursion in computational fair division divided in two parts....... The first part tackles the cake cutting problem, where the cake is a metaphor for a heterogeneous divisible resource such as land, time, mineral deposits, and computer memory. We study the equilibria of classical protocols and design an algorithmic framework for reasoning about their game theoretic...

  17. Reasons Internalism and the function of normative reasons

    OpenAIRE

    Sinclair, Neil

    2017-01-01

    What is the connection between reasons and motives? According to Reasons Internalism there is a non-trivial conceptual connection between normative reasons and the possibility of rationally accessing relevant motivation. Reasons Internalism is attractive insofar as it captures the thought that reasons are for reasoning with and repulsive insofar as it fails to generate sufficient critical distance between reasons and motives. Rather than directly adjudicate this dispute, I extract from it two...

  18. Reduced computational cost in the calculation of worst case response time for real time systems

    OpenAIRE

    Urriza, José M.; Schorb, Lucas; Orozco, Javier D.; Cayssials, Ricardo

    2009-01-01

    Modern Real Time Operating Systems require reducing computational costs even though the microprocessors become more powerful each day. It is usual that Real Time Operating Systems for embedded systems have advance features to administrate the resources of the applications that they support. In order to guarantee either the schedulability of the system or the schedulability of a new task in a dynamic Real Time System, it is necessary to know the Worst Case Response Time of the Real Time tasks ...

  19. Finite Difference Time Domain (FDTD) Simulations Using Graphics Processors

    National Research Council Canada - National Science Library

    Adams, Samuel; Payne, Jason; Boppana, Rajendra

    2007-01-01

    .... This paper shows how GPUs can be used to greatly speedup FDTD simulations. The main objective is to leverage GPU processing power for FDTD update calculations and complete computationally expensive simulations in reasonable time...

  20. Point-of-views representation for hypothetical reasoning: application to decision-aid

    International Nuclear Information System (INIS)

    Diaz, Antoine

    1992-01-01

    Most of the knowledge based Decision Support Systems must deal with two difficulties in problem solving representation: reasoning with incomplete knowledge and managing contradictory reasoning. We propose a method which answers the question of reasoning revision when a contradiction occurs, while preserving the functionalities of the De Kleer's ATMS System for simulating hypothetical reasoning. As a matter of fact, these functionalities are particularly suitable for decision aiding problems. In order to formalize the ATMS, we use a resolution method called Cat-resolution (Cayrol and Tayrac). This method allows the computation of ATMS functions relating to a set of propositional clauses by saturating this set. Owing to this choice, we can use the same principles as ATMS on the saturation trace. Each clause in the saturated set can be linked to the sets of initial clauses justifying its derivation by Cat-resolution. The reasoning inconsistency is now managed. First the user can identify the source of the inconsistency thanks to the empty clause explanation. Then he can try to restore the reasoning consistency by relaxing at least one of the initial clauses justifying the empty clause. The computation of 'partial' ATMS, representing a point of view in the decision-making problem, is more effective owing to the justifications of the derived clauses. (author) [fr

  1. Multiscale Methods, Parallel Computation, and Neural Networks for Real-Time Computer Vision.

    Science.gov (United States)

    Battiti, Roberto

    1990-01-01

    This thesis presents new algorithms for low and intermediate level computer vision. The guiding ideas in the presented approach are those of hierarchical and adaptive processing, concurrent computation, and supervised learning. Processing of the visual data at different resolutions is used not only to reduce the amount of computation necessary to reach the fixed point, but also to produce a more accurate estimation of the desired parameters. The presented adaptive multiple scale technique is applied to the problem of motion field estimation. Different parts of the image are analyzed at a resolution that is chosen in order to minimize the error in the coefficients of the differential equations to be solved. Tests with video-acquired images show that velocity estimation is more accurate over a wide range of motion with respect to the homogeneous scheme. In some cases introduction of explicit discontinuities coupled to the continuous variables can be used to avoid propagation of visual information from areas corresponding to objects with different physical and/or kinematic properties. The human visual system uses concurrent computation in order to process the vast amount of visual data in "real -time." Although with different technological constraints, parallel computation can be used efficiently for computer vision. All the presented algorithms have been implemented on medium grain distributed memory multicomputers with a speed-up approximately proportional to the number of processors used. A simple two-dimensional domain decomposition assigns regions of the multiresolution pyramid to the different processors. The inter-processor communication needed during the solution process is proportional to the linear dimension of the assigned domain, so that efficiency is close to 100% if a large region is assigned to each processor. Finally, learning algorithms are shown to be a viable technique to engineer computer vision systems for different applications starting from

  2. Do players reason by forward induction in dynamic perfect information games?

    Directory of Open Access Journals (Sweden)

    Sujata Ghosh

    2016-06-01

    Full Text Available We conducted an experiment where participants played a perfect-information game against a computer, which was programmed to deviate often from its backward induction strategy right at the beginning of the game. Participants knew that in each game, the computer was nevertheless optimizing against some belief about the participant's future strategy. It turned out that in the aggregate, participants were likely to respond in a way which is optimal with respect to their best-rationalization extensive form rationalizability conjecture - namely the conjecture that the computer is after a larger prize than the one it has foregone, even when this necessarily meant that the computer has attributed future irrationality to the participant when the computer made the first move in the game. Thus, it appeared that participants applied forward induction. However, there exist alternative explanations for the choices of most participants; for example, choices could be based on the extent of risk aversion that participants attributed to the computer in the remainder of the game, rather than to the sunk outside option that the computer has already foregone at the beginning of the game. For this reason, the results of the experiment do not yet provide conclusive evidence for Forward Induction reasoning on the part of the participants.

  3. Supporting Students' Learning and Socioscientific Reasoning About Climate Change—the Effect of Computer-Based Concept Mapping Scaffolds

    Science.gov (United States)

    Eggert, Sabina; Nitsch, Anne; Boone, William J.; Nückles, Matthias; Bögeholz, Susanne

    2017-02-01

    Climate change is one of the most challenging problems facing today's global society (e.g., IPCC 2013). While climate change is a widely covered topic in the media, and abundant information is made available through the internet, the causes and consequences of climate change in its full complexity are difficult for individuals, especially non-scientists, to grasp. Science education is a field which can play a crucial role in fostering meaningful education of students to become climate literate citizens (e.g., NOAA 2009; Schreiner et al., 41, 3-50, 2005). If students are, at some point, to participate in societal discussions about the sustainable development of our planet, their learning with respect to such issues needs to be supported. This includes the ability to think critically, to cope with complex scientific evidence, which is often subject to ongoing inquiry, and to reach informed decisions on the basis of factual information as well as values-based considerations. The study presented in this paper focused on efforts to advance students in (1) their conceptual understanding about climate change and (2) their socioscientific reasoning and decision making regarding socioscientific issues in general. Although there is evidence that "knowledge" does not guarantee pro-environmental behavior (e.g. Schreiner et al., 41, 3-50, 2005; Skamp et al., 97(2), 191-217, 2013), conceptual, interdisciplinary understanding of climate change is an important prerequisite to change individuals' attitudes towards climate change and thus to eventually foster climate literate citizens (e.g., Clark et al. 2013). In order to foster conceptual understanding and socioscientific reasoning, a computer-based learning environment with an embedded concept mapping tool was utilized to support senior high school students' learning about climate change and possible solution strategies. The evaluation of the effect of different concept mapping scaffolds focused on the quality of student

  4. Relational Reasoning for Markov Chains in a Probabilistic Guarded Lambda Calculus

    DEFF Research Database (Denmark)

    Aguirre, Alejandro; Barthe, Gilles; Birkedal, Lars

    2018-01-01

    We extend the simply-typed guarded $\\lambda$-calculus with discrete probabilities and endow it with a program logic for reasoning about relational properties of guarded probabilistic computations. This provides a framework for programming and reasoning about infinite stochastic processes like Mar...... literature to justify better proof rules for relational reasoning about probabilistic expressions. We illustrate these benefits with a broad range of examples that were beyond the scope of previous systems, including shift couplings and lump couplings between random walks....

  5. A strategy for reducing turnaround time in design optimization using a distributed computer system

    Science.gov (United States)

    Young, Katherine C.; Padula, Sharon L.; Rogers, James L.

    1988-01-01

    There is a need to explore methods for reducing lengthly computer turnaround or clock time associated with engineering design problems. Different strategies can be employed to reduce this turnaround time. One strategy is to run validated analysis software on a network of existing smaller computers so that portions of the computation can be done in parallel. This paper focuses on the implementation of this method using two types of problems. The first type is a traditional structural design optimization problem, which is characterized by a simple data flow and a complicated analysis. The second type of problem uses an existing computer program designed to study multilevel optimization techniques. This problem is characterized by complicated data flow and a simple analysis. The paper shows that distributed computing can be a viable means for reducing computational turnaround time for engineering design problems that lend themselves to decomposition. Parallel computing can be accomplished with a minimal cost in terms of hardware and software.

  6. Computing the Maximum Detour of a Plane Graph in Subquadratic Time

    DEFF Research Database (Denmark)

    Wulff-Nilsen, Christian

    Let G be a plane graph where each edge is a line segment. We consider the problem of computing the maximum detour of G, defined as the maximum over all pairs of distinct points p and q of G of the ratio between the distance between p and q in G and the distance |pq|. The fastest known algorithm f...... for this problem has O(n^2) running time. We show how to obtain O(n^{3/2}*(log n)^3) expected running time. We also show that if G has bounded treewidth, its maximum detour can be computed in O(n*(log n)^3) expected time....

  7. Further improvement in ABWR (part-4) open distributed plant process computer system

    International Nuclear Information System (INIS)

    Makino, Shigenori; Hatori, Yoshinori

    1999-01-01

    In the nuclear industry of Japan, the electric power companies have promoted the plant process computer (PPC) technology of nuclear power plant (NPP). When PPC was introduced to NPP for the first time, because of very tight requirement such as high reliability, high speed processing, the large-scale customized computer was applied. As for recent computer field, the large market of computer contributes to the remarkable progress of engineering work station (EWS) and personal computer (PC) technology. Moreover because the data transmission technology has been progressing at the same time, world wide computer network has been established. Thanks to progress of both technologies, the distributed computer system has been established at reasonable price. So Tokyo Electric Power Company (TEPCO) is trying to apply it for PPC of NPP. (author)

  8. Reasoning with probabilistic and deterministic graphical models exact algorithms

    CERN Document Server

    Dechter, Rina

    2013-01-01

    Graphical models (e.g., Bayesian and constraint networks, influence diagrams, and Markov decision processes) have become a central paradigm for knowledge representation and reasoning in both artificial intelligence and computer science in general. These models are used to perform many reasoning tasks, such as scheduling, planning and learning, diagnosis and prediction, design, hardware and software verification, and bioinformatics. These problems can be stated as the formal tasks of constraint satisfaction and satisfiability, combinatorial optimization, and probabilistic inference. It is well

  9. Applications of parallel computer architectures to the real-time simulation of nuclear power systems

    International Nuclear Information System (INIS)

    Doster, J.M.; Sills, E.D.

    1988-01-01

    In this paper the authors report on efforts to utilize parallel computer architectures for the thermal-hydraulic simulation of nuclear power systems and current research efforts toward the development of advanced reactor operator aids and control systems based on this new technology. Many aspects of reactor thermal-hydraulic calculations are inherently parallel, and the computationally intensive portions of these calculations can be effectively implemented on modern computers. Timing studies indicate faster-than-real-time, high-fidelity physics models can be developed when the computational algorithms are designed to take advantage of the computer's architecture. These capabilities allow for the development of novel control systems and advanced reactor operator aids. Coupled with an integral real-time data acquisition system, evolving parallel computer architectures can provide operators and control room designers improved control and protection capabilities. Current research efforts are currently under way in this area

  10. Challenges in reducing the computational time of QSTS simulations for distribution system analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Deboever, Jeremiah [Georgia Inst. of Technology, Atlanta, GA (United States); Zhang, Xiaochen [Georgia Inst. of Technology, Atlanta, GA (United States); Reno, Matthew J. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Broderick, Robert Joseph [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Grijalva, Santiago [Georgia Inst. of Technology, Atlanta, GA (United States); Therrien, Francis [CME International T& D, St. Bruno, QC (Canada)

    2017-06-01

    The rapid increase in penetration of distributed energy resources on the electric power distribution system has created a need for more comprehensive interconnection modelling and impact analysis. Unlike conventional scenario - based studies , quasi - static time - series (QSTS) simulation s can realistically model time - dependent voltage controllers and the diversity of potential impacts that can occur at different times of year . However, to accurately model a distribution system with all its controllable devices, a yearlong simulation at 1 - second resolution is often required , which could take conventional computers a computational time of 10 to 120 hours when an actual unbalanced distribution feeder is modeled . This computational burden is a clear l imitation to the adoption of QSTS simulation s in interconnection studies and for determining optimal control solutions for utility operations . Our ongoing research to improve the speed of QSTS simulation has revealed many unique aspects of distribution system modelling and sequential power flow analysis that make fast QSTS a very difficult problem to solve. In this report , the most relevant challenges in reducing the computational time of QSTS simulations are presented: number of power flows to solve, circuit complexity, time dependence between time steps, multiple valid power flow solutions, controllable element interactions, and extensive accurate simulation analysis.

  11. 8th International Workshop on Non-Monotonic Reasoning

    CERN Document Server

    Truszczynski, Mirek

    2000-01-01

    The papers gathered in this collection were presented at the 8th International Workshop on Nonmonotonic Reasoning, NMR2000. The series was started by John McCarthy in 1978. The first international NMR workshop was held at Mohonk Mountain House, New Paltz, New York in June, 1984, and was organized by Ray Reiter and Bonnie Webber. In the last 10 years the area of nonmonotonic reasoning has seen a number of important developments. Significant theoretical advances were made in the understanding of general abstract principles underlying nonmonotonicity. Key results on the expressibility and computational complexity of nonmonotonic logics were established. The role of nonmonotonic reasoning in belief revision, abduction, reasoning about action, planing and uncertainty was further clarified. Several successful NMR systems were built and used in applications such as planning, scheduling, logic programming and constraint satisfaction. The papers in the proceedings reflect these recent advances in the field. They are g...

  12. Soft Computing Techniques for the Protein Folding Problem on High Performance Computing Architectures.

    Science.gov (United States)

    Llanes, Antonio; Muñoz, Andrés; Bueno-Crespo, Andrés; García-Valverde, Teresa; Sánchez, Antonia; Arcas-Túnez, Francisco; Pérez-Sánchez, Horacio; Cecilia, José M

    2016-01-01

    The protein-folding problem has been extensively studied during the last fifty years. The understanding of the dynamics of global shape of a protein and the influence on its biological function can help us to discover new and more effective drugs to deal with diseases of pharmacological relevance. Different computational approaches have been developed by different researchers in order to foresee the threedimensional arrangement of atoms of proteins from their sequences. However, the computational complexity of this problem makes mandatory the search for new models, novel algorithmic strategies and hardware platforms that provide solutions in a reasonable time frame. We present in this revision work the past and last tendencies regarding protein folding simulations from both perspectives; hardware and software. Of particular interest to us are both the use of inexact solutions to this computationally hard problem as well as which hardware platforms have been used for running this kind of Soft Computing techniques.

  13. Nonmonotonic Reasoning as a Temporal Activity

    OpenAIRE

    Schwartz, Daniel G.

    2014-01-01

    A {\\it dynamic reasoning system} (DRS) is an adaptation of a conventional formal logical system that explicitly portrays reasoning as a temporal activity, with each extralogical input to the system and each inference rule application being viewed as occurring at a distinct time step. Every DRS incorporates some well-defined logic together with a controller that serves to guide the reasoning process in response to user inputs. Logics are generic, whereas controllers are application-specific. E...

  14. Computationally determining the salience of decision points for real-time wayfinding support

    Directory of Open Access Journals (Sweden)

    Makoto Takemiya

    2012-06-01

    Full Text Available This study introduces the concept of computational salience to explain the discriminatory efficacy of decision points, which in turn may have applications to providing real-time assistance to users of navigational aids. This research compared algorithms for calculating the computational salience of decision points and validated the results via three methods: high-salience decision points were used to classify wayfinders; salience scores were used to weight a conditional probabilistic scoring function for real-time wayfinder performance classification; and salience scores were correlated with wayfinding-performance metrics. As an exploratory step to linking computational and cognitive salience, a photograph-recognition experiment was conducted. Results reveal a distinction between algorithms useful for determining computational and cognitive saliences. For computational salience, information about the structural integration of decision points is effective, while information about the probability of decision-point traversal shows promise for determining cognitive salience. Limitations from only using structural information and motivations for future work that include non-structural information are elicited.

  15. Computation of transit times using the milestoning method with applications to polymer translocation

    Science.gov (United States)

    Hawk, Alexander T.; Konda, Sai Sriharsha M.; Makarov, Dmitrii E.

    2013-08-01

    Milestoning is an efficient approximation for computing long-time kinetics and thermodynamics of large molecular systems, which are inaccessible to brute-force molecular dynamics simulations. A common use of milestoning is to compute the mean first passage time (MFPT) for a conformational transition of interest. However, the MFPT is not always the experimentally observed timescale. In particular, the duration of the transition path, or the mean transit time, can be measured in single-molecule experiments, such as studies of polymers translocating through pores and fluorescence resonance energy transfer studies of protein folding. Here we show how to use milestoning to compute transit times and illustrate our approach by applying it to the translocation of a polymer through a narrow pore.

  16. A heterogeneous hierarchical architecture for real-time computing

    Energy Technology Data Exchange (ETDEWEB)

    Skroch, D.A.; Fornaro, R.J.

    1988-12-01

    The need for high-speed data acquisition and control algorithms has prompted continued research in the area of multiprocessor systems and related programming techniques. The result presented here is a unique hardware and software architecture for high-speed real-time computer systems. The implementation of a prototype of this architecture has required the integration of architecture, operating systems and programming languages into a cohesive unit. This report describes a Heterogeneous Hierarchial Architecture for Real-Time (H{sup 2} ART) and system software for program loading and interprocessor communication.

  17. Modern EMC analysis I time-domain computational schemes

    CERN Document Server

    Kantartzis, Nikolaos V

    2008-01-01

    The objective of this two-volume book is the systematic and comprehensive description of the most competitive time-domain computational methods for the efficient modeling and accurate solution of contemporary real-world EMC problems. Intended to be self-contained, it performs a detailed presentation of all well-known algorithms, elucidating on their merits or weaknesses, and accompanies the theoretical content with a variety of applications. Outlining the present volume, the analysis covers the theory of the finite-difference time-domain, the transmission-line matrix/modeling, and the finite i

  18. Individual differences in conflict detection during reasoning.

    Science.gov (United States)

    Frey, Darren; Johnson, Eric D; De Neys, Wim

    2018-05-01

    Decades of reasoning and decision-making research have established that human judgment is often biased by intuitive heuristics. Recent "error" or bias detection studies have focused on reasoners' abilities to detect whether their heuristic answer conflicts with logical or probabilistic principles. A key open question is whether there are individual differences in this bias detection efficiency. Here we present three studies in which co-registration of different error detection measures (confidence, response time and confidence response time) allowed us to assess bias detection sensitivity at the individual participant level in a range of reasoning tasks. The results indicate that although most individuals show robust bias detection, as indexed by increased latencies and decreased confidence, there is a subgroup of reasoners who consistently fail to do so. We discuss theoretical and practical implications for the field.

  19. Efficient quantum algorithm for computing n-time correlation functions.

    Science.gov (United States)

    Pedernales, J S; Di Candia, R; Egusquiza, I L; Casanova, J; Solano, E

    2014-07-11

    We propose a method for computing n-time correlation functions of arbitrary spinorial, fermionic, and bosonic operators, consisting of an efficient quantum algorithm that encodes these correlations in an initially added ancillary qubit for probe and control tasks. For spinorial and fermionic systems, the reconstruction of arbitrary n-time correlation functions requires the measurement of two ancilla observables, while for bosonic variables time derivatives of the same observables are needed. Finally, we provide examples applicable to different quantum platforms in the frame of the linear response theory.

  20. Automatically ordering events and times in text

    CERN Document Server

    Derczynski, Leon R A

    2017-01-01

    The book offers a detailed guide to temporal ordering, exploring open problems in the field and providing solutions and extensive analysis. It addresses the challenge of automatically ordering events and times in text. Aided by TimeML, it also describes and presents concepts relating to time in easy-to-compute terms. Working out the order that events and times happen has proven difficult for computers, since the language used to discuss time can be vague and complex. Mapping out these concepts for a computational system, which does not have its own inherent idea of time, is, unsurprisingly, tough. Solving this problem enables powerful systems that can plan, reason about events, and construct stories of their own accord, as well as understand the complex narratives that humans express and comprehend so naturally. This book presents a theory and data-driven analysis of temporal ordering, leading to the identification of exactly what is difficult about the task. It then proposes and evaluates machine-learning so...

  1. A Modular Environment for Geophysical Inversion and Run-time Autotuning using Heterogeneous Computing Systems

    Science.gov (United States)

    Myre, Joseph M.

    Heterogeneous computing systems have recently come to the forefront of the High-Performance Computing (HPC) community's interest. HPC computer systems that incorporate special purpose accelerators, such as Graphics Processing Units (GPUs), are said to be heterogeneous. Large scale heterogeneous computing systems have consistently ranked highly on the Top500 list since the beginning of the heterogeneous computing trend. By using heterogeneous computing systems that consist of both general purpose processors and special- purpose accelerators, the speed and problem size of many simulations could be dramatically increased. Ultimately this results in enhanced simulation capabilities that allows, in some cases for the first time, the execution of parameter space and uncertainty analyses, model optimizations, and other inverse modeling techniques that are critical for scientific discovery and engineering analysis. However, simplifying the usage and optimization of codes for heterogeneous computing systems remains a challenge. This is particularly true for scientists and engineers for whom understanding HPC architectures and undertaking performance analysis may not be primary research objectives. To enable scientists and engineers to remain focused on their primary research objectives, a modular environment for geophysical inversion and run-time autotuning on heterogeneous computing systems is presented. This environment is composed of three major components: 1) CUSH---a framework for reducing the complexity of programming heterogeneous computer systems, 2) geophysical inversion routines which can be used to characterize physical systems, and 3) run-time autotuning routines designed to determine configurations of heterogeneous computing systems in an attempt to maximize the performance of scientific and engineering codes. Using three case studies, a lattice-Boltzmann method, a non-negative least squares inversion, and a finite-difference fluid flow method, it is shown that

  2. Time-Critical Reasoning: Representations and Application

    OpenAIRE

    Horvitz, Eric J.; Seiver, Adam

    2013-01-01

    We review the problem of time-critical action and discuss a reformulation that shifts knowledge acquisition from the assessment of complex temporal probabilistic dependencies to the direct assessment of time-dependent utilities over key outcomes of interest. We dwell on a class of decision problems characterized by the centrality of diagnosing and reacting in a timely manner to pathological processes. We motivate key ideas in the context of trauma-care triage and transportation decisions.

  3. A Non-Linear Digital Computer Model Requiring Short Computation Time for Studies Concerning the Hydrodynamics of the BWR

    Energy Technology Data Exchange (ETDEWEB)

    Reisch, F; Vayssier, G

    1969-05-15

    This non-linear model serves as one of the blocks in a series of codes to study the transient behaviour of BWR or PWR type reactors. This program is intended to be the hydrodynamic part of the BWR core representation or the hydrodynamic part of the PWR heat exchanger secondary side representation. The equations have been prepared for the CSMP digital simulation language. By using the most suitable integration routine available, the ratio of simulation time to real time is about one on an IBM 360/75 digital computer. Use of the slightly different language DSL/40 on an IBM 7044 computer takes about four times longer. The code has been tested against the Eindhoven loop with satisfactory agreement.

  4. EIAGRID: In-field optimization of seismic data acquisition by real-time subsurface imaging using a remote GRID computing environment.

    Science.gov (United States)

    Heilmann, B. Z.; Vallenilla Ferrara, A. M.

    2009-04-01

    The constant growth of contaminated sites, the unsustainable use of natural resources, and, last but not least, the hydrological risk related to extreme meteorological events and increased climate variability are major environmental issues of today. Finding solutions for these complex problems requires an integrated cross-disciplinary approach, providing a unified basis for environmental science and engineering. In computer science, grid computing is emerging worldwide as a formidable tool allowing distributed computation and data management with administratively-distant resources. Utilizing these modern High Performance Computing (HPC) technologies, the GRIDA3 project bundles several applications from different fields of geoscience aiming to support decision making for reasonable and responsible land use and resource management. In this abstract we present a geophysical application called EIAGRID that uses grid computing facilities to perform real-time subsurface imaging by on-the-fly processing of seismic field data and fast optimization of the processing workflow. Even though, seismic reflection profiling has a broad application range spanning from shallow targets in a few meters depth to targets in a depth of several kilometers, it is primarily used by the hydrocarbon industry and hardly for environmental purposes. The complexity of data acquisition and processing poses severe problems for environmental and geotechnical engineering: Professional seismic processing software is expensive to buy and demands large experience from the user. In-field processing equipment needed for real-time data Quality Control (QC) and immediate optimization of the acquisition parameters is often not available for this kind of studies. As a result, the data quality will be suboptimal. In the worst case, a crucial parameter such as receiver spacing, maximum offset, or recording time turns out later to be inappropriate and the complete acquisition campaign has to be repeated. The

  5. Time-domain numerical computations of electromagnetic fields in cylindrical co-ordinates using the transmission line matrix: evaluation of radiaion losses from a charge bunch passing through a pill-box resonator

    International Nuclear Information System (INIS)

    Sarma, J.; Robson, P.N.

    1979-01-01

    The two dimensional transmission line matrix (TLM) numerical method has been adapted to compute electromagnetic field distributions in cylindrical co-ordinates and it is applied to evaluate the radiation loss from a charge bunch passing through a 'pill-box' resonator. The computer program has been developed to calculate not only the total energy loss to the resonator but also that component of it which exists in the TM 010 mode. The numerically computed results are shown to agree very well with the analytically derived values as found in the literature which, therefore, established the degree of accuracy that is obtained with the TLM method. The particular features of computational simplicity, numerical stability and the inherently time-domain solutions produced by the TLM method are cited as additional, attractive reasons for using this numerical procedure in solving such problems. (Auth.)

  6. Climate Data Provenance Tracking for Just-In-Time Computation

    Science.gov (United States)

    Fries, S.; Nadeau, D.; Doutriaux, C.; Williams, D. N.

    2016-12-01

    The "Climate Data Management System" (CDMS) was created in 1996 as part of the Climate Data Analysis Tools suite of software. It provides a simple interface into a wide variety of climate data formats, and creates NetCDF CF-Compliant files. It leverages the NumPy framework for high performance computation, and is an all-in-one IO and computation package. CDMS has been extended to track manipulations of data, and trace that data all the way to the original raw data. This extension tracks provenance about data, and enables just-in-time (JIT) computation. The provenance for each variable is packaged as part of the variable's metadata, and can be used to validate data processing and computations (by repeating the analysis on the original data). It also allows for an alternate solution for sharing analyzed data; if the bandwidth for a transfer is prohibitively expensive, the provenance serialization can be passed in a much more compact format and the analysis rerun on the input data. Data provenance tracking in CDMS enables far-reaching and impactful functionalities, permitting implementation of many analytical paradigms.

  7. Event Based Simulator for Parallel Computing over the Wide Area Network for Real Time Visualization

    Science.gov (United States)

    Sundararajan, Elankovan; Harwood, Aaron; Kotagiri, Ramamohanarao; Satria Prabuwono, Anton

    As the computational requirement of applications in computational science continues to grow tremendously, the use of computational resources distributed across the Wide Area Network (WAN) becomes advantageous. However, not all applications can be executed over the WAN due to communication overhead that can drastically slowdown the computation. In this paper, we introduce an event based simulator to investigate the performance of parallel algorithms executed over the WAN. The event based simulator known as SIMPAR (SIMulator for PARallel computation), simulates the actual computations and communications involved in parallel computation over the WAN using time stamps. Visualization of real time applications require steady stream of processed data flow for visualization purposes. Hence, SIMPAR may prove to be a valuable tool to investigate types of applications and computing resource requirements to provide uninterrupted flow of processed data for real time visualization purposes. The results obtained from the simulation show concurrence with the expected performance using the L-BSP model.

  8. An atomic orbital based real-time time-dependent density functional theory for computing electronic circular dichroism band spectra

    Energy Technology Data Exchange (ETDEWEB)

    Goings, Joshua J.; Li, Xiaosong, E-mail: xsli@uw.edu [Department of Chemistry, University of Washington, Seattle, Washington 98195 (United States)

    2016-06-21

    One of the challenges of interpreting electronic circular dichroism (ECD) band spectra is that different states may have different rotatory strength signs, determined by their absolute configuration. If the states are closely spaced and opposite in sign, observed transitions may be washed out by nearby states, unlike absorption spectra where transitions are always positive additive. To accurately compute ECD bands, it is necessary to compute a large number of excited states, which may be prohibitively costly if one uses the linear-response time-dependent density functional theory (TDDFT) framework. Here we implement a real-time, atomic-orbital based TDDFT method for computing the entire ECD spectrum simultaneously. The method is advantageous for large systems with a high density of states. In contrast to previous implementations based on real-space grids, the method is variational, independent of nuclear orientation, and does not rely on pseudopotential approximations, making it suitable for computation of chiroptical properties well into the X-ray regime.

  9. Effects of mathematics computer games on special education students’ multiplicative reasoning ability

    NARCIS (Netherlands)

    Bakker, M.|info:eu-repo/dai/nl/355337770; Van den Heuvel-Panhuizen, M.|info:eu-repo/dai/nl/069266255; Robitzsch, Alexander

    2016-01-01

    This study examined the effects of a teacher-delivered intervention with online math-ematics mini-games on special education students’ multiplicative reasoning ability(multiplication and division). The games involved declarative, procedural, as well asconceptual knowledge of multiplicative

  10. Effects of mathematics computer games on special education students' multiplicative reasoning ability

    NARCIS (Netherlands)

    Bakker, M.; Heuvel-Panhuizen, M.H.A.M. van den; Robitzsch, A.

    2016-01-01

    This study examined the effects of a teacher-delivered intervention with online mathematics mini-games on special education students' multiplicative reasoning ability (multiplication and division). The games involved declarative, procedural, as well as conceptual knowledge of multiplicative

  11. Dual Trajectories of Reactive and Proactive Aggression from Mid-childhood to Early Adolescence: Relations to Sensation Seeking, Risk Taking, and Moral Reasoning.

    Science.gov (United States)

    Cui, Lixian; Colasante, Tyler; Malti, Tina; Ribeaud, Denis; Eisner, Manuel P

    2016-05-01

    We examined the roles of sensation seeking, risk taking, and moral reasoning in the development of reactive and proactive aggression. Data were drawn from a multiethnic, longitudinal study of children from Switzerland (N = 1571; 52 % male; assessed annually over 6 years; 7-years-old at Time 1). At all 6 time points, teachers reported children's reactive and proactive aggression via questionnaire. Children's sensation seeking (at Time 1) and risk taking (at Time 2) were assessed with two interactive computer tasks and their moral reasoning was assessed at Time 2 in response to four hypothetical vignettes depicting moral transgressions. Parallel process Latent Class Growth Analysis (PP-LCGA) identified six dual trajectories of reactive and proactive aggression. Children with either childhood-limited or adolescent-onset aggression showed high sensation seeking. Children with persistent, high levels of both reactive and proactive aggression across time showed high levels of sensation seeking and risk taking, as well as low levels of moral reasoning. Children with only high risk taking were more likely to display moderate levels of aggression across time. These findings highlight the shared and differential roles of sensation seeking, risk taking, and moral reasoning in the dual development of reactive and proactive aggression from mid-childhood to early adolescence. We discuss implications for common and tailored strategies to combat these aggression subtypes.

  12. CONFLICTING REASONS

    OpenAIRE

    Parfit, Derek

    2016-01-01

    Sidgwick believed that, when impartial reasons conflict with self-interested reasons, there are no truths about their relative strength. There are such truths, I claim, but these truths are imprecise. Many self-interested reasons are decisively outweighed by conflicting impar-tial moral reasons. But we often have sufficient self-interested reasons to do what would make things go worse, and we sometimes have sufficient self-interested reasons to act wrongly. If we reject Act Consequentialism, ...

  13. Different strategies in solving series completion inductive reasoning problems : An fMRI and computational study

    NARCIS (Netherlands)

    Liang, Peipeng; Jia, Xiuqin; Taatgen, Niels A.; Zhong, Ning; Li, Kuncheng

    Neural correlate of human inductive reasoning process is still unclear. Number series and letter series completion are two typical inductive reasoning tasks, and with a common core component of rule induction. Previous studies have demonstrated that different strategies are adopted in number series

  14. Visualizing complex processes using a cognitive-mapping tool to support the learning of clinical reasoning.

    Science.gov (United States)

    Wu, Bian; Wang, Minhong; Grotzer, Tina A; Liu, Jun; Johnson, Janice M

    2016-08-22

    Practical experience with clinical cases has played an important role in supporting the learning of clinical reasoning. However, learning through practical experience involves complex processes difficult to be captured by students. This study aimed to examine the effects of a computer-based cognitive-mapping approach that helps students to externalize the reasoning process and the knowledge underlying the reasoning process when they work with clinical cases. A comparison between the cognitive-mapping approach and the verbal-text approach was made by analyzing their effects on learning outcomes. Fifty-two third-year or higher students from two medical schools participated in the study. Students in the experimental group used the computer-base cognitive-mapping approach, while the control group used the verbal-text approach, to make sense of their thinking and actions when they worked with four simulated cases over 4 weeks. For each case, students in both groups reported their reasoning process (involving data capture, hypotheses formulation, and reasoning with justifications) and the underlying knowledge (involving identified concepts and the relationships between the concepts) using the given approach. The learning products (cognitive maps or verbal text) revealed that students in the cognitive-mapping group outperformed those in the verbal-text group in the reasoning process, but not in making sense of the knowledge underlying the reasoning process. No significant differences were found in a knowledge posttest between the two groups. The computer-based cognitive-mapping approach has shown a promising advantage over the verbal-text approach in improving students' reasoning performance. Further studies are needed to examine the effects of the cognitive-mapping approach in improving the construction of subject-matter knowledge on the basis of practical experience.

  15. Operation ARA: A Computerized Learning Game that Teaches Critical Thinking and Scientific Reasoning

    Science.gov (United States)

    Halpern, Diane F.; Millis, Keith; Graesser, Arthur C.; Butler, Heather; Forsyth, Carol; Cai, Zhiqiang

    2012-01-01

    Operation ARA (Acquiring Research Acumen) is a computerized learning game that teaches critical thinking and scientific reasoning. It is a valuable learning tool that utilizes principles from the science of learning and serious computer games. Students learn the skills of scientific reasoning by engaging in interactive dialogs with avatars. They…

  16. Formal Social Norms and their Enforcement in Computational MAS by Automated Reasoning

    Czech Academy of Sciences Publication Activity Database

    Neruda, Roman; Kazík, O.

    2012-01-01

    Roč. 39, č. 1 (2012), s. 80-87 ISSN 1819-9224 Institutional support: RVO:67985807 Keywords : role model * description logic * integrity constraints * computational intelligence Subject RIV: IN - Informatics, Computer Science

  17. Rule Based Reasoning Untuk Monitoring Distribusi Bahan Bakar Minyak Secara Online dan Realtime menggunakan Radio Frequency Identification

    Directory of Open Access Journals (Sweden)

    Mokhamad Iklil Mustofa

    2017-05-01

    Full Text Available The scarcity of fuel oil in Indonesia often occurs due to delays in delivery caused by natural factors or transportation constraints. Theaim of this  research is to develop systems of fuel distribution monitoring online and realtime using rule base reasoning method and radio frequency identification technology. The rule-based reasoning method is used as a rule-based reasoning model used for monitoring distribution and determine rule-based safety stock. The monitoring system program is run with a web-based computer application. Radio frequency identification technology is used by utilizing radio waves as an media identification. This technology is used as a system of tracking and gathering information from objects automatically. The research data uses data of delayed distribution of fuel from fuel terminal to consumer. The monitoring technique uses the time of departure, the estimated time to arrive, the route / route passed by a fuel tanker attached to the radio frequency Identification tag. This monitoring system is carried out by the radio frequency identification reader connected online at any gas station or specified position that has been designed with study case in Semarang. The results of the research covering  the status of rule based reasoning that sends status, that is timely and appropriate paths, timely and truncated pathways, late and on track, late and cut off, and tank lost. The monitoring system is also used in determining the safety stock warehouse, with the safety stock value determined based on the condition of the stock warehouse rules.

  18. Relations between Inductive Reasoning and Deductive Reasoning

    Science.gov (United States)

    Heit, Evan; Rotello, Caren M.

    2010-01-01

    One of the most important open questions in reasoning research is how inductive reasoning and deductive reasoning are related. In an effort to address this question, we applied methods and concepts from memory research. We used 2 experiments to examine the effects of logical validity and premise-conclusion similarity on evaluation of arguments.…

  19. Hard Real-Time Task Scheduling in Cloud Computing Using an Adaptive Genetic Algorithm

    Directory of Open Access Journals (Sweden)

    Amjad Mahmood

    2017-04-01

    Full Text Available In the Infrastructure-as-a-Service cloud computing model, virtualized computing resources in the form of virtual machines are provided over the Internet. A user can rent an arbitrary number of computing resources to meet their requirements, making cloud computing an attractive choice for executing real-time tasks. Economical task allocation and scheduling on a set of leased virtual machines is an important problem in the cloud computing environment. This paper proposes a greedy and a genetic algorithm with an adaptive selection of suitable crossover and mutation operations (named as AGA to allocate and schedule real-time tasks with precedence constraint on heterogamous virtual machines. A comprehensive simulation study has been done to evaluate the performance of the proposed algorithms in terms of their solution quality and efficiency. The simulation results show that AGA outperforms the greedy algorithm and non-adaptive genetic algorithm in terms of solution quality.

  20. Quantum computer science

    CERN Document Server

    Lanzagorta, Marco

    2009-01-01

    In this text we present a technical overview of the emerging field of quantum computation along with new research results by the authors. What distinguishes our presentation from that of others is our focus on the relationship between quantum computation and computer science. Specifically, our emphasis is on the computational model of quantum computing rather than on the engineering issues associated with its physical implementation. We adopt this approach for the same reason that a book on computer programming doesn't cover the theory and physical realization of semiconductors. Another distin

  1. A device for automatically recording information on the reasons for idling of stopes

    Energy Technology Data Exchange (ETDEWEB)

    Dergachev, L G; Kuzoyatov, G I; Tereshchenko, V N

    1979-01-01

    One substantial reserve for raising production efficiency in the coal industry is reduction of nonproductive time costs. The state of the art of stoping work and design features of stoping equipment and devices for obtaining information do not presently enable complete automation of the process of obtaining data on the reasons for down times. Therefore, together with automatic data formation manual recording of information is required through remote control equipment to the controlling computer directly from the work place. The Donetsk department of the Giprougleavtomatizatsiy institute has developed the UKIP-1 device for automatic recording of information on the reasons for stope down times. The device is designed for use in an automatic process control system of coal mines. It provides coding of information and its conversion to a form suitable for transmission through remote control channels, further processing, and recording. The device enables recording of information on down times of eight objects of the stope. Up to eight down time reasons can be recorded for each object. The device has 2 contact outputs, on one of which is formed an informational sequential eight-bit code; on the other, 8 clock pulses. The device's code is generated automatically, after it is activated by a switch. The length of the code packet results from the maximum possible information transmission rate of the existing mine remote control systems, and equals 12+2 sec. The clock pulse length equals half the length of the information pulse. The device has been tested at coal mines, and recommended for industrial production. UKIP-1 devices are being used in a pilot model of an automatic production control system of the Sotsialisticheskiy Donbass Newspaper mine of the Donetskugol' production association.

  2. Scaling predictive modeling in drug development with cloud computing.

    Science.gov (United States)

    Moghadam, Behrooz Torabi; Alvarsson, Jonathan; Holm, Marcus; Eklund, Martin; Carlsson, Lars; Spjuth, Ola

    2015-01-26

    Growing data sets with increased time for analysis is hampering predictive modeling in drug discovery. Model building can be carried out on high-performance computer clusters, but these can be expensive to purchase and maintain. We have evaluated ligand-based modeling on cloud computing resources where computations are parallelized and run on the Amazon Elastic Cloud. We trained models on open data sets of varying sizes for the end points logP and Ames mutagenicity and compare with model building parallelized on a traditional high-performance computing cluster. We show that while high-performance computing results in faster model building, the use of cloud computing resources is feasible for large data sets and scales well within cloud instances. An additional advantage of cloud computing is that the costs of predictive models can be easily quantified, and a choice can be made between speed and economy. The easy access to computational resources with no up-front investments makes cloud computing an attractive alternative for scientists, especially for those without access to a supercomputer, and our study shows that it enables cost-efficient modeling of large data sets on demand within reasonable time.

  3. A computer-based time study system for timber harvesting operations

    Science.gov (United States)

    Jingxin Wang; Joe McNeel; John Baumgras

    2003-01-01

    A computer-based time study system was developed for timber harvesting operations. Object-oriented techniques were used to model and design the system. The front-end of the time study system resides on the MS Windows CE and the back-end is supported by MS Access. The system consists of three major components: a handheld system, data transfer interface, and data storage...

  4. Diagnostic reasoning using qualitative causal models

    International Nuclear Information System (INIS)

    Sudduth, A.L.

    1992-01-01

    The application of expert systems to reasoning problems involving real-time data from plant measurements has been a topic of much research, but few practical systems have been deployed. One obstacle to wider use of expert systems in applications involving real-time data is the lack of adequate knowledge representation methodologies for dynamic processes. Knowledge bases composed mainly of rules have disadvantages when applied to dynamic processes and real-time data. This paper describes a methodology for the development of qualitative causal models that can be used as knowledge bases for reasoning about process dynamic behavior. These models provide a systematic method for knowledge base construction, considerably reducing the engineering effort required. They also offer much better opportunities for verification and validation of the knowledge base, thus increasing the possibility of the application of expert systems to reasoning about mission critical systems. Starting with the Signed Directed Graph (SDG) method that has been successfully applied to describe the behavior of diverse dynamic processes, the paper shows how certain non-physical behaviors that result from abstraction may be eliminated by applying causal constraint to the models. The resulting Extended Signed Directed Graph (ESDG) may then be compiled to produce a model for use in process fault diagnosis. This model based reasoning methodology is used in the MOBIAS system being developed by Duke Power Company under EPRI sponsorship. 15 refs., 4 figs

  5. Theory of mind broad and narrow: reasoning about social exchange engages ToM areas, precautionary reasoning does not.

    Science.gov (United States)

    Ermer, Elsa; Guerin, Scott A; Cosmides, Leda; Tooby, John; Miller, Michael B

    2006-01-01

    Baron-Cohen (1995) proposed that the theory of mind (ToM) inference system evolved to promote strategic social interaction. Social exchange--a form of co-operation for mutual benefit--involves strategic social interaction and requires ToM inferences about the contents of other individuals' mental states, especially their desires, goals, and intentions. There are behavioral and neuropsychological dissociations between reasoning about social exchange and reasoning about equivalent problems tapping other, more general content domains. It has therefore been proposed that social exchange behavior is regulated by social contract algorithms: a domain-specific inference system that is functionally specialized for reasoning about social exchange. We report an fMRI study using the Wason selection task that provides further support for this hypothesis. Precautionary rules share so many properties with social exchange rules--they are conditional, deontic, and involve subjective utilities--that most reasoning theories claim they are processed by the same neurocomputational machinery. Nevertheless, neuroimaging shows that reasoning about social exchange activates brain areas not activated by reasoning about precautionary rules, and vice versa. As predicted, neural correlates of ToM (anterior and posterior temporal cortex) were activated when subjects interpreted social exchange rules, but not precautionary rules (where ToM inferences are unnecessary). We argue that the interaction between ToM and social contract algorithms can be reciprocal: social contract algorithms requires ToM inferences, but their functional logic also allows ToM inferences to be made. By considering interactions between ToM in the narrower sense (belief-desire reasoning) and all the social inference systems that create the logic of human social interaction--ones that enable as well as use inferences about the content of mental states--a broader conception of ToM may emerge: a computational model embodying

  6. Continuous-variable quantum computing in optical time-frequency modes using quantum memories.

    Science.gov (United States)

    Humphreys, Peter C; Kolthammer, W Steven; Nunn, Joshua; Barbieri, Marco; Datta, Animesh; Walmsley, Ian A

    2014-09-26

    We develop a scheme for time-frequency encoded continuous-variable cluster-state quantum computing using quantum memories. In particular, we propose a method to produce, manipulate, and measure two-dimensional cluster states in a single spatial mode by exploiting the intrinsic time-frequency selectivity of Raman quantum memories. Time-frequency encoding enables the scheme to be extremely compact, requiring a number of memories that are a linear function of only the number of different frequencies in which the computational state is encoded, independent of its temporal duration. We therefore show that quantum memories can be a powerful component for scalable photonic quantum information processing architectures.

  7. Effects of Mathematics Computer Games on Special Education Students' Multiplicative Reasoning Ability

    Science.gov (United States)

    Bakker, Marjoke; van den Heuvel-Panhuizen, Marja; Robitzsch, Alexander

    2016-01-01

    This study examined the effects of a teacher-delivered intervention with online mathematics mini-games on special education students' multiplicative reasoning ability (multiplication and division). The games involved declarative, procedural, as well as conceptual knowledge of multiplicative relations, and were accompanied with teacher-led lessons…

  8. Replacement strategy for obsolete plant computers

    International Nuclear Information System (INIS)

    Schaefer, J.P.

    1985-01-01

    The plant computers of the first generation of larger nuclear power plants are reaching the end of their useful life time with respect to the hardware. The software would be no reason for a system exchange but new tasks for the supervisory computer system, availability questions of maintenance personnel and spare parts and the demand for improved operating procedures for the computer users have stimulated the considerations on how to exchange a computer system in a nuclear power plant without extending plant outage times due to exchange works. In the Federal Republic of Germany the planning phase of such backfitting projects is well under way, some projects are about to be implemented. The base for these backfitting projects is a modular supervisory computer concept which has been designated for the new line of KWU PWR's. The main characteristic of this computer system is the splitting of the system into a data acquisition level and a data processing level. This principle allows an extension of the processing level or even repeated replacements of the processing computers. With the existing computer system still in operation the new system can be installed in a step-by-step procedure. As soon as the first of the redundant process computers of the data processing level is in operation and the data link to the data acquisition computers is established the old computer system can be taken out of service. Then the back-up processing computer can be commissioned to complete the new system. (author)

  9. VNAP2: a computer program for computation of two-dimensional, time-dependent, compressible, turbulent flow

    Energy Technology Data Exchange (ETDEWEB)

    Cline, M.C.

    1981-08-01

    VNAP2 is a computer program for calculating turbulent (as well as laminar and inviscid), steady, and unsteady flow. VNAP2 solves the two-dimensional, time-dependent, compressible Navier-Stokes equations. The turbulence is modeled with either an algebraic mixing-length model, a one-equation model, or the Jones-Launder two-equation model. The geometry may be a single- or a dual-flowing stream. The interior grid points are computed using the unsplit MacCormack scheme. Two options to speed up the calculations for high Reynolds number flows are included. The boundary grid points are computed using a reference-plane-characteristic scheme with the viscous terms treated as source functions. An explicit artificial viscosity is included for shock computations. The fluid is assumed to be a perfect gas. The flow boundaries may be arbitrary curved solid walls, inflow/outflow boundaries, or free-jet envelopes. Typical problems that can be solved concern nozzles, inlets, jet-powered afterbodies, airfoils, and free-jet expansions. The accuracy and efficiency of the program are shown by calculations of several inviscid and turbulent flows. The program and its use are described completely, and six sample cases and a code listing are included.

  10. Causal reasoning in physics

    CERN Document Server

    Frisch, Mathias

    2014-01-01

    Much has been written on the role of causal notions and causal reasoning in the so-called 'special sciences' and in common sense. But does causal reasoning also play a role in physics? Mathias Frisch argues that, contrary to what influential philosophical arguments purport to show, the answer is yes. Time-asymmetric causal structures are as integral a part of the representational toolkit of physics as a theory's dynamical equations. Frisch develops his argument partly through a critique of anti-causal arguments and partly through a detailed examination of actual examples of causal notions in physics, including causal principles invoked in linear response theory and in representations of radiation phenomena. Offering a new perspective on the nature of scientific theories and causal reasoning, this book will be of interest to professional philosophers, graduate students, and anyone interested in the role of causal thinking in science.

  11. Reasoning about Grover's Quantum Search Algorithm using Probabilistic wp

    NARCIS (Netherlands)

    Butler, M.J.; Hartel, Pieter H.

    Grover's search algorithm is designed to be executed on a quantum mechanical computer. In this paper, the probabilistic wp-calculus is used to model and reason about Grover's algorithm. It is demonstrated that the calculus provides a rigorous programming notation for modelling this and other quantum

  12. A Matter of Computer Time

    Science.gov (United States)

    Celano, Donna; Neuman, Susan B.

    2010-01-01

    Many low-income children do not have the opportunity to develop the computer skills necessary to succeed in our technological economy. Their only access to computers and the Internet--school, afterschool programs, and community organizations--is woefully inadequate. Educators must work to close this knowledge gap and to ensure that low-income…

  13. A note on computing average state occupation times

    Directory of Open Access Journals (Sweden)

    Jan Beyersmann

    2014-05-01

    Full Text Available Objective: This review discusses how biometricians would probably compute or estimate expected waiting times, if they had the data. Methods: Our framework is a time-inhomogeneous Markov multistate model, where all transition hazards are allowed to be time-varying. We assume that the cumulative transition hazards are given. That is, they are either known, as in a simulation, determined by expert guesses, or obtained via some method of statistical estimation. Our basic tool is product integration, which transforms the transition hazards into the matrix of transition probabilities. Product integration enjoys a rich mathematical theory, which has successfully been used to study probabilistic and statistical aspects of multistate models. Our emphasis will be on practical implementation of product integration, which allows us to numerically approximate the transition probabilities. Average state occupation times and other quantities of interest may then be derived from the transition probabilities.

  14. Inductive reasoning about causally transmitted properties.

    Science.gov (United States)

    Shafto, Patrick; Kemp, Charles; Bonawitz, Elizabeth Baraff; Coley, John D; Tenenbaum, Joshua B

    2008-11-01

    Different intuitive theories constrain and guide inferences in different contexts. Formalizing simple intuitive theories as probabilistic processes operating over structured representations, we present a new computational model of category-based induction about causally transmitted properties. A first experiment demonstrates undergraduates' context-sensitive use of taxonomic and food web knowledge to guide reasoning about causal transmission and shows good qualitative agreement between model predictions and human inferences. A second experiment demonstrates strong quantitative and qualitative fits to inferences about a more complex artificial food web. A third experiment investigates human reasoning about complex novel food webs where species have known taxonomic relations. Results demonstrate a double-dissociation between the predictions of our causal model and a related taxonomic model [Kemp, C., & Tenenbaum, J. B. (2003). Learning domain structures. In Proceedings of the 25th annual conference of the cognitive science society]: the causal model predicts human inferences about diseases but not genes, while the taxonomic model predicts human inferences about genes but not diseases. We contrast our framework with previous models of category-based induction and previous formal instantiations of intuitive theories, and outline challenges in developing a complete model of context-sensitive reasoning.

  15. Computational electrodynamics the finite-difference time-domain method

    CERN Document Server

    Taflove, Allen

    2005-01-01

    This extensively revised and expanded third edition of the Artech House bestseller, Computational Electrodynamics: The Finite-Difference Time-Domain Method, offers engineers the most up-to-date and definitive resource on this critical method for solving Maxwell's equations. The method helps practitioners design antennas, wireless communications devices, high-speed digital and microwave circuits, and integrated optical devices with unsurpassed efficiency. There has been considerable advancement in FDTD computational technology over the past few years, and the third edition brings professionals the very latest details with entirely new chapters on important techniques, major updates on key topics, and new discussions on emerging areas such as nanophotonics. What's more, to supplement the third edition, the authors have created a Web site with solutions to problems, downloadable graphics and videos, and updates, making this new edition the ideal textbook on the subject as well.

  16. The death of argument fallacies in agent based reasoning

    CERN Document Server

    Woods, John

    2004-01-01

    This book is a sequel to the classic work, Fallacies Selected Papers 1972 - 1982 (1989), coauthored with Douglas Walton, and is a further major contribution to the Woods-Walton Approach to the logic of fallacious reasoning No one disputes the formitable accomplishments of modern mathematical logic; but equally no one seriously believes that classical logic is much good for the analysis of real-life argument and reasoning, or that it is the best place in which to transact the business of fallacy theory One of the principal innovations of the book is its adaptation of systems of logic to the particular requirements of fallacy theory The book develops logical analyses which take into account such features of real-life cognitive agency as resource- availability and computational complexity The book is also an invitation to interdisciplinary cooperation, linking the relevant branches of logic with computer science, cognitive psychology, neurobiology, forensic science, linguistics, (including conversational analysi...

  17. Polynomial-Time Reasoning Support for Design and Maintenance of Large-Scale Biomedical Ontologies

    OpenAIRE

    Suntisrivaraporn, Boontawee

    2009-01-01

    Description Logics (DLs) belong to a successful family of knowledge representation formalisms with two key assets: formally well-defined semantics which allows to represent knowledge in an unambiguous way and automated reasoning which allows to infer implicit knowledge from the one given explicitly. This thesis investigates various reasoning techniques for tractable DLs in the EL family which have been implemented in the CEL system. It suggests that the use of the lightweight DLs, in which re...

  18. Grid Computing Making the Global Infrastructure a Reality

    CERN Document Server

    Fox, Geoffrey C; Hey, Anthony J G

    2003-01-01

    Grid computing is applying the resources of many computers in a network to a single problem at the same time Grid computing appears to be a promising trend for three reasons: (1) Its ability to make more cost-effective use of a given amount of computer resources, (2) As a way to solve problems that can't be approached without an enormous amount of computing power (3) Because it suggests that the resources of many computers can be cooperatively and perhaps synergistically harnessed and managed as a collaboration toward a common objective. A number of corporations, professional groups, university consortiums, and other groups have developed or are developing frameworks and software for managing grid computing projects. The European Community (EU) is sponsoring a project for a grid for high-energy physics, earth observation, and biology applications. In the United States, the National Technology Grid is prototyping a computational grid for infrastructure and an access grid for people. Sun Microsystems offers Gri...

  19. Real-time data-intensive computing

    Energy Technology Data Exchange (ETDEWEB)

    Parkinson, Dilworth Y., E-mail: dyparkinson@lbl.gov; Chen, Xian; Hexemer, Alexander; MacDowell, Alastair A.; Padmore, Howard A.; Shapiro, David; Tamura, Nobumichi [Advanced Light Source, Lawrence Berkeley National Laboratory, Berkeley, CA 94720 (United States); Beattie, Keith; Krishnan, Harinarayan; Patton, Simon J.; Perciano, Talita; Stromsness, Rune; Tull, Craig E.; Ushizima, Daniela [Computational Research Division, Lawrence Berkeley National Laboratory Berkeley CA 94720 (United States); Correa, Joaquin; Deslippe, Jack R. [National Energy Research Scientific Computing Center, Berkeley, CA 94720 (United States); Dart, Eli; Tierney, Brian L. [Energy Sciences Network, Berkeley, CA 94720 (United States); Daurer, Benedikt J.; Maia, Filipe R. N. C. [Uppsala University, Uppsala (Sweden); and others

    2016-07-27

    Today users visit synchrotrons as sources of understanding and discovery—not as sources of just light, and not as sources of data. To achieve this, the synchrotron facilities frequently provide not just light but often the entire end station and increasingly, advanced computational facilities that can reduce terabytes of data into a form that can reveal a new key insight. The Advanced Light Source (ALS) has partnered with high performance computing, fast networking, and applied mathematics groups to create a “super-facility”, giving users simultaneous access to the experimental, computational, and algorithmic resources to make this possible. This combination forms an efficient closed loop, where data—despite its high rate and volume—is transferred and processed immediately and automatically on appropriate computing resources, and results are extracted, visualized, and presented to users or to the experimental control system, both to provide immediate insight and to guide decisions about subsequent experiments during beamtime. We will describe our work at the ALS ptychography, scattering, micro-diffraction, and micro-tomography beamlines.

  20. GRAPHIC, time-sharing magnet design computer programs at Argonne

    International Nuclear Information System (INIS)

    Lari, R.J.

    1974-01-01

    This paper describes three magnet design computer programs in use at the Zero Gradient Synchrotron of Argonne National Laboratory. These programs are used in the time sharing mode in conjunction with a Tektronix model 4012 graphic display terminal. The first program in called TRIM, the second MAGNET, and the third GFUN. (U.S.)

  1. Some aspects of analogical reasoning in mathematical creativity

    OpenAIRE

    Pease, Alison; Guhe, Markus; Smaill, Alan

    2010-01-01

    Analogical reasoning can shed light on both of the two key processes of creativity– generation and evaluation. Hence, it is a powerful tool for creativity. We illustrate this with three historical case studies of creative mathematical conjectures which were either found or evaluated via analogies. We conclude by describing our ongoing efforts to build computational realisations of these ideas.

  2. FRANTIC: a computer code for time dependent unavailability analysis

    International Nuclear Information System (INIS)

    Vesely, W.E.; Goldberg, F.F.

    1977-03-01

    The FRANTIC computer code evaluates the time dependent and average unavailability for any general system model. The code is written in FORTRAN IV for the IBM 370 computer. Non-repairable components, monitored components, and periodically tested components are handled. One unique feature of FRANTIC is the detailed, time dependent modeling of periodic testing which includes the effects of test downtimes, test overrides, detection inefficiencies, and test-caused failures. The exponential distribution is used for the component failure times and periodic equations are developed for the testing and repair contributions. Human errors and common mode failures can be included by assigning an appropriate constant probability for the contributors. The output from FRANTIC consists of tables and plots of the system unavailability along with a breakdown of the unavailability contributions. Sensitivity studies can be simply performed and a wide range of tables and plots can be obtained for reporting purposes. The FRANTIC code represents a first step in the development of an approach that can be of direct value in future system evaluations. Modifications resulting from use of the code, along with the development of reliability data based on operating reactor experience, can be expected to provide increased confidence in its use and potential application to the licensing process

  3. Computer-controlled radiation monitoring system

    International Nuclear Information System (INIS)

    Homann, S.G.

    1994-01-01

    A computer-controlled radiation monitoring system was designed and installed at the Lawrence Livermore National Laboratory's Multiuser Tandem Laboratory (10 MV tandem accelerator from High Voltage Engineering Corporation). The system continuously monitors the photon and neutron radiation environment associated with the facility and automatically suspends accelerator operation if preset radiation levels are exceeded. The system has proved reliable real-time radiation monitoring over the past five years, and has been a valuable tool for maintaining personnel exposure as low as reasonably achievable

  4. Synthesis Reasoning and Its Application in Chinese Calligraphy Generation

    Institute of Scientific and Technical Information of China (English)

    XUSong-Hua; PANYun-He; ZHUANGYue-Ting; FRANCISC.M.Lau

    2005-01-01

    In this paper, we address the demanding task of developing intelligent systems equipped with machine creativity that can perform design tasks automatically. The main challenge is how to model human beings' creativity mathematically and mimic such creativity computationally. We propose a “synthesis reasoning model” as the underlying mechanism to simulate human beings’ creative thinking when they are handling design tasks. We present the theory of the synthesis reasoning model, and the detailed procedure of designing an intelligent system based on the model.We offer a case study of an intelligent Chinese calligraphy generation system which we have developed.Based on implementation experiences of the calligraphy generation system as well as a few other systems for solving real-world problems, we suggest a generic methodology for constructing intelligent systems using the synthesis reasoning model.

  5. A neuro-fuzzy computing technique for modeling hydrological time series

    Science.gov (United States)

    Nayak, P. C.; Sudheer, K. P.; Rangan, D. M.; Ramasastri, K. S.

    2004-05-01

    Intelligent computing tools such as artificial neural network (ANN) and fuzzy logic approaches are proven to be efficient when applied individually to a variety of problems. Recently there has been a growing interest in combining both these approaches, and as a result, neuro-fuzzy computing techniques have evolved. This approach has been tested and evaluated in the field of signal processing and related areas, but researchers have only begun evaluating the potential of this neuro-fuzzy hybrid approach in hydrologic modeling studies. This paper presents the application of an adaptive neuro fuzzy inference system (ANFIS) to hydrologic time series modeling, and is illustrated by an application to model the river flow of Baitarani River in Orissa state, India. An introduction to the ANFIS modeling approach is also presented. The advantage of the method is that it does not require the model structure to be known a priori, in contrast to most of the time series modeling techniques. The results showed that the ANFIS forecasted flow series preserves the statistical properties of the original flow series. The model showed good performance in terms of various statistical indices. The results are highly promising, and a comparative analysis suggests that the proposed modeling approach outperforms ANNs and other traditional time series models in terms of computational speed, forecast errors, efficiency, peak flow estimation etc. It was observed that the ANFIS model preserves the potential of the ANN approach fully, and eases the model building process.

  6. Meta-analysis: how does posterior parietal cortex contribute to reasoning?

    Science.gov (United States)

    Wendelken, Carter

    2015-01-01

    Reasoning depends on the contribution of posterior parietal cortex (PPC). But PPC is involved in many basic operations—including spatial attention, mathematical cognition, working memory, long-term memory, and language—and the nature of its contribution to reasoning is unclear. Psychological theories of the processes underlying reasoning make divergent claims about the neural systems that are likely to be involved, and better understanding the specific contribution of PPC can help to inform these theories. We set out to address several competing hypotheses, concerning the role of PPC in reasoning: (1) reasoning involves application of formal logic and is dependent on language, with PPC activation for reasoning mainly reflective of linguistic processing; (2) reasoning involves probabilistic computation and is thus dependent on numerical processing mechanisms in PPC; and (3) reasoning is built upon the representation and processing of spatial relations, and PPC activation associated with reasoning reflects spatial processing. We conducted two separate meta-analyses. First, we pooled data from our own studies of reasoning in adults, and examined activation in PPC regions of interest (ROI). Second, we conducted an automated meta-analysis using Neurosynth, in which we examined overlap between activation maps associated with reasoning and maps associated with other key functions of PPC. In both analyses, we observed reasoning-related activation concentrated in the left Inferior Parietal Lobe (IPL). Reasoning maps demonstrated the greatest overlap with mathematical cognition. Maintenance, visuospatial, and phonological processing also demonstrated some overlap with reasoning, but a large portion of the reasoning map did not overlap with the map for any other function. This evidence suggests that the PPC’s contribution to reasoning may be most closely related to its role in mathematical cognition, but that a core component of this contribution may be specific to

  7. Meta-analysis: How does posterior parietal cortex contribute to reasoning?

    Directory of Open Access Journals (Sweden)

    Carter eWendelken

    2015-01-01

    Full Text Available Reasoning depends on the contribution of posterior parietal cortex (PPC. But PPC is involved in many basic operations -- including spatial attention, mathematical cognition, working memory, long-term memory, and language -- and the nature of its contribution to reasoning is unclear. Psychological theories of the processes underlying reasoning make divergent claims about the neural systems that are likely to be involved, and better understanding the specific contribution of PPC can help to inform these theories. We set out to address several competing hypotheses, concerning the role of PPC in reasoning: 1 reasoning involves application of formal logic and is dependent on language, with PPC activation for reasoning mainly reflective of linguistic processing, 2 reasoning involves probabilistic computation and is thus dependent on numerical processing mechanisms in PPC, and 3 reasoning is built upon the representation and processing of spatial relations, and PPC activation associated with reasoning reflects spatial processing. We conducted two separate meta-analyses. First, we pooled data from our own studies of reasoning in adults, and examined activation in PPC regions of interest. Second, we conducted an automated meta-analysis using Neurosynth, in which we examined overlap between activation maps associated with reasoning and maps associated with other key functions of PPC. In both analyses, we observed reasoning-related activation concentrated in the left Inferior Parietal Lobe (IPL. Reasoning maps demonstrated the greatest overlap with mathematical cognition. Maintenance, visuospatial, and phonological processing also demonstrated some overlap with reasoning, but a large portion of the reasoning map did not overlap with the map for any other function. This evidence suggests that the PPC’s contribution to reasoning may be most closely related to its role in mathematical cognition, but that a core component of this contribution may be specific

  8. Computer Game Design Classes: The Students' and Professionals' Perspectives

    Science.gov (United States)

    Swacha, Jakub; Skrzyszewski, Adam; Syslo, Wojciech A.

    2010-01-01

    There are multiple reasons that justify teaching computer game design. Its multi-aspectual nature creates opportunity to develop, at the same time, creativity, technical skills and ability to work in team. Thinking of game design classes, one needs direction on what to focus on so that the students could benefit the most. In this paper, we present…

  9. Relating derived relations as a model of analogical reasoning: reaction times and event-related potentials.

    Science.gov (United States)

    Barnes-Holmes, Dermot; Regan, Donal; Barnes-Holmes, Yvonne; Commins, Sean; Walsh, Derek; Stewart, Ian; Smeets, Paul M; Whelan, Robert; Dymond, Simon

    2005-11-01

    The current study aimed to test a Relational Frame Theory (RFT) model of analogical reasoning based on the relating of derived same and derived difference relations. Experiment 1 recorded reaction time measures of similar-similar (e.g., "apple is to orange as dog is to cat") versus different-different (e.g., "he is to his brother as chalk is to cheese") derived relational responding, in both speed-contingent and speed-noncontingent conditions. Experiment 2 examined the event-related potentials (ERPs) associated with these two response patterns. Both experiments showed similar-similar responding to be significantly faster than different-different responding. Experiment 2 revealed significant differences between the waveforms of the two response patterns in the left-hemispheric prefrontal regions; different-different waveforms were significantly more negative than similar-similar waveforms. The behavioral and neurophysiological data support the RFT prediction that, all things being equal, similar-similar responding is relationally "simpler" than, and functionally distinct from, different-different analogical responding. The ERP data were fully consistent with findings in the neurocognitive literature on analogy. These findings strengthen the validity of the RFT model of analogical reasoning and supplement the behavior-analytic approach to analogy based on the relating of derived relations.

  10. An integrated real-time diagnostic concept using expert systems, qualitative reasoning and quantitative analysis

    International Nuclear Information System (INIS)

    Edwards, R.M.; Lee, K.Y.; Kumara, S.; Levine, S.H.

    1989-01-01

    An approach for an integrated real-time diagnostic system is being developed for inclusion as an integral part of a power plant automatic control system. In order to participate in control decisions and automatic closed loop operation, the diagnostic system must operate in real-time. Thus far, an expert system with real-time capabilities has been developed and installed on a subsystem at the Experimental Breeder Reactor (EBR-II) in Idaho, USA. Real-time simulation testing of advanced power plant concepts at the Pennsylvania State University has been developed and was used to support the expert system development and installation at EBR-II. Recently, the US National Science Foundation (NSF) and the US Department of Energy (DOE) have funded a Penn State research program to further enhance application of real-time diagnostic systems by pursuing implementation in a distributed power plant computer system including microprocessor based controllers. This paper summarizes past, current, planned, and possible future approaches to power plant diagnostic systems research at Penn State. 34 refs., 9 figs

  11. "Defining Computer 'Speed': An Unsolved Challenge"

    CERN Multimedia

    CERN. Geneva

    2012-01-01

    Abstract: The reason we use computers is their speed, and the reason we use parallel computers is that they're faster than single-processor computers. Yet, after 70 years of electronic digital computing, we still do not have a solid definition of what computer 'speed' means, or even what it means to be 'faster'. Unlike measures in physics, where the definition of speed is rigorous and unequivocal, in computing there is no definition of speed that is universally accepted. As a result, computer customers have made purchases misguided by dubious information, computer designers have optimized their designs for the wrong goals, and computer programmers have chosen methods that optimize the wrong things. This talk describes why some of the obvious and historical ways of defining 'speed' haven't served us well, and the things we've learned in the struggle to find a definition that works. Biography: Dr. John Gustafson is a Director ...

  12. A neurocomputational system for relational reasoning.

    Science.gov (United States)

    Knowlton, Barbara J; Morrison, Robert G; Hummel, John E; Holyoak, Keith J

    2012-07-01

    The representation and manipulation of structured relations is central to human reasoning. Recent work in computational modeling and neuroscience has set the stage for developing more detailed neurocomputational models of these abilities. Several key neural findings appear to dovetail with computational constraints derived from a model of analogical processing, 'Learning and Inference with Schemas and Analogies' (LISA). These include evidence that (i) coherent oscillatory activity in the gamma and theta bands enables long-distance communication between the prefrontal cortex and posterior brain regions where information is stored; (ii) neurons in prefrontal cortex can rapidly learn to represent abstract concepts; (iii) a rostral-caudal abstraction gradient exists in the PFC; and (iv) the inferior frontal gyrus exerts inhibitory control over task-irrelevant information. Copyright © 2012. Published by Elsevier Ltd.

  13. Reason with me : 'Confabulation' and interpersonal moral reasoning

    NARCIS (Netherlands)

    Nyholm, S.R.

    2015-01-01

    According to Haidt’s ‘social intuitionist model’, empirical moral psychology supports the following conclusion: intuition comes first, strategic reasoning second. Critics have responded by arguing that intuitions can depend on non-conscious reasons, that not being able to articulate one’s reasons

  14. Image Analysis via Soft Computing: Prototype Applications at NASA KSC and Product Commercialization

    Science.gov (United States)

    Dominguez, Jesus A.; Klinko, Steve

    2011-01-01

    This slide presentation reviews the use of "soft computing" which differs from "hard computing" in that it is more tolerant of imprecision, partial truth, uncertainty, and approximation and its use in image analysis. Soft computing provides flexible information processing to handle real life ambiguous situations and achieve tractability, robustness low solution cost, and a closer resemblance to human decision making. Several systems are or have been developed: Fuzzy Reasoning Edge Detection (FRED), Fuzzy Reasoning Adaptive Thresholding (FRAT), Image enhancement techniques, and visual/pattern recognition. These systems are compared with examples that show the effectiveness of each. NASA applications that are reviewed are: Real-Time (RT) Anomaly Detection, Real-Time (RT) Moving Debris Detection and the Columbia Investigation. The RT anomaly detection reviewed the case of a damaged cable for the emergency egress system. The use of these techniques is further illustrated in the Columbia investigation with the location and detection of Foam debris. There are several applications in commercial usage: image enhancement, human screening and privacy protection, visual inspection, 3D heart visualization, tumor detections and x ray image enhancement.

  15. Relations between inductive reasoning and deductive reasoning.

    Science.gov (United States)

    Heit, Evan; Rotello, Caren M

    2010-05-01

    One of the most important open questions in reasoning research is how inductive reasoning and deductive reasoning are related. In an effort to address this question, we applied methods and concepts from memory research. We used 2 experiments to examine the effects of logical validity and premise-conclusion similarity on evaluation of arguments. Experiment 1 showed 2 dissociations: For a common set of arguments, deduction judgments were more affected by validity, and induction judgments were more affected by similarity. Moreover, Experiment 2 showed that fast deduction judgments were like induction judgments-in terms of being more influenced by similarity and less influenced by validity, compared with slow deduction judgments. These novel results pose challenges for a 1-process account of reasoning and are interpreted in terms of a 2-process account of reasoning, which was implemented as a multidimensional signal detection model and applied to receiver operating characteristic data. PsycINFO Database Record (c) 2010 APA, all rights reserved.

  16. Note on a reformulation of the strong cosmic censor conjecture based on computability

    Energy Technology Data Exchange (ETDEWEB)

    Etesi, Gabor

    2002-12-12

    In this Letter we provide a reformulation of the strong cosmic censor conjecture taking into account recent results on Malament-Hogarth space-times. We claim that the strong version of the cosmic censor conjecture can be formulated by postulating that a physically reasonable space-time is either globally hyperbolic or possesses the Malament-Hogarth property. But it is known that a Malament-Hogarth space-time in principle is capable for performing non-Turing computations such as checking consistency of ZFC set theory. In this way we get an intimate conjectured link between the cosmic censorship scenario and computability theory.

  17. Learning to reason about speakers' alternatives in sentence comprehension : A computational account

    NARCIS (Netherlands)

    Hendriks, Petra; van Rijn, Hedderik; Valkenier, Bea

    We present a computational simulation study of the acquisition of pronouns and reflexives. The computational simulation is based on an Optimality Theory analysis, and is shown to account for the well-known observation that in English and many other languages the correct comprehension of pronouns

  18. Learning to reason about speakers' alternatives in sentence comprehension : A computational account

    NARCIS (Netherlands)

    Hendriks, Petra; van Rijn, Hedderik; Valkenier, Bea

    2007-01-01

    We present a computational simulation study of the acquisition of pronouns and reflexives. The computational simulation is based on an Optimality Theory analysis, and is shown to account for the well-known observation that in English and many other languages the correct comprehension of pronouns

  19. Computational time analysis of the numerical solution of 3D electrostatic Poisson's equation

    Science.gov (United States)

    Kamboh, Shakeel Ahmed; Labadin, Jane; Rigit, Andrew Ragai Henri; Ling, Tech Chaw; Amur, Khuda Bux; Chaudhary, Muhammad Tayyab

    2015-05-01

    3D Poisson's equation is solved numerically to simulate the electric potential in a prototype design of electrohydrodynamic (EHD) ion-drag micropump. Finite difference method (FDM) is employed to discretize the governing equation. The system of linear equations resulting from FDM is solved iteratively by using the sequential Jacobi (SJ) and sequential Gauss-Seidel (SGS) methods, simulation results are also compared to examine the difference between the results. The main objective was to analyze the computational time required by both the methods with respect to different grid sizes and parallelize the Jacobi method to reduce the computational time. In common, the SGS method is faster than the SJ method but the data parallelism of Jacobi method may produce good speedup over SGS method. In this study, the feasibility of using parallel Jacobi (PJ) method is attempted in relation to SGS method. MATLAB Parallel/Distributed computing environment is used and a parallel code for SJ method is implemented. It was found that for small grid size the SGS method remains dominant over SJ method and PJ method while for large grid size both the sequential methods may take nearly too much processing time to converge. Yet, the PJ method reduces computational time to some extent for large grid sizes.

  20. New Research Perspectives in the Emerging Field of Computational Intelligence to Economic Modeling

    Directory of Open Access Journals (Sweden)

    Vasile MAZILESCU

    2009-01-01

    Full Text Available Computational Intelligence (CI is a new development paradigm of intelligentsystems which has resulted from a synergy between fuzzy sets, artificial neuralnetworks, evolutionary computation, machine learning, etc., broadeningcomputer science, physics, economics, engineering, mathematics, statistics. It isimperative to know why these tools can be potentially relevant and effective toeconomic and financial modeling. This paper presents, after a synergic newparadigm of intelligent systems, as a practical case study the fuzzy and temporalproperties of knowledge formalism embedded in an Intelligent Control System(ICS, based on FT-algorithm. We are not dealing high with level reasoningmethods, because we think that real-time problems can only be solved by ratherlow-level reasoning. Most of the overall run-time of fuzzy expert systems isused in the match phase. To achieve a fast reasoning the number of fuzzy setoperations must be reduced. For this, we use a fuzzy compiled structure ofknowledge, like Rete, because it is required for real-time responses. Solving thematch-time predictability problem would allow us to built much more powerfulreasoning techniques.

  1. To Reason or Not to Reason: Is Autobiographical Reasoning Always Beneficial?

    Science.gov (United States)

    McLean, Kate C.; Mansfield, Cade D.

    2011-01-01

    Autobiographical reasoning has been found to be a critical process in identity development; however, the authors suggest that existing research shows that such reasoning may not always be critical to another important outcome: well-being. The authors describe characteristics of people such as personality and age, contexts such as conversations,…

  2. Improving the Timed Automata Approach to Biological Pathway Dynamics

    NARCIS (Netherlands)

    Langerak, R.; Pol, Jaco van de; Post, Janine N.; Schivo, Stefano; Aceto, Luca; Bacci, Giorgio; Bacci, Giovanni; Ingólfsdóttir, Anna; Legay, Axel; Mardare, Radu

    2017-01-01

    Biological systems such as regulatory or gene networks can be seen as a particular type of distributed systems, and for this reason they can be modeled within the Timed Automata paradigm, which was developed in the computer science context. However, tools designed to model distributed systems often

  3. History Matters: Incremental Ontology Reasoning Using Modules

    Science.gov (United States)

    Cuenca Grau, Bernardo; Halaschek-Wiener, Christian; Kazakov, Yevgeny

    The development of ontologies involves continuous but relatively small modifications. Existing ontology reasoners, however, do not take advantage of the similarities between different versions of an ontology. In this paper, we propose a technique for incremental reasoning—that is, reasoning that reuses information obtained from previous versions of an ontology—based on the notion of a module. Our technique does not depend on a particular reasoning calculus and thus can be used in combination with any reasoner. We have applied our results to incremental classification of OWL DL ontologies and found significant improvement over regular classification time on a set of real-world ontologies.

  4. Polynomial-time computability of the edge-reliability of graphs using Gilbert's formula

    Directory of Open Access Journals (Sweden)

    Marlowe Thomas J.

    1998-01-01

    Full Text Available Reliability is an important consideration in analyzing computer and other communication networks, but current techniques are extremely limited in the classes of graphs which can be analyzed efficiently. While Gilbert's formula establishes a theoretically elegant recursive relationship between the edge reliability of a graph and the reliability of its subgraphs, naive evaluation requires consideration of all sequences of deletions of individual vertices, and for many graphs has time complexity essentially Θ (N!. We discuss a general approach which significantly reduces complexity, encoding subgraph isomorphism in a finer partition by invariants, and recursing through the set of invariants. We illustrate this approach using threshhold graphs, and show that any computation of reliability using Gilbert's formula will be polynomial-time if and only if the number of invariants considered is polynomial; we then show families of graphs with polynomial-time, and non-polynomial reliability computation, and show that these encompass most previously known results. We then codify our approach to indicate how it can be used for other classes of graphs, and suggest several classes to which the technique can be applied.

  5. Extending the length and time scales of Gram–Schmidt Lyapunov vector computations

    Energy Technology Data Exchange (ETDEWEB)

    Costa, Anthony B., E-mail: acosta@northwestern.edu [Department of Chemistry, Northwestern University, Evanston, IL 60208 (United States); Green, Jason R., E-mail: jason.green@umb.edu [Department of Chemistry, Northwestern University, Evanston, IL 60208 (United States); Department of Chemistry, University of Massachusetts Boston, Boston, MA 02125 (United States)

    2013-08-01

    Lyapunov vectors have found growing interest recently due to their ability to characterize systems out of thermodynamic equilibrium. The computation of orthogonal Gram–Schmidt vectors requires multiplication and QR decomposition of large matrices, which grow as N{sup 2} (with the particle count). This expense has limited such calculations to relatively small systems and short time scales. Here, we detail two implementations of an algorithm for computing Gram–Schmidt vectors. The first is a distributed-memory message-passing method using Scalapack. The second uses the newly-released MAGMA library for GPUs. We compare the performance of both codes for Lennard–Jones fluids from N=100 to 1300 between Intel Nahalem/Infiniband DDR and NVIDIA C2050 architectures. To our best knowledge, these are the largest systems for which the Gram–Schmidt Lyapunov vectors have been computed, and the first time their calculation has been GPU-accelerated. We conclude that Lyapunov vector calculations can be significantly extended in length and time by leveraging the power of GPU-accelerated linear algebra.

  6. Extending the length and time scales of Gram–Schmidt Lyapunov vector computations

    International Nuclear Information System (INIS)

    Costa, Anthony B.; Green, Jason R.

    2013-01-01

    Lyapunov vectors have found growing interest recently due to their ability to characterize systems out of thermodynamic equilibrium. The computation of orthogonal Gram–Schmidt vectors requires multiplication and QR decomposition of large matrices, which grow as N 2 (with the particle count). This expense has limited such calculations to relatively small systems and short time scales. Here, we detail two implementations of an algorithm for computing Gram–Schmidt vectors. The first is a distributed-memory message-passing method using Scalapack. The second uses the newly-released MAGMA library for GPUs. We compare the performance of both codes for Lennard–Jones fluids from N=100 to 1300 between Intel Nahalem/Infiniband DDR and NVIDIA C2050 architectures. To our best knowledge, these are the largest systems for which the Gram–Schmidt Lyapunov vectors have been computed, and the first time their calculation has been GPU-accelerated. We conclude that Lyapunov vector calculations can be significantly extended in length and time by leveraging the power of GPU-accelerated linear algebra

  7. Reasons for Whistleblowing: A Qualitative Study

    Directory of Open Access Journals (Sweden)

    Ali BALTACI

    2017-04-01

    Full Text Available Whistleblowing has become a commonly encountered concept in recent times. Negative behaviors and actions can be experienced in any organization, and whistleblowing, as a communication process, is a kind of ethical behavior. Whistleblowing is the transmission of an unfavorable situation discovered in the organization to either internal or external authorities. An examination of the reasons for the employee’s whistleblowing is important for a better understanding of this concept; hence, this research focuses on the reasons for whistleblowing. In addition, the reasons for avoiding whistleblowing were also investigated. This research, which is designed as a qualitative study, is based on the phenomenological approach. Interviews were conducted with open-ended, semi-structured interview form in the study. The research was conducted on 20 teachers, 12 administrators, and 7 inspectors. The data were analyzed using the content analysis method. As a result of the research, the individual, organizational and social reasons for whistleblowing have been differentiated. Among the individual reasons for whistleblowing are the considerations of protecting and gaining interests. Organizational reasons include business ethics and the expectation of subsequent promotion. Social reasons encompass social benefits, social justice, and religious belief. Reasons for avoiding whistleblowing vary based on retaliation and worry. This research is considered important because as it is believed to be the first qualitative research to approach the reasons for whistleblowing. The results of this research have revealed gaps in the understanding of this area for future studies.

  8. A concept analysis of abductive reasoning.

    Science.gov (United States)

    Mirza, Noeman A; Akhtar-Danesh, Noori; Noesgaard, Charlotte; Martin, Lynn; Staples, Eric

    2014-09-01

    To describe an analysis of the concept of abductive reasoning. In the discipline of nursing, abductive reasoning has received only philosophical attention and remains a vague concept. In addition to deductive and inductive reasoning, abductive reasoning is not recognized even in prominent nursing knowledge development literature. Therefore, what abductive reasoning is and how it can inform nursing practice and education was explored. Concept analysis. Combinations of specific keywords were searched in Web of Science, CINAHL, PsychINFO, PubMed, Medline and EMBASE. The analysis was conducted in June 2012 and only literature before this period was included. No time limits were set. Rodger's evolutionary method for conducting concept analysis was used. Twelve records were included in the analysis. The most common surrogate term was retroduction, whereas related terms included intuition and pattern and similarity recognition. Antecedents consisted of a complex, puzzling situation and a clinician with creativity, experience and knowledge. Consequences included the formation of broad hypotheses that enhance understanding of care situations. Overall, abductive reasoning was described as the process of hypothesis or theory generation and evaluation. It was also viewed as inference to the best explanation. As a new approach, abductive reasoning could enhance reasoning abilities of novice clinicians. It can not only incorporate various ways of knowing but also its holistic approach to learning appears to be promising in problem-based learning. As nursing literature on abductive reasoning is predominantly philosophical, practical consequences of abductive reasoning warrant further research. © 2014 John Wiley & Sons Ltd.

  9. An Integrated Software Framework to Support Semantic Modeling and Reasoning of Spatiotemporal Change of Geographical Objects: A Use Case of Land Use and Land Cover Change Study

    Directory of Open Access Journals (Sweden)

    Wenwen Li

    2016-09-01

    Full Text Available Evolving Earth observation and change detection techniques enable the automatic identification of Land Use and Land Cover Change (LULCC over a large extent from massive amounts of remote sensing data. It at the same time poses a major challenge in effective organization, representation and modeling of such information. This study proposes and implements an integrated computational framework to support the modeling, semantic and spatial reasoning of change information with regard to space, time and topology. We first proposed a conceptual model to formally represent the spatiotemporal variation of change data, which is essential knowledge to support various environmental and social studies, such as deforestation and urbanization studies. Then, a spatial ontology was created to encode these semantic spatiotemporal data in a machine-understandable format. Based on the knowledge defined in the ontology and related reasoning rules, a semantic platform was developed to support the semantic query and change trajectory reasoning of areas with LULCC. This semantic platform is innovative, as it integrates semantic and spatial reasoning into a coherent computational and operational software framework to support automated semantic analysis of time series data that can go beyond LULC datasets. In addition, this system scales well as the amount of data increases, validated by a number of experimental results. This work contributes significantly to both the geospatial Semantic Web and GIScience communities in terms of the establishment of the (web-based semantic platform for collaborative question answering and decision-making.

  10. Computing Refined Buneman Trees in Cubic Time

    DEFF Research Database (Denmark)

    Brodal, G.S.; Fagerberg, R.; Östlin, A.

    2003-01-01

    Reconstructing the evolutionary tree for a set of n species based on pairwise distances between the species is a fundamental problem in bioinformatics. Neighbor joining is a popular distance based tree reconstruction method. It always proposes fully resolved binary trees despite missing evidence...... in the underlying distance data. Distance based methods based on the theory of Buneman trees and refined Buneman trees avoid this problem by only proposing evolutionary trees whose edges satisfy a number of constraints. These trees might not be fully resolved but there is strong combinatorial evidence for each...... proposed edge. The currently best algorithm for computing the refined Buneman tree from a given distance measure has a running time of O(n 5) and a space consumption of O(n 4). In this paper, we present an algorithm with running time O(n 3) and space consumption O(n 2). The improved complexity of our...

  11. Accounting for dropout reason in longitudinal studies with nonignorable dropout.

    Science.gov (United States)

    Moore, Camille M; MaWhinney, Samantha; Forster, Jeri E; Carlson, Nichole E; Allshouse, Amanda; Wang, Xinshuo; Routy, Jean-Pierre; Conway, Brian; Connick, Elizabeth

    2017-08-01

    Dropout is a common problem in longitudinal cohort studies and clinical trials, often raising concerns of nonignorable dropout. Selection, frailty, and mixture models have been proposed to account for potentially nonignorable missingness by relating the longitudinal outcome to time of dropout. In addition, many longitudinal studies encounter multiple types of missing data or reasons for dropout, such as loss to follow-up, disease progression, treatment modifications and death. When clinically distinct dropout reasons are present, it may be preferable to control for both dropout reason and time to gain additional clinical insights. This may be especially interesting when the dropout reason and dropout times differ by the primary exposure variable. We extend a semi-parametric varying-coefficient method for nonignorable dropout to accommodate dropout reason. We apply our method to untreated HIV-infected subjects recruited to the Acute Infection and Early Disease Research Program HIV cohort and compare longitudinal CD4 + T cell count in injection drug users to nonusers with two dropout reasons: anti-retroviral treatment initiation and loss to follow-up.

  12. Episodic Reasoning for Vision-Based Human Action Recognition

    Directory of Open Access Journals (Sweden)

    Maria J. Santofimia

    2014-01-01

    Full Text Available Smart Spaces, Ambient Intelligence, and Ambient Assisted Living are environmental paradigms that strongly depend on their capability to recognize human actions. While most solutions rest on sensor value interpretations and video analysis applications, few have realized the importance of incorporating common-sense capabilities to support the recognition process. Unfortunately, human action recognition cannot be successfully accomplished by only analyzing body postures. On the contrary, this task should be supported by profound knowledge of human agency nature and its tight connection to the reasons and motivations that explain it. The combination of this knowledge and the knowledge about how the world works is essential for recognizing and understanding human actions without committing common-senseless mistakes. This work demonstrates the impact that episodic reasoning has in improving the accuracy of a computer vision system for human action recognition. This work also presents formalization, implementation, and evaluation details of the knowledge model that supports the episodic reasoning.

  13. Computational intelligence in time series forecasting theory and engineering applications

    CERN Document Server

    Palit, Ajoy K

    2005-01-01

    Foresight in an engineering enterprise can make the difference between success and failure, and can be vital to the effective control of industrial systems. Applying time series analysis in the on-line milieu of most industrial plants has been problematic owing to the time and computational effort required. The advent of soft computing tools offers a solution. The authors harness the power of intelligent technologies individually and in combination. Examples of the particular systems and processes susceptible to each technique are investigated, cultivating a comprehensive exposition of the improvements on offer in quality, model building and predictive control and the selection of appropriate tools from the plethora available. Application-oriented engineers in process control, manufacturing, production industry and research centres will find much to interest them in this book. It is suitable for industrial training purposes, as well as serving as valuable reference material for experimental researchers.

  14. Theory and computation of disturbance invariant sets for discrete-time linear systems

    Directory of Open Access Journals (Sweden)

    Kolmanovsky Ilya

    1998-01-01

    Full Text Available This paper considers the characterization and computation of invariant sets for discrete-time, time-invariant, linear systems with disturbance inputs whose values are confined to a specified compact set but are otherwise unknown. The emphasis is on determining maximal disturbance-invariant sets X that belong to a specified subset Γ of the state space. Such d-invariant sets have important applications in control problems where there are pointwise-in-time state constraints of the form χ ( t ∈ Γ . One purpose of the paper is to unite and extend in a rigorous way disparate results from the prior literature. In addition there are entirely new results. Specific contributions include: exploitation of the Pontryagin set difference to clarify conceptual matters and simplify mathematical developments, special properties of maximal invariant sets and conditions for their finite determination, algorithms for generating concrete representations of maximal invariant sets, practical computational questions, extension of the main results to general Lyapunov stable systems, applications of the computational techniques to the bounding of state and output response. Results on Lyapunov stable systems are applied to the implementation of a logic-based, nonlinear multimode regulator. For plants with disturbance inputs and state-control constraints it enlarges the constraint-admissible domain of attraction. Numerical examples illustrate the various theoretical and computational results.

  15. Using Relational Reasoning Strategies to Help Improve Clinical Reasoning Practice.

    Science.gov (United States)

    Dumas, Denis; Torre, Dario M; Durning, Steven J

    2018-05-01

    Clinical reasoning-the steps up to and including establishing a diagnosis and/or therapy-is a fundamentally important mental process for physicians. Unfortunately, mounting evidence suggests that errors in clinical reasoning lead to substantial problems for medical professionals and patients alike, including suboptimal care, malpractice claims, and rising health care costs. For this reason, cognitive strategies by which clinical reasoning may be improved-and that many expert clinicians are already using-are highly relevant for all medical professionals, educators, and learners.In this Perspective, the authors introduce one group of cognitive strategies-termed relational reasoning strategies-that have been empirically shown, through limited educational and psychological research, to improve the accuracy of learners' reasoning both within and outside of the medical disciplines. The authors contend that relational reasoning strategies may help clinicians to be metacognitive about their own clinical reasoning; such strategies may also be particularly well suited for explicitly organizing clinical reasoning instruction for learners. Because the particular curricular efforts that may improve the relational reasoning of medical students are not known at this point, the authors describe the nature of previous research on relational reasoning strategies to encourage the future design, implementation, and evaluation of instructional interventions for relational reasoning within the medical education literature. The authors also call for continued research on using relational reasoning strategies and their role in clinical practice and medical education, with the long-term goal of improving diagnostic accuracy.

  16. Extensive use of computational fluid dynamics in the upgrading of hydraulic turbines

    Energy Technology Data Exchange (ETDEWEB)

    Sabourin, M.; De Henau, V. [GEC Alsthom Electromechanical Inc., Tracy, PQ (Canada); Eremeef, R. [GEC Alsthom Neyrpic, Grenoble (France)

    1995-12-31

    The use of computational fluid flow dynamics (CFD) and the Navier Stokes equations by GEC Alsthom for turbine rehabilitation were discussed. The process of runner rehabilitation was discussed from a fluid flow perspective, which accounts for the spiral case-distributor set and draft tube. The Kootenay turbine rehabilitation was described with regard to it spiral case and stay vane. The numerical analysis used to model upstream components was explained. The influence of draft tube effects was emphasized as an important efficiency factor. The differences between draft tubes at Sir Adam Beck 2 and La Grande 2 were discussed. Computational fluid flow modelling was claimed to have produced global performance enhancements in a reasonably short time, and at a reasonable cost. 6 refs., 6 figs., 4 tabs.

  17. Ultrasonic divergent-beam scanner for time-of-flight tomography with computer evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Glover, G H

    1978-03-02

    The rotatable ultrasonic divergent-beam scanner is designed for time-of-flight tomography with computer evaluation. With it there can be measured parameters that are of importance for the structure of soft tissues, e.g. time as a function of the velocity distribution along a certain path of flight(the method is analogous to the transaxial X-ray tomography). Moreover it permits to perform the quantitative measurement of two-dimensional velocity distributions and may therefore be applied to serial examinations for detecting cancer of the breast. As computers digital memories as well as analog-digital-hybrid systems are suitable.

  18. Computing moment to moment BOLD activation for real-time neurofeedback

    Science.gov (United States)

    Hinds, Oliver; Ghosh, Satrajit; Thompson, Todd W.; Yoo, Julie J.; Whitfield-Gabrieli, Susan; Triantafyllou, Christina; Gabrieli, John D.E.

    2013-01-01

    Estimating moment to moment changes in blood oxygenation level dependent (BOLD) activation levels from functional magnetic resonance imaging (fMRI) data has applications for learned regulation of regional activation, brain state monitoring, and brain-machine interfaces. In each of these contexts, accurate estimation of the BOLD signal in as little time as possible is desired. This is a challenging problem due to the low signal-to-noise ratio of fMRI data. Previous methods for real-time fMRI analysis have either sacrificed the ability to compute moment to moment activation changes by averaging several acquisitions into a single activation estimate or have sacrificed accuracy by failing to account for prominent sources of noise in the fMRI signal. Here we present a new method for computing the amount of activation present in a single fMRI acquisition that separates moment to moment changes in the fMRI signal intensity attributable to neural sources from those due to noise, resulting in a feedback signal more reflective of neural activation. This method computes an incremental general linear model fit to the fMRI timeseries, which is used to calculate the expected signal intensity at each new acquisition. The difference between the measured intensity and the expected intensity is scaled by the variance of the estimator in order to transform this residual difference into a statistic. Both synthetic and real data were used to validate this method and compare it to the only other published real-time fMRI method. PMID:20682350

  19. Time Domain Terahertz Axial Computed Tomography Non Destructive Evaluation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to demonstrate key elements of feasibility for a high speed automated time domain terahertz computed axial tomography (TD-THz CT) non destructive...

  20. Mathematical reasoning analogies, metaphors, and images

    CERN Document Server

    English, Lyn D

    2013-01-01

    How we reason with mathematical ideas continues to be a fascinating and challenging topic of research--particularly with the rapid and diverse developments in the field of cognitive science that have taken place in recent years. Because it draws on multiple disciplines, including psychology, philosophy, computer science, linguistics, and anthropology, cognitive science provides rich scope for addressing issues that are at the core of mathematical learning. Drawing upon the interdisciplinary nature of cognitive science, this book presents a broadened perspective on mathematics and mat

  1. Improving statistical reasoning theoretical models and practical implications

    CERN Document Server

    Sedlmeier, Peter

    1999-01-01

    This book focuses on how statistical reasoning works and on training programs that can exploit people''s natural cognitive capabilities to improve their statistical reasoning. Training programs that take into account findings from evolutionary psychology and instructional theory are shown to have substantially larger effects that are more stable over time than previous training regimens. The theoretical implications are traced in a neural network model of human performance on statistical reasoning problems. This book apppeals to judgment and decision making researchers and other cognitive scientists, as well as to teachers of statistics and probabilistic reasoning.

  2. Viking Afterbody Heating Computations and Comparisons to Flight Data

    Science.gov (United States)

    Edquist, Karl T.; Wright, Michael J.; Allen, Gary A., Jr.

    2006-01-01

    Computational fluid dynamics predictions of Viking Lander 1 entry vehicle afterbody heating are compared to flight data. The analysis includes a derivation of heat flux from temperature data at two base cover locations, as well as a discussion of available reconstructed entry trajectories. Based on the raw temperature-time history data, convective heat flux is derived to be 0.63-1.10 W/cm2 for the aluminum base cover at the time of thermocouple failure. Peak heat flux at the fiberglass base cover thermocouple is estimated to be 0.54-0.76 W/cm2, occurring 16 seconds after peak stagnation point heat flux. Navier-Stokes computational solutions are obtained with two separate codes using an 8- species Mars gas model in chemical and thermal non-equilibrium. Flowfield solutions using local time-stepping did not result in converged heating at either thermocouple location. A global time-stepping approach improved the computational stability, but steady state heat flux was not reached for either base cover location. Both thermocouple locations lie within a separated flow region of the base cover that is likely unsteady. Heat flux computations averaged over the solution history are generally below the flight data and do not vary smoothly over time for both base cover locations. Possible reasons for the mismatch between flight data and flowfield solutions include underestimated conduction effects and limitations of the computational methods.

  3. Time Domain Terahertz Axial Computed Tomography Non Destructive Evaluation, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — In this Phase 2 project, we propose to develop, construct, and deliver to NASA a computed axial tomography time-domain terahertz (CT TD-THz) non destructive...

  4. The reliable solution and computation time of variable parameters logistic model

    Science.gov (United States)

    Wang, Pengfei; Pan, Xinnong

    2018-05-01

    The study investigates the reliable computation time (RCT, termed as T c) by applying a double-precision computation of a variable parameters logistic map (VPLM). Firstly, by using the proposed method, we obtain the reliable solutions for the logistic map. Secondly, we construct 10,000 samples of reliable experiments from a time-dependent non-stationary parameters VPLM and then calculate the mean T c. The results indicate that, for each different initial value, the T cs of the VPLM are generally different. However, the mean T c trends to a constant value when the sample number is large enough. The maximum, minimum, and probable distribution functions of T c are also obtained, which can help us to identify the robustness of applying a nonlinear time series theory to forecasting by using the VPLM output. In addition, the T c of the fixed parameter experiments of the logistic map is obtained, and the results suggest that this T c matches the theoretical formula-predicted value.

  5. 78 FR 38949 - Computer Security Incident Coordination (CSIC): Providing Timely Cyber Incident Response

    Science.gov (United States)

    2013-06-28

    ... exposed to various forms of cyber attack. In some cases, attacks can be thwarted through the use of...-3383-01] Computer Security Incident Coordination (CSIC): Providing Timely Cyber Incident Response... systems will be successfully attacked. When a successful attack occurs, the job of a Computer Security...

  6. Approaching Gender Parity: Women in Computer Science at Afghanistan's Kabul University

    Science.gov (United States)

    Plane, Jandelyn

    2010-01-01

    This study explores the representation of women in computer science at the tertiary level through data collected about undergraduate computer science education at Kabul University in Afghanistan. Previous studies have theorized reasons for underrepresentation of women in computer science, and while many of these reasons are indeed present in…

  7. CROSAT: A digital computer program for statistical-spectral analysis of two discrete time series

    International Nuclear Information System (INIS)

    Antonopoulos Domis, M.

    1978-03-01

    The program CROSAT computes directly from two discrete time series auto- and cross-spectra, transfer and coherence functions, using a Fast Fourier Transform subroutine. Statistical analysis of the time series is optional. While of general use the program is constructed to be immediately compatible with the ICL 4-70 and H316 computers at AEE Winfrith, and perhaps with minor modifications, with any other hardware system. (author)

  8. On the Relationship between a Computational Natural Logic and Natural Language

    DEFF Research Database (Denmark)

    Andreasen, Troels; Bulskov, Henrik; Nilsson, Jørgen Fischer

    2016-01-01

    This paper makes a case for adopting appropriate forms of natural logic as target language for computational reasoning with descriptive natural language. Natural logics are stylized fragments of natural language where reasoning can be conducted directly by natural reasoning rules reflecting intui...... intuitive reasoning in natural language. The approach taken in this paper is to extend natural logic stepwise with a view to covering successively larger parts of natural language. We envisage applications for computational querying and reasoning, in particular within the life-sciences....

  9. Part-whole reasoning in medical ontologies revisited--introducing SEP triplets into classification-based description logics.

    OpenAIRE

    Schulz, S.; Romacker, M.; Hahn, U.

    1998-01-01

    The development of powerful and comprehensive medical ontologies that support formal reasoning on a large scale is one of the key requirements for clinical computing in the next millennium. Taxonomic medical knowledge, a major portion of these ontologies, is mainly characterized by generalization and part-whole relations between concepts. While reasoning in generalization hierarchies is quite well understood, no fully conclusive mechanism as yet exists for part-whole reasoning. The approach w...

  10. Model-based reasoning and the control of process plants

    International Nuclear Information System (INIS)

    Vaelisuo, Heikki

    1993-02-01

    In addition to feedback control, safe and economic operation of industrial process plants requires discrete-event type logic control like for example automatic control sequences, interlocks, etc. A lot of complex routine reasoning is involved in the design and verification and validation (VandV) of such automatics. Similar reasoning tasks are encountered during plant operation in action planning and fault diagnosis. The low-level part of the required problem solving is so straightforward that it could be accomplished by a computer if only there were plant models which allow versatile mechanised reasoning. Such plant models and corresponding inference algorithms are the main subject of this report. Deep knowledge and qualitative modelling play an essential role in this work. Deep knowledge refers to mechanised reasoning based on the first principles of the phenomena in the problem domain. Qualitative modelling refers to knowledge representation formalism and related reasoning methods which allow solving problems on an abstraction level higher than for example traditional simulation and optimisation. Prolog is a commonly used platform for artificial intelligence (Al) applications. Constraint logic languages like CLP(R) and Prolog-III extend the scope of logic programming to numeric problem solving. In addition they allow a programming style which often reduces the computational complexity significantly. An approach to model-based reasoning implemented in constraint logic programming language CLP(R) is presented. The approach is based on some of the principles of QSIM, an algorithm for qualitative simulation. It is discussed how model-based reasoning can be applied in the design and VandV of plant automatics and in action planning during plant operation. A prototype tool called ISIR is discussed and some initial results obtained during the development of the tool are presented. The results presented originate from preliminary test results of the prototype obtained

  11. Effects of playing mathematics computer games on primary school students' multiplicative reasoning ability

    NARCIS (Netherlands)

    Bakker, Marjoke; Van den Heuvel-Panhuizen, M.; Robitzsch, Alexander

    2015-01-01

    This study used a large-scale cluster randomized longitudinal experiment (N=719; 35schools) to investigate the effects of online mathematics mini-games on primary school students' multiplicative reasoning ability. The experiment included four conditions: playing at school, integrated in a lesson

  12. Why Don't All Professors Use Computers?

    Science.gov (United States)

    Drew, David Eli

    1989-01-01

    Discusses the adoption of computer technology at universities and examines reasons why some professors don't use computers. Topics discussed include computer applications, including artificial intelligence, social science research, statistical analysis, and cooperative research; appropriateness of the technology for the task; the Computer Aptitude…

  13. Computation of the target state and feedback controls for time optimal consensus in multi-agent systems

    Science.gov (United States)

    Mulla, Ameer K.; Patil, Deepak U.; Chakraborty, Debraj

    2018-02-01

    N identical agents with bounded inputs aim to reach a common target state (consensus) in the minimum possible time. Algorithms for computing this time-optimal consensus point, the control law to be used by each agent and the time taken for the consensus to occur, are proposed. Two types of multi-agent systems are considered, namely (1) coupled single-integrator agents on a plane and, (2) double-integrator agents on a line. At the initial time instant, each agent is assumed to have access to the state information of all the other agents. An algorithm, using convexity of attainable sets and Helly's theorem, is proposed, to compute the final consensus target state and the minimum time to achieve this consensus. Further, parts of the computation are parallelised amongst the agents such that each agent has to perform computations of O(N2) run time complexity. Finally, local feedback time-optimal control laws are synthesised to drive each agent to the target point in minimum time. During this part of the operation, the controller for each agent uses measurements of only its own states and does not need to communicate with any neighbouring agents.

  14. Part-whole reasoning in medical ontologies revisited--introducing SEP triplets into classification-based description logics.

    Science.gov (United States)

    Schulz, S; Romacker, M; Hahn, U

    1998-01-01

    The development of powerful and comprehensive medical ontologies that support formal reasoning on a large scale is one of the key requirements for clinical computing in the next millennium. Taxonomic medical knowledge, a major portion of these ontologies, is mainly characterized by generalization and part-whole relations between concepts. While reasoning in generalization hierarchies is quite well understood, no fully conclusive mechanism as yet exists for part-whole reasoning. The approach we take emulates part-whole reasoning via classification-based reasoning using SEP triplets, a special data structure for encoding part-whole relations that is fully embedded in the formal framework of standard description logics.

  15. On some methods for improving time of reachability sets computation for the dynamic system control problem

    Science.gov (United States)

    Zimovets, Artem; Matviychuk, Alexander; Ushakov, Vladimir

    2016-12-01

    The paper presents two different approaches to reduce the time of computer calculation of reachability sets. First of these two approaches use different data structures for storing the reachability sets in the computer memory for calculation in single-threaded mode. Second approach is based on using parallel algorithms with reference to the data structures from the first approach. Within the framework of this paper parallel algorithm of approximate reachability set calculation on computer with SMP-architecture is proposed. The results of numerical modelling are presented in the form of tables which demonstrate high efficiency of parallel computing technology and also show how computing time depends on the used data structure.

  16. A computationally simple and robust method to detect determinism in a time series

    DEFF Research Database (Denmark)

    Lu, Sheng; Ju, Ki Hwan; Kanters, Jørgen K.

    2006-01-01

    We present a new, simple, and fast computational technique, termed the incremental slope (IS), that can accurately distinguish between deterministic from stochastic systems even when the variance of noise is as large or greater than the signal, and remains robust for time-varying signals. The IS ......We present a new, simple, and fast computational technique, termed the incremental slope (IS), that can accurately distinguish between deterministic from stochastic systems even when the variance of noise is as large or greater than the signal, and remains robust for time-varying signals...

  17. Short-term electric load forecasting using computational intelligence methods

    OpenAIRE

    Jurado, Sergio; Peralta, J.; Nebot, Àngela; Mugica, Francisco; Cortez, Paulo

    2013-01-01

    Accurate time series forecasting is a key issue to support individual and organizational decision making. In this paper, we introduce several methods for short-term electric load forecasting. All the presented methods stem from computational intelligence techniques: Random Forest, Nonlinear Autoregressive Neural Networks, Evolutionary Support Vector Machines and Fuzzy Inductive Reasoning. The performance of the suggested methods is experimentally justified with several experiments carried out...

  18. Hardware architecture design of image restoration based on time-frequency domain computation

    Science.gov (United States)

    Wen, Bo; Zhang, Jing; Jiao, Zipeng

    2013-10-01

    The image restoration algorithms based on time-frequency domain computation is high maturity and applied widely in engineering. To solve the high-speed implementation of these algorithms, the TFDC hardware architecture is proposed. Firstly, the main module is designed, by analyzing the common processing and numerical calculation. Then, to improve the commonality, the iteration control module is planed for iterative algorithms. In addition, to reduce the computational cost and memory requirements, the necessary optimizations are suggested for the time-consuming module, which include two-dimensional FFT/IFFT and the plural calculation. Eventually, the TFDC hardware architecture is adopted for hardware design of real-time image restoration system. The result proves that, the TFDC hardware architecture and its optimizations can be applied to image restoration algorithms based on TFDC, with good algorithm commonality, hardware realizability and high efficiency.

  19. Online Operation Guidance of Computer System Used in Real-Time Distance Education Environment

    Science.gov (United States)

    He, Aiguo

    2011-01-01

    Computer system is useful for improving real time and interactive distance education activities. Especially in the case that a large number of students participate in one distance lecture together and every student uses their own computer to share teaching materials or control discussions over the virtual classrooms. The problem is that within…

  20. ERVING: A Program to Teach Sociological Reasoning from the Dramaturgical Perspective.

    Science.gov (United States)

    Brent, Edward E., Jr.; And Others

    1989-01-01

    Describes the computer program labeled ERVING which teaches students to reason sociologically using E. Goffman's classic dramaturgical perspective. Suggests that artificial intelligence programs offer a means for developing instructional models for the social sciences which are not amenable to a quantitative approach. (KO)

  1. A real-time computational model for estimating kinematics of ankle ligaments.

    Science.gov (United States)

    Zhang, Mingming; Davies, T Claire; Zhang, Yanxin; Xie, Sheng Quan

    2016-01-01

    An accurate assessment of ankle ligament kinematics is crucial in understanding the injury mechanisms and can help to improve the treatment of an injured ankle, especially when used in conjunction with robot-assisted therapy. A number of computational models have been developed and validated for assessing the kinematics of ankle ligaments. However, few of them can do real-time assessment to allow for an input into robotic rehabilitation programs. An ankle computational model was proposed and validated to quantify the kinematics of ankle ligaments as the foot moves in real-time. This model consists of three bone segments with three rotational degrees of freedom (DOFs) and 12 ankle ligaments. This model uses inputs for three position variables that can be measured from sensors in many ankle robotic devices that detect postures within the foot-ankle environment and outputs the kinematics of ankle ligaments. Validation of this model in terms of ligament length and strain was conducted by comparing it with published data on cadaver anatomy and magnetic resonance imaging. The model based on ligament lengths and strains is in concurrence with those from the published studies but is sensitive to ligament attachment positions. This ankle computational model has the potential to be used in robot-assisted therapy for real-time assessment of ligament kinematics. The results provide information regarding the quantification of kinematics associated with ankle ligaments related to the disability level and can be used for optimizing the robotic training trajectory.

  2. Real time simulation of large systems on mini-computer

    International Nuclear Information System (INIS)

    Nakhle, Michel; Roux, Pierre.

    1979-01-01

    Most simulation languages will only accept an explicit formulation of differential equations, and logical variables hold no special status therein. The pace of the suggested methods of integration is limited by the smallest time constant of the model submitted. The NEPTUNIX 2 simulation software has a language that will take implicit equations and an integration method of which the variable pace is not limited by the time constants of the model. This, together with high time and memory ressources optimization of the code generated, makes NEPTUNIX 2 a basic tool for simulation on mini-computers. Since the logical variables are specific entities under centralized control, correct processing of discontinuities and synchronization with a real process are feasible. The NEPTUNIX 2 is the industrial version of NEPTUNIX 1 [fr

  3. The Fundamental Reasons Why Laptop Computers should not be Used on Your Lap

    Directory of Open Access Journals (Sweden)

    Mortazavi S. A. R.

    2016-12-01

    Full Text Available As a tendency to use new technologies, gadgets such as laptop computers are becoming more popular among students, teachers, businessmen and office workers. Today laptops are a great tool for education and learning, work and personal multimedia. Millions of men, especially those in the reproductive age, are frequently using their laptop computers on the lap (thigh. Over the past several years, our lab has focused on the health effects of exposure to different sources of electromagnetic fields such as cellular phones, mobile base stations, mobile phone jammers, laptop computers, radars, dentistry cavitrons and Magnetic Resonance Imaging (MRI. Our own studies as well as the studies performed by other researchers indicate that using laptop computers on the lap adversely affects the male reproductive health. When it is placed on the lap, not only the heat from a laptop computer can warm men’s scrotums, the electromagnetic fields generated by laptop’s internal electronic circuits as well as the Wi-Fi Radiofrequency radiation hazards (in a Wi-Fi connected laptop may decrease sperm quality. Furthermore, due to poor working posture, laptops should not be used on the lap for long hours.

  4. Real-time computing in environmental monitoring of a nuclear power plant

    International Nuclear Information System (INIS)

    Deme, S.; Lang, E.; Nagy, Gy.

    1987-06-01

    A real-time computing method is described for calculating the environmental radiation exposure due to a nuclear power plant both at normal operation and at accident. The effects of the Gaussian plume are recalculated in every ten minutes based on meteorological parameters measured at a height of 20 and 120 m as well as on emission data. At normal operation the quantity of radioactive materials released through the stacks is measured and registered while, at an accident, the source strength is unknown and the calculated relative data are normalized to the values measured at the eight environmental monitoring stations. The doses due to noble gases and to dry and wet deposition as well as the time integral of 131 I concentration are calculated and stored by a professional personal computer for 720 points of the environment of 11 km radius. (author)

  5. Design and development of a run-time monitor for multi-core architectures in cloud computing.

    Science.gov (United States)

    Kang, Mikyung; Kang, Dong-In; Crago, Stephen P; Park, Gyung-Leen; Lee, Junghoon

    2011-01-01

    Cloud computing is a new information technology trend that moves computing and data away from desktops and portable PCs into large data centers. The basic principle of cloud computing is to deliver applications as services over the Internet as well as infrastructure. A cloud is a type of parallel and distributed system consisting of a collection of inter-connected and virtualized computers that are dynamically provisioned and presented as one or more unified computing resources. The large-scale distributed applications on a cloud require adaptive service-based software, which has the capability of monitoring system status changes, analyzing the monitored information, and adapting its service configuration while considering tradeoffs among multiple QoS features simultaneously. In this paper, we design and develop a Run-Time Monitor (RTM) which is a system software to monitor the application behavior at run-time, analyze the collected information, and optimize cloud computing resources for multi-core architectures. RTM monitors application software through library instrumentation as well as underlying hardware through a performance counter optimizing its computing configuration based on the analyzed data.

  6. Design and Development of a Run-Time Monitor for Multi-Core Architectures in Cloud Computing

    Directory of Open Access Journals (Sweden)

    Junghoon Lee

    2011-03-01

    Full Text Available Cloud computing is a new information technology trend that moves computing and data away from desktops and portable PCs into large data centers. The basic principle of cloud computing is to deliver applications as services over the Internet as well as infrastructure. A cloud is a type of parallel and distributed system consisting of a collection of inter-connected and virtualized computers that are dynamically provisioned and presented as one or more unified computing resources. The large-scale distributed applications on a cloud require adaptive service-based software, which has the capability of monitoring system status changes, analyzing the monitored information, and adapting its service configuration while considering tradeoffs among multiple QoS features simultaneously. In this paper, we design and develop a Run-Time Monitor (RTM which is a system software to monitor the application behavior at run-time, analyze the collected information, and optimize cloud computing resources for multi-core architectures. RTM monitors application software through library instrumentation as well as underlying hardware through a performance counter optimizing its computing configuration based on the analyzed data.

  7. Computations of concentration of radon and its decay products against time. Computer program; Obliczanie koncentracji radonu i jego produktow rozpadu w funkcji czasu. Program komputerowy

    Energy Technology Data Exchange (ETDEWEB)

    Machaj, B. [Institute of Nuclear Chemistry and Technology, Warsaw (Poland)

    1996-12-31

    This research is aimed to develop a device for continuous monitoring of radon in the air, by measuring alpha activity of radon and its short lived decay products. The influence of alpha activity variation of radon and its daughters on the measured results is of importance and requires a knowledge of this variation with time. Employing the measurement of alpha radiation of radon and of its short lived decay products, require knowledge of radon concentration variation and its decay products against the time. A computer program in Turbo Pascal language was therefore developed performing the computations employing the known relations involved, the program being adapted for IBM PC computers. The presented program enables computation of activity of {sup 222}Rn and its daughter products: {sup 218}Po, {sup 214}Pb, {sup 214}Bi and {sup 214}Po every 1 min within the period of 0-255 min for any state of radiation equilibrium between the radon and its daughter products. The program permits also to compute alpha activity of {sup 222}Rn + {sup 218}Po + {sup 214}Po against time and the total alpha activity at selected interval of time. The results of computations are stored on the computer hard disk in ASCII format and are used a graphic program e.g. by DrawPerfect program to make diagrams. Equations employed for computation of the alpha activity of radon and its decay products as well as the description of program functions are given. (author). 2 refs, 4 figs.

  8. Bound on quantum computation time: Quantum error correction in a critical environment

    International Nuclear Information System (INIS)

    Novais, E.; Mucciolo, Eduardo R.; Baranger, Harold U.

    2010-01-01

    We obtain an upper bound on the time available for quantum computation for a given quantum computer and decohering environment with quantum error correction implemented. First, we derive an explicit quantum evolution operator for the logical qubits and show that it has the same form as that for the physical qubits but with a reduced coupling strength to the environment. Using this evolution operator, we find the trace distance between the real and ideal states of the logical qubits in two cases. For a super-Ohmic bath, the trace distance saturates, while for Ohmic or sub-Ohmic baths, there is a finite time before the trace distance exceeds a value set by the user.

  9. The Contribution of Reasoning to the Utilization of Feedback from Software When Solving Mathematical Problems

    Science.gov (United States)

    Olsson, Jan

    2018-01-01

    This study investigates how students' reasoning contributes to their utilization of computer-generated feedback. Sixteen 16-year-old students solved a linear function task designed to present a challenge to them using dynamic software, GeoGebra, for assistance. The data were analysed with respect both to character of reasoning and to the use of…

  10. Prospective elementary and secondary school mathematics teachers’ statistical reasoning

    Directory of Open Access Journals (Sweden)

    Rabia KARATOPRAK

    2015-04-01

    Full Text Available This study investigated prospective elementary (PEMTs and secondary (PSMTs school mathematics teachers’ statistical reasoning. The study began with the adaptation of the Statistical Reasoning Assessment (Garfield, 2003 test. Then, the test was administered to 82 PEMTs and 91 PSMTs in a metropolitan city of Turkey. Results showed that both groups were equally successful in understanding independence, and understanding importance of large samples. However, results from selecting appropriate measures of center together with the misconceptions assessing the same subscales showed that both groups selected mode rather than mean as an appropriate average. This suggested their lack of attention to the categorical and interval/ratio variables while examining data. Similarly, both groups were successful in interpreting and computing probability; however, they had equiprobability bias, law of small numbers and representativeness misconceptions. The results imply a change in some questions in the Statistical Reasoning Assessment test and that teacher training programs should include statistics courses focusing on studying characteristics of samples.

  11. Activity in the fronto-parietal network indicates numerical inductive reasoning beyond calculation: An fMRI study combined with a cognitive model.

    Science.gov (United States)

    Liang, Peipeng; Jia, Xiuqin; Taatgen, Niels A; Borst, Jelmer P; Li, Kuncheng

    2016-05-19

    Numerical inductive reasoning refers to the process of identifying and extrapolating the rule involved in numeric materials. It is associated with calculation, and shares the common activation of the fronto-parietal regions with calculation, which suggests that numerical inductive reasoning may correspond to a general calculation process. However, compared with calculation, rule identification is critical and unique to reasoning. Previous studies have established the central role of the fronto-parietal network for relational integration during rule identification in numerical inductive reasoning. The current question of interest is whether numerical inductive reasoning exclusively corresponds to calculation or operates beyond calculation, and whether it is possible to distinguish between them based on the activity pattern in the fronto-parietal network. To directly address this issue, three types of problems were created: numerical inductive reasoning, calculation, and perceptual judgment. Our results showed that the fronto-parietal network was more active in numerical inductive reasoning which requires more exchanges between intermediate representations and long-term declarative knowledge during rule identification. These results survived even after controlling for the covariates of response time and error rate. A computational cognitive model was developed using the cognitive architecture ACT-R to account for the behavioral results and brain activity in the fronto-parietal network.

  12. The Educator´s Approach to Media Training and Computer Games within Leisure Time of School-children

    OpenAIRE

    MORAVCOVÁ, Dagmar

    2009-01-01

    The paper describes possible ways of approaching computer games playing as part of leisure time of school-children and deals with the significance of media training in leisure time. At first it specifies the concept of leisure time and its functions, then shows some positive and negative effects of the media. It further describes classical computer games, the problem of excess computer game playing and means of prevention. The paper deals with the educator's personality and the importance of ...

  13. Information processing systems, reasoning modules, and reasoning system design methods

    Science.gov (United States)

    Hohimer, Ryan E; Greitzer, Frank L; Hampton, Shawn D

    2014-03-04

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  14. Information processing systems, reasoning modules, and reasoning system design methods

    Energy Technology Data Exchange (ETDEWEB)

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2016-08-23

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  15. Information processing systems, reasoning modules, and reasoning system design methods

    Energy Technology Data Exchange (ETDEWEB)

    Hohimer, Ryan E.; Greitzer, Frank L.; Hampton, Shawn D.

    2015-08-18

    Information processing systems, reasoning modules, and reasoning system design methods are described. According to one aspect, an information processing system includes working memory comprising a semantic graph which comprises a plurality of abstractions, wherein the abstractions individually include an individual which is defined according to an ontology and a reasoning system comprising a plurality of reasoning modules which are configured to process different abstractions of the semantic graph, wherein a first of the reasoning modules is configured to process a plurality of abstractions which include individuals of a first classification type of the ontology and a second of the reasoning modules is configured to process a plurality of abstractions which include individuals of a second classification type of the ontology, wherein the first and second classification types are different.

  16. Flexible structure control experiments using a real-time workstation for computer-aided control engineering

    Science.gov (United States)

    Stieber, Michael E.

    1989-01-01

    A Real-Time Workstation for Computer-Aided Control Engineering has been developed jointly by the Communications Research Centre (CRC) and Ruhr-Universitaet Bochum (RUB), West Germany. The system is presently used for the development and experimental verification of control techniques for large space systems with significant structural flexibility. The Real-Time Workstation essentially is an implementation of RUB's extensive Computer-Aided Control Engineering package KEDDC on an INTEL micro-computer running under the RMS real-time operating system. The portable system supports system identification, analysis, control design and simulation, as well as the immediate implementation and test of control systems. The Real-Time Workstation is currently being used by CRC to study control/structure interaction on a ground-based structure called DAISY, whose design was inspired by a reflector antenna. DAISY emulates the dynamics of a large flexible spacecraft with the following characteristics: rigid body modes, many clustered vibration modes with low frequencies and extremely low damping. The Real-Time Workstation was found to be a very powerful tool for experimental studies, supporting control design and simulation, and conducting and evaluating tests withn one integrated environment.

  17. Applying spatial reasoning to topographical data with a grounded geographical ontology

    OpenAIRE

    Mallenby, D.; Bennett, B.

    2007-01-01

    Grounding an ontology upon geographical data has been pro-\\ud posed as a method of handling the vagueness in the domain more effectively. In order to do this, we require methods of reasoning about the spatial relations between the regions within the data. This stage can be computationally expensive, as we require information on the location of\\ud points in relation to each other. This paper illustrates how using knowledge about regions allows us to reduce the computation required in an effici...

  18. A Real-Time Plagiarism Detection Tool for Computer-Based Assessments

    Science.gov (United States)

    Jeske, Heimo J.; Lall, Manoj; Kogeda, Okuthe P.

    2018-01-01

    Aim/Purpose: The aim of this article is to develop a tool to detect plagiarism in real time amongst students being evaluated for learning in a computer-based assessment setting. Background: Cheating or copying all or part of source code of a program is a serious concern to academic institutions. Many academic institutions apply a combination of…

  19. Instructional Advice, Time Advice and Learning Questions in Computer Simulations

    Science.gov (United States)

    Rey, Gunter Daniel

    2010-01-01

    Undergraduate students (N = 97) used an introductory text and a computer simulation to learn fundamental concepts about statistical analyses (e.g., analysis of variance, regression analysis and General Linear Model). Each learner was randomly assigned to one cell of a 2 (with or without instructional advice) x 2 (with or without time advice) x 2…

  20. How can we study reasoning in the brain?

    Science.gov (United States)

    Papo, David

    2015-01-01

    The brain did not develop a dedicated device for reasoning. This fact bears dramatic consequences. While for perceptuo-motor functions neural activity is shaped by the input's statistical properties, and processing is carried out at high speed in hardwired spatially segregated modules, in reasoning, neural activity is driven by internal dynamics and processing times, stages, and functional brain geometry are largely unconstrained a priori. Here, it is shown that the complex properties of spontaneous activity, which can be ignored in a short-lived event-related world, become prominent at the long time scales of certain forms of reasoning. It is argued that the neural correlates of reasoning should in fact be defined in terms of non-trivial generic properties of spontaneous brain activity, and that this implies resorting to concepts, analytical tools, and ways of designing experiments that are as yet non-standard in cognitive neuroscience. The implications in terms of models of brain activity, shape of the neural correlates, methods of data analysis, observability of the phenomenon, and experimental designs are discussed.

  1. Alternative majority-voting methods for real-time computing systems

    Science.gov (United States)

    Shin, Kang G.; Dolter, James W.

    1989-01-01

    Two techniques that provide a compromise between the high time overhead in maintaining synchronous voting and the difficulty of combining results in asynchronous voting are proposed. These techniques are specifically suited for real-time applications with a single-source/single-sink structure that need instantaneous error masking. They provide a compromise between a tightly synchronized system in which the synchronization overhead can be quite high, and an asynchronous system which lacks suitable algorithms for combining the output data. Both quorum-majority voting (QMV) and compare-majority voting (CMV) are most applicable to distributed real-time systems with single-source/single-sink tasks. All real-time systems eventually have to resolve their outputs into a single action at some stage. The development of the advanced information processing system (AIPS) and other similar systems serve to emphasize the importance of these techniques. Time bounds suggest that it is possible to reduce the overhead for quorum-majority voting to below that for synchronous voting. All the bounds assume that the computation phase is nonpreemptive and that there is no multitasking.

  2. Reducing the throughput time of the diagnostic track involving CT scanning with computer simulation

    Energy Technology Data Exchange (ETDEWEB)

    Lent, Wineke A.M. van, E-mail: w.v.lent@nki.nl [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital (NKI-AVL), P.O. Box 90203, 1006 BE Amsterdam (Netherlands); University of Twente, IGS Institute for Innovation and Governance Studies, Department of Health Technology Services Research (HTSR), Enschede (Netherlands); Deetman, Joost W., E-mail: j.deetman@nki.nl [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital (NKI-AVL), P.O. Box 90203, 1006 BE Amsterdam (Netherlands); Teertstra, H. Jelle, E-mail: h.teertstra@nki.nl [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital (NKI-AVL), P.O. Box 90203, 1006 BE Amsterdam (Netherlands); Muller, Sara H., E-mail: s.muller@nki.nl [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital (NKI-AVL), P.O. Box 90203, 1006 BE Amsterdam (Netherlands); Hans, Erwin W., E-mail: e.w.hans@utwente.nl [University of Twente, School of Management and Governance, Dept. of Industrial Engineering and Business Intelligence Systems, Enschede (Netherlands); Harten, Wim H. van, E-mail: w.v.harten@nki.nl [Netherlands Cancer Institute - Antoni van Leeuwenhoek Hospital (NKI-AVL), P.O. Box 90203, 1006 BE Amsterdam (Netherlands); University of Twente, IGS Institute for Innovation and Governance Studies, Department of Health Technology Services Research (HTSR), Enschede (Netherlands)

    2012-11-15

    Introduction: To examine the use of computer simulation to reduce the time between the CT request and the consult in which the CT report is discussed (diagnostic track) while restricting idle time and overtime. Methods: After a pre implementation analysis in our case study hospital, by computer simulation three scenarios were evaluated on access time, overtime and idle time of the CT; after implementation these same aspects were evaluated again. Effects on throughput time were measured for outpatient short-term and urgent requests only. Conclusion: The pre implementation analysis showed an average CT access time of 9.8 operating days and an average diagnostic track of 14.5 operating days. Based on the outcomes of the simulation, management changed the capacity for the different patient groups to facilitate a diagnostic track of 10 operating days, with a CT access time of 7 days. After the implementation of changes, the average diagnostic track duration was 12.6 days with an average CT access time of 7.3 days. The fraction of patients with a total throughput time within 10 days increased from 29% to 44% while the utilization remained equal with 82%, the idle time increased by 11% and the overtime decreased by 82%. The fraction of patients that completed the diagnostic track within 10 days improved with 52%. Computer simulation proved useful for studying the effects of proposed scenarios in radiology management. Besides the tangible effects, the simulation increased the awareness that optimizing capacity allocation can reduce access times.

  3. Reducing the throughput time of the diagnostic track involving CT scanning with computer simulation

    International Nuclear Information System (INIS)

    Lent, Wineke A.M. van; Deetman, Joost W.; Teertstra, H. Jelle; Muller, Sara H.; Hans, Erwin W.; Harten, Wim H. van

    2012-01-01

    Introduction: To examine the use of computer simulation to reduce the time between the CT request and the consult in which the CT report is discussed (diagnostic track) while restricting idle time and overtime. Methods: After a pre implementation analysis in our case study hospital, by computer simulation three scenarios were evaluated on access time, overtime and idle time of the CT; after implementation these same aspects were evaluated again. Effects on throughput time were measured for outpatient short-term and urgent requests only. Conclusion: The pre implementation analysis showed an average CT access time of 9.8 operating days and an average diagnostic track of 14.5 operating days. Based on the outcomes of the simulation, management changed the capacity for the different patient groups to facilitate a diagnostic track of 10 operating days, with a CT access time of 7 days. After the implementation of changes, the average diagnostic track duration was 12.6 days with an average CT access time of 7.3 days. The fraction of patients with a total throughput time within 10 days increased from 29% to 44% while the utilization remained equal with 82%, the idle time increased by 11% and the overtime decreased by 82%. The fraction of patients that completed the diagnostic track within 10 days improved with 52%. Computer simulation proved useful for studying the effects of proposed scenarios in radiology management. Besides the tangible effects, the simulation increased the awareness that optimizing capacity allocation can reduce access times.

  4. Modeling of requirement specification for safety critical real time computer system using formal mathematical specifications

    International Nuclear Information System (INIS)

    Sankar, Bindu; Sasidhar Rao, B.; Ilango Sambasivam, S.; Swaminathan, P.

    2002-01-01

    Full text: Real time computer systems are increasingly used for safety critical supervision and control of nuclear reactors. Typical application areas are supervision of reactor core against coolant flow blockage, supervision of clad hot spot, supervision of undesirable power excursion, power control and control logic for fuel handling systems. The most frequent cause of fault in safety critical real time computer system is traced to fuzziness in requirement specification. To ensure the specified safety, it is necessary to model the requirement specification of safety critical real time computer systems using formal mathematical methods. Modeling eliminates the fuzziness in the requirement specification and also helps to prepare the verification and validation schemes. Test data can be easily designed from the model of the requirement specification. Z and B are the popular languages used for modeling the requirement specification. A typical safety critical real time computer system for supervising the reactor core of prototype fast breeder reactor (PFBR) against flow blockage is taken as case study. Modeling techniques and the actual model are explained in detail. The advantages of modeling for ensuring the safety are summarized

  5. Expert system reasoning techniques applicable to the electric power industry

    International Nuclear Information System (INIS)

    Touchton, R.A.

    1987-01-01

    This paper describes the applicability of three problem solving paradigms adopted from the artificial intelligence discipline of computer sciences, which have been used in developing nuclear plant expert systems. Each technique is briefly defined and an example is presented that shows how that technique was used in developing an expert system application prototype. The three paradigms and their associated example systems are: (1) rule-based reasoning: reactor emergency action level monitor (REALM) for the Electric Power Research Institute, (2) object-oriented programming: accident diagnosis and prognosis aid for the US Department of Energy, and (3) model-based reasoning: knowledge-based monitoring and control system for the Electric Power Research Institute

  6. Analytic and heuristic processing influences on adolescent reasoning and decision-making.

    Science.gov (United States)

    Klaczynski, P A

    2001-01-01

    The normative/descriptive gap is the discrepancy between actual reasoning and traditional standards for reasoning. The relationship between age and the normative/descriptive gap was examined by presenting adolescents with a battery of reasoning and decision-making tasks. Middle adolescents (N = 76) performed closer to normative ideals than early adolescents (N = 66), although the normative/descriptive gap was large for both groups. Correlational analyses revealed that (1) normative responses correlated positively with each other, (2) nonnormative responses were positively interrelated, and (3) normative and nonnormative responses were largely independent. Factor analyses suggested that performance was based on two processing systems. The "analytic" system operates on "decontextualized" task representations and underlies conscious, computational reasoning. The "heuristic" system operates on "contextualized," content-laden representations and produces "cognitively cheap" responses that sometimes conflict with traditional norms. Analytic processing was more clearly linked to age and to intelligence than heuristic processing. Implications for cognitive development, the competence/performance issue, and rationality are discussed.

  7. Seismic response of three-dimensional topographies using a time-domain boundary element method

    Science.gov (United States)

    Janod, François; Coutant, Olivier

    2000-08-01

    We present a time-domain implementation for a boundary element method (BEM) to compute the diffraction of seismic waves by 3-D topographies overlying a homogeneous half-space. This implementation is chosen to overcome the memory limitations arising when solving the boundary conditions with a frequency-domain approach. This formulation is flexible because it allows one to make an adaptive use of the Green's function time translation properties: the boundary conditions solving scheme can be chosen as a trade-off between memory and cpu requirements. We explore here an explicit method of solution that requires little memory but a high cpu cost in order to run on a workstation computer. We obtain good results with four points per minimum wavelength discretization for various topographies and plane wave excitations. This implementation can be used for two different aims: the time-domain approach allows an easier implementation of the BEM in hybrid methods (e.g. coupling with finite differences), and it also allows one to run simple BEM models with reasonable computer requirements. In order to keep reasonable computation times, we do not introduce any interface and we only consider homogeneous models. Results are shown for different configurations: an explosion near a flat free surface, a plane wave vertically incident on a Gaussian hill and on a hemispherical cavity, and an explosion point below the surface of a Gaussian hill. Comparison is made with other numerical methods, such as finite difference methods (FDMs) and spectral elements.

  8. Computer Profiling Based Model for Investigation

    OpenAIRE

    Neeraj Choudhary; Nikhil Kumar Singh; Parmalik Singh

    2011-01-01

    Computer profiling is used for computer forensic analysis, and proposes and elaborates on a novel model for use in computer profiling, the computer profiling object model. The computer profiling object model is an information model which models a computer as objects with various attributes and inter-relationships. These together provide the information necessary for a human investigator or an automated reasoning engine to make judgments as to the probable usage and evidentiary value of a comp...

  9. Image reconstruction of computed tomograms using functional algebra

    International Nuclear Information System (INIS)

    Bradaczek, M.; Bradaczek, H.

    1997-01-01

    A detailed presentation of the process for calculating computed tomograms from the measured data by means of functional algebra is given and an attempt is made to demonstrate the relationships to those inexperienced in mathematics. Suggestions are also made to the manufacturers for improving tomography software although the authors cannot exclude the possibility that some of the recommendations may have already been realized. An interpolation in Fourier space to right-angled coordinates was not employed so that additional computer time and errors resulting from the interpolation are avoided. The savings in calculation time can only be estimated but should amount to about 25%. The error-correction calculation is merely a suggestion since it depends considerably on the apparatus used. Functional algebra is introduced here because it is not so well known but does provide appreciable simplifications in comparison to an explicit presentation. Didactic reasons as well as the possibility for reducing calculation time provided the foundation for this work. (orig.) [de

  10. Computing the universe: how large-scale simulations illuminate galaxies and dark energy

    Science.gov (United States)

    O'Shea, Brian

    2015-04-01

    High-performance and large-scale computing is absolutely to understanding astronomical objects such as stars, galaxies, and the cosmic web. This is because these are structures that operate on physical, temporal, and energy scales that cannot be reasonably approximated in the laboratory, and whose complexity and nonlinearity often defies analytic modeling. In this talk, I show how the growth of computing platforms over time has facilitated our understanding of astrophysical and cosmological phenomena, focusing primarily on galaxies and large-scale structure in the Universe.

  11. The relationship between TV/computer time and adolescents' health-promoting behavior: a secondary data analysis.

    Science.gov (United States)

    Chen, Mei-Yen; Liou, Yiing-Mei; Wu, Jen-Yee

    2008-03-01

    Television and computers provide significant benefits for learning about the world. Some studies have linked excessive television (TV) watching or computer game playing to disadvantage of health status or some unhealthy behavior among adolescents. However, the relationships between watching TV/playing computer games and adolescents adopting health promoting behavior were limited. This study aimed to discover the relationship between time spent on watching TV and on leisure use of computers and adolescents' health promoting behavior, and associated factors. This paper used secondary data analysis from part of a health promotion project in Taoyuan County, Taiwan. A cross-sectional design was used and purposive sampling was conducted among adolescents in the original project. A total of 660 participants answered the questions appropriately for this work between January and June 2004. Findings showed the mean age of the respondents was 15.0 +/- 1.7 years. The mean numbers of TV watching hours were 2.28 and 4.07 on weekdays and weekends respectively. The mean hours of leisure (non-academic) computer use were 1.64 and 3.38 on weekdays and weekends respectively. Results indicated that adolescents spent significant time watching TV and using the computer, which was negatively associated with adopting health-promoting behaviors such as life appreciation, health responsibility, social support and exercise behavior. Moreover, being boys, being overweight, living in a rural area, and being middle-school students were significantly associated with spending long periods watching TV and using the computer. Therefore, primary health care providers should record the TV and non-academic computer time of youths when conducting health promotion programs, and educate parents on how to become good and healthy electronic media users.

  12. An image-based approach to the rendering of crowds in real-time

    OpenAIRE

    Tecchia, Franco

    2007-01-01

    The wide use of computer graphics in games, entertainment, medical, architectural and cultural applications, has led it to becoming a prevalent area of research. Games and entertainment in general have become one of the driving forces of the real-time computer graphics industry, bringing reasonably realistic, complex and appealing virtual worlds to the mass-market. At the current stage of technology, an user can interactively navigate through complex, polygon-based scenes rendered with sophis...

  13. Joint Time-Frequency-Space Classification of EEG in a Brain-Computer Interface Application

    Directory of Open Access Journals (Sweden)

    Molina Gary N Garcia

    2003-01-01

    Full Text Available Brain-computer interface is a growing field of interest in human-computer interaction with diverse applications ranging from medicine to entertainment. In this paper, we present a system which allows for classification of mental tasks based on a joint time-frequency-space decorrelation, in which mental tasks are measured via electroencephalogram (EEG signals. The efficiency of this approach was evaluated by means of real-time experimentations on two subjects performing three different mental tasks. To do so, a number of protocols for visualization, as well as training with and without feedback, were also developed. Obtained results show that it is possible to obtain good classification of simple mental tasks, in view of command and control, after a relatively small amount of training, with accuracies around 80%, and in real time.

  14. Computational model for real-time determination of tritium inventory in a detritiation installation

    International Nuclear Information System (INIS)

    Bornea, Anisia; Stefanescu, Ioan; Zamfirache, Marius; Stefan, Iuliana; Sofalca, Nicolae; Bidica, Nicolae

    2008-01-01

    Full text: At ICIT Rm.Valcea an experimental pilot plant was built having as main objective the development of a technology for detritiation of heavy water processed in the CANDU-type reactors of the nuclear power plant at Cernavoda, Romania. The aspects related to safeguards and safety for such a detritiation installation being of great importance, a complex computational model has been developed. The model allows real-time calculation of tritium inventory in a working installation. The applied detritiation technology is catalyzed isotopic exchange coupled with cryogenic distillation. Computational models for non-steady working conditions have been developed for each process of isotopic exchange. By coupling these processes tritium inventory can be determined in real-time. The computational model was developed based on the experience gained on the pilot installation. The model uses a set of parameters specific to isotopic exchange processes. These parameters were experimentally determined in the pilot installation. The model is included in the monitoring system and uses as input data the parameters acquired in real-time from automation system of the pilot installation. A friendly interface has been created to visualize the final results as data or graphs. (authors)

  15. Modeling visual problem solving as analogical reasoning.

    Science.gov (United States)

    Lovett, Andrew; Forbus, Kenneth

    2017-01-01

    We present a computational model of visual problem solving, designed to solve problems from the Raven's Progressive Matrices intelligence test. The model builds on the claim that analogical reasoning lies at the heart of visual problem solving, and intelligence more broadly. Images are compared via structure mapping, aligning the common relational structure in 2 images to identify commonalities and differences. These commonalities or differences can themselves be reified and used as the input for future comparisons. When images fail to align, the model dynamically rerepresents them to facilitate the comparison. In our analysis, we find that the model matches adult human performance on the Standard Progressive Matrices test, and that problems which are difficult for the model are also difficult for people. Furthermore, we show that model operations involving abstraction and rerepresentation are particularly difficult for people, suggesting that these operations may be critical for performing visual problem solving, and reasoning more generally, at the highest level. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  16. Context-dependent Reasoning for the Semantic Web

    Directory of Open Access Journals (Sweden)

    Neli P. Zlatareva

    2011-08-01

    Full Text Available Ontologies are the backbone of the emerging Semantic Web, which is envisioned to dramatically improve current web services by extending them with intelligent capabilities such as reasoning and context-awareness. They define a shared vocabulary of common domains accessible to both, humans and computers, and support various types of information management including storage and processing of data. Current ontology languages, which are designed to be decidable to allow for automatic data processing, target simple typed ontologies that are completely and consistently specified. As the size of ontologies and the complexity of web applications grow, the need for more flexible representation and reasoning schemes emerges. This article presents a logical framework utilizing context-dependent rules which are intended to support not fully and/or precisely specified ontologies. A hypothetical application scenario is described to illustrate the type of ontologies targeted, and the type of queries that the presented logical framework is intended to address.

  17. A real-time computer simulation of nuclear simulator software using standard PC hardware and linux environments

    International Nuclear Information System (INIS)

    Cha, K. H.; Kweon, K. C.

    2001-01-01

    A feasibility study, which standard PC hardware and Real-Time Linux are applied to real-time computer simulation of software for a nuclear simulator, is presented in this paper. The feasibility prototype was established with the existing software in the Compact Nuclear Simulator (CNS). Throughout the real-time implementation in the feasibility prototype, we has identified that the approach can enable the computer-based predictive simulation to be approached, due to both the remarkable improvement in real-time performance and the less efforts for real-time implementation under standard PC hardware and Real-Time Linux envrionments

  18. Reasoning by analogy as an aid to heuristic theorem proving.

    Science.gov (United States)

    Kling, R. E.

    1972-01-01

    When heuristic problem-solving programs are faced with large data bases that contain numbers of facts far in excess of those needed to solve any particular problem, their performance rapidly deteriorates. In this paper, the correspondence between a new unsolved problem and a previously solved analogous problem is computed and invoked to tailor large data bases to manageable sizes. This paper outlines the design of an algorithm for generating and exploiting analogies between theorems posed to a resolution-logic system. These algorithms are believed to be the first computationally feasible development of reasoning by analogy to be applied to heuristic theorem proving.

  19. Software Accelerates Computing Time for Complex Math

    Science.gov (United States)

    2014-01-01

    Ames Research Center awarded Newark, Delaware-based EM Photonics Inc. SBIR funding to utilize graphic processing unit (GPU) technology- traditionally used for computer video games-to develop high-computing software called CULA. The software gives users the ability to run complex algorithms on personal computers with greater speed. As a result of the NASA collaboration, the number of employees at the company has increased 10 percent.

  20. Computational imaging with multi-camera time-of-flight systems

    KAUST Repository

    Shrestha, Shikhar

    2016-07-11

    Depth cameras are a ubiquitous technology used in a wide range of applications, including robotic and machine vision, human computer interaction, autonomous vehicles as well as augmented and virtual reality. In this paper, we explore the design and applications of phased multi-camera time-of-flight (ToF) systems. We develop a reproducible hardware system that allows for the exposure times and waveforms of up to three cameras to be synchronized. Using this system, we analyze waveform interference between multiple light sources in ToF applications and propose simple solutions to this problem. Building on the concept of orthogonal frequency design, we demonstrate state-of-the-art results for instantaneous radial velocity capture via Doppler time-of-flight imaging and we explore new directions for optically probing global illumination, for example by de-scattering dynamic scenes and by non-line-of-sight motion detection via frequency gating. © 2016 ACM.

  1. BigSR: an empirical study of real-time expressive RDF stream reasoning on modern Big Data platforms

    OpenAIRE

    Ren, Xiangnan; Curé, Olivier; Naacke, Hubert; Xiao, Guohui

    2018-01-01

    The trade-off between language expressiveness and system scalability (E&S) is a well-known problem in RDF stream reasoning. Higher expressiveness supports more complex reasoning logic, however, it may also hinder system scalability. Current research mainly focuses on logical frameworks suitable for stream reasoning as well as the implementation and the evaluation of prototype systems. These systems are normally developed in a centralized setting which suffer from inherent limited scalability,...

  2. Emotional reasoning and parent-based reasoning in normal children.

    OpenAIRE

    Morren, M.; Muris, P.; Kindt, M.

    2004-01-01

    A previous study by Muris, Merckelbach, and Van Spauwen demonstrated that children display emotional reasoning irrepective of their anxiety levels. That is when estimating whether a situation is dangerous, childen not only rely on objective danger information but also on their own anciety-response. The present study further examined emotional reasoning in childeren aged 7-13 years (N=508). In addition, it was investigated whether children also show parent-based reasoning, which can be defined...

  3. A neural computational model for animal's time-to-collision estimation.

    Science.gov (United States)

    Wang, Ling; Yao, Dezhong

    2013-04-17

    The time-to-collision (TTC) is the time elapsed before a looming object hits the subject. An accurate estimation of TTC plays a critical role in the survival of animals in nature and acts as an important factor in artificial intelligence systems that depend on judging and avoiding potential dangers. The theoretic formula for TTC is 1/τ≈θ'/sin θ, where θ and θ' are the visual angle and its variation, respectively, and the widely used approximation computational model is θ'/θ. However, both of these measures are too complex to be implemented by a biological neuronal model. We propose a new simple computational model: 1/τ≈Mθ-P/(θ+Q)+N, where M, P, Q, and N are constants that depend on a predefined visual angle. This model, weighted summation of visual angle model (WSVAM), can achieve perfect implementation through a widely accepted biological neuronal model. WSVAM has additional merits, including a natural minimum consumption and simplicity. Thus, it yields a precise and neuronal-implemented estimation for TTC, which provides a simple and convenient implementation for artificial vision, and represents a potential visual brain mechanism.

  4. Future computing needs for Fermilab

    International Nuclear Information System (INIS)

    1983-12-01

    The following recommendations are made: (1) Significant additional computing capacity and capability beyond the present procurement should be provided by 1986. A working group with representation from the principal computer user community should be formed to begin immediately to develop the technical specifications. High priority should be assigned to providing a large user memory, software portability and a productive computing environment. (2) A networked system of VAX-equivalent super-mini computers should be established with at least one such computer dedicated to each reasonably large experiment for both online and offline analysis. The laboratory staff responsible for mini computers should be augmented in order to handle the additional work of establishing, maintaining and coordinating this system. (3) The laboratory should move decisively to a more fully interactive environment. (4) A plan for networking both inside and outside the laboratory should be developed over the next year. (5) The laboratory resources devoted to computing, including manpower, should be increased over the next two to five years. A reasonable increase would be 50% over the next two years increasing thereafter to a level of about twice the present one. (6) A standing computer coordinating group, with membership of experts from all the principal computer user constituents of the laboratory, should be appointed by and report to the director. This group should meet on a regularly scheduled basis and be charged with continually reviewing all aspects of the laboratory computing environment

  5. Computing Whether She Belongs: Stereotypes Undermine Girls' Interest and Sense of Belonging in Computer Science

    Science.gov (United States)

    Master, Allison; Cheryan, Sapna; Meltzoff, Andrew N.

    2016-01-01

    Computer science has one of the largest gender disparities in science, technology, engineering, and mathematics. An important reason for this disparity is that girls are less likely than boys to enroll in necessary "pipeline courses," such as introductory computer science. Two experiments investigated whether high-school girls' lower…

  6. A general framework for reasoning on inconsistency

    CERN Document Server

    Martinez, Maria Vanina; Subrahmanian, VS; Amgoud, Leila

    2013-01-01

    This SpringerBrief proposes a general framework for reasoning about inconsistency in a wide variety of logics, including inconsistency resolution methods that have not yet been studied.  The proposed framework allows users to specify preferences on how to resolve inconsistency when there are multiple ways to do so. This empowers users to resolve inconsistency in data leveraging both their detailed knowledge of the data as well as their application needs. The brief shows that the framework is well-suited to handle inconsistency in several logics, and provides algorithms to compute preferred opt

  7. Symbolic reasoning about myocardial scintigrams in PROLOG

    International Nuclear Information System (INIS)

    Rosenberg, S.; Itti, R.; Benjelloun, L.

    1986-01-01

    PROLOG (PROgramming in LOGic) is the declarative programming language at the heart of the Japanese fifth-generation computer project. It is proposed that PROLOG is a suitable tool for symbolic image processing, once standard preprocessing has been done. In the present application, the problem of prediction of coronary anatomy from myocardial scintigrams is addressed. Uncertainty is dealt with by a combination of fuzzy-set theoretic and probabilistic reasoning. Heuristic classification rules are based on clinical experience and on a set of 247 myocardial scintigrams with their corresponding coronary angiograms. (orig.)

  8. Symbolic reasoning about myocardial scintigrams in PROLOG

    Energy Technology Data Exchange (ETDEWEB)

    Rosenberg, S; Itti, R; Benjelloun, L

    1986-06-01

    PROLOG (PROgramming in LOGic) is the declarative programming language at the heart of the Japanese fifth-generation computer project. It is proposed that PROLOG is a suitable tool for symbolic image processing, once standard preprocessing has been done. In the present application, the problem of prediction of coronary anatomy from myocardial scintigrams is addressed. Uncertainty is dealt with by a combination of fuzzy-set theoretic and probabilistic reasoning. Heuristic classification rules are based on clinical experience and on a set of 247 myocardial scintigrams with their corresponding coronary angiograms.

  9. A review of metaheuristic scheduling techniques in cloud computing

    Directory of Open Access Journals (Sweden)

    Mala Kalra

    2015-11-01

    Full Text Available Cloud computing has become a buzzword in the area of high performance distributed computing as it provides on-demand access to shared pool of resources over Internet in a self-service, dynamically scalable and metered manner. Cloud computing is still in its infancy, so to reap its full benefits, much research is required across a broad array of topics. One of the important research issues which need to be focused for its efficient performance is scheduling. The goal of scheduling is to map tasks to appropriate resources that optimize one or more objectives. Scheduling in cloud computing belongs to a category of problems known as NP-hard problem due to large solution space and thus it takes a long time to find an optimal solution. There are no algorithms which may produce optimal solution within polynomial time to solve these problems. In cloud environment, it is preferable to find suboptimal solution, but in short period of time. Metaheuristic based techniques have been proved to achieve near optimal solutions within reasonable time for such problems. In this paper, we provide an extensive survey and comparative analysis of various scheduling algorithms for cloud and grid environments based on three popular metaheuristic techniques: Ant Colony Optimization (ACO, Genetic Algorithm (GA and Particle Swarm Optimization (PSO, and two novel techniques: League Championship Algorithm (LCA and BAT algorithm.

  10. Variable dead time counters: 2. A computer simulation

    International Nuclear Information System (INIS)

    Hooton, B.W.; Lees, E.W.

    1980-09-01

    A computer model has been developed to give a pulse train which simulates that generated by a variable dead time counter (VDC) used in safeguards determination of Pu mass. The model is applied to two algorithms generally used for VDC analysis. It is used to determine their limitations at high counting rates and to investigate the effects of random neutrons from (α,n) reactions. Both algorithms are found to be deficient for use with masses of 240 Pu greater than 100g and one commonly used algorithm is shown, by use of the model and also by theory, to yield a result which is dependent on the random neutron intensity. (author)

  11. Philosophy of computing and information technology

    NARCIS (Netherlands)

    Brey, Philip A.E.; Soraker, Johnny; Meijers, A.

    2009-01-01

    Philosophy has been described as having taken a “computational turn,” referring to the ways in which computers and information technology throw new light upon traditional philosophical issues, provide new tools and concepts for philosophical reasoning, and pose theoretical and practical questions

  12. An approach to the verification of a fault-tolerant, computer-based reactor safety system: A case study using automated reasoning: Volume 1: Interim report

    International Nuclear Information System (INIS)

    Chisholm, G.H.; Kljaich, J.; Smith, B.T.; Wojcik, A.S.

    1987-01-01

    The purpose of this project is to explore the feasibility of automating the verification process for computer systems. The intent is to demonstrate that both the software and hardware that comprise the system meet specified availability and reliability criteria, that is, total design analysis. The approach to automation is based upon the use of Automated Reasoning Software developed at Argonne National Laboratory. This approach is herein referred to as formal analysis and is based on previous work on the formal verification of digital hardware designs. Formal analysis represents a rigorous evaluation which is appropriate for system acceptance in critical applications, such as a Reactor Safety System (RSS). This report describes a formal analysis technique in the context of a case study, that is, demonstrates the feasibility of applying formal analysis via application. The case study described is based on the Reactor Safety System (RSS) for the Experimental Breeder Reactor-II (EBR-II). This is a system where high reliability and availability are tantamount to safety. The conceptual design for this case study incorporates a Fault-Tolerant Processor (FTP) for the computer environment. An FTP is a computer which has the ability to produce correct results even in the presence of any single fault. This technology was selected as it provides a computer-based equivalent to the traditional analog based RSSs. This provides a more conservative design constraint than that imposed by the IEEE Standard, Criteria For Protection Systems For Nuclear Power Generating Stations (ANSI N42.7-1972)

  13. Symbolic Processing Combined with Model-Based Reasoning

    Science.gov (United States)

    James, Mark

    2009-01-01

    A computer program for the detection of present and prediction of future discrete states of a complex, real-time engineering system utilizes a combination of symbolic processing and numerical model-based reasoning. One of the biggest weaknesses of a purely symbolic approach is that it enables prediction of only future discrete states while missing all unmodeled states or leading to incorrect identification of an unmodeled state as a modeled one. A purely numerical approach is based on a combination of statistical methods and mathematical models of the applicable physics and necessitates development of a complete model to the level of fidelity required for prediction. In addition, a purely numerical approach does not afford the ability to qualify its results without some form of symbolic processing. The present software implements numerical algorithms to detect unmodeled events and symbolic algorithms to predict expected behavior, correlate the expected behavior with the unmodeled events, and interpret the results in order to predict future discrete states. The approach embodied in this software differs from that of the BEAM methodology (aspects of which have been discussed in several prior NASA Tech Briefs articles), which provides for prediction of future measurements in the continuous-data domain.

  14. Operational Circular nr 5 - October 2000 USE OF CERN COMPUTING FACILITIES

    CERN Multimedia

    Division HR

    2000-01-01

    New rules covering the use of CERN Computing facilities have been drawn up. All users of CERN’s computing facilites are subject to these rules, as well as to the subsidiary rules of use. The Computing Rules explicitly address your responsibility for taking reasonable precautions to protect computing equipment and accounts. In particular, passwords must not be easily guessed or obtained by others. Given the difficulty to completely separate work and personal use of computing facilities, the rules define under which conditions limited personal use is tolerated. For example, limited personal use of e-mail, news groups or web browsing is tolerated in your private time, provided CERN resources and your official duties are not adversely affected. The full conditions governing use of CERN’s computing facilities are contained in Operational Circular N° 5, which you are requested to read. Full details are available at : http://www.cern.ch/ComputingRules Copies of the circular are also available in the Divis...

  15. Achieving high performance in numerical computations on RISC workstations and parallel systems

    Energy Technology Data Exchange (ETDEWEB)

    Goedecker, S. [Max-Planck Inst. for Solid State Research, Stuttgart (Germany); Hoisie, A. [Los Alamos National Lab., NM (United States)

    1997-08-20

    The nominal peak speeds of both serial and parallel computers is raising rapidly. At the same time however it is becoming increasingly difficult to get out a significant fraction of this high peak speed from modern computer architectures. In this tutorial the authors give the scientists and engineers involved in numerically demanding calculations and simulations the necessary basic knowledge to write reasonably efficient programs. The basic principles are rather simple and the possible rewards large. Writing a program by taking into account optimization techniques related to the computer architecture can significantly speedup your program, often by factors of 10--100. As such, optimizing a program can for instance be a much better solution than buying a faster computer. If a few basic optimization principles are applied during program development, the additional time needed for obtaining an efficient program is practically negligible. In-depth optimization is usually only needed for a few subroutines or kernels and the effort involved is therefore also acceptable.

  16. Programming Unconventional Computers: Dynamics, Development, Self-Reference

    Directory of Open Access Journals (Sweden)

    Susan Stepney

    2012-10-01

    Full Text Available Classical computing has well-established formalisms for specifying, refining, composing, proving, and otherwise reasoning about computations. These formalisms have matured over the past 70 years or so. Unconventional Computing includes the use of novel kinds of substrates–from black holes and quantum effects, through to chemicals, biomolecules, even slime moulds–to perform computations that do not conform to the classical model. Although many of these unconventional substrates can be coerced into performing classical computation, this is not how they “naturally” compute. Our ability to exploit unconventional computing is partly hampered by a lack of corresponding programming formalisms: we need models for building, composing, and reasoning about programs that execute in these substrates. What might, say, a slime mould programming language look like? Here I outline some of the issues and properties of these unconventional substrates that need to be addressed to find “natural” approaches to programming them. Important concepts include embodied real values, processes and dynamical systems, generative systems and their meta-dynamics, and embodied self-reference.

  17. Pertinent reasoning

    CSIR Research Space (South Africa)

    Britz, K

    2010-05-01

    Full Text Available In this paper the authors venture beyond one of the fundamental assumptions in the non-monotonic reasoning community, namely that non-monotonic entailment is supra-classical. They investigate reasoning which uses an infra-classical entailment...

  18. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure.

    Science.gov (United States)

    Wang, Henry; Ma, Yunzhi; Pratx, Guillem; Xing, Lei

    2011-09-07

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47× speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed.

  19. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure

    International Nuclear Information System (INIS)

    Wang, Henry; Ma Yunzhi; Pratx, Guillem; Xing Lei

    2011-01-01

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47x speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed. (note)

  20. Toward real-time Monte Carlo simulation using a commercial cloud computing infrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Henry [Department of Electrical Engineering, Stanford University, Stanford, CA 94305 (United States); Ma Yunzhi; Pratx, Guillem; Xing Lei, E-mail: hwang41@stanford.edu [Department of Radiation Oncology, Stanford University School of Medicine, Stanford, CA 94305-5847 (United States)

    2011-09-07

    Monte Carlo (MC) methods are the gold standard for modeling photon and electron transport in a heterogeneous medium; however, their computational cost prohibits their routine use in the clinic. Cloud computing, wherein computing resources are allocated on-demand from a third party, is a new approach for high performance computing and is implemented to perform ultra-fast MC calculation in radiation therapy. We deployed the EGS5 MC package in a commercial cloud environment. Launched from a single local computer with Internet access, a Python script allocates a remote virtual cluster. A handshaking protocol designates master and worker nodes. The EGS5 binaries and the simulation data are initially loaded onto the master node. The simulation is then distributed among independent worker nodes via the message passing interface, and the results aggregated on the local computer for display and data analysis. The described approach is evaluated for pencil beams and broad beams of high-energy electrons and photons. The output of cloud-based MC simulation is identical to that produced by single-threaded implementation. For 1 million electrons, a simulation that takes 2.58 h on a local computer can be executed in 3.3 min on the cloud with 100 nodes, a 47x speed-up. Simulation time scales inversely with the number of parallel nodes. The parallelization overhead is also negligible for large simulations. Cloud computing represents one of the most important recent advances in supercomputing technology and provides a promising platform for substantially improved MC simulation. In addition to the significant speed up, cloud computing builds a layer of abstraction for high performance parallel computing, which may change the way dose calculations are performed and radiation treatment plans are completed. (note)

  1. Information systems outsourcing reasons and risks: a new assessment

    OpenAIRE

    González Ramírez, María Reyes; Gascó Gascó, José Luis; Llopis Taverner, Juan

    2010-01-01

    Outsourcing is currently going through a stage of unstoppable growth. This paper makes a proposal about the main reasons which may lead firms to adopt Outsourcing in Information Systems services. It will equally analyse the potential risks that IS clients are likely to face. An additional objective is to assess these reasons and risks in the case of large Spanish firms, while simultaneously examining their evolution over time. This study of outsourcing reasons and risks has been carried out f...

  2. Public Reason Renaturalized

    DEFF Research Database (Denmark)

    Tønder, Lars

    2014-01-01

    . The article develops this argument via a sensorial orientation to politics that not only re-frames existing critiques of neo-Kantianism but also includes an alternative, renaturalized conception of public reason, one that allows us to overcome the disconnect between the account we give of reason and the way......This article takes up recent discussions of nature and the sensorium in order to rethink public reason in deeply divided societies. The aim is not to reject the role of reason-giving but rather to infuse it with new meaning, bringing the reasonable back to its sensorially inflected circumstances...... it is mobilized in a world of deep pluralism. The article concludes with a discussion of how a renaturalized conception of public reason might change the positioning of contemporary democratic theory vis-a-vis the struggle for empowerment and pluralization in an age of neo-liberalism and state-surveillance....

  3. Learning clinical reasoning.

    Science.gov (United States)

    Pinnock, Ralph; Welch, Paul

    2014-04-01

    Errors in clinical reasoning continue to account for significant morbidity and mortality, despite evidence-based guidelines and improved technology. Experts in clinical reasoning often use unconscious cognitive processes that they are not aware of unless they explain how they are thinking. Understanding the intuitive and analytical thinking processes provides a guide for instruction. How knowledge is stored is critical to expertise in clinical reasoning. Curricula should be designed so that trainees store knowledge in a way that is clinically relevant. Competence in clinical reasoning is acquired by supervised practice with effective feedback. Clinicians must recognise the common errors in clinical reasoning and how to avoid them. Trainees can learn clinical reasoning effectively in everyday practice if teachers provide guidance on the cognitive processes involved in making diagnostic decisions. © 2013 The Authors. Journal of Paediatrics and Child Health © 2013 Paediatrics and Child Health Division (Royal Australasian College of Physicians).

  4. Television Viewing, Computer Use, Time Driving and All‐Cause Mortality: The SUN Cohort

    Science.gov (United States)

    Basterra‐Gortari, Francisco Javier; Bes‐Rastrollo, Maira; Gea, Alfredo; Núñez‐Córdoba, Jorge María; Toledo, Estefanía; Martínez‐González, Miguel Ángel

    2014-01-01

    Background Sedentary behaviors have been directly associated with all‐cause mortality. However, little is known about different types of sedentary behaviors in relation to overall mortality. Our objective was to assess the association between different sedentary behaviors and all‐cause mortality. Methods and Results In this prospective, dynamic cohort study (the SUN Project) 13 284 Spanish university graduates with a mean age of 37 years were followed‐up for a median of 8.2 years. Television, computer, and driving time were assessed at baseline. Poisson regression models were fitted to examine the association between each sedentary behavior and total mortality. All‐cause mortality incidence rate ratios (IRRs) per 2 hours per day were 1.40 (95% confidence interval (CI): 1.06 to 1.84) for television viewing, 0.96 (95% CI: 0.79 to 1.18) for computer use, and 1.14 (95% CI: 0.90 to 1.44) for driving, after adjustment for age, sex, smoking status, total energy intake, Mediterranean diet adherence, body mass index, and physical activity. The risk of mortality was twofold higher for participants reporting ≥3 h/day of television viewing than for those reporting Television viewing was directly associated with all‐cause mortality. However, computer use and time spent driving were not significantly associated with higher mortality. Further cohort studies and trials designed to assess whether reductions in television viewing are able to reduce mortality are warranted. The lack of association between computer use or time spent driving and mortality needs further confirmation. PMID:24965030

  5. Towards Cache-Enabled, Order-Aware, Ontology-Based Stream Reasoning Framework

    Energy Technology Data Exchange (ETDEWEB)

    Yan, Rui; Praggastis, Brenda L.; Smith, William P.; McGuinness, Deborah L.

    2016-08-16

    While streaming data have become increasingly more popular in business and research communities, semantic models and processing software for streaming data have not kept pace. Traditional semantic solutions have not addressed transient data streams. Semantic web languages (e.g., RDF, OWL) have typically addressed static data settings and linked data approaches have predominantly addressed static or growing data repositories. Streaming data settings have some fundamental differences; in particular, data are consumed on the fly and data may expire. Stream reasoning, a combination of stream processing and semantic reasoning, has emerged with the vision of providing "smart" processing of streaming data. C-SPARQL is a prominent stream reasoning system that handles semantic (RDF) data streams. Many stream reasoning systems including C-SPARQL use a sliding window and use data arrival time to evict data. For data streams that include expiration times, a simple arrival time scheme is inadequate if the window size does not match the expiration period. In this paper, we propose a cache-enabled, order-aware, ontology-based stream reasoning framework. This framework consumes RDF streams with expiration timestamps assigned by the streaming source. Our framework utilizes both arrival and expiration timestamps in its cache eviction policies. In addition, we introduce the notion of "semantic importance" which aims to address the relevance of data to the expected reasoning, thus enabling the eviction algorithms to be more context- and reasoning-aware when choosing what data to maintain for question answering. We evaluate this framework by implementing three different prototypes and utilizing five metrics. The trade-offs of deploying the proposed framework are also discussed.

  6. Defeasibility in Legal Reasoning

    OpenAIRE

    SARTOR, Giovanni

    2009-01-01

    I shall first introduce the idea of reasoning, and of defeasible reasoning in particular. I shall then argue that cognitive agents need to engage in defeasible reasoning for coping with a complex and changing environment. Consequently, defeasibility is needed in practical reasoning, and in particular in legal reasoning

  7. Application verification research of cloud computing technology in the field of real time aerospace experiment

    Science.gov (United States)

    Wan, Junwei; Chen, Hongyan; Zhao, Jing

    2017-08-01

    According to the requirements of real-time, reliability and safety for aerospace experiment, the single center cloud computing technology application verification platform is constructed. At the IAAS level, the feasibility of the cloud computing technology be applied to the field of aerospace experiment is tested and verified. Based on the analysis of the test results, a preliminary conclusion is obtained: Cloud computing platform can be applied to the aerospace experiment computing intensive business. For I/O intensive business, it is recommended to use the traditional physical machine.

  8. 21 CFR 10.20 - Submission of documents to Division of Dockets Management; computation of time; availability for...

    Science.gov (United States)

    2010-04-01

    ... Management; computation of time; availability for public disclosure. 10.20 Section 10.20 Food and Drugs FOOD... Management; computation of time; availability for public disclosure. (a) A submission to the Division of Dockets Management of a petition, comment, objection, notice, compilation of information, or any other...

  9. Workplace lactation support by New Jersey employers following US Reasonable Break Time for Nursing Mothers law.

    Science.gov (United States)

    Bai, Yeon K; Gaits, Susan I; Wunderlich, Shahla M

    2015-02-01

    Returning to an unsupportive work environment has been identified as a major reason for avoidance or early abandonment of breastfeeding among working mothers. This study aimed to examine the nature and extent of accommodations offered to breastfeeding employees among New Jersey employers since the US federal Reasonable Break Time for Nursing Mothers law enactment. A cross-sectional survey was conducted to measure current lactation support in the workplace in New Jersey. Using convenience sampling, the survey was sent to managerial personnel in hospitals and nonhospitals. The level of support was assessed on company policy, lactation room, and room amenity. A composite lactation amenity score was calculated based on responses about lactation room amenities. Respondents (N = 51) completed a 22-item online questionnaire during fall 2011. The support level was compared by type of organization: hospital (n = 37) versus nonhospital (n = 14). The amenity score of hospitals was significantly higher than nonhospitals (1.44 vs 0.45, P = .002). The mean amenity score (score = 0.95) for all employers was far below comprehensive (score = 3.0). Compared to nonhospitals, hospitals were more likely to offer lactation rooms (81% vs 36%, P = .003), have their own breastfeeding policy (35.1% vs 7.1%, P = .01), and provide additional breastfeeding support (eg, education classes, resources; P < .05). Employers, regardless of the type of organization, need to improve their current practices and create equity of lactation support in the workplace. © The Author(s) 2014.

  10. System matrix computation vs storage on GPU: A comparative study in cone beam CT.

    Science.gov (United States)

    Matenine, Dmitri; Côté, Geoffroi; Mascolo-Fortin, Julia; Goussard, Yves; Després, Philippe

    2018-02-01

    Iterative reconstruction algorithms in computed tomography (CT) require a fast method for computing the intersection distances between the trajectories of photons and the object, also called ray tracing or system matrix computation. This work focused on the thin-ray model is aimed at comparing different system matrix handling strategies using graphical processing units (GPUs). In this work, the system matrix is modeled by thin rays intersecting a regular grid of box-shaped voxels, known to be an accurate representation of the forward projection operator in CT. However, an uncompressed system matrix exceeds the random access memory (RAM) capacities of typical computers by one order of magnitude or more. Considering the RAM limitations of GPU hardware, several system matrix handling methods were compared: full storage of a compressed system matrix, on-the-fly computation of its coefficients, and partial storage of the system matrix with partial on-the-fly computation. These methods were tested on geometries mimicking a cone beam CT (CBCT) acquisition of a human head. Execution times of three routines of interest were compared: forward projection, backprojection, and ordered-subsets convex (OSC) iteration. A fully stored system matrix yielded the shortest backprojection and OSC iteration times, with a 1.52× acceleration for OSC when compared to the on-the-fly approach. Nevertheless, the maximum problem size was bound by the available GPU RAM and geometrical symmetries. On-the-fly coefficient computation did not require symmetries and was shown to be the fastest for forward projection. It also offered reasonable execution times of about 176.4 ms per view per OSC iteration for a detector of 512 × 448 pixels and a volume of 384 3 voxels, using commodity GPU hardware. Partial system matrix storage has shown a performance similar to the on-the-fly approach, while still relying on symmetries. Partial system matrix storage was shown to yield the lowest relative

  11. Reliability of real-time computing with radiation data feedback at accidental release

    International Nuclear Information System (INIS)

    Deme, S.; Feher, I.; Lang, E.

    1990-01-01

    At the first workshop in 1985 we reported on the real-time dose computing method used at the Paks Nuclear Power Plant and on the telemetric system developed for the normalization of the computed data. At present, the computing method normalized for the telemetric data represents the primary information for deciding on any necessary counter measures in case of a nuclear reactor accident. In this connection we analyzed the reliability of the results obtained in this manner. The points of the analysis were: how the results are influenced by the choice of certain parameters that cannot be determined by direct methods and how the improperly chosen diffusion parameters would distort the determination of environmental radiation parameters normalized on the basis of the measurements ( 131 I activity concentration, gamma dose rate) at points lying at a given distance from the measuring stations. A further source of errors may be that, when determining the level of gamma radiation, the radionuclide doses in the cloud and on the ground surface are measured together by the environmental monitoring stations, whereas these doses appear separately in the computations. At the Paks NPP it is the time integral of the aiborne activity concentration of vapour form 131 I which is determined. This quantity includes neither the other physical and chemical forms of 131 I nor the other isotopes of radioiodine. We gave numerical examples for the uncertainties due to the above factors. As a result, we arrived at the conclusions that there is a need to decide on accident-related measures based on the computing method that the dose uncertainties may reach one order of magnitude for points lying far from the monitoring stations. Different measures are discussed to make the uncertainties significantly lower

  12. Time-dependent configurations in the perturbative formalism of string theory

    International Nuclear Information System (INIS)

    Durin, B.

    2006-01-01

    In this thesis three time-dependent configurations are studied in the formalism of first-quantized string. These configurations are interesting because perturbative computation of correlation functions is possible and thus is a tool to understand the interplay between the time-dependent geometry and the quantified string. In a first chapter, we explain the reasons for studying these configurations. Then in the second chapter we describe the perturbative formalism and explain how to solve technical problem we encountered. The third chapter is devoted to the physical description of the phenomena involved in these configurations, to the specific computations we made and to the insights we gained. Eventually, we conclude and give some perspectives. (author)

  13. An Efficient Integer Coding and Computing Method for Multiscale Time Segment

    Directory of Open Access Journals (Sweden)

    TONG Xiaochong

    2016-12-01

    Full Text Available This article focus on the exist problem and status of current time segment coding, proposed a new set of approach about time segment coding: multi-scale time segment integer coding (MTSIC. This approach utilized the tree structure and the sort by size formed among integer, it reflected the relationship among the multi-scale time segments: order, include/contained, intersection, etc., and finally achieved an unity integer coding processing for multi-scale time. On this foundation, this research also studied the computing method for calculating the time relationships of MTSIC, to support an efficient calculation and query based on the time segment, and preliminary discussed the application method and prospect of MTSIC. The test indicated that, the implement of MTSIC is convenient and reliable, and the transformation between it and the traditional method is convenient, it has the very high efficiency in query and calculating.

  14. Rigorous bounds on survival times in circular accelerators and efficient computation of fringe-field transfer maps

    International Nuclear Information System (INIS)

    Hoffstaetter, G.H.

    1994-12-01

    Analyzing stability of particle motion in storage rings contributes to the general field of stability analysis in weakly nonlinear motion. A method which we call pseudo invariant estimation (PIE) is used to compute lower bounds on the survival time in circular accelerators. The pseudeo invariants needed for this approach are computed via nonlinear perturbative normal form theory and the required global maxima of the highly complicated multivariate functions could only be rigorously bound with an extension of interval arithmetic. The bounds on the survival times are large enough to the relevant; the same is true for the lower bounds on dynamical aperatures, which can be computed. The PIE method can lead to novel design criteria with the objective of maximizing the survival time. A major effort in the direction of rigourous predictions only makes sense if accurate models of accelerators are available. Fringe fields often have a significant influence on optical properties, but the computation of fringe-field maps by DA based integration is slower by several orders of magnitude than DA evaluation of the propagator for main-field maps. A novel computation of fringe-field effects called symplectic scaling (SYSCA) is introduced. It exploits the advantages of Lie transformations, generating functions, and scaling properties and is extremely accurate. The computation of fringe-field maps is typically made nearly two orders of magnitude faster. (orig.)

  15. Design considerations for computationally constrained two-way real-time video communication

    Science.gov (United States)

    Bivolarski, Lazar M.; Saunders, Steven E.; Ralston, John D.

    2009-08-01

    Today's video codecs have evolved primarily to meet the requirements of the motion picture and broadcast industries, where high-complexity studio encoding can be utilized to create highly-compressed master copies that are then broadcast one-way for playback using less-expensive, lower-complexity consumer devices for decoding and playback. Related standards activities have largely ignored the computational complexity and bandwidth constraints of wireless or Internet based real-time video communications using devices such as cell phones or webcams. Telecommunications industry efforts to develop and standardize video codecs for applications such as video telephony and video conferencing have not yielded image size, quality, and frame-rate performance that match today's consumer expectations and market requirements for Internet and mobile video services. This paper reviews the constraints and the corresponding video codec requirements imposed by real-time, 2-way mobile video applications. Several promising elements of a new mobile video codec architecture are identified, and more comprehensive computational complexity metrics and video quality metrics are proposed in order to support the design, testing, and standardization of these new mobile video codecs.

  16. Putting the Co in Education: Timing, Reasons, and Consequences of College Coeducation from 1835 to the Present. NBER Working Paper No. 16281

    Science.gov (United States)

    Goldin, Claudia; Katz, Lawrence F.

    2010-01-01

    The history of coeducation in U.S. higher education is explored through an analysis of a database containing information on all institutions offering four-year undergraduate degrees that operated in 1897, 1924, 1934, or 1980, most of which still exist today. These data reveal surprises about the timing of coeducation and the reasons for its…

  17. The Effect of Functional Hearing and Hearing Aid Usage on Verbal Reasoning in a Large Community-Dwelling Population.

    Science.gov (United States)

    Keidser, Gitte; Rudner, Mary; Seeto, Mark; Hygge, Staffan; Rönnberg, Jerker

    2016-01-01

    Verbal reasoning performance is an indicator of the ability to think constructively in everyday life and relies on both crystallized and fluid intelligence. This study aimed to determine the effect of functional hearing on verbal reasoning when controlling for age, gender, and education. In addition, the study investigated whether hearing aid usage mitigated the effect and examined different routes from hearing to verbal reasoning. Cross-sectional data on 40- to 70-year-old community-dwelling participants from the UK Biobank resource were accessed. Data consisted of behavioral and subjective measures of functional hearing, assessments of numerical and linguistic verbal reasoning, measures of executive function, and demographic and lifestyle information. Data on 119,093 participants who had completed hearing and verbal reasoning tests were submitted to multiple regression analyses, and data on 61,688 of these participants, who had completed additional cognitive tests and provided relevant lifestyle information, were submitted to structural equation modeling. Poorer performance on the behavioral measure of functional hearing was significantly associated with poorer verbal reasoning in both the numerical and linguistic domains (p reasoning. Functional hearing significantly interacted with education (p reasoning among those with a higher level of formal education. Among those with poor hearing, hearing aid usage had a significant positive, but not necessarily causal, effect on both numerical and linguistic verbal reasoning (p reasoning and showed that controlling for executive function eliminated the effect. However, when computer usage was controlled for, the eliminating effect of executive function was weakened. Poor functional hearing was associated with poor verbal reasoning in a 40- to 70-year-old community-dwelling population after controlling for age, gender, and education. The effect of functional hearing on verbal reasoning was significantly reduced among

  18. Cloud computing platform for real-time measurement and verification of energy performance

    International Nuclear Information System (INIS)

    Ke, Ming-Tsun; Yeh, Chia-Hung; Su, Cheng-Jie

    2017-01-01

    Highlights: • Application of PSO algorithm can improve the accuracy of the baseline model. • M&V cloud platform automatically calculates energy performance. • M&V cloud platform can be applied in all energy conservation measures. • Real-time operational performance can be monitored through the proposed platform. • M&V cloud platform facilitates the development of EE programs and ESCO industries. - Abstract: Nations worldwide are vigorously promoting policies to improve energy efficiency. The use of measurement and verification (M&V) procedures to quantify energy performance is an essential topic in this field. Currently, energy performance M&V is accomplished via a combination of short-term on-site measurements and engineering calculations. This requires extensive amounts of time and labor and can result in a discrepancy between actual energy savings and calculated results. In addition, the M&V period typically lasts for periods as long as several months or up to a year, the failure to immediately detect abnormal energy performance not only decreases energy performance, results in the inability to make timely correction, and misses the best opportunity to adjust or repair equipment and systems. In this study, a cloud computing platform for the real-time M&V of energy performance is developed. On this platform, particle swarm optimization and multivariate regression analysis are used to construct accurate baseline models. Instantaneous and automatic calculations of the energy performance and access to long-term, cumulative information about the energy performance are provided via a feature that allows direct uploads of the energy consumption data. Finally, the feasibility of this real-time M&V cloud platform is tested for a case study involving improvements to a cold storage system in a hypermarket. Cloud computing platform for real-time energy performance M&V is applicable to any industry and energy conservation measure. With the M&V cloud platform, real-time

  19. Computer-games for gravitational wave science outreach: Black Hole Pong and Space Time Quest

    International Nuclear Information System (INIS)

    Carbone, L; Bond, C; Brown, D; Brückner, F; Grover, K; Lodhia, D; Mingarelli, C M F; Fulda, P; Smith, R J E; Unwin, R; Vecchio, A; Wang, M; Whalley, L; Freise, A

    2012-01-01

    We have established a program aimed at developing computer applications and web applets to be used for educational purposes as well as gravitational wave outreach activities. These applications and applets teach gravitational wave physics and technology. The computer programs are generated in collaboration with undergraduates and summer students as part of our teaching activities, and are freely distributed on a dedicated website. As part of this program, we have developed two computer-games related to gravitational wave science: 'Black Hole Pong' and 'Space Time Quest'. In this article we present an overview of our computer related outreach activities and discuss the games and their educational aspects, and report on some positive feedback received.

  20. Combinational Reasoning of Quantitative Fuzzy Topological Relations for Simple Fuzzy Regions

    Science.gov (United States)

    Liu, Bo; Li, Dajun; Xia, Yuanping; Ruan, Jian; Xu, Lili; Wu, Huanyi

    2015-01-01

    In recent years, formalization and reasoning of topological relations have become a hot topic as a means to generate knowledge about the relations between spatial objects at the conceptual and geometrical levels. These mechanisms have been widely used in spatial data query, spatial data mining, evaluation of equivalence and similarity in a spatial scene, as well as for consistency assessment of the topological relations of multi-resolution spatial databases. The concept of computational fuzzy topological space is applied to simple fuzzy regions to efficiently and more accurately solve fuzzy topological relations. Thus, extending the existing research and improving upon the previous work, this paper presents a new method to describe fuzzy topological relations between simple spatial regions in Geographic Information Sciences (GIS) and Artificial Intelligence (AI). Firstly, we propose a new definition for simple fuzzy line segments and simple fuzzy regions based on the computational fuzzy topology. And then, based on the new definitions, we also propose a new combinational reasoning method to compute the topological relations between simple fuzzy regions, moreover, this study has discovered that there are (1) 23 different topological relations between a simple crisp region and a simple fuzzy region; (2) 152 different topological relations between two simple fuzzy regions. In the end, we have discussed some examples to demonstrate the validity of the new method, through comparisons with existing fuzzy models, we showed that the proposed method can compute more than the existing models, as it is more expressive than the existing fuzzy models. PMID:25775452

  1. Could Elementary Mathematics Textbooks Help Give Attention to Reasons in the Classroom?

    Science.gov (United States)

    Newton, Douglas P.; Newton, Lynn D.

    2007-01-01

    Trainee teachers, new and non-specialist teachers of elementary mathematics have a tendency to avoid thought about reasons in mathematics. Instead, they tend to favour the development of computational skill through the rote application of procedures, routines and algorithms. Could elementary mathematics textbooks serve as models of practice and…

  2. Kajian dan Implementasi Real TIME Operating System pada Single Board Computer Berbasis Arm

    OpenAIRE

    A, Wiedjaja; M, Handi; L, Jonathan; Christian, Benyamin; Kristofel, Luis

    2014-01-01

    Operating System is an important software in computer system. For personal and office use the operating system is sufficient. However, to critical mission applications such as nuclear power plants and braking system on the car (auto braking system) which need a high level of reliability, it requires operating system which operates in real time. The study aims to assess the implementation of the Linux-based operating system on a Single Board Computer (SBC) ARM-based, namely Pandaboard ES with ...

  3. Connectionism vs. Computational Theory of Mind

    Directory of Open Access Journals (Sweden)

    Angel Garrido

    2010-01-01

    Full Text Available

    Usually, the problems in AI may be many times related to Philosophy of Mind, and perhaps because this reason may be in essence very disputable. So, for instance, the famous question: Can a machine think? It was proposed by Alan Turing [16]. And it may be the more decisive question, but for many people it would be a nonsense. So, two of the very fundamental and more confronted positions usually considered according this line include the Connectionism and the Computational Theory of Mind. We analyze here its content, with their past disputes, and current situation.

  4. Analogical reasoning abilities of recovering alcoholics.

    Science.gov (United States)

    Gardner, M K; Clark, E; Bowman, M A; Miller, P J

    1989-08-01

    This study investigated analogical reasoning abilities of alcoholics who had been abstinent from alcohol for at least 1 year. Their performance was compared to that of nonalcoholic controls matched as a group for education, age, and gender. Solution times and error rates were modeled using a regression model. Results showed a nonsignificant trend for alcoholics to be faster, but more error prone, than controls. The same componential model applied to both groups, and fit them equally well. Although differences have been found in analogical reasoning ability between controls and alcoholics immediately following detoxification, we find no evidence of differences after extended periods of sobriety.

  5. SPEEDI: a computer code system for the real-time prediction of radiation dose to the public due to an accidental release

    International Nuclear Information System (INIS)

    Imai, Kazuhiko; Chino, Masamichi; Ishikawa, Hirohiko

    1985-10-01

    SPEEDI, a computer code system for prediction of environmental doses from radioactive materials accidentally released from a nuclear plant has been developed to assist the organizations responsible for an emergency planning. For realistic simulation, have been developed a model which statistically predicts the basic wind data and then calculates the three-dimensional mass consistent wind field by interpolating these predicted data, and a model for calculation of the diffusion of released materials using a combined model of random-walk and PICK methods. These calculation in the system is carried out in conversational mode with a computer so that we may use the system with ease in an emergency. SPEEDI has also versatile files, which make it easy to control the complicated flows of calculation. In order to attain a short computation time, a large-scale computer with performance of 25 MIPS and a vector processor of maximum 250 MFLOPS are used for calculation of the models so that quick responses have been made. Simplified models are also prepared for calculation in a minicomputer widely used by local governments and research institutes, although the precision of calculation as same with the above models can not be expected to obtain. The present report outlines the structure and functions of SPEEDI, methods for prediction of the wind field and the models for calculation of the concentration of released materials in air and on the ground, and the doses to the public. Some of the diffusion models have been compared with the field experiments which had been carried out as a part of the SPEEDI development program. The report also discusses the reliability of the diffusion models on the basis of the compared results, and shows that they can reasonably simulate the diffusion in the internal boundary layer which commonly occurs near the coastal region. (J.P.N.)

  6. Diagnostic causal reasoning with verbal information.

    Science.gov (United States)

    Meder, Björn; Mayrhofer, Ralf

    2017-08-01

    In diagnostic causal reasoning, the goal is to infer the probability of causes from one or multiple observed effects. Typically, studies investigating such tasks provide subjects with precise quantitative information regarding the strength of the relations between causes and effects or sample data from which the relevant quantities can be learned. By contrast, we sought to examine people's inferences when causal information is communicated through qualitative, rather vague verbal expressions (e.g., "X occasionally causes A"). We conducted three experiments using a sequential diagnostic inference task, where multiple pieces of evidence were obtained one after the other. Quantitative predictions of different probabilistic models were derived using the numerical equivalents of the verbal terms, taken from an unrelated study with different subjects. We present a novel Bayesian model that allows for incorporating the temporal weighting of information in sequential diagnostic reasoning, which can be used to model both primacy and recency effects. On the basis of 19,848 judgments from 292 subjects, we found a remarkably close correspondence between the diagnostic inferences made by subjects who received only verbal information and those of a matched control group to whom information was presented numerically. Whether information was conveyed through verbal terms or numerical estimates, diagnostic judgments closely resembled the posterior probabilities entailed by the causes' prior probabilities and the effects' likelihoods. We observed interindividual differences regarding the temporal weighting of evidence in sequential diagnostic reasoning. Our work provides pathways for investigating judgment and decision making with verbal information within a computational modeling framework. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Improvements in fast-response flood modeling: desktop parallel computing and domain tracking

    Energy Technology Data Exchange (ETDEWEB)

    Judi, David R [Los Alamos National Laboratory; Mcpherson, Timothy N [Los Alamos National Laboratory; Burian, Steven J [UNIV. OF UTAH

    2009-01-01

    It is becoming increasingly important to have the ability to accurately forecast flooding, as flooding accounts for the most losses due to natural disasters in the world and the United States. Flood inundation modeling has been dominated by one-dimensional approaches. These models are computationally efficient and are considered by many engineers to produce reasonably accurate water surface profiles. However, because the profiles estimated in these models must be superimposed on digital elevation data to create a two-dimensional map, the result may be sensitive to the ability of the elevation data to capture relevant features (e.g. dikes/levees, roads, walls, etc...). Moreover, one-dimensional models do not explicitly represent the complex flow processes present in floodplains and urban environments and because two-dimensional models based on the shallow water equations have significantly greater ability to determine flow velocity and direction, the National Research Council (NRC) has recommended that two-dimensional models be used over one-dimensional models for flood inundation studies. This paper has shown that two-dimensional flood modeling computational time can be greatly reduced through the use of Java multithreading on multi-core computers which effectively provides a means for parallel computing on a desktop computer. In addition, this paper has shown that when desktop parallel computing is coupled with a domain tracking algorithm, significant computation time can be eliminated when computations are completed only on inundated cells. The drastic reduction in computational time shown here enhances the ability of two-dimensional flood inundation models to be used as a near-real time flood forecasting tool, engineering, design tool, or planning tool. Perhaps even of greater significance, the reduction in computation time makes the incorporation of risk and uncertainty/ensemble forecasting more feasible for flood inundation modeling (NRC 2000; Sayers et al

  8. Stereotypical Reasoning: Logical Properties

    OpenAIRE

    Lehmann, Daniel

    2002-01-01

    Stereotypical reasoning assumes that the situation at hand is one of a kind and that it enjoys the properties generally associated with that kind of situation. It is one of the most basic forms of nonmonotonic reasoning. A formal model for stereotypical reasoning is proposed and the logical properties of this form of reasoning are studied. Stereotypical reasoning is shown to be cumulative under weak assumptions.

  9. 29 CFR 531.3 - General determinations of “reasonable cost.”

    Science.gov (United States)

    2010-07-01

    ... employer: Provided, That if the total so computed is more than the fair rental value (or the fair price of the commodities or facilities offered for sale), the fair rental value (or the fair price of the... REGULATIONS WAGE PAYMENTS UNDER THE FAIR LABOR STANDARDS ACT OF 1938 Determinations of âReasonable Costâ and â...

  10. A decision network account of reasoning about other people's choices

    Science.gov (United States)

    Jern, Alan; Kemp, Charles

    2015-01-01

    The ability to predict and reason about other people's choices is fundamental to social interaction. We propose that people reason about other people's choices using mental models that are similar to decision networks. Decision networks are extensions of Bayesian networks that incorporate the idea that choices are made in order to achieve goals. In our first experiment, we explore how people predict the choices of others. Our remaining three experiments explore how people infer the goals and knowledge of others by observing the choices that they make. We show that decision networks account for our data better than alternative computational accounts that do not incorporate the notion of goal-directed choice or that do not rely on probabilistic inference. PMID:26010559

  11. Computing time-series suspended-sediment concentrations and loads from in-stream turbidity-sensor and streamflow data

    Science.gov (United States)

    Rasmussen, Patrick P.; Gray, John R.; Glysson, G. Doug; Ziegler, Andrew C.

    2010-01-01

    Over the last decade, use of a method for computing suspended-sediment concentration and loads using turbidity sensors—primarily nephelometry, but also optical backscatter—has proliferated. Because an in- itu turbidity sensor is capa le of measuring turbidity instantaneously, a turbidity time series can be recorded and related directly to time-varying suspended-sediment concentrations. Depending on the suspended-sediment characteristics of the measurement site, this method can be more reliable and, in many cases, a more accurate means for computing suspended-sediment concentrations and loads than traditional U.S. Geological Survey computational methods. Guidelines and procedures for estimating time s ries of suspended-sediment concentration and loading as a function of turbidity and streamflow data have been published in a U.S. Geological Survey Techniques and Methods Report, Book 3, Chapter C4. This paper is a summary of these guidelines and discusses some of the concepts, s atistical procedures, and techniques used to maintain a multiyear suspended sediment time series.

  12. Using AberOWL for fast and scalable reasoning over BioPortal ontologies

    KAUST Repository

    Slater, Luke

    2016-08-08

    Background: Reasoning over biomedical ontologies using their OWL semantics has traditionally been a challenging task due to the high theoretical complexity of OWL-based automated reasoning. As a consequence, ontology repositories, as well as most other tools utilizing ontologies, either provide access to ontologies without use of automated reasoning, or limit the number of ontologies for which automated reasoning-based access is provided. Methods: We apply the AberOWL infrastructure to provide automated reasoning-based access to all accessible and consistent ontologies in BioPortal (368 ontologies). We perform an extensive performance evaluation to determine query times, both for queries of different complexity and for queries that are performed in parallel over the ontologies. Results and conclusions: We demonstrate that, with the exception of a few ontologies, even complex and parallel queries can now be answered in milliseconds, therefore allowing automated reasoning to be used on a large scale, to run in parallel, and with rapid response times.

  13. Green computing: power optimisation of vfi-based real-time multiprocessor dataflow applications

    NARCIS (Netherlands)

    Ahmad, W.; Holzenspies, P.K.F.; Stoelinga, Mariëlle Ida Antoinette; van de Pol, Jan Cornelis

    2015-01-01

    Execution time is no longer the only performance metric for computer systems. In fact, a trend is emerging to trade raw performance for energy savings. Techniques like Dynamic Power Management (DPM, switching to low power state) and Dynamic Voltage and Frequency Scaling (DVFS, throttling processor

  14. In this issue: Time to replace doctors’ judgement with computers

    Directory of Open Access Journals (Sweden)

    Simon de Lusignan

    2015-11-01

    Full Text Available Informaticians continue to rise to the challenge, set by the English Health Minister, of trying to replace doctors’ judgement with computers. This issue describes successes and where there are barriers. However, whilst there is progress this tends to be incremental and there are grand challenges to be overcome before computers can replace clinician. These grand challenges include: (1 improving usability so it is possible to more readily incorporate technology into clinical workflow; (2 rigorous new analytic methods that make use of the mass of available data, ‘Big data’, to create real-world evidence; (3 faster ways of meeting regulatory and legal requirements including ensuring privacy; (4 provision of reimbursement models to fund innovative technology that can substitute for clinical time and (5 recognition that innovations that improve quality also often increase cost. Informatics more is likely to support and augment clinical decision making rather than replace clinicians.

  15. A New Methodology for Fuel Mass Computation of an operating Aircraft

    Directory of Open Access Journals (Sweden)

    M Souli

    2016-03-01

    Full Text Available The paper performs a new computational methodology for an accurate computation of fuel mass inside an aircraft wing during the flight. The computation is carried out using hydrodynamic equations, classically known as Navier-Stokes equations by the CFD community. For this purpose, a computational software is developed, the software computes the fuel mass inside the tank based on experimental data of pressure gages that are inserted in the fuel tank. Actually and for safety reasons, Optical fiber sensor for fluid level sensor detection is used. The optical system consists to an optically controlled acoustic transceiver system which measures the fuel level inside the each compartment of the fuel tank. The system computes fuel volume inside the tank and needs density to compute the total fuel mass. Using optical sensor technique, density measurement inside the tank is required. The method developed in the paper, requires pressure measurements in each tank compartment, the density is then computed based on pressure measurements and hydrostatic assumptions. The methodology is tested using a fuel tank provided by Airbus for time history refueling process.

  16. Time-of-Flight Sensors in Computer Graphics

    DEFF Research Database (Denmark)

    Kolb, Andreas; Barth, Erhardt; Koch, Reinhard

    2009-01-01

    , including Computer Graphics, Computer Vision and Man Machine Interaction (MMI). These technologies are starting to have an impact on research and commercial applications. The upcoming generation of ToF sensors, however, will be even more powerful and will have the potential to become “ubiquitous real...

  17. Object reasoning for waste remediation

    International Nuclear Information System (INIS)

    Pennock, K.A.; Bohn, S.J.; Franklin, A.L.

    1991-08-01

    A large number of contaminated waste sites across the United States await size remediation efforts. These sites can be physically complex, composed of multiple, possibly interacting, contaminants distributed throughout one or more media. The Remedial Action Assessment System (RAAS) is being designed and developed to support decisions concerning the selection of remediation alternatives. The goal of this system is to broaden the consideration of remediation alternatives, while reducing the time and cost of making these considerations. The Remedial Action Assessment System is a hybrid system, designed and constructed using object-oriented, knowledge- based systems, and structured programming techniques. RAAS uses a combination of quantitative and qualitative reasoning to consider and suggest remediation alternatives. The reasoning process that drives this application is centered around an object-oriented organization of remediation technology information. This paper describes the information structure and organization used to support this reasoning process. In addition, the paper describes the level of detail of the technology related information used in RAAS, discusses required assumptions and procedural implications of these assumptions, and provides rationale for structuring RAAS in this manner. 3 refs., 3 figs

  18. Finding a Reasonable Foundation for Peace

    Directory of Open Access Journals (Sweden)

    Roberta Bayer

    2017-03-01

    Full Text Available Can world peace come about through a world federation of governments? Is growing agreement and appreciation for, throughout the world, the doctrine of equal human rights inevitable? Such questions are raised by Mortimer Adler in How to Think about War and Peace. Adler argues in this book that both are possible, and in doing so he argues that the insights of liberal contract thinkers, particularly Immanuel Kant, are essentially true. Kant argues that each person has the capacity to discover within himself the foundation for human rights because they are self-evident. It follows that over time inequalities and prejudices will disappear, and people will gain the freedom to advance the cause of peace. About this account of the possibility of world peace I ask the question: is it indeed reasonable? For if it is reasonable, it is not reasonable for the reasons that would have been advanced by Aristotle or Plato or their medieval followers. In older political philosophy it is agreement about the unchanging truth of things that can bring peace. To seek the unchanging truth of things, philosophical speculation about God and things divine, is the highest human activity. It is that end to which life in this world is directed, and upon which human flourishing depends. Freedom depends upon our openness to unchanging eternal truth, even more than self-evident rights; the exercise of speculative reasoning allows for political discourse and an open society.

  19. Analysis of students’ mathematical reasoning

    Science.gov (United States)

    Sukirwan; Darhim; Herman, T.

    2018-01-01

    The reasoning is one of the mathematical abilities that have very complex implications. This complexity causes reasoning including abilities that are not easily attainable by students. Similarly, studies dealing with reason are quite diverse, primarily concerned with the quality of mathematical reasoning. The objective of this study was to determine the quality of mathematical reasoning based perspective Lithner. Lithner looked at how the environment affects the mathematical reasoning. In this regard, Lithner made two perspectives, namely imitative reasoning and creative reasoning. Imitative reasoning can be memorized and algorithmic reasoning. The Result study shows that although the students generally still have problems in reasoning. Students tend to be on imitative reasoning which means that students tend to use a routine procedure when dealing with reasoning. It is also shown that the traditional approach still dominates on the situation of students’ daily learning.

  20. Automatic measurement system for congenital hip dislocation using a computed radiography

    International Nuclear Information System (INIS)

    Komori, M.; Minato, K.; Hirakawa, A.; Kuwahara, M.

    1988-01-01

    Acetabular angle which is a diagnostic parameter of congenital hip dislocation has been measured manually in conventional X-ray film system. Using digital image directly provided from a computed radiography, an automatic measurement system was developed for this parameter. The process of the measurement was completed within a reasonable time, and accurate enough. The system was combined with an image database, so that it would be a measurement tool of PACS

  1. Individual and family environmental correlates of television and computer time in 10- to 12-year-old European children: the ENERGY-project.

    Science.gov (United States)

    Verloigne, Maïté; Van Lippevelde, Wendy; Bere, Elling; Manios, Yannis; Kovács, Éva; Grillenberger, Monika; Maes, Lea; Brug, Johannes; De Bourdeaudhuij, Ilse

    2015-09-18

    The aim was to investigate which individual and family environmental factors are related to television and computer time separately in 10- to-12-year-old children within and across five European countries (Belgium, Germany, Greece, Hungary, Norway). Data were used from the ENERGY-project. Children and one of their parents completed a questionnaire, including questions on screen time behaviours and related individual and family environmental factors. Family environmental factors included social, political, economic and physical environmental factors. Complete data were obtained from 2022 child-parent dyads (53.8 % girls, mean child age 11.2 ± 0.8 years; mean parental age 40.5 ± 5.1 years). To examine the association between individual and family environmental factors (i.e. independent variables) and television/computer time (i.e. dependent variables) in each country, multilevel regression analyses were performed using MLwiN 2.22, adjusting for children's sex and age. In all countries, children reported more television and/or computer time, if children and their parents thought that the maximum recommended level for watching television and/or using the computer was higher and if children had a higher preference for television watching and/or computer use and a lower self-efficacy to control television watching and/or computer use. Most physical and economic environmental variables were not significantly associated with television or computer time. Slightly more individual factors were related to children's computer time and more parental social environmental factors to children's television time. We also found different correlates across countries: parental co-participation in television watching was significantly positively associated with children's television time in all countries, except for Greece. A higher level of parental television and computer time was only associated with a higher level of children's television and computer time in Hungary. Having rules

  2. Reasoning in Design: Idea Generation Condition Effects on Reasoning Processes and Evaluation of Ideas

    DEFF Research Database (Denmark)

    Cramer-Petersen, Claus Lundgaard; Ahmed-Kristensen, Saeema

    2015-01-01

    to investigate idea generation sessions of two industry cases. Reasoning was found to appear in sequences of alternating reasoning types where the initiating reasoning type was decisive. The study found that abductive reasoning led to more radical ideas, whereas deductive reasoning led to ideas being for project...... requirements, but having a higher proportion being rejected as not valuable. The study sheds light on the conditions that promote these reasoning types. The study is one of the first of its kind and advances an understanding of reasoning in design by empirical means and suggests a relationship between......Reasoning is at the core of design activity and thinking. Thus, understanding and explaining reasoning in design is fundamental to understand and support design practice. This paper investigates reasoning in design and its relationship to varying foci at the stage of idea generation and subsequent...

  3. Computer versus paper--does it make any difference in test performance?

    Science.gov (United States)

    Karay, Yassin; Schauber, Stefan K; Stosch, Christoph; Schüttpelz-Brauns, Katrin

    2015-01-01

    CONSTRUCT: In this study, we examine the differences in test performance between the paper-based and the computer-based version of the Berlin formative Progress Test. In this context it is the first study that allows controlling for students' prior performance. Computer-based tests make possible a more efficient examination procedure for test administration and review. Although university staff will benefit largely from computer-based tests, the question arises if computer-based tests influence students' test performance. A total of 266 German students from the 9th and 10th semester of medicine (comparable with the 4th-year North American medical school schedule) participated in the study (paper = 132, computer = 134). The allocation of the test format was conducted as a randomized matched-pair design in which students were first sorted according to their prior test results. The organizational procedure, the examination conditions, the room, and seating arrangements, as well as the order of questions and answers, were identical in both groups. The sociodemographic variables and pretest scores of both groups were comparable. The test results from the paper and computer versions did not differ. The groups remained within the allotted time, but students using the computer version (particularly the high performers) needed significantly less time to complete the test. In addition, we found significant differences in guessing behavior. Low performers using the computer version guess significantly more than low-performing students in the paper-pencil version. Participants in computer-based tests are not at a disadvantage in terms of their test results. The computer-based test required less processing time. The reason for the longer processing time when using the paper-pencil version might be due to the time needed to write the answer down, controlling for transferring the answer correctly. It is still not known why students using the computer version (particularly low

  4. Fuzzy Reasoning as a Base for Collision Avoidance Decision Support System

    Directory of Open Access Journals (Sweden)

    tanja brcko

    2013-12-01

    Full Text Available Despite the generally high qualifications of seafarers, many maritime accidents are caused by human error; such accidents include capsizing, collision, and fire, and often result in pollution. Enough concern has been generated that researchers around the world have developed the study of the human factor into an independent scientific discipline. A great deal of progress has been made, particularly in the area of artificial intelligence. But since total autonomy is not yet expedient, the decision support systems based on soft computing are proposed to support human navigators and VTS operators in times of crisis as well as during the execution of everyday tasks as a means of reducing risk levels.This paper considers a decision support system based on fuzzy logic integrated into an existing bridge collision avoidance system. The main goal is to determine the appropriate course of avoidance, using fuzzy reasoning.

  5. Reason, emotion and decision-making: risk and reward computation with feeling

    OpenAIRE

    Quartz, Steven R.

    2009-01-01

    Many models of judgment and decision-making posit distinct cognitive and emotional contributions to decision-making under uncertainty. Cognitive processes typically involve exact computations according to a cost-benefit calculus, whereas emotional processes typically involve approximate, heuristic processes that deliver rapid evaluations without mental effort. However, it remains largely unknown what specific parameters of uncertain decision the brain encodes, the extent to which these parame...

  6. Applications of soft computing in time series forecasting simulation and modeling techniques

    CERN Document Server

    Singh, Pritpal

    2016-01-01

    This book reports on an in-depth study of fuzzy time series (FTS) modeling. It reviews and summarizes previous research work in FTS modeling and also provides a brief introduction to other soft-computing techniques, such as artificial neural networks (ANNs), rough sets (RS) and evolutionary computing (EC), focusing on how these techniques can be integrated into different phases of the FTS modeling approach. In particular, the book describes novel methods resulting from the hybridization of FTS modeling approaches with neural networks and particle swarm optimization. It also demonstrates how a new ANN-based model can be successfully applied in the context of predicting Indian summer monsoon rainfall. Thanks to its easy-to-read style and the clear explanations of the models, the book can be used as a concise yet comprehensive reference guide to fuzzy time series modeling, and will be valuable not only for graduate students, but also for researchers and professionals working for academic, business and governmen...

  7. Tracking cognitive phases in analogical reasoning with event-related potentials.

    Science.gov (United States)

    Maguire, Mandy J; McClelland, M Michelle; Donovan, Colin M; Tillman, Gail D; Krawczyk, Daniel C

    2012-03-01

    Analogical reasoning consists of multiple phases. Four-term analogies (A:B::C:D) have an encoding period in which the A:B pair is evaluated prior to a mapping phase. The electrophysiological timing associated with analogical reasoning has remained unclear. We used event-related potentials to identify neural timing related to analogical reasoning relative to perceptual and semantic control conditions. Spatiotemporal principal-components analyses revealed differences primarily in left frontal electrodes during encoding and mapping phases of analogies relative to the other conditions. The timing of the activity differed depending upon the phase of the problem. During the encoding of A:B terms, analogies elicited a positive deflection compared to the control conditions between 400 and 1,200 ms, but for the mapping phase analogical processing elicited a negative deflection that occurred earlier and for a shorter time period, between 350 and 625 ms. These results provide neural and behavioral evidence that 4-term analogy problems involve a highly active evaluation phase of the A:B pair. 2012 APA, all rights reserved

  8. Real-time computer treatment of THz passive device images with the high image quality

    Science.gov (United States)

    Trofimov, Vyacheslav A.; Trofimov, Vladislav V.

    2012-06-01

    We demonstrate real-time computer code improving significantly the quality of images captured by the passive THz imaging system. The code is not only designed for a THz passive device: it can be applied to any kind of such devices and active THz imaging systems as well. We applied our code for computer processing of images captured by four passive THz imaging devices manufactured by different companies. It should be stressed that computer processing of images produced by different companies requires using the different spatial filters usually. The performance of current version of the computer code is greater than one image per second for a THz image having more than 5000 pixels and 24 bit number representation. Processing of THz single image produces about 20 images simultaneously corresponding to various spatial filters. The computer code allows increasing the number of pixels for processed images without noticeable reduction of image quality. The performance of the computer code can be increased many times using parallel algorithms for processing the image. We develop original spatial filters which allow one to see objects with sizes less than 2 cm. The imagery is produced by passive THz imaging devices which captured the images of objects hidden under opaque clothes. For images with high noise we develop an approach which results in suppression of the noise after using the computer processing and we obtain the good quality image. With the aim of illustrating the efficiency of the developed approach we demonstrate the detection of the liquid explosive, ordinary explosive, knife, pistol, metal plate, CD, ceramics, chocolate and other objects hidden under opaque clothes. The results demonstrate the high efficiency of our approach for the detection of hidden objects and they are a very promising solution for the security problem.

  9. An Educational Approach to Computationally Modeling Dynamical Systems

    Science.gov (United States)

    Chodroff, Leah; O'Neal, Tim M.; Long, David A.; Hemkin, Sheryl

    2009-01-01

    Chemists have used computational science methodologies for a number of decades and their utility continues to be unabated. For this reason we developed an advanced lab in computational chemistry in which students gain understanding of general strengths and weaknesses of computation-based chemistry by working through a specific research problem.…

  10. Philosophy of computing and information technology

    OpenAIRE

    Brey, Philip A.E.; Soraker, Johnny; Meijers, A.

    2009-01-01

    Philosophy has been described as having taken a “computational turn,” referring to the ways in which computers and information technology throw new light upon traditional philosophical issues, provide new tools and concepts for philosophical reasoning, and pose theoretical and practical questions that cannot readily be approached within traditional philosophical frameworks. As such, computer technology is arguably the technology that has had the most profound impact on philosophy. Philosopher...

  11. Scheduling Method of Data-Intensive Applications in Cloud Computing Environments

    Directory of Open Access Journals (Sweden)

    Xiong Fu

    2015-01-01

    Full Text Available The virtualization of cloud computing improves the utilization of resources and energy. And a cloud user can deploy his/her own applications and related data on a pay-as-you-go basis. The communications between an application and a data storage node, as well as within the application, have a great impact on the execution efficiency of the application. The locations of subtasks of an application and the data that transferred between the subtasks are the main reason why communication delay exists. The communication delay can affect the completion time of the application. In this paper, we take into account the data transmission time and communications between subtasks and propose a heuristic optimal virtual machine (VM placement algorithm. Related simulations demonstrate that this algorithm can reduce the completion time of user tasks and ensure the feasibility and effectiveness of the overall network performance of applications when running in a cloud computing environment.

  12. Non-Monotonic Spatial Reasoning with Answer Set Programming Modulo Theories

    OpenAIRE

    Wałęga, Przemysław Andrzej; Schultz, Carl; Bhatt, Mehul

    2016-01-01

    The systematic modelling of dynamic spatial systems is a key requirement in a wide range of application areas such as commonsense cognitive robotics, computer-aided architecture design, and dynamic geographic information systems. We present ASPMT(QS), a novel approach and fully-implemented prototype for non-monotonic spatial reasoning -a crucial requirement within dynamic spatial systems- based on Answer Set Programming Modulo Theories (ASPMT). ASPMT(QS) consists of a (qualitative) spatial re...

  13. An assessment of the real-time application capabilities of the SIFT computer system

    Science.gov (United States)

    Butler, R. W.

    1982-01-01

    The real-time capabilities of the SIFT computer system, a highly reliable multicomputer architecture developed to support the flight controls of a relaxed static stability aircraft, are discussed. The SIFT computer system was designed to meet extremely high reliability requirements and to facilitate a formal proof of its correctness. Although SIFT represents a significant achievement in fault-tolerant system research it presents an unusual and restrictive interface to its users. The characteristics of the user interface and its impact on application system design are assessed.

  14. Computation of the Short-Time Linear Canonical Transform with Dual Window

    Directory of Open Access Journals (Sweden)

    Lei Huang

    2017-01-01

    Full Text Available The short-time linear canonical transform (STLCT, which maps the time domain signal into the joint time and frequency domain, has recently attracted some attention in the area of signal processing. However, its applications are still limited due to the fact that selection of coefficients of the short-time linear canonical series (STLCS is not unique, because time and frequency elementary functions (together known as basis function of STLCS do not constitute an orthogonal basis. To solve this problem, this paper investigates a dual window solution. First, the nonorthogonal problem that suffered from original window is fulfilled by orthogonal condition with dual window. Then based on the obtained condition, a dual window computation approach of the GT is extended to the STLCS. In addition, simulations verify the validity of the proposed condition and solutions. Furthermore, some possible applied directions are discussed.

  15. Manual cross check of computed dose times for motorised wedged fields

    International Nuclear Information System (INIS)

    Porte, J.

    2001-01-01

    If a mass of tissue equivalent material is exposed in turn to wedged and open radiation fields of the same size, for equal times, it is incorrect to assume that the resultant isodose pattern will be effectively that of a wedge having half the angle of the wedged field. Computer programs have been written to address the problem of creating an intermediate wedge field, commonly known as a motorized wedge. The total exposure time is apportioned between the open and wedged fields, to produce a beam modification equivalent to that of a wedged field of a given wedge angle. (author)

  16. Historical reasoning: towards a framework for analyzing students' reasoning about the past

    NARCIS (Netherlands)

    van Drie, J.; van Boxtel, C.

    2008-01-01

    This article explores historical reasoning, an important activity in history learning. Based upon an extensive review of empirical literature on students’ thinking and reasoning about history, a theoretical framework of historical reasoning is proposed. The framework consists of six components:

  17. ADAPTATION OF JOHNSON SEQUENCING ALGORITHM FOR JOB SCHEDULING TO MINIMISE THE AVERAGE WAITING TIME IN CLOUD COMPUTING ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    SOUVIK PAL

    2016-09-01

    Full Text Available Cloud computing is an emerging paradigm of Internet-centric business computing where Cloud Service Providers (CSPs are providing services to the customer according to their needs. The key perception behind cloud computing is on-demand sharing of resources available in the resource pool provided by CSP, which implies new emerging business model. The resources are provisioned when jobs arrive. The job scheduling and minimization of waiting time are the challenging issue in cloud computing. When a large number of jobs are requested, they have to wait for getting allocated to the servers which in turn may increase the queue length and also waiting time. This paper includes system design for implementation which is concerned with Johnson Scheduling Algorithm that provides the optimal sequence. With that sequence, service times can be obtained. The waiting time and queue length can be reduced using queuing model with multi-server and finite capacity which improves the job scheduling model.

  18. Farmers’ reasons for deregistering from organic farming

    DEFF Research Database (Denmark)

    Koesling, Matthias; Løes, Anne-Kristin; Flaten, Ola

    2012-01-01

    Every year since 2002, 150 to 200 farmers in Norway have deregistered from certified organic production. The aim of this study was to get behind these figures and improve our understanding of the reasoning leading to decisions to opt out. Four cases of deregistered organic farmers with grain, sheep......, dairy or vegetable production were selected for in-depth studies. The cases were analysed from the perspective of individual competencies and the competencies available in the networks of the selected organic farmers. Besides the conspicuous reasons to opt out of certified organic farming......, such as regulations getting stricter over time and low income, personal reasons such as disappointment and need for acceptance were also important. This shows that hard mechanisms, such as economic support and premium prices, are not sufficient to motivate farmers for sustained organic management. Support...

  19. Reservoir computer predictions for the Three Meter magnetic field time evolution

    Science.gov (United States)

    Perevalov, A.; Rojas, R.; Lathrop, D. P.; Shani, I.; Hunt, B. R.

    2017-12-01

    The source of the Earth's magnetic field is the turbulent flow of liquid metal in the outer core. Our experiment's goal is to create Earth-like dynamo, to explore the mechanisms and to understand the dynamics of the magnetic and velocity fields. Since it is a complicated system, predictions of the magnetic field is a challenging problem. We present results of mimicking the three Meter experiment by a reservoir computer deep learning algorithm. The experiment is a three-meter diameter outer sphere and a one-meter diameter inner sphere with the gap filled with liquid sodium. The spheres can rotate up to 4 and 14 Hz respectively, giving a Reynolds number near to 108. Two external electromagnets apply magnetic fields, while an array of 31 external and 2 internal Hall sensors measure the resulting induced fields. We use this magnetic probe data to train a reservoir computer to predict the 3M time evolution and mimic waves in the experiment. Surprisingly accurate predictions can be made for several magnetic dipole time scales. This shows that such a complicated MHD system's behavior can be predicted. We gratefully acknowledge support from NSF EAR-1417148.

  20. A stable computational scheme for stiff time-dependent constitutive equations

    International Nuclear Information System (INIS)

    Shih, C.F.; Delorenzi, H.G.; Miller, A.K.

    1977-01-01

    Viscoplasticity and creep type constitutive equations are increasingly being employed in finite element codes for evaluating the deformation of high temperature structural members. These constitutive equations frequently exhibit stiff regimes which makes an analytical assessment of the structure very costly. A computational scheme for handling deformation in stiff regimes is proposed in this paper. By the finite element discretization, the governing partial differential equations in the spatial (x) and time (t) variables are reduced to a system of nonlinear ordinary differential equations in the independent variable t. The constitutive equations are expanded in a Taylor's series about selected values of t. The resulting system of differential equations are then integrated by an implicit scheme which employs a predictor technique to initiate the Newton-Raphson procedure. To examine the stability and accuracy of the computational scheme, a series of calculations were carried out for uniaxial specimens and thick wall tubes subjected to mechanical and thermal loading. (Auth.)

  1. Decreasing Transition Times in Elementary School Classrooms: Using Computer-Assisted Instruction to Automate Intervention Components

    Science.gov (United States)

    Hine, Jeffrey F.; Ardoin, Scott P.; Foster, Tori E.

    2015-01-01

    Research suggests that students spend a substantial amount of time transitioning between classroom activities, which may reduce time spent academically engaged. This study used an ABAB design to evaluate the effects of a computer-assisted intervention that automated intervention components previously shown to decrease transition times. We examined…

  2. The first accident simulation of Angra-1 power plant using the ALMOD computer code

    International Nuclear Information System (INIS)

    Camargo, C.T.M.

    1981-01-01

    The implementation of the german computer code ALMOD and its application in the calculation of Angra-1, a nuclear power plant different from the KWU power plants, demanded study and models adaptation, and due to economic reasons simplifications and optimizations were necessary. The first results define the analytical potential of the computer code, confirm the adequacy of the adaptations done and provide relevant conclusions about the Angra-1 safety analysis, showing at the same time areas in which the model can be applied or simply improved. (E.G.) [pt

  3. Metacognition and reasoning

    Science.gov (United States)

    Fletcher, Logan; Carruthers, Peter

    2012-01-01

    This article considers the cognitive architecture of human meta-reasoning: that is, metacognition concerning one's own reasoning and decision-making. The view we defend is that meta-reasoning is a cobbled-together skill comprising diverse self-management strategies acquired through individual and cultural learning. These approximate the monitoring-and-control functions of a postulated adaptive system for metacognition by recruiting mechanisms that were designed for quite other purposes. PMID:22492753

  4. A Systematic Review of Predictors of, and Reasons for, Adherence to Online Psychological Interventions.

    Science.gov (United States)

    Beatty, Lisa; Binnion, Claire

    2016-12-01

    A key issue regarding the provision of psychological therapy in a self-guided online format is low rates of adherence. The aim of this systematic review was to assess both quantitative and qualitative data on the predictors of adherence, as well as participant reported reasons for adhering or not adhering to online psychological interventions. Database searches of PsycINFO, Medline, and CINAHL identified 1721 potentially relevant articles published between 1 January 2000 and 25 November 2015. A further 34 potentially relevant articles were retrieved from reference lists. Articles that reported predictors of, or reasons for, adherence to an online psychological intervention were included. A total of 36 studies met the inclusion criteria. Predictors assessed included demographic, psychological, characteristics of presenting problem, and intervention/computer-related predictors. Evidence suggested that female gender, higher treatment expectancy, sufficient time, and personalized intervention content each predicted higher adherence. Age, baseline symptom severity, and control group allocation had mixed findings. The majority of assessed variables however, did not predict adherence. Few clear predictors of adherence emerged overall, and most results were either mixed or too preliminary to draw conclusions. More research of predictors associated with adherence to online interventions is warranted.

  5. Logical reasoning versus information processing in the dual-strategy model of reasoning.

    Science.gov (United States)

    Markovits, Henry; Brisson, Janie; de Chantal, Pier-Luc

    2017-01-01

    One of the major debates concerning the nature of inferential reasoning is between counterexample-based strategies such as mental model theory and statistical strategies underlying probabilistic models. The dual-strategy model, proposed by Verschueren, Schaeken, & d'Ydewalle (2005a, 2005b), which suggests that people might have access to both kinds of strategy has been supported by several recent studies. These have shown that statistical reasoners make inferences based on using information about premises in order to generate a likelihood estimate of conclusion probability. However, while results concerning counterexample reasoners are consistent with a counterexample detection model, these results could equally be interpreted as indicating a greater sensitivity to logical form. In order to distinguish these 2 interpretations, in Studies 1 and 2, we presented reasoners with Modus ponens (MP) inferences with statistical information about premise strength and in Studies 3 and 4, naturalistic MP inferences with premises having many disabling conditions. Statistical reasoners accepted the MP inference more often than counterexample reasoners in Studies 1 and 2, while the opposite pattern was observed in Studies 3 and 4. Results show that these strategies must be defined in terms of information processing, with no clear relations to "logical" reasoning. These results have additional implications for the underlying debate about the nature of human reasoning. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  6. Probabilistic reasoning under time pressure: an assessment in Italian, Spanish and English psychology undergraduates

    Science.gov (United States)

    Agus, M.; Hitchcott, P. K.; Penna, M. P.; Peró-Cebollero, M.; Guàrdia-Olmos, J.

    2016-11-01

    Many studies have investigated the features of probabilistic reasoning developed in relation to different formats of problem presentation, showing that it is affected by various individual and contextual factors. Incomplete understanding of the identity and role of these factors may explain the inconsistent evidence concerning the effect of problem presentation format. Thus, superior performance has sometimes been observed for graphically, rather than verbally, presented problems. The present study was undertaken to address this issue. Psychology undergraduates without any statistical expertise (N = 173 in Italy; N = 118 in Spain; N = 55 in England) were administered statistical problems in two formats (verbal-numerical and graphical-pictorial) under a condition of time pressure. Students also completed additional measures indexing several potentially relevant individual dimensions (statistical ability, statistical anxiety, attitudes towards statistics and confidence). Interestingly, a facilitatory effect of graphical presentation was observed in the Italian and Spanish samples but not in the English one. Significantly, the individual dimensions predicting statistical performance also differed between the samples, highlighting a different role of confidence. Hence, these findings confirm previous observations concerning problem presentation format while simultaneously highlighting the importance of individual dimensions.

  7. Knowing, Applying, and Reasoning about Arithmetic: Roles of Domain-General and Numerical Skills in Multiple Domains of Arithmetic Learning

    Science.gov (United States)

    Zhang, Xiao; Räsänen, Pekka; Koponen, Tuire; Aunola, Kaisa; Lerkkanen, Marja-Kristiina; Nurmi, Jari-Erik

    2017-01-01

    The longitudinal relations of domain-general and numerical skills at ages 6-7 years to 3 cognitive domains of arithmetic learning, namely knowing (written computation), applying (arithmetic word problems), and reasoning (arithmetic reasoning) at age 11, were examined for a representative sample of 378 Finnish children. The results showed that…

  8. SPECT: Theoretical aspecte and evolution of emission computed axial tomography

    International Nuclear Information System (INIS)

    Brunol, J.; Nuta, V.

    1981-01-01

    We have detailed certain of the elements of 3-D image reconstruction from axial projections. Two of the aspects specific to nuclear medicine have been analysed namely self-absorption and statistics. In our view, the development of ECAT in the months to come must hence proceed in two essential directions: - application to dynamic cardiac imagery (multigated). Results of this type have been obtained over 8 months in the Radioisotope Service of Cochin Hospital in Paris. It must be stressed here that the number of images to be processed then becomes considerable (multiplication by the gate factor yielding more than 100 images), the more the statistics are reduced due to the fact of the temporal separation. The obtaining of good image quality requires sophisticated quadri-dimensional processing. It follows that the computing times, with all the mini-computers available in nuclear medicine, then become much too great to envisage really application in hospital routine (several hours of computing). This is the reason why we connected an array processor with the IMAC system. This very powerful system (several tens of times the power of a mini-computer) will reduce the time of such computing to less than 10 minutes. New elements can be introduced into the reconstruction algorithm (static case opposite the foregoing one). These important elements of improvement are to the detriment of space and hence of computing time. Here again, the use of an array processor appears indispensable. It is to recall that the ECAT is today a currently used method, the theoretical analyses that it has necessitated have opened the way to new effective methods of tomography by 'Slanted Hole'. (orig.) [de

  9. Real-time dynamics of lattice gauge theories with a few-qubit quantum computer

    Science.gov (United States)

    Martinez, Esteban A.; Muschik, Christine A.; Schindler, Philipp; Nigg, Daniel; Erhard, Alexander; Heyl, Markus; Hauke, Philipp; Dalmonte, Marcello; Monz, Thomas; Zoller, Peter; Blatt, Rainer

    2016-06-01

    Gauge theories are fundamental to our understanding of interactions between the elementary constituents of matter as mediated by gauge bosons. However, computing the real-time dynamics in gauge theories is a notorious challenge for classical computational methods. This has recently stimulated theoretical effort, using Feynman’s idea of a quantum simulator, to devise schemes for simulating such theories on engineered quantum-mechanical devices, with the difficulty that gauge invariance and the associated local conservation laws (Gauss laws) need to be implemented. Here we report the experimental demonstration of a digital quantum simulation of a lattice gauge theory, by realizing (1 + 1)-dimensional quantum electrodynamics (the Schwinger model) on a few-qubit trapped-ion quantum computer. We are interested in the real-time evolution of the Schwinger mechanism, describing the instability of the bare vacuum due to quantum fluctuations, which manifests itself in the spontaneous creation of electron-positron pairs. To make efficient use of our quantum resources, we map the original problem to a spin model by eliminating the gauge fields in favour of exotic long-range interactions, which can be directly and efficiently implemented on an ion trap architecture. We explore the Schwinger mechanism of particle-antiparticle generation by monitoring the mass production and the vacuum persistence amplitude. Moreover, we track the real-time evolution of entanglement in the system, which illustrates how particle creation and entanglement generation are directly related. Our work represents a first step towards quantum simulation of high-energy theories using atomic physics experiments—the long-term intention is to extend this approach to real-time quantum simulations of non-Abelian lattice gauge theories.

  10. Low Computational Signal Acquisition for GNSS Receivers Using a Resampling Strategy and Variable Circular Correlation Time

    Directory of Open Access Journals (Sweden)

    Yeqing Zhang

    2018-02-01

    Full Text Available For the objective of essentially decreasing computational complexity and time consumption of signal acquisition, this paper explores a resampling strategy and variable circular correlation time strategy specific to broadband multi-frequency GNSS receivers. In broadband GNSS receivers, the resampling strategy is established to work on conventional acquisition algorithms by resampling the main lobe of received broadband signals with a much lower frequency. Variable circular correlation time is designed to adapt to different signal strength conditions and thereby increase the operation flexibility of GNSS signal acquisition. The acquisition threshold is defined as the ratio of the highest and second highest correlation results in the search space of carrier frequency and code phase. Moreover, computational complexity of signal acquisition is formulated by amounts of multiplication and summation operations in the acquisition process. Comparative experiments and performance analysis are conducted on four sets of real GPS L2C signals with different sampling frequencies. The results indicate that the resampling strategy can effectively decrease computation and time cost by nearly 90–94% with just slight loss of acquisition sensitivity. With circular correlation time varying from 10 ms to 20 ms, the time cost of signal acquisition has increased by about 2.7–5.6% per millisecond, with most satellites acquired successfully.

  11. Low Computational Signal Acquisition for GNSS Receivers Using a Resampling Strategy and Variable Circular Correlation Time

    Science.gov (United States)

    Zhang, Yeqing; Wang, Meiling; Li, Yafeng

    2018-01-01

    For the objective of essentially decreasing computational complexity and time consumption of signal acquisition, this paper explores a resampling strategy and variable circular correlation time strategy specific to broadband multi-frequency GNSS receivers. In broadband GNSS receivers, the resampling strategy is established to work on conventional acquisition algorithms by resampling the main lobe of received broadband signals with a much lower frequency. Variable circular correlation time is designed to adapt to different signal strength conditions and thereby increase the operation flexibility of GNSS signal acquisition. The acquisition threshold is defined as the ratio of the highest and second highest correlation results in the search space of carrier frequency and code phase. Moreover, computational complexity of signal acquisition is formulated by amounts of multiplication and summation operations in the acquisition process. Comparative experiments and performance analysis are conducted on four sets of real GPS L2C signals with different sampling frequencies. The results indicate that the resampling strategy can effectively decrease computation and time cost by nearly 90–94% with just slight loss of acquisition sensitivity. With circular correlation time varying from 10 ms to 20 ms, the time cost of signal acquisition has increased by about 2.7–5.6% per millisecond, with most satellites acquired successfully. PMID:29495301

  12. Mask CD relationship to temperature at the time backscatter is received

    Science.gov (United States)

    Zable, Harold; Kronmiller, Tom; Pearman, Ryan; Guthrie, Bill; Shirali, Nagesh; Masuda, Yukihiro; Kamikubo, Takashi; Nakayamada, Noriaki; Fujimura, Aki

    2017-07-01

    Mask writers need to be able to write sub-50nm features accurately. Nano-imprint lithography (NIL) masters need to create sub-20nm line and space (L:S) patterns reliably. Increasingly slower resists are deployed, but mask write times need to remain reasonable. The leading edge EBM-9500 offers 1200A/cm2 current density to shoot variable shaped beam (VSB) to write the masks. Last year, thermal effect correction (TEC) was introduced by NuFlare in the EBM-95001. It is a GPU-accelerated inline correction for the effect that the temperature of the resist has on CD. For example, a 100nm CD may print at 102nm where that area was at a comparably high temperature at the time of the shot. Since thermal effect is a temporal effect, the simulated temperature of the surface of the mask is dynamically updated for the effect of each shot in order to accurately predict the cumulative effect that is the temperature at the location of the shot at the time of the shot and therefore its impact on CD. The shot dose is changed to reverse the effects of the temperature change. This paper for the first time reveals an enhancement to this thermal model and a simulator for it. It turns out that the temperature at the time each location receives backscatter from other shots also make a difference to the CD. The effect is secondary, but still measurable for some resists and substrates. Results of a test-chip study will be presented. The computation required for the backscatter effect is substantial. It has been demonstrated that this calculation can be performed fast enough to be inline with the EBM-9500 with a reasonable-sized computing platform. Run-time results and the computing architecture will be presented.

  13. Differences in autonomic physiological responses between good and poor inductive reasoners.

    Science.gov (United States)

    Melis, C; van Boxtel, A

    2001-11-01

    We investigated individual- and task-related differences in autonomic physiological responses induced by time limited figural and verbal inductive reasoning tasks. In a group of 52 participants, the percentage of correctly responded task items was evaluated together with nine different autonomic physiological response measures and respiration rate (RR). Weighted multidimensional scaling analyses of the physiological responses revealed three underlying dimensions, primarily characterized by RR, parasympathetic, and sympathetic activity. RR and sympathetic activity appeared to be relatively more important response dimensions for poor reasoners, whereas parasympathetic responsivity was relatively more important for good reasoners. These results suggest that poor reasoners showed higher levels of cognitive processing intensity than good reasoners. Furthermore, for the good reasoners, the dimension of sympathetic activity was relatively more important during the figural than during the verbal reasoning task, which was explained in terms of hemispheric lateralization in autonomic function.

  14. Artificial neuron operations and spike-timing-dependent plasticity using memristive devices for brain-inspired computing

    Science.gov (United States)

    Marukame, Takao; Nishi, Yoshifumi; Yasuda, Shin-ichi; Tanamoto, Tetsufumi

    2018-04-01

    The use of memristive devices for creating artificial neurons is promising for brain-inspired computing from the viewpoints of computation architecture and learning protocol. We present an energy-efficient multiplier accumulator based on a memristive array architecture incorporating both analog and digital circuitries. The analog circuitry is used to full advantage for neural networks, as demonstrated by the spike-timing-dependent plasticity (STDP) in fabricated AlO x /TiO x -based metal-oxide memristive devices. STDP protocols for controlling periodic analog resistance with long-range stability were experimentally verified using a variety of voltage amplitudes and spike timings.

  15. Real time recording system of radioisotopes by local area network (LAN) computer system and user input processing

    International Nuclear Information System (INIS)

    Shinohara, Kunio; Ito, Atsushi; Kawaguchi, Hajime; Yanase, Makoto; Uno, Kiyoshi.

    1991-01-01

    A computer-assisted real time recording system was developed for management of radioisotopes. The system composed of two personal computers forming LAN, identification-card (ID-card) reader, and electricity-operating door-lock. One computer is operated by radiation safety staffs and stores the records of radioisotopes. The users of radioisotopes are registered in this computer. Another computer is installed in front of the storage room for radioisotopes. This computer is ready for operation by a registered ID-card and is input data by the user. After the completion of data input, the door to the storage room is unlocked. The present system enables us the following merits: Radiation safety staffs can easily keep up with the present states of radioisotopes in the storage room and save much labor. Radioactivity is always corrected. The upper limit of radioactivities in use per day is automatically checked and users are regulated when they input the amounts to be used. Users can obtain storage records of radioisotopes any time. In addition, the system is applicable to facilities which have more than two storage rooms. (author)

  16. Quantification of Artifact Reduction With Real-Time Cine Four-Dimensional Computed Tomography Acquisition Methods

    International Nuclear Information System (INIS)

    Langner, Ulrich W.; Keall, Paul J.

    2010-01-01

    Purpose: To quantify the magnitude and frequency of artifacts in simulated four-dimensional computed tomography (4D CT) images using three real-time acquisition methods- direction-dependent displacement acquisition, simultaneous displacement and phase acquisition, and simultaneous displacement and velocity acquisition- and to compare these methods with commonly used retrospective phase sorting. Methods and Materials: Image acquisition for the four 4D CT methods was simulated with different displacement and velocity tolerances for spheres with radii of 0.5 cm, 1.5 cm, and 2.5 cm, using 58 patient-measured tumors and respiratory motion traces. The magnitude and frequency of artifacts, CT doses, and acquisition times were computed for each method. Results: The mean artifact magnitude was 50% smaller for the three real-time methods than for retrospective phase sorting. The dose was ∼50% lower, but the acquisition time was 20% to 100% longer for the real-time methods than for retrospective phase sorting. Conclusions: Real-time acquisition methods can reduce the frequency and magnitude of artifacts in 4D CT images, as well as the imaging dose, but they increase the image acquisition time. The results suggest that direction-dependent displacement acquisition is the preferred real-time 4D CT acquisition method, because on average, the lowest dose is delivered to the patient and the acquisition time is the shortest for the resulting number and magnitude of artifacts.

  17. A distributed agent architecture for real-time knowledge-based systems: Real-time expert systems project, phase 1

    Science.gov (United States)

    Lee, S. Daniel

    1990-01-01

    We propose a distributed agent architecture (DAA) that can support a variety of paradigms based on both traditional real-time computing and artificial intelligence. DAA consists of distributed agents that are classified into two categories: reactive and cognitive. Reactive agents can be implemented directly in Ada to meet hard real-time requirements and be deployed on on-board embedded processors. A traditional real-time computing methodology under consideration is the rate monotonic theory that can guarantee schedulability based on analytical methods. AI techniques under consideration for reactive agents are approximate or anytime reasoning that can be implemented using Bayesian belief networks as in Guardian. Cognitive agents are traditional expert systems that can be implemented in ART-Ada to meet soft real-time requirements. During the initial design of cognitive agents, it is critical to consider the migration path that would allow initial deployment on ground-based workstations with eventual deployment on on-board processors. ART-Ada technology enables this migration while Lisp-based technologies make it difficult if not impossible. In addition to reactive and cognitive agents, a meta-level agent would be needed to coordinate multiple agents and to provide meta-level control.

  18. Quantum computing without wavefunctions: time-dependent density functional theory for universal quantum computation.

    Science.gov (United States)

    Tempel, David G; Aspuru-Guzik, Alán

    2012-01-01

    We prove that the theorems of TDDFT can be extended to a class of qubit Hamiltonians that are universal for quantum computation. The theorems of TDDFT applied to universal Hamiltonians imply that single-qubit expectation values can be used as the basic variables in quantum computation and information theory, rather than wavefunctions. From a practical standpoint this opens the possibility of approximating observables of interest in quantum computations directly in terms of single-qubit quantities (i.e. as density functionals). Additionally, we also demonstrate that TDDFT provides an exact prescription for simulating universal Hamiltonians with other universal Hamiltonians that have different, and possibly easier-to-realize two-qubit interactions. This establishes the foundations of TDDFT for quantum computation and opens the possibility of developing density functionals for use in quantum algorithms.

  19. Review of resistance temperature detector time response characteristics. Safety evaluation report

    International Nuclear Information System (INIS)

    1981-08-01

    A Resistance Temperature Detector (RTD) is used extensively for monitoring water temperatures in nuclear reactor plants. The RTD element does not respond instantaneously to changes in water temperature, but rather there is a time delay before the element senses the temperature change, and in nuclear reactors this delay must be factored into the computation of safety setpoints. For this reason it is necessary to have an accurate description of the RTD time response. This report is a review of the current state of the art of describing and measuring this time response

  20. Contextual object understanding through geospatial analysis and reasoning (COUGAR)

    Science.gov (United States)

    Douglas, Joel; Antone, Matthew; Coggins, James; Rhodes, Bradley J.; Sobel, Erik; Stolle, Frank; Vinciguerra, Lori; Zandipour, Majid; Zhong, Yu

    2009-05-01

    Military operations in urban areas often require detailed knowledge of the location and identity of commonly occurring objects and spatial features. The ability to rapidly acquire and reason over urban scenes is critically important to such tasks as mission and route planning, visibility prediction, communications simulation, target recognition, and inference of higher-level form and function. Under DARPA's Urban Reasoning and Geospatial ExploitatioN Technology (URGENT) Program, the BAE Systems team has developed a system that combines a suite of complementary feature extraction and matching algorithms with higher-level inference and contextual reasoning to detect, segment, and classify urban entities of interest in a fully automated fashion. Our system operates solely on colored 3D point clouds, and considers object categories with a wide range of specificity (fire hydrants, windows, parking lots), scale (street lights, roads, buildings, forests), and shape (compact shapes, extended regions, terrain). As no single method can recognize the diverse set of categories under consideration, we have integrated multiple state-of-the-art technologies that couple hierarchical associative reasoning with robust computer vision and machine learning techniques. Our solution leverages contextual cues and evidence propagation from features to objects to scenes in order to exploit the combined descriptive power of 3D shape, appearance, and learned inter-object spatial relationships. The result is a set of tools designed to significantly enhance the productivity of analysts in exploiting emerging 3D data sources.

  1. Computation and control with neural nets

    Energy Technology Data Exchange (ETDEWEB)

    Corneliusen, A.; Terdal, P.; Knight, T.; Spencer, J.

    1989-10-04

    As energies have increased exponentially with time so have the size and complexity of accelerators and control systems. NN may offer the kinds of improvements in computation and control that are needed to maintain acceptable functionality. For control their associative characteristics could provide signal conversion or data translation. Because they can do any computation such as least squares, they can close feedback loops autonomously to provide intelligent control at the point of action rather than at a central location that requires transfers, conversions, hand-shaking and other costly repetitions like input protection. Both computation and control can be integrated on a single chip, printed circuit or an optical equivalent that is also inherently faster through full parallel operation. For such reasons one expects lower costs and better results. Such systems could be optimized by integrating sensor and signal processing functions. Distributed nets of such hardware could communicate and provide global monitoring and multiprocessing in various ways e.g. via token, slotted or parallel rings (or Steiner trees) for compatibility with existing systems. Problems and advantages of this approach such as an optimal, real-time Turing machine are discussed. Simple examples are simulated and hardware implemented using discrete elements that demonstrate some basic characteristics of learning and parallelism. Future microprocessors' are predicted and requested on this basis. 19 refs., 18 figs.

  2. Computation and control with neural nets

    International Nuclear Information System (INIS)

    Corneliusen, A.; Terdal, P.; Knight, T.; Spencer, J.

    1989-01-01

    As energies have increased exponentially with time so have the size and complexity of accelerators and control systems. NN may offer the kinds of improvements in computation and control that are needed to maintain acceptable functionality. For control their associative characteristics could provide signal conversion or data translation. Because they can do any computation such as least squares, they can close feedback loops autonomously to provide intelligent control at the point of action rather than at a central location that requires transfers, conversions, hand-shaking and other costly repetitions like input protection. Both computation and control can be integrated on a single chip, printed circuit or an optical equivalent that is also inherently faster through full parallel operation. For such reasons one expects lower costs and better results. Such systems could be optimized by integrating sensor and signal processing functions. Distributed nets of such hardware could communicate and provide global monitoring and multiprocessing in various ways e.g. via token, slotted or parallel rings (or Steiner trees) for compatibility with existing systems. Problems and advantages of this approach such as an optimal, real-time Turing machine are discussed. Simple examples are simulated and hardware implemented using discrete elements that demonstrate some basic characteristics of learning and parallelism. Future 'microprocessors' are predicted and requested on this basis. 19 refs., 18 figs

  3. Thermodynamic heuristics with case-based reasoning: combined insights for RNA pseudoknot secondary structure.

    Science.gov (United States)

    Al-Khatib, Ra'ed M; Rashid, Nur'Aini Abdul; Abdullah, Rosni

    2011-08-01

    The secondary structure of RNA pseudoknots has been extensively inferred and scrutinized by computational approaches. Experimental methods for determining RNA structure are time consuming and tedious; therefore, predictive computational approaches are required. Predicting the most accurate and energy-stable pseudoknot RNA secondary structure has been proven to be an NP-hard problem. In this paper, a new RNA folding approach, termed MSeeker, is presented; it includes KnotSeeker (a heuristic method) and Mfold (a thermodynamic algorithm). The global optimization of this thermodynamic heuristic approach was further enhanced by using a case-based reasoning technique as a local optimization method. MSeeker is a proposed algorithm for predicting RNA pseudoknot structure from individual sequences, especially long ones. This research demonstrates that MSeeker improves the sensitivity and specificity of existing RNA pseudoknot structure predictions. The performance and structural results from this proposed method were evaluated against seven other state-of-the-art pseudoknot prediction methods. The MSeeker method had better sensitivity than the DotKnot, FlexStem, HotKnots, pknotsRG, ILM, NUPACK and pknotsRE methods, with 79% of the predicted pseudoknot base-pairs being correct.

  4. Development of abstract mathematical reasoning: the case of algebra.

    Science.gov (United States)

    Susac, Ana; Bubic, Andreja; Vrbanc, Andrija; Planinic, Maja

    2014-01-01

    Algebra typically represents the students' first encounter with abstract mathematical reasoning and it therefore causes significant difficulties for students who still reason concretely. The aim of the present study was to investigate the developmental trajectory of the students' ability to solve simple algebraic equations. 311 participants between the ages of 13 and 17 were given a computerized test of equation rearrangement. Equations consisted of an unknown and two other elements (numbers or letters), and the operations of multiplication/division. The obtained results showed that younger participants are less accurate and slower in solving equations with letters (symbols) than those with numbers. This difference disappeared for older participants (16-17 years), suggesting that they had reached an abstract reasoning level, at least for this simple task. A corresponding conclusion arises from the analysis of their strategies which suggests that younger participants mostly used concrete strategies such as inserting numbers, while older participants typically used more abstract, rule-based strategies. These results indicate that the development of algebraic thinking is a process which unfolds over a long period of time. In agreement with previous research, we can conclude that, on average, children at the age of 15-16 transition from using concrete to abstract strategies while solving the algebra problems addressed within the present study. A better understanding of the timing and speed of students' transition from concrete arithmetic reasoning to abstract algebraic reasoning might help in designing better curricula and teaching materials that would ease that transition.

  5. Reasons for Trying E-cigarettes and Risk of Continued Use.

    Science.gov (United States)

    Bold, Krysten W; Kong, Grace; Cavallo, Dana A; Camenga, Deepa R; Krishnan-Sarin, Suchitra

    2016-09-01

    Longitudinal research is needed to identify predictors of continued electronic cigarette (e-cigarette) use among youth. We expected that certain reasons for first trying e-cigarettes would predict continued use over time (eg, good flavors, friends use), whereas other reasons would not predict continued use (eg, curiosity). Longitudinal surveys from middle and high school students from fall 2013 (wave 1) and spring 2014 (wave 2) were used to examine reasons for trying e-cigarettes as predictors of continued e-cigarette use over time. Ever e-cigarette users (n = 340) at wave 1 were categorized into those using or not using e-cigarettes at wave 2. Among those who continued using e-cigarettes, reasons for trying e-cigarettes were examined as predictors of use frequency, measured as the number of days using e-cigarettes in the past 30 days at wave 2. Covariates included age, sex, race, and smoking of traditional cigarettes. Several reasons for first trying e-cigarettes predicted continued use, including low cost, the ability to use e-cigarettes anywhere, and to quit smoking regular cigarettes. Trying e-cigarettes because of low cost also predicted more days of e-cigarette use at wave 2. Being younger or a current smoker of traditional cigarettes also predicted continued use and more frequent use over time. Regulatory strategies such as increasing cost or prohibiting e-cigarette use in certain places may be important for preventing continued use in youth. In addition, interventions targeting current cigarette smokers and younger students may also be needed. Copyright © 2016 by the American Academy of Pediatrics.

  6. Reasons for Trying E-cigarettes and Risk of Continued Use

    Science.gov (United States)

    Kong, Grace; Cavallo, Dana A.; Camenga, Deepa R.; Krishnan-Sarin, Suchitra

    2016-01-01

    BACKGROUND: Longitudinal research is needed to identify predictors of continued electronic cigarette (e-cigarette) use among youth. We expected that certain reasons for first trying e-cigarettes would predict continued use over time (eg, good flavors, friends use), whereas other reasons would not predict continued use (eg, curiosity). METHODS: Longitudinal surveys from middle and high school students from fall 2013 (wave 1) and spring 2014 (wave 2) were used to examine reasons for trying e-cigarettes as predictors of continued e-cigarette use over time. Ever e-cigarette users (n = 340) at wave 1 were categorized into those using or not using e-cigarettes at wave 2. Among those who continued using e-cigarettes, reasons for trying e-cigarettes were examined as predictors of use frequency, measured as the number of days using e-cigarettes in the past 30 days at wave 2. Covariates included age, sex, race, and smoking of traditional cigarettes. RESULTS: Several reasons for first trying e-cigarettes predicted continued use, including low cost, the ability to use e-cigarettes anywhere, and to quit smoking regular cigarettes. Trying e-cigarettes because of low cost also predicted more days of e-cigarette use at wave 2. Being younger or a current smoker of traditional cigarettes also predicted continued use and more frequent use over time. CONCLUSIONS: Regulatory strategies such as increasing cost or prohibiting e-cigarette use in certain places may be important for preventing continued use in youth. In addition, interventions targeting current cigarette smokers and younger students may also be needed. PMID:27503349

  7. Case-Based FCTF Reasoning System

    Directory of Open Access Journals (Sweden)

    Jing Lu

    2015-10-01

    Full Text Available Case-based reasoning uses old information to infer the answer of new problems. In case-based reasoning, a reasoner firstly records the previous cases, then searches the previous case list that is similar to the current one and uses that to solve the new case. Case-based reasoning means adapting old solving solutions to new situations. This paper proposes a reasoning system based on the case-based reasoning method. To begin, we show the theoretical structure and algorithm of from coarse to fine (FCTF reasoning system, and then demonstrate that it is possible to successfully learn and reason new information. Finally, we use our system to predict practical weather conditions based on previous ones and experiments show that the prediction accuracy increases with further learning of the FCTF reasoning system.

  8. Real time computer control of a nonlinear Multivariable System via Linearization and Stability Analysis

    International Nuclear Information System (INIS)

    Raza, K.S.M.

    2004-01-01

    This paper demonstrates that if a complicated nonlinear, non-square, state-coupled multi variable system is smartly linearized and subjected to a thorough stability analysis then we can achieve our design objectives via a controller which will be quite simple (in term of resource usage and execution time) and very efficient (in terms of robustness). Further the aim is to implement this controller via computer in a real time environment. Therefore first a nonlinear mathematical model of the system is achieved. An intelligent work is done to decouple the multivariable system. Linearization and stability analysis techniques are employed for the development of a linearized and mathematically sound control law. Nonlinearities like the saturation in actuators are also been catered. The controller is then discretized using Runge-Kutta integration. Finally the discretized control law is programmed in a computer in a real time environment. The programme is done in RT -Linux using GNU C for the real time realization of the control scheme. The real time processes, like sampling and controlled actuation, and the non real time processes, like graphical user interface and display, are programmed as different tasks. The issue of inter process communication, between real time and non real time task is addressed quite carefully. The results of this research pursuit are presented graphically. (author)

  9. Methods for solving reasoning problems in abstract argumentation – A survey

    Science.gov (United States)

    Charwat, Günther; Dvořák, Wolfgang; Gaggl, Sarah A.; Wallner, Johannes P.; Woltran, Stefan

    2015-01-01

    Within the last decade, abstract argumentation has emerged as a central field in Artificial Intelligence. Besides providing a core formalism for many advanced argumentation systems, abstract argumentation has also served to capture several non-monotonic logics and other AI related principles. Although the idea of abstract argumentation is appealingly simple, several reasoning problems in this formalism exhibit high computational complexity. This calls for advanced techniques when it comes to implementation issues, a challenge which has been recently faced from different angles. In this survey, we give an overview on different methods for solving reasoning problems in abstract argumentation and compare their particular features. Moreover, we highlight available state-of-the-art systems for abstract argumentation, which put these methods to practice. PMID:25737590

  10. The computational psychiatry of reward: Broken brains or misguided minds?

    Directory of Open Access Journals (Sweden)

    Michael eMoutoussis

    2015-09-01

    Full Text Available Research into the biological basis of emotional and motivational disorders is in danger of riding roughshod over a patient-centred psychiatry and falling into the dualist errors of the past, i.e. by treating mind and brain as conceptually distinct. We argue that a psychiatry informed by computational neuroscience, computational psychiatry, can obviate this danger. Through a focus on the reasoning processes by which humans attempt to maximise reward (and minimise punishment, and how such reasoning is expressed neurally, computational psychiatry can render obsolete the polarity between biological and psychosocial conceptions of illness. Here, the term 'psychological' comes to refer to information processing performed by biological agents, seen in light of underlying goals. We reflect on the implications of this perspective for a definition of mental disorder, including what is entailed in asserting that a particular disorder is ‘biological’ or ‘psychological’ in origin. We propose that a computational approach assists in understanding the topography of mental disorder, while cautioning that the point at which eccentric reasoning constitutes disorder often remains a matter of cultural judgement.

  11. A first accident simulation for Angra-1 power plant using the ALMOD computer code

    International Nuclear Information System (INIS)

    Camargo, C.T.M.

    1981-02-01

    The acquisition of the Almod computer code from GRS-Munich to CNEN has permited doing calculations of transients in PWR nuclear power plants, in which doesn't occur loss of coolant. The implementation of the german computer code Almod and its application in the calculation of Angra-1, a nuclear power plant different from the KWU power plants, demanded study and models adaptation; and due to economic reasons simplifications and optimizations were necessary. The first results define the analytical potential of the computer code, confirm the adequacy of the adaptations done and provide relevant conclusions about the Angra-1 safety analysis, showing at the same time areas in which the model can be applied or simply improved. (Author) [pt

  12. Real-time skin feature identification in a time-sequential video stream

    Science.gov (United States)

    Kramberger, Iztok

    2005-04-01

    Skin color can be an important feature when tracking skin-colored objects. Particularly this is the case for computer-vision-based human-computer interfaces (HCI). Humans have a highly developed feeling of space and, therefore, it is reasonable to support this within intelligent HCI, where the importance of augmented reality can be foreseen. Joining human-like interaction techniques within multimodal HCI could, or will, gain a feature for modern mobile telecommunication devices. On the other hand, real-time processing plays an important role in achieving more natural and physically intuitive ways of human-machine interaction. The main scope of this work is the development of a stereoscopic computer-vision hardware-accelerated framework for real-time skin feature identification in the sense of a single-pass image segmentation process. The hardware-accelerated preprocessing stage is presented with the purpose of color and spatial filtering, where the skin color model within the hue-saturation-value (HSV) color space is given with a polyhedron of threshold values representing the basis of the filter model. An adaptive filter management unit is suggested to achieve better segmentation results. This enables the adoption of filter parameters to the current scene conditions in an adaptive way. Implementation of the suggested hardware structure is given at the level of filed programmable system level integrated circuit (FPSLIC) devices using an embedded microcontroller as their main feature. A stereoscopic clue is achieved using a time-sequential video stream, but this shows no difference for real-time processing requirements in terms of hardware complexity. The experimental results for the hardware-accelerated preprocessing stage are given by efficiency estimation of the presented hardware structure using a simple motion-detection algorithm based on a binary function.

  13. SCRY: Enabling quantitative reasoning in SPARQL queries

    NARCIS (Netherlands)

    Meroño-Peñuela, A.; Stringer, Bas; Loizou, Antonis; Abeln, Sanne; Heringa, Jaap

    2015-01-01

    The inability to include quantitative reasoning in SPARQL queries slows down the application of Semantic Web technology in the life sciences. SCRY, our SPARQL compatible service layer, improves this by executing services at query time and making their outputs query-accessible, generating RDF data on

  14. The heuristic-analytic theory of reasoning: extension and evaluation.

    Science.gov (United States)

    Evans, Jonathan St B T

    2006-06-01

    An extensively revised heuristic-analytic theory of reasoning is presented incorporating three principles of hypothetical thinking. The theory assumes that reasoning and judgment are facilitated by the formation of epistemic mental models that are generated one at a time (singularity principle) by preconscious heuristic processes that contextualize problems in such a way as to maximize relevance to current goals (relevance principle). Analytic processes evaluate these models but tend to accept them unless there is good reason to reject them (satisficing principle). At a minimum, analytic processing of models is required so as to generate inferences or judgments relevant to the task instructions, but more active intervention may result in modification or replacement of default models generated by the heuristic system. Evidence for this theory is provided by a review of a wide range of literature on thinking and reasoning.

  15. Guidelines and Procedures for Computing Time-Series Suspended-Sediment Concentrations and Loads from In-Stream Turbidity-Sensor and Streamflow Data

    Science.gov (United States)

    Rasmussen, Patrick P.; Gray, John R.; Glysson, G. Douglas; Ziegler, Andrew C.

    2009-01-01

    In-stream continuous turbidity and streamflow data, calibrated with measured suspended-sediment concentration data, can be used to compute a time series of suspended-sediment concentration and load at a stream site. Development of a simple linear (ordinary least squares) regression model for computing suspended-sediment concentrations from instantaneous turbidity data is the first step in the computation process. If the model standard percentage error (MSPE) of the simple linear regression model meets a minimum criterion, this model should be used to compute a time series of suspended-sediment concentrations. Otherwise, a multiple linear regression model using paired instantaneous turbidity and streamflow data is developed and compared to the simple regression model. If the inclusion of the streamflow variable proves to be statistically significant and the uncertainty associated with the multiple regression model results in an improvement over that for the simple linear model, the turbidity-streamflow multiple linear regression model should be used to compute a suspended-sediment concentration time series. The computed concentration time series is subsequently used with its paired streamflow time series to compute suspended-sediment loads by standard U.S. Geological Survey techniques. Once an acceptable regression model is developed, it can be used to compute suspended-sediment concentration beyond the period of record used in model development with proper ongoing collection and analysis of calibration samples. Regression models to compute suspended-sediment concentrations are generally site specific and should never be considered static, but they represent a set period in a continually dynamic system in which additional data will help verify any change in sediment load, type, and source.

  16. The level 1 and 2 specification for parallel benchmark and a benchmark test of scalar-parallel computer SP2 based on the specifications

    International Nuclear Information System (INIS)

    Orii, Shigeo

    1998-06-01

    A benchmark specification for performance evaluation of parallel computers for numerical analysis is proposed. Level 1 benchmark, which is a conventional type benchmark using processing time, measures performance of computers running a code. Level 2 benchmark proposed in this report is to give the reason of the performance. As an example, scalar-parallel computer SP2 is evaluated with this benchmark specification in case of a molecular dynamics code. As a result, the main causes to suppress the parallel performance are maximum band width and start-up time of communication between nodes. Especially the start-up time is proportional not only to the number of processors but also to the number of particles. (author)

  17. Eye Movements Reveal Optimal Strategies for Analogical Reasoning.

    Science.gov (United States)

    Vendetti, Michael S; Starr, Ariel; Johnson, Elizabeth L; Modavi, Kiana; Bunge, Silvia A

    2017-01-01

    Analogical reasoning refers to the process of drawing inferences on the basis of the relational similarity between two domains. Although this complex cognitive ability has been the focus of inquiry for many years, most models rely on measures that cannot capture individuals' thought processes moment by moment. In the present study, we used participants' eye movements to investigate reasoning strategies in real time while solving visual propositional analogy problems (A:B::C:D). We included both a semantic and a perceptual lure on every trial to determine how these types of distracting information influence reasoning strategies. Participants spent more time fixating the analogy terms and the target relative to the other response choices, and made more saccades between the A and B items than between any other items. Participants' eyes were initially drawn to perceptual lures when looking at response choices, but they nonetheless performed the task accurately. We used participants' gaze sequences to classify each trial as representing one of three classic analogy problem solving strategies and related strategy usage to analogical reasoning performance. A project-first strategy, in which participants first extrapolate the relation between the AB pair and then generalize that relation for the C item, was both the most commonly used strategy as well as the optimal strategy for solving visual analogy problems. These findings provide new insight into the role of strategic processing in analogical problem solving.

  18. Eye Movements Reveal Optimal Strategies for Analogical Reasoning

    Directory of Open Access Journals (Sweden)

    Michael S. Vendetti

    2017-06-01

    Full Text Available Analogical reasoning refers to the process of drawing inferences on the basis of the relational similarity between two domains. Although this complex cognitive ability has been the focus of inquiry for many years, most models rely on measures that cannot capture individuals' thought processes moment by moment. In the present study, we used participants' eye movements to investigate reasoning strategies in real time while solving visual propositional analogy problems (A:B::C:D. We included both a semantic and a perceptual lure on every trial to determine how these types of distracting information influence reasoning strategies. Participants spent more time fixating the analogy terms and the target relative to the other response choices, and made more saccades between the A and B items than between any other items. Participants' eyes were initially drawn to perceptual lures when looking at response choices, but they nonetheless performed the task accurately. We used participants' gaze sequences to classify each trial as representing one of three classic analogy problem solving strategies and related strategy usage to analogical reasoning performance. A project-first strategy, in which participants first extrapolate the relation between the AB pair and then generalize that relation for the C item, was both the most commonly used strategy as well as the optimal strategy for solving visual analogy problems. These findings provide new insight into the role of strategic processing in analogical problem solving.

  19. Real-Time Incompressible Fluid Simulation on the GPU

    Directory of Open Access Journals (Sweden)

    Xiao Nie

    2015-01-01

    Full Text Available We present a parallel framework for simulating incompressible fluids with predictive-corrective incompressible smoothed particle hydrodynamics (PCISPH on the GPU in real time. To this end, we propose an efficient GPU streaming pipeline to map the entire computational task onto the GPU, fully exploiting the massive computational power of state-of-the-art GPUs. In PCISPH-based simulations, neighbor search is the major performance obstacle because this process is performed several times at each time step. To eliminate this bottleneck, an efficient parallel sorting method for this time-consuming step is introduced. Moreover, we discuss several optimization techniques including using fast on-chip shared memory to avoid global memory bandwidth limitations and thus further improve performance on modern GPU hardware. With our framework, the realism of real-time fluid simulation is significantly improved since our method enforces incompressibility constraint which is typically ignored due to efficiency reason in previous GPU-based SPH methods. The performance results illustrate that our approach can efficiently simulate realistic incompressible fluid in real time and results in a speed-up factor of up to 23 on a high-end NVIDIA GPU in comparison to single-threaded CPU-based implementation.

  20. Research on SDG-Based Qualitative Reasoning in Conceptual Design

    Directory of Open Access Journals (Sweden)

    Kai Li

    2013-01-01

    Full Text Available Conceptual design is the initial stage throughout the product life cycle, whose main purposes include function creation, function decomposition, and function and subfunction designs. At this stage, the information about product function and structure has the characteristics of imprecision, incompleteness, being qualitative, and so forth, which will affect the validity of conceptual design. In this paper, the signed directed graph is used to reveal the inherent causal relationship and interactions among the variables and find qualitative interactions between design variables and design purpose with the help of causal sequence analysis and constraint propagation. In the case of incomplete information, qualitative reasoning, which has the function of qualitative behavior prediction, can improve conceptual design level aided by the computer. To some extent, qualitative reasoning plays a supplementary role in evaluating scheme and predicting function. At last, with the problem of planar four-bar mechanism design, a qualitative reasoning flowchart based on the Signed Directed Graph is introduced, and an analysis is made of how to adjust design parameters to make the trajectory of a moving point reach to the predetermined position so as to meet the design requirements and achieve the effect that aided designers expect in conceptual design.

  1. Person-related determinants of TV viewing and computer time in a cohort of young Dutch adults: Who sits the most?

    NARCIS (Netherlands)

    Uijtdewilligen, L.; Singh, A.S.; Chin A Paw, M.J.M.; Twisk, J.W.R.; van Mechelen, W.

    2015-01-01

    We aimed to assess the associations of person-related factors with leisure time television (TV) viewing and computer time among young adults. We analyzed self-reported TV viewing (h/week) and leisure computer time (h/week) from 475 Dutch young adults (47% male) who had participated in the Amsterdam

  2. Timing comparison of two-dimensional discrete-ordinates codes for criticality calculations

    International Nuclear Information System (INIS)

    Miller, W.F. Jr.; Alcouffe, R.E.; Bosler, G.E.; Brinkley, F.W. Jr.; O'dell, R.D.

    1979-01-01

    The authors compare two-dimensional discrete-ordinates neutron transport computer codes to solve reactor criticality problems. The fundamental interest is in determining which code requires the minimum Central Processing Unit (CPU) time for a given numerical model of a reasonably realistic fast reactor core and peripherals. The computer codes considered are the most advanced available and, in three cases, are not officially released. The conclusion, based on the study of four fast reactor core models, is that for this class of problems the diffusion synthetic accelerated version of TWOTRAN, labeled TWOTRAN-DA, is superior to the other codes in terms of CPU requirements

  3. Y2K issues for real time computer systems for fast breeder test reactor

    International Nuclear Information System (INIS)

    Swaminathan, P.

    1999-01-01

    Presentation shows the classification of real time systems related to operation, control and monitoring of the fast breeder test reactor. Software life cycle includes software requirement specification, software design description, coding, commissioning, operation and management. A software scheme in supervisory computer of fast breeder test rector is described with the twenty years of experience in design, development, installation, commissioning, operation and maintenance of computer based supervision control system for nuclear installation with a particular emphasis on solving the Y2K problem

  4. Spatial and Temporal Wind Power Forecasting by Case-Based Reasoning Using Big-Data

    Directory of Open Access Journals (Sweden)

    Fabrizio De Caro

    2017-02-01

    Full Text Available The massive penetration of wind generators in electrical power systems asks for effective wind power forecasting tools, which should be high reliable, in order to mitigate the effects of the uncertain generation profiles, and fast enough to enhance power system operation. To address these two conflicting objectives, this paper advocates the role of knowledge discovery from big-data, by proposing the integration of adaptive Case Based Reasoning models, and cardinality reduction techniques based on Partial Least Squares Regression, and Principal Component Analysis. The main idea is to learn from a large database of historical climatic observations, how to solve the windforecasting problem, avoiding complex and time-consuming computations. To assess the benefits derived by the application of the proposed methodology in complex application scenarios, the experimental results obtained in a real case study will be presented and discussed.

  5. Identification of Genetic Loci Jointly Influencing Schizophrenia Risk and the Cognitive Traits of Verbal-Numerical Reasoning, Reaction Time, and General Cognitive Function.

    Science.gov (United States)

    Smeland, Olav B; Frei, Oleksandr; Kauppi, Karolina; Hill, W David; Li, Wen; Wang, Yunpeng; Krull, Florian; Bettella, Francesco; Eriksen, Jon A; Witoelar, Aree; Davies, Gail; Fan, Chun C; Thompson, Wesley K; Lam, Max; Lencz, Todd; Chen, Chi-Hua; Ueland, Torill; Jönsson, Erik G; Djurovic, Srdjan; Deary, Ian J; Dale, Anders M; Andreassen, Ole A

    2017-10-01

    Schizophrenia is associated with widespread cognitive impairments. Although cognitive deficits are one of the factors most strongly associated with functional outcome in schizophrenia, current treatment strategies largely fail to ameliorate these impairments. To develop more efficient treatment strategies in patients with schizophrenia, a better understanding of the pathogenesis of these cognitive deficits is needed. Accumulating evidence indicates that genetic risk of schizophrenia may contribute to cognitive dysfunction. To identify genomic regions jointly influencing schizophrenia and the cognitive domains of reaction time and verbal-numerical reasoning, as well as general cognitive function, a phenotype that captures the shared variation in performance across cognitive domains. Combining data from genome-wide association studies from multiple phenotypes using conditional false discovery rate analysis provides increased power to discover genetic variants and could elucidate shared molecular genetic mechanisms. Data from the following genome-wide association studies, published from July 24, 2014, to January 17, 2017, were combined: schizophrenia in the Psychiatric Genomics Consortium cohort (n = 79 757 [cases, 34 486; controls, 45 271]); verbal-numerical reasoning (n = 36 035) and reaction time (n = 111 483) in the UK Biobank cohort; and general cognitive function in CHARGE (Cohorts for Heart and Aging Research in Genomic Epidemiology) (n = 53 949) and COGENT (Cognitive Genomics Consortium) (n = 27 888). Genetic loci identified by conditional false discovery rate analysis. Brain messenger RNA expression and brain expression quantitative trait locus functionality were determined. Among the participants in the genome-wide association studies, 21 loci jointly influencing schizophrenia and cognitive traits were identified: 2 loci shared between schizophrenia and verbal-numerical reasoning, 6 loci shared between schizophrenia and

  6. The role of representation in Bayesian reasoning: Correcting common misconceptions

    OpenAIRE

    Gigerenzer, G.; Hoffrage, U.

    2007-01-01

    The terms nested sets, partitive frequencies, inside-outside view, and dual processes add little but confusion to our original analysis (Gigerenzer & Hoffrage 1995; 1999). The idea of nested set was introduced because of an oversight; it simply rephrases two of our equations. Representation in terms of chances, in contrast, is a novel contribution yet consistent with our computational analysis - it uses exactly the same numbers as natural frequencies. We show that non-Bayesian reasoning in ch...

  7. Elastic Spatial Query Processing in OpenStack Cloud Computing Environment for Time-Constraint Data Analysis

    Directory of Open Access Journals (Sweden)

    Wei Huang

    2017-03-01

    Full Text Available Geospatial big data analysis (GBDA is extremely significant for time-constraint applications such as disaster response. However, the time-constraint analysis is not yet a trivial task in the cloud computing environment. Spatial query processing (SQP is typical computation-intensive and indispensable for GBDA, and the spatial range query, join query, and the nearest neighbor query algorithms are not scalable without using MapReduce-liked frameworks. Parallel SQP algorithms (PSQPAs are trapped in screw-processing, which is a known issue in Geoscience. To satisfy time-constrained GBDA, we propose an elastic SQP approach in this paper. First, Spark is used to implement PSQPAs. Second, Kubernetes-managed Core Operation System (CoreOS clusters provide self-healing Docker containers for running Spark clusters in the cloud. Spark-based PSQPAs are submitted to Docker containers, where Spark master instances reside. Finally, the horizontal pod auto-scaler (HPA would scale-out and scale-in Docker containers for supporting on-demand computing resources. Combined with an auto-scaling group of virtual instances, HPA helps to find each of the five nearest neighbors for 46,139,532 query objects from 834,158 spatial data objects in less than 300 s. The experiments conducted on an OpenStack cloud demonstrate that auto-scaling containers can satisfy time-constraint GBDA in clouds.

  8. Radioactivity computation of steady-state and pulsed fusion reactors operation

    International Nuclear Information System (INIS)

    Attaya, H.

    1994-11-01

    The International Thermonuclear Report (ITER) is expected to operate in a pulsed operational mode. Accurate radioactivity calculations, that take into account this mode of operation, are required in order to determine precisely the different safety aspects of ITER. The authors previous examined analytically the effect of pulsed operation in ITER and showed how it depends on the burn time, the dwell time, and the half-lives. That analysis showed also that for ITER's low duty factor, using the continuous operation assumption would considerably overestimate the radioactivities, for a wide range of half-lives. At the same time, the large improvements in the quality and the quantity of the decay and the cross-section data libraries has considerably increased the computation times of the radioactivity calculations. For both reasons it is imperative to seek different methods of solution that reduce the computational time and can be easily adopted to the treatment of the pulsed operation. In this work, they have developed algorithms based on several mathematical methods that were chosen based on their generality, reliability, stability, accuracy, and efficiency. These methods are the matrix Schuer decomposition, the eigenvector decomposition, and the Pade approximation for the matrix exponential functions

  9. Large holographic displays for real-time applications

    Science.gov (United States)

    Schwerdtner, A.; Häussler, R.; Leister, N.

    2008-02-01

    Holography is generally accepted as the ultimate approach to display three-dimensional scenes or objects. Principally, the reconstruction of an object from a perfect hologram would appear indistinguishable from viewing the corresponding real-world object. Up to now two main obstacles have prevented large-screen Computer-Generated Holograms (CGH) from achieving a satisfactory laboratory prototype not to mention a marketable one. The reason is a small cell pitch CGH resulting in a huge number of hologram cells and a very high computational load for encoding the CGH. These seemingly inevitable technological hurdles for a long time have not been cleared limiting the use of holography to special applications, such as optical filtering, interference, beam forming, digital holography for capturing the 3-D shape of objects, and others. SeeReal Technologies has developed a new approach for real-time capable CGH using the socalled Tracked Viewing Windows technology to overcome these problems. The paper will show that today's state of the art reconfigurable Spatial Light Modulators (SLM), especially today's feasible LCD panels are suited for reconstructing large 3-D scenes which can be observed from large viewing angles. For this to achieve the original holographic concept of containing information from the entire scene in each part of the CGH has been abandoned. This substantially reduces the hologram resolution and thus the computational load by several orders of magnitude making thus real-time computation possible. A monochrome real-time prototype measuring 20 inches has been built and demonstrated at last year's SID conference and exhibition 2007 and at several other events.

  10. Small sets of interacting proteins suggest functional linkage mechanisms via Bayesian analogical reasoning.

    Science.gov (United States)

    Airoldi, Edoardo M; Heller, Katherine A; Silva, Ricardo

    2011-07-01

    Proteins and protein complexes coordinate their activity to execute cellular functions. In a number of experimental settings, including synthetic genetic arrays, genetic perturbations and RNAi screens, scientists identify a small set of protein interactions of interest. A working hypothesis is often that these interactions are the observable phenotypes of some functional process, which is not directly observable. Confirmatory analysis requires finding other pairs of proteins whose interaction may be additional phenotypical evidence about the same functional process. Extant methods for finding additional protein interactions rely heavily on the information in the newly identified set of interactions. For instance, these methods leverage the attributes of the individual proteins directly, in a supervised setting, in order to find relevant protein pairs. A small set of protein interactions provides a small sample to train parameters of prediction methods, thus leading to low confidence. We develop RBSets, a computational approach to ranking protein interactions rooted in analogical reasoning; that is, the ability to learn and generalize relations between objects. Our approach is tailored to situations where the training set of protein interactions is small, and leverages the attributes of the individual proteins indirectly, in a Bayesian ranking setting that is perhaps closest to propensity scoring in mathematical psychology. We find that RBSets leads to good performance in identifying additional interactions starting from a small evidence set of interacting proteins, for which an underlying biological logic in terms of functional processes and signaling pathways can be established with some confidence. Our approach is scalable and can be applied to large databases with minimal computational overhead. Our results suggest that analogical reasoning within a Bayesian ranking problem is a promising new approach for real-time biological discovery. Java code is available at

  11. Robotics Vision-based Heuristic Reasoning for Underwater Target Tracking and Navigation

    OpenAIRE

    Kia, Chua; Arshad, Mohd Rizal

    2006-01-01

    This paper presents a robotics vision-based heuristic reasoning system for underwater target tracking and navigation. This system is introduced to improve the level of automation of underwater Remote Operated Vehicles (ROVs) operations. A prototype which combines computer vision with an underwater robotics system is successfully designed and developed to perform target tracking and intelligent navigation. This study focuses on developing image processing algorithms and fuzzy inference system ...

  12. Real-Time Head Pose Estimation on Mobile Platforms

    Directory of Open Access Journals (Sweden)

    Jianfeng Ren

    2010-06-01

    Full Text Available Many computer vision applications such as augmented reality require head pose estimation. As far as the real-time implementation of head pose estimation on relatively resource limited mobile platforms is concerned, it is required to satisfy real-time constraints while maintaining reasonable head pose estimation accuracy. The introduced head pose estimation approach in this paper is an attempt to meet this objective. The approach consists of the following components: Viola-Jones face detection, color-based face tracking using an online calibration procedure, and head pose estimation using Hu moment features and Fisher linear discriminant. Experimental results running on an actual mobile device are reported exhibiting both the real- time and accuracy aspects of the developed approach.

  13. Clinical reasoning: concept analysis.

    Science.gov (United States)

    Simmons, Barbara

    2010-05-01

    This paper is a report of a concept analysis of clinical reasoning in nursing. Clinical reasoning is an ambiguous term that is often used synonymously with decision-making and clinical judgment. Clinical reasoning has not been clearly defined in the literature. Healthcare settings are increasingly filled with uncertainty, risk and complexity due to increased patient acuity, multiple comorbidities, and enhanced use of technology, all of which require clinical reasoning. Data sources. Literature for this concept analysis was retrieved from several databases, including CINAHL, PubMed, PsycINFO, ERIC and OvidMEDLINE, for the years 1980 to 2008. Rodgers's evolutionary method of concept analysis was used because of its applicability to concepts that are still evolving. Multiple terms have been used synonymously to describe the thinking skills that nurses use. Research in the past 20 years has elucidated differences among these terms and identified the cognitive processes that precede judgment and decision-making. Our concept analysis defines one of these terms, 'clinical reasoning,' as a complex process that uses cognition, metacognition, and discipline-specific knowledge to gather and analyse patient information, evaluate its significance, and weigh alternative actions. This concept analysis provides a middle-range descriptive theory of clinical reasoning in nursing that helps clarify meaning and gives direction for future research. Appropriate instruments to operationalize the concept need to be developed. Research is needed to identify additional variables that have an impact on clinical reasoning and what are the consequences of clinical reasoning in specific situations.

  14. Towards OpenVL: Improving Real-Time Performance of Computer Vision Applications

    Science.gov (United States)

    Shen, Changsong; Little, James J.; Fels, Sidney

    Meeting constraints for real-time performance is a main issue for computer vision, especially for embedded computer vision systems. This chapter presents our progress on our open vision library (OpenVL), a novel software architecture to address efficiency through facilitating hardware acceleration, reusability, and scalability for computer vision systems. A logical image understanding pipeline is introduced to allow parallel processing. We also discuss progress on our middleware—vision library utility toolkit (VLUT)—that enables applications to operate transparently over a heterogeneous collection of hardware implementations. OpenVL works as a state machine,with an event-driven mechanismto provide users with application-level interaction. Various explicit or implicit synchronization and communication methods are supported among distributed processes in the logical pipelines. The intent of OpenVL is to allow users to quickly and easily recover useful information from multiple scenes, in a cross-platform, cross-language manner across various software environments and hardware platforms. To validate the critical underlying concepts of OpenVL, a human tracking system and a local positioning system are implemented and described. The novel architecture separates the specification of algorithmic details from the underlying implementation, allowing for different components to be implemented on an embedded system without recompiling code.

  15. Natural representation of the deduction; applying to the temporal reasoning for expert systems based on production rules

    International Nuclear Information System (INIS)

    Baudin, Patrick

    1990-01-01

    The expert systems development within a real time context, requires both to master the necessary reasoning about the time as well as to master the necessary response time for reasoning. Although rigorous temporal logic formalisms exist, strategies for temporal reasoning are either incomplete or else imply unacceptable response times. The first part presents the logic formalism upon which is based the production system. This formalism contains a three-valued logic system with truth-valued matrix, and a deductive system with a formal system. It does a rigorous work for this no standard logic, where the notions of consistency and completeness can be studied. Its development supports itself on the will to formalise the reasoning used at the elaboration time of the strategies to make them more explicit as the natural deduction method. The second part proposes an extension for the source logic formalism to take explicitly the time into account. The approach proposed through 'TANIS', the prototype of such an expert system shell, using a natural reasoning application is proposed. It allows, at the generation time, the implementation within the expert system, of an adapted deduction strategy to the symbolic temporal reasoning which is complete and ease the determination of the response time. (author) [fr

  16. Celebrating 50 years of the CERN Computing Operations group

    CERN Multimedia

    Katarina Anthony

    2013-01-01

    Last week, former and current computing operations staff, managers and system engineers were reunited at CERN. They came together to celebrate a milestone not only for the IT Department but also for CERN: the 50th anniversary of the CERN Operations group and the 40th birthday of the Computer Centre.   The reunion was organised by former chief operator, Pierre Bénassi, and took place from 26 to 27 April. Among the 44 attendees were Neil Spoonley and Charles Symons, who together created the Operations group back in 1963. “At that time, working in the Operations group was a very physical job,” recalls former Operations Group Leader, David Underhill. “For that reason, many of the first operators were former firemen.” A few of the participants enjoyed a tour of CERN landmarks during their visit (see photo). The group toured the CERN Computing Centre (accompanied by IT Department Head, Frédéric Hemmer), as well as the ATLAS cav...

  17. Accuracy of computer-assisted cervicle pedicle screw installation

    International Nuclear Information System (INIS)

    Zhang Honglei; Zhou Dongsheng; Jang Zhensong

    2009-01-01

    Objective: To investigate the accuracy of computer-assisted cervical pedicle screw installation and the reason of screw malposition. Methods: A total of 172 cervical pedicle screws were installed by computer-assisted navigation for 30 patients with lower cervical spinal diseases. All the patients were examined by X-ray and CT after operation. Screw's position and direction were measured on the sagittal and transectional images of intraoperative navigation and post-operative CT. Then linear regression analysis was taken between navigational and post-operative CT's images. Results: Two screws perforated the upper pedicle wall, 3 perforated the lateral pedicle wall.There was a positive linear correlation between navigational and post-operative CT's images. Conclusion: Computer-assisted navigation can provide the high accuracy of cervical pedicle screw installation and excursion phenomenon is reason of screw malposition. (authors)

  18. Reflective ability and moral reasoning in final year medical students: a semi-qualitative cohort study.

    Science.gov (United States)

    Chalmers, Patricia; Dunngalvin, Audrey; Shorten, George

    2011-01-01

    Moral reasoning and reflective ability are important concepts in medical education. To date, the association between reflective ability and moral reasoning in medical students has not been measured. This study tested the hypotheses that, amongst final year medical students, (1) moral reasoning and reflective ability improve over time and (2) positive change in reflective ability favourably influences moral reasoning. With Institutional Ethical approval, 56 medical students (of a class of 110) participated fully both at the beginning and end of the final academic year. Reflective ability and moral reasoning were assessed at each time using Sobral's reflection-in-learning scale (RLS), Boenink's overall reflection score and by employing Kohlberg's schema for moral reasoning. The most important findings were that (1) Students' level of reflective ability scores related to medicine decreased significantly over the course of the year, (2) students demonstrated a predominantly conventional level of moral reasoning at both the beginning and end of the year, (3) moral reasoning scores tended to decrease over the course of the year and (4) RLS is a strong predictor of change in moral reasoning over time. This study confirms the usefulness of Sobral's RLS and BOR score for evaluating moral development in the context of medical education. This study further documents regression and levelling in the moral reasoning of final year medical students and a decrease in reflective ability applied in the medical context. Further studies are required to determine factors that would favourably influence reflective ability and moral reasoning among final year medical students.

  19. Resolving time of scintillation camera-computer system and methods of correction for counting loss, 2

    International Nuclear Information System (INIS)

    Iinuma, Takeshi; Fukuhisa, Kenjiro; Matsumoto, Toru

    1975-01-01

    Following the previous work, counting-rate performance of camera-computer systems was investigated for two modes of data acquisition. The first was the ''LIST'' mode in which image data and timing signals were sequentially stored on magnetic disk or tape via a buffer memory. The second was the ''HISTOGRAM'' mode in which image data were stored in a core memory as digital images and then the images were transfered to magnetic disk or tape by the signal of frame timing. Firstly, the counting-rates stored in the buffer memory was measured as a function of display event-rates of the scintillation camera for the two modes. For both modes, stored counting-rated (M) were expressed by the following formula: M=N(1-Ntau) where N was the display event-rates of the camera and tau was the resolving time including analog-to-digital conversion time and memory cycle time. The resolving time for each mode may have been different, but it was about 10 μsec for both modes in our computer system (TOSBAC 3400 model 31). Secondly, the date transfer speed from the buffer memory to the external memory such as magnetic disk or tape was considered for the two modes. For the ''LIST'' mode, the maximum value of stored counting-rates from the camera was expressed in terms of size of the buffer memory, access time and data transfer-rate of the external memory. For the ''HISTOGRAM'' mode, the minimum time of the frame was determined by size of the buffer memory, access time and transfer rate of the external memory. In our system, the maximum value of stored counting-rates were about 17,000 counts/sec. with the buffer size of 2,000 words, and minimum frame time was about 130 msec. with the buffer size of 1024 words. These values agree well with the calculated ones. From the author's present analysis, design of the camera-computer system becomes possible for quantitative dynamic imaging and future improvements are suggested. (author)

  20. Diagnosing component faults in a generic nuclear power plant using counterfactual and temporal reasoning

    International Nuclear Information System (INIS)

    Oehrstroem, P.; Nielsen, F.R.; Pedersen, S.A.

    1992-01-01

    The subject of main interest is the logical and epistemological aspects of diagnostic reasoning. The aim was to understand the role of conditionals and causality in this respect. A model of causal and temporal reasoning was developed and evaluated in a controlled but complex setting. The generic nuclear power plant was used as a test ground. The coherence and scope of a logical theory of diagnostic reasoning was studied in order to discover whether the theory constitutes an adequate tool for making correct diagnoses of component faults in a generic nuclear power plant. A diagnosing system based on the CIMP system was run on a computer model of a nuclear power plant, various errors were then introduced. The aim of the diagnosis is mainly explanation and only partly repair. The causal field defines a conceptual framework within which the diagnostic purpose is given and within which various diagnostic possibilities and causal relationships are given, here with regard to error detection in a control room. The causal field is tacitly given and related to the operator's training and experience. The logical aspects of the problem of the diagnosis is described. The computer model is described and the symptom language is introduced. The process of reasoning about the possible diagnosis is presented. The utilization of ideas similiar to the heuristic classification is discussed. A data base command language for manipulating lists of symptoms is described and the design of a CIMP user interface for symptom language visualization is outlined. (AB)