WorldWideScience

Sample records for interactive computational tool

  1. HCI^2 Workbench: A Development Tool for Multimodal Human-Computer Interaction Systems

    NARCIS (Netherlands)

    Shen, Jie; Wenzhe, Shi; Pantic, Maja

    In this paper, we present a novel software tool designed and implemented to simplify the development process of Multimodal Human-Computer Interaction (MHCI) systems. This tool, which is called the HCI^2 Workbench, exploits a Publish / Subscribe (P/S) architecture [13] [14] to facilitate efficient

  2. Design of Intelligent Robot as A Tool for Teaching Media Based on Computer Interactive Learning and Computer Assisted Learning to Improve the Skill of University Student

    Science.gov (United States)

    Zuhrie, M. S.; Basuki, I.; Asto B, I. G. P.; Anifah, L.

    2018-01-01

    The focus of the research is the teaching module which incorporates manufacturing, planning mechanical designing, controlling system through microprocessor technology and maneuverability of the robot. Computer interactive and computer-assisted learning is strategies that emphasize the use of computers and learning aids (computer assisted learning) in teaching and learning activity. This research applied the 4-D model research and development. The model is suggested by Thiagarajan, et.al (1974). 4-D Model consists of four stages: Define Stage, Design Stage, Develop Stage, and Disseminate Stage. This research was conducted by applying the research design development with an objective to produce a tool of learning in the form of intelligent robot modules and kit based on Computer Interactive Learning and Computer Assisted Learning. From the data of the Indonesia Robot Contest during the period of 2009-2015, it can be seen that the modules that have been developed confirm the fourth stage of the research methods of development; disseminate method. The modules which have been developed for students guide students to produce Intelligent Robot Tool for Teaching Based on Computer Interactive Learning and Computer Assisted Learning. Results of students’ responses also showed a positive feedback to relate to the module of robotics and computer-based interactive learning.

  3. Enhancing pediatric safety: assessing and improving resident competency in life-threatening events with a computer-based interactive resuscitation tool

    International Nuclear Information System (INIS)

    Lerner, Catherine; Gaca, Ana M.; Frush, Donald P.; Ancarana, Anjanette; Hohenhaus, Sue; Seelinger, Terry A.; Frush, Karen

    2009-01-01

    Though rare, allergic reactions occur as a result of administration of low osmolality nonionic iodinated contrast material to pediatric patients. Currently available resuscitation aids are inadequate in guiding radiologists' initial management of such reactions. To compare radiology resident competency with and without a computer-based interactive resuscitation tool in the management of life-threatening events in pediatric patients. The study was approved by the IRB. Radiology residents (n=19; 14 male, 5 female; 19 certified in basic life support/advanced cardiac life support; 1 certified in pediatric advanced life support) were videotaped during two simulated 5-min anaphylaxis scenarios involving 18-month-old and 8-year-old mannequins (order randomized). No advance warning was given. In half of the scenarios, a computer-based interactive resuscitation tool with a response-driven decision tree was available to residents (order randomized). Competency measures included: calling a code, administering oxygen and epinephrine, and correctly dosing epinephrine. Residents performed significantly more essential interventions with the computer-based resuscitation tool than without (72/76 vs. 49/76, P<0.001). Significantly more residents appropriately dosed epinephrine with the tool than without (17/19 vs. 1/19; P<0.001). More residents called a code with the tool than without (17/19 vs. 14/19; P = 0.08). A learning effect was present: average times to call a code, request oxygen, and administer epinephrine were shorter in the second scenario (129 vs. 93 s, P=0.24; 52 vs. 30 s, P<0.001; 152 vs. 82 s, P=0.025, respectively). All the trainees found the resuscitation tool helpful and potentially useful in a true pediatric emergency. A computer-based interactive resuscitation tool significantly improved resident performance in managing pediatric emergencies in the radiology department. (orig.)

  4. Interactive Computer Graphics

    Science.gov (United States)

    Kenwright, David

    2000-01-01

    Aerospace data analysis tools that significantly reduce the time and effort needed to analyze large-scale computational fluid dynamics simulations have emerged this year. The current approach for most postprocessing and visualization work is to explore the 3D flow simulations with one of a dozen or so interactive tools. While effective for analyzing small data sets, this approach becomes extremely time consuming when working with data sets larger than one gigabyte. An active area of research this year has been the development of data mining tools that automatically search through gigabyte data sets and extract the salient features with little or no human intervention. With these so-called feature extraction tools, engineers are spared the tedious task of manually exploring huge amounts of data to find the important flow phenomena. The software tools identify features such as vortex cores, shocks, separation and attachment lines, recirculation bubbles, and boundary layers. Some of these features can be extracted in a few seconds; others take minutes to hours on extremely large data sets. The analysis can be performed off-line in a batch process, either during or following the supercomputer simulations. These computations have to be performed only once, because the feature extraction programs search the entire data set and find every occurrence of the phenomena being sought. Because the important questions about the data are being answered automatically, interactivity is less critical than it is with traditional approaches.

  5. Translator-computer interaction in action

    DEFF Research Database (Denmark)

    Bundgaard, Kristine; Christensen, Tina Paulsen; Schjoldager, Anne

    2016-01-01

    perspective, this paper investigates the relationship between machines and humans in the field of translation, analysing a CAT process in which machine-translation (MT) technology was integrated into a translation-memory (TM) suite. After a review of empirical research into the impact of CAT tools......Though we lack empirically-based knowledge of the impact of computer-aided translation (CAT) tools on translation processes, it is generally agreed that all professional translators are now involved in some kind of translator-computer interaction (TCI), using O’Brien’s (2012) term. Taking a TCI......, the study indicates that the tool helps the translator conform to project and customer requirements....

  6. The Impact of Computer Simulations as Interactive Demonstration Tools on the Performance of Grade 11 Learners in Electromagnetism

    Science.gov (United States)

    Kotoka, Jonas; Kriek, Jeanne

    2014-01-01

    The impact of computer simulations on the performance of 65 grade 11 learners in electromagnetism in a South African high school in the Mpumalanga province is investigated. Learners did not use the simulations individually, but teachers used them as an interactive demonstration tool. Basic concepts in electromagnetism are difficult to understand…

  7. New tools to aid in scientific computing and visualization

    International Nuclear Information System (INIS)

    Wallace, M.G.; Christian-Frear, T.L.

    1992-01-01

    In this paper, two computer programs are described which aid in the pre- and post-processing of computer generated data. CoMeT (Computational Mechanics Toolkit) is a customizable, interactive, graphical, menu-driven program that provides the analyst with a consistent user-friendly interface to analysis codes. Trans Vol (Transparent Volume Visualization) is a specialized tool for the scientific three-dimensional visualization of complex solids by the technique of volume rendering. Both tools are described in basic detail along with an application example concerning the simulation of contaminant migration from an underground nuclear repository

  8. Computer-Based Interaction Analysis with DEGREE Revisited

    Science.gov (United States)

    Barros, B.; Verdejo, M. F.

    2016-01-01

    We review our research with "DEGREE" and analyse how our work has impacted the collaborative learning community since 2000. Our research is framed within the context of computer-based interaction analysis and the development of computer-supported collaborative learning (CSCL) tools. We identify some aspects of our work which have been…

  9. iDoRNA: An Interacting Domain-based Tool for Designing RNA-RNA Interaction Systems

    Directory of Open Access Journals (Sweden)

    Jittrawan Thaiprasit

    2016-03-01

    Full Text Available RNA-RNA interactions play a crucial role in gene regulation in living organisms. They have gained increasing interest in the field of synthetic biology because of their potential applications in medicine and biotechnology. However, few novel regulators based on RNA-RNA interactions with desired structures and functions have been developed due to the challenges of developing design tools. Recently, we proposed a novel tool, called iDoDe, for designing RNA-RNA interacting sequences by first decomposing RNA structures into interacting domains and then designing each domain using a stochastic algorithm. However, iDoDe did not provide an optimal solution because it still lacks a mechanism to optimize the design. In this work, we have further developed the tool by incorporating a genetic algorithm (GA to find an RNA solution with maximized structural similarity and minimized hybridized RNA energy, and renamed the tool iDoRNA. A set of suitable parameters for the genetic algorithm were determined and found to be a weighting factor of 0.7, a crossover rate of 0.9, a mutation rate of 0.1, and the number of individuals per population set to 8. We demonstrated the performance of iDoRNA in comparison with iDoDe by using six RNA-RNA interaction models. It was found that iDoRNA could efficiently generate all models of interacting RNAs with far more accuracy and required far less computational time than iDoDe. Moreover, we compared the design performance of our tool against existing design tools using forty-four RNA-RNA interaction models. The results showed that the performance of iDoRNA is better than RiboMaker when considering the ensemble defect, the fitness score and computation time usage. However, it appears that iDoRNA is outperformed by NUPACK and RNAiFold 2.0 when considering the ensemble defect. Nevertheless, iDoRNA can still be an useful alternative tool for designing novel RNA-RNA interactions in synthetic biology research. The source code of i

  10. An Interactive Computer Tool for Teaching About Desalination and Managing Water Demand in the US

    Science.gov (United States)

    Ziolkowska, J. R.; Reyes, R.

    2016-12-01

    This paper presents an interactive tool to geospatially and temporally analyze desalination developments and trends in the US in the time span 1950-2013, its current contribution to satisfying water demands and its future potentials. The computer tool is open access and can be used by any user with Internet connection, thus facilitating interactive learning about water resources. The tool can also be used by stakeholders and policy makers for decision-making support and with designing sustainable water management strategies. Desalination technology has been acknowledged as a solution to a sustainable water demand management stemming from many sectors, including municipalities, industry, agriculture, power generation, and other users. Desalination has been applied successfully in the US and many countries around the world since 1950s. As of 2013, around 1,336 desalination plants were operating in the US alone, with a daily production capacity of 2 BGD (billion gallons per day) (GWI, 2013). Despite a steady increase in the number of new desalination plants and growing production capacity, in many regions, the costs of desalination are still prohibitive. At the same time, the technology offers a tremendous potential for `enormous supply expansion that exceeds all likely demands' (Chowdhury et al., 2013). The model and tool are based on data from Global Water Intelligence (GWI, 2013). The analysis shows that more than 90% of all the plants in the US are small-scale plants with the capacity below 4.31 MGD. Most of the plants (and especially larger plants) are located on the US East Coast, as well as in California, Texas, Oklahoma, and Florida. The models and the tool provide information about economic feasibility of potential new desalination plants based on the access to feed water, energy sources, water demand, and experiences of other plants in that region.

  11. COMPUTER TOOLS OF DYNAMIC MATHEMATIC SOFTWARE AND METHODICAL PROBLEMS OF THEIR USE

    Directory of Open Access Journals (Sweden)

    Olena V. Semenikhina

    2014-08-01

    Full Text Available The article presents results of analyses of standard computer tools of dynamic mathematic software which are used in solving tasks, and tools on which the teacher can support in the teaching of mathematics. Possibility of the organization of experimental investigating of mathematical objects on the basis of these tools and the wording of new tasks on the basis of the limited number of tools, fast automated check are specified. Some methodological comments on application of computer tools and methodological features of the use of interactive mathematical environments are presented. Problems, which are arising from the use of computer tools, among which rethinking forms and methods of training by teacher, the search for creative problems, the problem of rational choice of environment, check the e-solutions, common mistakes in the use of computer tools are selected.

  12. INTERACTIONS: DESIGN, IMPLEMENTATION AND EVALUATION OF A COMPUTATIONAL TOOL FOR TEACHING INTERMOLECULAR FORCES IN HIGHER EDUCATION

    Directory of Open Access Journals (Sweden)

    Francisco Geraldo Barbosa

    2015-12-01

    Full Text Available Intermolecular forces are a useful concept that can explain the attraction between particulate matter as well as numerous phenomena in our lives such as viscosity, solubility, drug interactions, and dyeing of fibers. However, studies show that students have difficulty understanding this important concept, which has led us to develop a free educational software in English and Portuguese. The software can be used interactively by teachers and students, thus facilitating better understanding. Professors and students, both graduate and undergraduate, were questioned about the software quality and its intuitiveness of use, facility of navigation, and pedagogical application using a Likert scale. The results led to the conclusion that the developed computer application can be characterized as an auxiliary tool to assist teachers in their lectures and students in their learning process of intermolecular forces.

  13. Supporting collaborative computing and interaction

    International Nuclear Information System (INIS)

    Agarwal, Deborah; McParland, Charles; Perry, Marcia

    2002-01-01

    To enable collaboration on the daily tasks involved in scientific research, collaborative frameworks should provide lightweight and ubiquitous components that support a wide variety of interaction modes. We envision a collaborative environment as one that provides a persistent space within which participants can locate each other, exchange synchronous and asynchronous messages, share documents and applications, share workflow, and hold videoconferences. We are developing the Pervasive Collaborative Computing Environment (PCCE) as such an environment. The PCCE will provide integrated tools to support shared computing and task control and monitoring. This paper describes the PCCE and the rationale for its design

  14. The ZAP Project: Designing Interactive Computer Tools for Learning Psychology

    Science.gov (United States)

    Hulshof, Casper; Eysink, Tessa; de Jong, Ton

    2006-01-01

    In the ZAP project, a set of interactive computer programs called "ZAPs" was developed. The programs were designed in such a way that first-year students experience psychological phenomena in a vivid and self-explanatory way. Students can either take the role of participant in a psychological experiment, they can experience phenomena themselves,…

  15. The Particle Beam Optics Interactive Computer Laboratory

    International Nuclear Information System (INIS)

    Gillespie, George H.; Hill, Barrey W.; Brown, Nathan A.; Babcock, R. Chris; Martono, Hendy; Carey, David C.

    1997-01-01

    The Particle Beam Optics Interactive Computer Laboratory (PBO Lab) is an educational software concept to aid students and professionals in learning about charged particle beams and particle beam optical systems. The PBO Lab is being developed as a cross-platform application and includes four key elements. The first is a graphic user interface shell that provides for a highly interactive learning session. The second is a knowledge database containing information on electric and magnetic optics transport elements. The knowledge database provides interactive tutorials on the fundamental physics of charged particle optics and on the technology used in particle optics hardware. The third element is a graphical construction kit that provides tools for students to interactively and visually construct optical beamlines. The final element is a set of charged particle optics computational engines that compute trajectories, transport beam envelopes, fit parameters to optical constraints and carry out similar calculations for the student designed beamlines. The primary computational engine is provided by the third-order TRANSPORT code. Augmenting TRANSPORT is the multiple ray tracing program TURTLE and a first-order matrix program that includes a space charge model and support for calculating single particle trajectories in the presence of the beam space charge. This paper describes progress on the development of the PBO Lab

  16. Computations and interaction

    NARCIS (Netherlands)

    Baeten, J.C.M.; Luttik, S.P.; Tilburg, van P.J.A.; Natarajan, R.; Ojo, A.

    2011-01-01

    We enhance the notion of a computation of the classical theory of computing with the notion of interaction. In this way, we enhance a Turing machine as a model of computation to a Reactive Turing Machine that is an abstract model of a computer as it is used nowadays, always interacting with the user

  17. A computational tool to predict the evolutionarily conserved protein-protein interaction hot-spot residues from the structure of the unbound protein.

    Science.gov (United States)

    Agrawal, Neeraj J; Helk, Bernhard; Trout, Bernhardt L

    2014-01-21

    Identifying hot-spot residues - residues that are critical to protein-protein binding - can help to elucidate a protein's function and assist in designing therapeutic molecules to target those residues. We present a novel computational tool, termed spatial-interaction-map (SIM), to predict the hot-spot residues of an evolutionarily conserved protein-protein interaction from the structure of an unbound protein alone. SIM can predict the protein hot-spot residues with an accuracy of 36-57%. Thus, the SIM tool can be used to predict the yet unknown hot-spot residues for many proteins for which the structure of the protein-protein complexes are not available, thereby providing a clue to their functions and an opportunity to design therapeutic molecules to target these proteins. Copyright © 2013 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.

  18. Informed public choices for low-carbon electricity portfolios using a computer decision tool.

    Science.gov (United States)

    Mayer, Lauren A Fleishman; Bruine de Bruin, Wändi; Morgan, M Granger

    2014-04-01

    Reducing CO2 emissions from the electricity sector will likely require policies that encourage the widespread deployment of a diverse mix of low-carbon electricity generation technologies. Public discourse informs such policies. To make informed decisions and to productively engage in public discourse, citizens need to understand the trade-offs between electricity technologies proposed for widespread deployment. Building on previous paper-and-pencil studies, we developed a computer tool that aimed to help nonexperts make informed decisions about the challenges faced in achieving a low-carbon energy future. We report on an initial usability study of this interactive computer tool. After providing participants with comparative and balanced information about 10 electricity technologies, we asked them to design a low-carbon electricity portfolio. Participants used the interactive computer tool, which constrained portfolio designs to be realistic and yield low CO2 emissions. As they changed their portfolios, the tool updated information about projected CO2 emissions, electricity costs, and specific environmental impacts. As in the previous paper-and-pencil studies, most participants designed diverse portfolios that included energy efficiency, nuclear, coal with carbon capture and sequestration, natural gas, and wind. Our results suggest that participants understood the tool and used it consistently. The tool may be downloaded from http://cedmcenter.org/tools-for-cedm/informing-the-public-about-low-carbon-technologies/ .

  19. Introducing Handheld Computing for Interactive Medical Education

    Directory of Open Access Journals (Sweden)

    Joseph Finkelstein

    2005-04-01

    Full Text Available The goals of this project were: (1 development of an interactive multimedia medical education tool (CO-ED utilizing modern features of handheld computing (PDA and major constructs of adult learning theories, and (2 pilot testing of the computer-assisted education in residents and clinicians. Comparison of the knowledge scores using paired t-test demonstrated statistically significant increase in subject knowledge (p<0.01 after using CO-ED. Attitudinal surveys were analyzed by total score (TS calculation represented as a percentage of a maximal possible score. The mean TS was 74.5±7.1%. None of the subjects (N=10 had TS less than 65% and in half of the subjects (N=5 TS was higher than 75%. Analysis of the semi-structured in-depth interviews showed strong support of the study subjects in using PDA as an educational tool, and high acceptance of CO-ED user interface. We concluded that PDA have a significant potential as a tool for clinician education.

  20. Discerning molecular interactions: A comprehensive review on biomolecular interaction databases and network analysis tools.

    Science.gov (United States)

    Miryala, Sravan Kumar; Anbarasu, Anand; Ramaiah, Sudha

    2018-02-05

    Computational analysis of biomolecular interaction networks is now gaining a lot of importance to understand the functions of novel genes/proteins. Gene interaction (GI) network analysis and protein-protein interaction (PPI) network analysis play a major role in predicting the functionality of interacting genes or proteins and gives an insight into the functional relationships and evolutionary conservation of interactions among the genes. An interaction network is a graphical representation of gene/protein interactome, where each gene/protein is a node, and interaction between gene/protein is an edge. In this review, we discuss the popular open source databases that serve as data repositories to search and collect protein/gene interaction data, and also tools available for the generation of interaction network, visualization and network analysis. Also, various network analysis approaches like topological approach and clustering approach to study the network properties and functional enrichment server which illustrates the functions and pathway of the genes and proteins has been discussed. Hence the distinctive attribute mentioned in this review is not only to provide an overview of tools and web servers for gene and protein-protein interaction (PPI) network analysis but also to extract useful and meaningful information from the interaction networks. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. The Particle Beam Optics Interactive Computer Laboratory

    International Nuclear Information System (INIS)

    Gillespie, G.H.; Hill, B.W.; Brown, N.A.; Babcock, R.C.; Martono, H.; Carey, D.C.

    1997-01-01

    The Particle Beam Optics Interactive Computer Laboratory (PBO Lab) is an educational software concept to aid students and professionals in learning about charged particle beams and particle beam optical systems. The PBO Lab is being developed as a cross-platform application and includes four key elements. The first is a graphic user interface shell that provides for a highly interactive learning session. The second is a knowledge database containing information on electric and magnetic optics transport elements. The knowledge database provides interactive tutorials on the fundamental physics of charged particle optics and on the technology used in particle optics hardware. The third element is a graphical construction kit that provides tools for students to interactively and visually construct optical beamlines. The final element is a set of charged particle optics computational engines that compute trajectories, transport beam envelopes, fit parameters to optical constraints and carry out similar calculations for the student designed beamlines. The primary computational engine is provided by the third-order TRANSPORT code. Augmenting TRANSPORT is the multiple ray tracing program TURTLE and a first-order matrix program that includes a space charge model and support for calculating single particle trajectories in the presence of the beam space charge. This paper describes progress on the development of the PBO Lab. copyright 1997 American Institute of Physics

  2. Tools for computational finance

    CERN Document Server

    Seydel, Rüdiger U

    2017-01-01

    Computational and numerical methods are used in a number of ways across the field of finance. It is the aim of this book to explain how such methods work in financial engineering. By concentrating on the field of option pricing, a core task of financial engineering and risk analysis, this book explores a wide range of computational tools in a coherent and focused manner and will be of use to anyone working in computational finance. Starting with an introductory chapter that presents the financial and stochastic background, the book goes on to detail computational methods using both stochastic and deterministic approaches. Now in its sixth edition, Tools for Computational Finance has been significantly revised and contains:    Several new parts such as a section on extended applications of tree methods, including multidimensional trees, trinomial trees, and the handling of dividends; Additional material in the field of generating normal variates with acceptance-rejection methods, and on Monte Carlo methods...

  3. Integrating interactive computational modeling in biology curricula.

    Directory of Open Access Journals (Sweden)

    Tomáš Helikar

    2015-03-01

    Full Text Available While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.

  4. Integrating interactive computational modeling in biology curricula.

    Science.gov (United States)

    Helikar, Tomáš; Cutucache, Christine E; Dahlquist, Lauren M; Herek, Tyler A; Larson, Joshua J; Rogers, Jim A

    2015-03-01

    While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology) class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.

  5. BUILD-IT : a computer vision-based interaction technique for a planning tool

    NARCIS (Netherlands)

    Rauterberg, G.W.M.; Fjeld, M.; Krueger, H.; Bichsel, M.; Leonhardt, U.; Meier, M.; Thimbleby, H.; O'Conaill, B.; Thomas, P.J.

    1997-01-01

    Shows a method that goes beyond the established approaches of human-computer interaction. We first bring a serious critique of traditional interface types, showing their major drawbacks and limitations. Promising alternatives are offered by virtual (or immersive) reality (VR) and by augmented

  6. An interactive computer lab of the galvanic cell for students in biochemistry.

    Science.gov (United States)

    Ahlstrand, Emma; Buetti-Dinh, Antoine; Friedman, Ran

    2018-01-01

    We describe an interactive module that can be used to teach basic concepts in electrochemistry and thermodynamics to first year natural science students. The module is used together with an experimental laboratory and improves the students' understanding of thermodynamic quantities such as Δ r G, Δ r H, and Δ r S that are calculated but not directly measured in the lab. We also discuss how new technologies can substitute some parts of experimental chemistry courses, and improve accessibility to course material. Cloud computing platforms such as CoCalc facilitate the distribution of computer codes and allow students to access and apply interactive course tools beyond the course's scope. Despite some limitations imposed by cloud computing, the students appreciated the approach and the enhanced opportunities to discuss study questions with their classmates and instructor as facilitated by the interactive tools. © 2017 by The International Union of Biochemistry and Molecular Biology, 46(1):58-65, 2018. © 2017 The International Union of Biochemistry and Molecular Biology.

  7. A new strategic neurosurgical planning tool for brainstem cavernous malformations using interactive computer graphics with multimodal fusion images.

    Science.gov (United States)

    Kin, Taichi; Nakatomi, Hirofumi; Shojima, Masaaki; Tanaka, Minoru; Ino, Kenji; Mori, Harushi; Kunimatsu, Akira; Oyama, Hiroshi; Saito, Nobuhito

    2012-07-01

    In this study, the authors used preoperative simulation employing 3D computer graphics (interactive computer graphics) to fuse all imaging data for brainstem cavernous malformations. The authors evaluated whether interactive computer graphics or 2D imaging correlated better with the actual operative field, particularly in identifying a developmental venous anomaly (DVA). The study population consisted of 10 patients scheduled for surgical treatment of brainstem cavernous malformations. Data from preoperative imaging (MRI, CT, and 3D rotational angiography) were automatically fused using a normalized mutual information method, and then reconstructed by a hybrid method combining surface rendering and volume rendering methods. With surface rendering, multimodality and multithreshold techniques for 1 tissue were applied. The completed interactive computer graphics were used for simulation of surgical approaches and assumed surgical fields. Preoperative diagnostic rates for a DVA associated with brainstem cavernous malformation were compared between conventional 2D imaging and interactive computer graphics employing receiver operating characteristic (ROC) analysis. The time required for reconstruction of 3D images was 3-6 hours for interactive computer graphics. Observation in interactive mode required approximately 15 minutes. Detailed anatomical information for operative procedures, from the craniotomy to microsurgical operations, could be visualized and simulated three-dimensionally as 1 computer graphic using interactive computer graphics. Virtual surgical views were consistent with actual operative views. This technique was very useful for examining various surgical approaches. Mean (±SEM) area under the ROC curve for rate of DVA diagnosis was significantly better for interactive computer graphics (1.000±0.000) than for 2D imaging (0.766±0.091; pcomputer graphics than with 2D images. Interactive computer graphics was also useful in helping to plan the surgical

  8. A Complete Interactive Graphical Computer-Aided Instruction System.

    Science.gov (United States)

    Abrams, Steven Selby

    The use of interactive graphics in computer-aided instruction systems is discussed with emphasis placed on two requirements of such a system. The first is the need to provide the teacher with a useful tool with which to design and modify teaching sessions tailored to the individual needs and capabilities of the students. The second is the…

  9. Hypercard Another Computer Tool.

    Science.gov (United States)

    Geske, Joel

    1991-01-01

    Describes "Hypercard," a computer application package usable in all three modes of instructional computing: tutor, tool, and tutee. Suggests using Hypercard in scholastic journalism programs to teach such topics as news, headlines, design, photography, and advertising. Argues that the ability to access, organize, manipulate, and comprehend…

  10. Communication Styles of Interactive Tools for Self-Improvement

    OpenAIRE

    Niess, Jasmin; Diefenbach, Sarah

    2016-01-01

    Background Interactive products for self-improvement (e.g., online trainings to reduce stress, fitness gadgets) have become increasingly popular among consumers and healthcare providers. In line with the idea of positive computing, these tools aim to support their users on their way to improved well-being and human flourishing. As an interdisciplinary domain, the design of self-improvement technologies requires psychological, technological, and design expertise. One needs to know how to suppo...

  11. NASCENT: an automatic protein interaction network generation tool for non-model organisms.

    Science.gov (United States)

    Banky, Daniel; Ordog, Rafael; Grolmusz, Vince

    2009-04-24

    Large quantity of reliable protein interaction data are available for model organisms in public depositories (e.g., MINT, DIP, HPRD, INTERACT). Most data correspond to experiments with the proteins of Saccharomyces cerevisiae, Drosophila melanogaster, Homo sapiens, Caenorhabditis elegans, Escherichia coli and Mus musculus. For other important organisms the data availability is poor or non-existent. Here we present NASCENT, a completely automatic web-based tool and also a downloadable Java program, capable of modeling and generating protein interaction networks even for non-model organisms. The tool performs protein interaction network modeling through gene-name mapping, and outputs the resulting network in graphical form and also in computer-readable graph-forms, directly applicable by popular network modeling software. http://nascent.pitgroup.org.

  12. Visualization Tools for Teaching Computer Security

    Science.gov (United States)

    Yuan, Xiaohong; Vega, Percy; Qadah, Yaseen; Archer, Ricky; Yu, Huiming; Xu, Jinsheng

    2010-01-01

    Using animated visualization tools has been an important teaching approach in computer science education. We have developed three visualization and animation tools that demonstrate various information security concepts and actively engage learners. The information security concepts illustrated include: packet sniffer and related computer network…

  13. IPython: components for interactive and parallel computing across disciplines. (Invited)

    Science.gov (United States)

    Perez, F.; Bussonnier, M.; Frederic, J. D.; Froehle, B. M.; Granger, B. E.; Ivanov, P.; Kluyver, T.; Patterson, E.; Ragan-Kelley, B.; Sailer, Z.

    2013-12-01

    Scientific computing is an inherently exploratory activity that requires constantly cycling between code, data and results, each time adjusting the computations as new insights and questions arise. To support such a workflow, good interactive environments are critical. The IPython project (http://ipython.org) provides a rich architecture for interactive computing with: 1. Terminal-based and graphical interactive consoles. 2. A web-based Notebook system with support for code, text, mathematical expressions, inline plots and other rich media. 3. Easy to use, high performance tools for parallel computing. Despite its roots in Python, the IPython architecture is designed in a language-agnostic way to facilitate interactive computing in any language. This allows users to mix Python with Julia, R, Octave, Ruby, Perl, Bash and more, as well as to develop native clients in other languages that reuse the IPython clients. In this talk, I will show how IPython supports all stages in the lifecycle of a scientific idea: 1. Individual exploration. 2. Collaborative development. 3. Production runs with parallel resources. 4. Publication. 5. Education. In particular, the IPython Notebook provides an environment for "literate computing" with a tight integration of narrative and computation (including parallel computing). These Notebooks are stored in a JSON-based document format that provides an "executable paper": notebooks can be version controlled, exported to HTML or PDF for publication, and used for teaching.

  14. RATIO_TOOL - SOFTWARE FOR COMPUTING IMAGE RATIOS

    Science.gov (United States)

    Yates, G. L.

    1994-01-01

    Geological studies analyze spectral data in order to gain information on surface materials. RATIO_TOOL is an interactive program for viewing and analyzing large multispectral image data sets that have been created by an imaging spectrometer. While the standard approach to classification of multispectral data is to match the spectrum for each input pixel against a library of known mineral spectra, RATIO_TOOL uses ratios of spectral bands in order to spot significant areas of interest within a multispectral image. Each image band can be viewed iteratively, or a selected image band of the data set can be requested and displayed. When the image ratios are computed, the result is displayed as a gray scale image. At this point a histogram option helps in viewing the distribution of values. A thresholding option can then be used to segment the ratio image result into two to four classes. The segmented image is then color coded to indicate threshold classes and displayed alongside the gray scale image. RATIO_TOOL is written in C language for Sun series computers running SunOS 4.0 and later. It requires the XView toolkit and the OpenWindows window manager (version 2.0 or 3.0). The XView toolkit is distributed with Open Windows. A color monitor is also required. The standard distribution medium for RATIO_TOOL is a .25 inch streaming magnetic tape cartridge in UNIX tar format. An electronic copy of the documentation is included on the program media. RATIO_TOOL was developed in 1992 and is a copyrighted work with all copyright vested in NASA. Sun, SunOS, and OpenWindows are trademarks of Sun Microsystems, Inc. UNIX is a registered trademark of AT&T Bell Laboratories.

  15. Computer-aided translation tools

    DEFF Research Database (Denmark)

    Christensen, Tina Paulsen; Schjoldager, Anne

    2016-01-01

    in Denmark is rather high in general, but limited in the case of machine translation (MT) tools: While most TSPs use translation-memory (TM) software, often in combination with a terminology management system (TMS), only very few have implemented MT, which is criticised for its low quality output, especially......The paper reports on a questionnaire survey from 2013 of the uptake and use of computer-aided translation (CAT) tools by Danish translation service providers (TSPs) and discusses how these tools appear to have impacted on the Danish translation industry. According to our results, the uptake...

  16. INTERFACING INTERACTIVE DATA ANALYSIS TOOLS WITH THE GRID: THE PPDG CS-11 ACTIVITY

    International Nuclear Information System (INIS)

    Perl, Joseph

    2003-01-01

    For today's physicists, who work in large geographically distributed collaborations, the data grid promises significantly greater capabilities for analysis of experimental data and production of physics results than is possible with today's ''remote access'' technologies. The goal of letting scientists at their home institutions interact with and analyze data as if they were physically present at the major laboratory that houses their detector and computer center has yet to be accomplished. The Particle Physics Data Grid project (www.ppdg.net) has recently embarked on an effort to ''Interface and Integrate Interactive Data Analysis Tools with the grid and identify Common Components and Services''. The initial activities are to collect known and identify new requirements for grid services and analysis tools from a range of current and future experiments to determine if existing plans for tools and services meet these requirements. Follow-on activities will foster the interaction between grid service developers, analysis tool developers, experiment analysis framework developers and end user physicists, and will identify and carry out specific development/integration work so that interactive analysis tools utilizing grid services actually provide the capabilities that users need. This talk will summarize what we know of requirements for analysis tools and grid services, as well as describe the identified areas where more development work is needed

  17. ESHOPPS: A COMPUTATIONAL TOOL TO AID THE TEACHING OF SHORTEST PATH ALGORITHMS

    Directory of Open Access Journals (Sweden)

    S. J. de A. LIMA

    2015-07-01

    Full Text Available The development of a computational tool called EShoPPS – Environment for Shortest Path Problem Solving, which is used to assist students in understanding the working of Dijkstra, Greedy search and A*(star algorithms is presented in this paper. Such algorithms are commonly taught in graduate and undergraduate courses of Engineering and Informatics and are used for solving many optimization problems that can be characterized as Shortest Path Problem. The EShoPPS is an interactive tool that allows students to create a graph representing the problem and also helps in developing their knowledge of each specific algorithm. Experiments performed with 155 students of undergraduate and graduate courses such as Industrial Engineering, Computer Science and Information Systems have shown that by using the EShoPPS tool students were able to improve their interpretation of investigated algorithms.

  18. Examining the Effects of Field Dependence-Independence on Learners' Problem-Solving Performance and Interaction with a Computer Modeling Tool: Implications for the Design of Joint Cognitive Systems

    Science.gov (United States)

    Angeli, Charoula

    2013-01-01

    An investigation was carried out to examine the effects of cognitive style on learners' performance and interaction during complex problem solving with a computer modeling tool. One hundred and nineteen undergraduates volunteered to participate in the study. Participants were first administered a test, and based on their test scores they were…

  19. PIMiner: A web tool for extraction of protein interactions from biomedical literature

    KAUST Repository

    Chowdhary, Rajesh

    2013-01-01

    Information on Protein Interactions (PIs) is valuable for biomedical research, but often lies buried in the scientific literature and cannot be readily retrieved. While much progress has been made over the years in extracting PIs from the literature using computational methods, there is a lack of free, public, user-friendly tools for the discovery of PIs. We developed an online tool for the extraction of PI relationships from PubMed-abstracts, which we name PIMiner. Protein pairs and the words that describe their interactions are reported by PIMiner so that new interactions can be easily detected within text. The interaction likelihood levels are reported too. The option to extract only specific types of interactions is also provided. The PIMiner server can be accessed through a web browser or remotely through a client\\'s command line. PIMiner can process 50,000 PubMed abstracts in approximately 7 min and thus appears suitable for large-scale processing of biological/biomedical literature. Copyright © 2013 Inderscience Enterprises Ltd.

  20. Cross-cultural human-computer interaction and user experience design a semiotic perspective

    CERN Document Server

    Brejcha, Jan

    2015-01-01

    This book describes patterns of language and culture in human-computer interaction (HCI). Through numerous examples, it shows why these patterns matter and how to exploit them to design a better user experience (UX) with computer systems. It provides scientific information on the theoretical and practical areas of the interaction and communication design for research experts and industry practitioners and covers the latest research in semiotics and cultural studies, bringing a set of tools and methods to benefit the process of designing with the cultural background in mind.

  1. The Benefits & Drawbacks of Integrating Cloud Computing and Interactive Whiteboards in Teacher Preparation

    Science.gov (United States)

    Blue, Elfreda; Tirotta, Rose

    2011-01-01

    Twenty-first century technology has changed the way tools are used to support and enhance learning and instruction. Cloud computing and interactive white boards, make it possible for learners to interact, simulate, collaborate, and document learning experiences and real world problem-solving. This article discusses how various technologies (blogs,…

  2. Integrating Computational Science Tools into a Thermodynamics Course

    Science.gov (United States)

    Vieira, Camilo; Magana, Alejandra J.; García, R. Edwin; Jana, Aniruddha; Krafcik, Matthew

    2018-01-01

    Computational tools and methods have permeated multiple science and engineering disciplines, because they enable scientists and engineers to process large amounts of data, represent abstract phenomena, and to model and simulate complex concepts. In order to prepare future engineers with the ability to use computational tools in the context of their disciplines, some universities have started to integrate these tools within core courses. This paper evaluates the effect of introducing three computational modules within a thermodynamics course on student disciplinary learning and self-beliefs about computation. The results suggest that using worked examples paired to computer simulations to implement these modules have a positive effect on (1) student disciplinary learning, (2) student perceived ability to do scientific computing, and (3) student perceived ability to do computer programming. These effects were identified regardless of the students' prior experiences with computer programming.

  3. Interactive computer enhanced remote viewing system

    International Nuclear Information System (INIS)

    Smith, D.A.; Tourtellott, J.A.

    1994-01-01

    The Interactive, Computer Enhanced, Remote Viewing System (ICERVSA) is a volumetric data system designed to help the Department of Energy (DOE) improve remote operations in hazardous sites by providing reliable and accurate maps of task spaces where robots will clean up nuclear wastes. The ICERVS mission is to acquire, store, integrate and manage all the sensor data for a site and to provide the necessary tools to facilitate its visualization and interpretation. Empirical sensor data enters through the Common Interface for Sensors and after initial processing, is stored in the Volumetric Database. The data can be analyzed and displayed via a Graphic User Interface with a variety of visualization tools. Other tools permit the construction of geometric objects, such as wire frame models, to represent objects which the operator may recognize in the live TV image. A computer image can be generated that matches the viewpoint of the live TV camera at the remote site, facilitating access to site data. Lastly, the data can be gathered, processed, and transmitted in acceptable form to a robotic controller. Descriptions are given of all these components. The final phase of the ICERVS project, which has just begun, will produce a full scale system and demonstrate it at a DOE site to be selected. A task added to this Phase will adapt the ICERVS to meet the needs of the Dismantlement and Decommissioning (D and D) work at the Oak Ridge National Laboratory (ORNL)

  4. SnapAnatomy, a computer-based interactive tool for independent learning of human anatomy.

    Science.gov (United States)

    Yip, George W; Rajendran, Kanagasuntheram

    2008-06-01

    Computer-aided instruction materials are becoming increasing popular in medical education and particularly in the teaching of human anatomy. This paper describes SnapAnatomy, a new interactive program that the authors designed for independent learning of anatomy. SnapAnatomy is primarily tailored for the beginner student to encourage the learning of anatomy by developing a three-dimensional visualization of human structure that is essential to applications in clinical practice and the understanding of function. The program allows the student to take apart and to accurately put together body components in an interactive, self-paced and variable manner to achieve the learning outcome.

  5. MRIVIEW: An interactive computational tool for investigation of brain structure and function

    International Nuclear Information System (INIS)

    Ranken, D.; George, J.

    1993-01-01

    MRIVIEW is a software system which uses image processing and visualization to provide neuroscience researchers with an integrated environment for combining functional and anatomical information. Key features of the software include semi-automated segmentation of volumetric head data and an interactive coordinate reconciliation method which utilizes surface visualization. The current system is a precursor to a computational brain atlas. We describe features this atlas will incorporate, including methods under development for visualizing brain functional data obtained from several different research modalities

  6. Tools for Embedded Computing Systems Software

    Science.gov (United States)

    1978-01-01

    A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.

  7. From Human-Computer Interaction to Human-Robot Social Interaction

    OpenAIRE

    Toumi, Tarek; Zidani, Abdelmadjid

    2014-01-01

    Human-Robot Social Interaction became one of active research fields in which researchers from different areas propose solutions and directives leading robots to improve their interactions with humans. In this paper we propose to introduce works in both human robot interaction and human computer interaction and to make a bridge between them, i.e. to integrate emotions and capabilities concepts of the robot in human computer model to become adequate for human robot interaction and discuss chall...

  8. Computational Tools To Model Halogen Bonds in Medicinal Chemistry.

    Science.gov (United States)

    Ford, Melissa Coates; Ho, P Shing

    2016-03-10

    The use of halogens in therapeutics dates back to the earliest days of medicine when seaweed was used as a source of iodine to treat goiters. The incorporation of halogens to improve the potency of drugs is now fairly standard in medicinal chemistry. In the past decade, halogens have been recognized as direct participants in defining the affinity of inhibitors through a noncovalent interaction called the halogen bond or X-bond. Incorporating X-bonding into structure-based drug design requires computational models for the anisotropic distribution of charge and the nonspherical shape of halogens, which lead to their highly directional geometries and stabilizing energies. We review here current successes and challenges in developing computational methods to introduce X-bonding into lead compound discovery and optimization during drug development. This fast-growing field will push further development of more accurate and efficient computational tools to accelerate the exploitation of halogens in medicinal chemistry.

  9. Child-Computer Interaction: ICMI 2012 special session

    NARCIS (Netherlands)

    Nijholt, Antinus; Morency, L.P.; Bohus, L.; Aghajan, H.; Nijholt, Antinus; Cassell, J.; Epps, J.

    2012-01-01

    This is a short introduction to the special session on child computer interaction at the International Conference on Multimodal Interaction 2012 (ICMI 2012). In human-computer interaction users have become participants in the design process. This is not different for child computer interaction

  10. An Educational Tool for Interactive Parallel and Distributed Processing

    DEFF Research Database (Denmark)

    Pagliarini, Luigi; Lund, Henrik Hautop

    2011-01-01

    In this paper we try to describe how the Modular Interactive Tiles System (MITS) can be a valuable tool for introducing students to interactive parallel and distributed processing programming. This is done by providing an educational hands-on tool that allows a change of representation of the abs......In this paper we try to describe how the Modular Interactive Tiles System (MITS) can be a valuable tool for introducing students to interactive parallel and distributed processing programming. This is done by providing an educational hands-on tool that allows a change of representation...... of the abstract problems related to designing interactive parallel and distributed systems. Indeed, MITS seems to bring a series of goals into the education, such as parallel programming, distributedness, communication protocols, master dependency, software behavioral models, adaptive interactivity, feedback......, connectivity, topology, island modeling, user and multiuser interaction, which can hardly be found in other tools. Finally, we introduce the system of modular interactive tiles as a tool for easy, fast, and flexible hands-on exploration of these issues, and through examples show how to implement interactive...

  11. Augmented reality as a tool for linguistic research: Intercepting and manipulating multimodal interaction

    OpenAIRE

    Pitsch, Karola; Neumann, Alexander; Schnier, Christian; Hermann, Thomas

    2013-01-01

    We suggest that an Augmented Reality (AR) system for coupled interaction partners provides a new tool for linguistic research that allows to manipulate the coparticipants’ real-time perception and action. It encompasses novel facilities for recording heterogeneous sensor-rich data sets to be accessed in parallel with qualitative/manual and quantitative/computational methods.

  12. Occupational stress in human computer interaction.

    Science.gov (United States)

    Smith, M J; Conway, F T; Karsh, B T

    1999-04-01

    There have been a variety of research approaches that have examined the stress issues related to human computer interaction including laboratory studies, cross-sectional surveys, longitudinal case studies and intervention studies. A critical review of these studies indicates that there are important physiological, biochemical, somatic and psychological indicators of stress that are related to work activities where human computer interaction occurs. Many of the stressors of human computer interaction at work are similar to those stressors that have historically been observed in other automated jobs. These include high workload, high work pressure, diminished job control, inadequate employee training to use new technology, monotonous tasks, por supervisory relations, and fear for job security. New stressors have emerged that can be tied primarily to human computer interaction. These include technology breakdowns, technology slowdowns, and electronic performance monitoring. The effects of the stress of human computer interaction in the workplace are increased physiological arousal; somatic complaints, especially of the musculoskeletal system; mood disturbances, particularly anxiety, fear and anger; and diminished quality of working life, such as reduced job satisfaction. Interventions to reduce the stress of computer technology have included improved technology implementation approaches and increased employee participation in implementation. Recommendations for ways to reduce the stress of human computer interaction at work are presented. These include proper ergonomic conditions, increased organizational support, improved job content, proper workload to decrease work pressure, and enhanced opportunities for social support. A model approach to the design of human computer interaction at work that focuses on the system "balance" is proposed.

  13. Specalyzer—an interactive online tool to analyze spectral reflectance measurements

    Directory of Open Access Journals (Sweden)

    Alexander Koc

    2018-06-01

    Full Text Available Low-cost phenotyping using proximal sensors is increasingly becoming popular in plant breeding. As these techniques generate a large amount of data, analysis pipelines that do not require expertise in computer programming can benefit a broader user base. In this work, a new online tool Specalyzer is presented that allows interactive analysis of the spectral reflectance data generated by proximal spectroradiometers. Specalyzer can be operated from any web browser allowing data uploading, analysis, interactive plots and exporting by point and click using a simple graphical user interface. Specalyzer is evaluated with case study data from a winter wheat fertilizer trial with two fertilizer treatments. Specalyzer can be accessed online at http://www.specalyzer.org.

  14. Elementary mathematical and computational tools for electrical and computer engineers using Matlab

    CERN Document Server

    Manassah, Jamal T

    2013-01-01

    Ideal for use as a short-course textbook and for self-study Elementary Mathematical and Computational Tools for Electrical and Computer Engineers Using MATLAB fills that gap. Accessible after just one semester of calculus, it introduces the many practical analytical and numerical tools that are essential to success both in future studies and in professional life. Sharply focused on the needs of the electrical and computer engineering communities, the text provides a wealth of relevant exercises and design problems. Changes in MATLAB's version 6.0 are included in a special addendum.

  15. Computer-Aided Modelling Methods and Tools

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    The development of models for a range of applications requires methods and tools. In many cases a reference model is required that allows the generation of application specific models that are fit for purpose. There are a range of computer aided modelling tools available that help to define the m...

  16. Interactive computing in BASIC an introduction to interactive computing and a practical course in the BASIC language

    CERN Document Server

    Sanderson, Peter C

    1973-01-01

    Interactive Computing in BASIC: An Introduction to Interactive Computing and a Practical Course in the BASIC Language provides a general introduction to the principles of interactive computing and a comprehensive practical guide to the programming language Beginners All-purpose Symbolic Instruction Code (BASIC). The book starts by providing an introduction to computers and discussing the aspects of terminal usage, programming languages, and the stages in writing and testing a program. The text then discusses BASIC with regard to methods in writing simple arithmetical programs, control stateme

  17. Workshop on Software Development Tools for Petascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Vetter, Jeffrey [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Georgia Inst. of Technology, Atlanta, GA (United States)

    2007-08-01

    Petascale computing systems will soon be available to the DOE science community. Recent studies in the productivity of HPC platforms point to better software environments as a key enabler to science on these systems. To prepare for the deployment and productive use of these petascale platforms, the DOE science and general HPC community must have the software development tools, such as performance analyzers and debuggers that meet application requirements for scalability, functionality, reliability, and ease of use. In this report, we identify and prioritize the research opportunities in the area of software development tools for high performance computing. To facilitate this effort, DOE hosted a group of 55 leading international experts in this area at the Software Development Tools for PetaScale Computing (SDTPC) Workshop, which was held in Washington, D.C. on August 1 and 2, 2007. Software development tools serve as an important interface between the application teams and the target HPC architectures. Broadly speaking, these roles can be decomposed into three categories: performance tools, correctness tools, and development environments. Accordingly, this SDTPC report has four technical thrusts: performance tools, correctness tools, development environment infrastructures, and scalable tool infrastructures. The last thrust primarily targets tool developers per se, rather than end users. Finally, this report identifies non-technical strategic challenges that impact most tool development. The organizing committee emphasizes that many critical areas are outside the scope of this charter; these important areas include system software, compilers, and I/O.

  18. Interactive Media and Simulation Tools for Technical Training

    Science.gov (United States)

    Gramoll, Kurt

    1997-01-01

    Over the last several years, integration of multiple media sources into a single information system has been rapidly developing. It has been found that when sound, graphics, text, animations, and simulations are skillfully integrated, the sum of the parts exceeds the individual parts for effective learning. In addition, simulations can be used to design and understand complex engineering processes. With the recent introduction of many high-level authoring, animation, modeling, and rendering programs for personal computers, significant multimedia programs can be developed by practicing engineers, scientists and even managers for both training and education. However, even with these new tools, a considerable amount of time is required to produce an interactive multimedia program. The development of both CD-ROM and Web-based programs are discussed in addition to the use of technically oriented animations. Also examined are various multimedia development tools and how they are used to develop effective engineering education courseware. Demonstrations of actual programs in engineering mechanics are shown.

  19. Development of a computer tool to support scenario analysis for safety assessment of HLW geological disposal

    International Nuclear Information System (INIS)

    Makino, Hitoshi; Kawamura, Makoto; Wakasugi, Keiichiro; Okubo, Hiroo; Takase, Hiroyasu

    2007-02-01

    In 'H12 Project to Establishing Technical Basis for HLW Disposal in Japan' a systematic approach that was based on an international consensus was adopted to develop scenarios to be considered in performance assessment. Adequacy of the approach was, in general term, appreciated through the domestic and international peer review. However it was also suggested that there were issues related to improving transparency and traceability of the procedure. To achieve this, improvement of scenario analysis method has been studied. In this study, based on an improvement method for treatment of FEP interaction a computer tool to support scenario analysis by specialists of performance assessment has been developed. Anticipated effects of this tool are to improve efficiency of complex and time consuming scenario analysis work and to reduce possibility of human errors in this work. This tool also enables to describe interactions among a vast number of FEPs and the related information as interaction matrix, and analysis those interactions from a variety of perspectives. (author)

  20. Communication Styles of Interactive Tools for Self-Improvement.

    Science.gov (United States)

    Niess, Jasmin; Diefenbach, Sarah

    Interactive products for self-improvement (e.g., online trainings to reduce stress, fitness gadgets) have become increasingly popular among consumers and healthcare providers. In line with the idea of positive computing, these tools aim to support their users on their way to improved well-being and human flourishing. As an interdisciplinary domain, the design of self-improvement technologies requires psychological, technological, and design expertise. One needs to know how to support people in behavior change, and one needs to find ways to do this through technology design. However, as recent reviews show, the interlocking relationship between these disciplines is still improvable. Many existing technologies for self-improvement neglect psychological theory on behavior change, especially motivational factors are not sufficiently considered. To counteract this, we suggest a focus on the dialog and emerging communication between product and user, considering the self-improvement tool as an interactive coach and advisor. The present qualitative interview study (N = 18) explored the user experience of self-improvement technologies. A special focus was on the perceived dialog between tool and user, which we analyzed in terms of models from communication psychology. Our findings show that users are sensible to the way the product "speaks to them" and consider this as essential for their experience and successful change. Analysis revealed different communication styles of self-improvement tools (e.g., helpful-cooperative, rational-distanced, critical-aggressive), each linked to specific emotional consequences. These findings form one starting point for a more psychologically founded design of self-improvement technology. On a more general level, our approach aims to contribute to a better integration of psychological and technological knowledge, and in consequence, supporting users on their way to enhanced well-being.

  1. Computation as Medium

    DEFF Research Database (Denmark)

    Jochum, Elizabeth Ann; Putnam, Lance

    2017-01-01

    Artists increasingly utilize computational tools to generate art works. Computational approaches to art making open up new ways of thinking about agency in interactive art because they invite participation and allow for unpredictable outcomes. Computational art is closely linked...... to the participatory turn in visual art, wherein spectators physically participate in visual art works. Unlike purely physical methods of interaction, computer assisted interactivity affords artists and spectators more nuanced control of artistic outcomes. Interactive art brings together human bodies, computer code......, and nonliving objects to create emergent art works. Computation is more than just a tool for artists, it is a medium for investigating new aesthetic possibilities for choreography and composition. We illustrate this potential through two artistic projects: an improvisational dance performance between a human...

  2. Applications of computational tools in biosciences and medical engineering

    CERN Document Server

    Altenbach, Holm

    2015-01-01

     This book presents the latest developments and applications of computational tools related to the biosciences and medical engineering. It also reports the findings of different multi-disciplinary research projects, for example, from the areas of scaffolds and synthetic bones, implants and medical devices, and medical materials. It is also shown that the application of computational tools often requires mathematical and experimental methods. Computational tools such as the finite element methods, computer-aided design and optimization as well as visualization techniques such as computed axial tomography open up completely new research fields that combine the fields of engineering and bio/medical. Nevertheless, there are still hurdles since both directions are based on quite different ways of education. Often even the “language” can vary from discipline to discipline.

  3. Affective Computing and Intelligent Interaction

    CERN Document Server

    2012-01-01

    2012 International Conference on Affective Computing and Intelligent Interaction (ICACII 2012) was the most comprehensive conference focused on the various aspects of advances in Affective Computing and Intelligent Interaction. The conference provided a rare opportunity to bring together worldwide academic researchers and practitioners for exchanging the latest developments and applications in this field such as Intelligent Computing, Affective Computing, Machine Learning, Business Intelligence and HCI.   This volume is a collection of 119 papers selected from 410 submissions from universities and industries all over the world, based on their quality and relevancy to the conference. All of the papers have been peer-reviewed by selected experts.  

  4. An educational tool for interactive parallel and distributed processing

    DEFF Research Database (Denmark)

    Pagliarini, Luigi; Lund, Henrik Hautop

    2012-01-01

    In this article we try to describe how the modular interactive tiles system (MITS) can be a valuable tool for introducing students to interactive parallel and distributed processing programming. This is done by providing a handson educational tool that allows a change in the representation...... of abstract problems related to designing interactive parallel and distributed systems. Indeed, the MITS seems to bring a series of goals into education, such as parallel programming, distributedness, communication protocols, master dependency, software behavioral models, adaptive interactivity, feedback......, connectivity, topology, island modeling, and user and multi-user interaction which can rarely be found in other tools. Finally, we introduce the system of modular interactive tiles as a tool for easy, fast, and flexible hands-on exploration of these issues, and through examples we show how to implement...

  5. Interactive visualization of Earth and Space Science computations

    Science.gov (United States)

    Hibbard, William L.; Paul, Brian E.; Santek, David A.; Dyer, Charles R.; Battaiola, Andre L.; Voidrot-Martinez, Marie-Francoise

    1994-01-01

    Computers have become essential tools for scientists simulating and observing nature. Simulations are formulated as mathematical models but are implemented as computer algorithms to simulate complex events. Observations are also analyzed and understood in terms of mathematical models, but the number of these observations usually dictates that we automate analyses with computer algorithms. In spite of their essential role, computers are also barriers to scientific understanding. Unlike hand calculations, automated computations are invisible and, because of the enormous numbers of individual operations in automated computations, the relation between an algorithm's input and output is often not intuitive. This problem is illustrated by the behavior of meteorologists responsible for forecasting weather. Even in this age of computers, many meteorologists manually plot weather observations on maps, then draw isolines of temperature, pressure, and other fields by hand (special pads of maps are printed for just this purpose). Similarly, radiologists use computers to collect medical data but are notoriously reluctant to apply image-processing algorithms to that data. To these scientists with life-and-death responsibilities, computer algorithms are black boxes that increase rather than reduce risk. The barrier between scientists and their computations can be bridged by techniques that make the internal workings of algorithms visible and that allow scientists to experiment with their computations. Here we describe two interactive systems developed at the University of Wisconsin-Madison Space Science and Engineering Center (SSEC) that provide these capabilities to Earth and space scientists.

  6. Fundamentals of human-computer interaction

    CERN Document Server

    Monk, Andrew F

    1985-01-01

    Fundamentals of Human-Computer Interaction aims to sensitize the systems designer to the problems faced by the user of an interactive system. The book grew out of a course entitled """"The User Interface: Human Factors for Computer-based Systems"""" which has been run annually at the University of York since 1981. This course has been attended primarily by systems managers from the computer industry. The book is organized into three parts. Part One focuses on the user as processor of information with studies on visual perception; extracting information from printed and electronically presented

  7. An interactive visualization tool for mobile objects

    Science.gov (United States)

    Kobayashi, Tetsuo

    Recent advancements in mobile devices---such as Global Positioning System (GPS), cellular phones, car navigation system, and radio-frequency identification (RFID)---have greatly influenced the nature and volume of data about individual-based movement in space and time. Due to the prevalence of mobile devices, vast amounts of mobile objects data are being produced and stored in databases, overwhelming the capacity of traditional spatial analytical methods. There is a growing need for discovering unexpected patterns, trends, and relationships that are hidden in the massive mobile objects data. Geographic visualization (GVis) and knowledge discovery in databases (KDD) are two major research fields that are associated with knowledge discovery and construction. Their major research challenges are the integration of GVis and KDD, enhancing the ability to handle large volume mobile objects data, and high interactivity between the computer and users of GVis and KDD tools. This dissertation proposes a visualization toolkit to enable highly interactive visual data exploration for mobile objects datasets. Vector algebraic representation and online analytical processing (OLAP) are utilized for managing and querying the mobile object data to accomplish high interactivity of the visualization tool. In addition, reconstructing trajectories at user-defined levels of temporal granularity with time aggregation methods allows exploration of the individual objects at different levels of movement generality. At a given level of generality, individual paths can be combined into synthetic summary paths based on three similarity measures, namely, locational similarity, directional similarity, and geometric similarity functions. A visualization toolkit based on the space-time cube concept exploits these functionalities to create a user-interactive environment for exploring mobile objects data. Furthermore, the characteristics of visualized trajectories are exported to be utilized for data

  8. Interactive computer graphics applications for compressible aerodynamics

    Science.gov (United States)

    Benson, Thomas J.

    1994-01-01

    Three computer applications have been developed to solve inviscid compressible fluids problems using interactive computer graphics. The first application is a compressible flow calculator which solves for isentropic flow, normal shocks, and oblique shocks or centered expansions produced by two dimensional ramps. The second application couples the solutions generated by the first application to a more graphical presentation of the results to produce a desk top simulator of three compressible flow problems: 1) flow past a single compression ramp; 2) flow past two ramps in series; and 3) flow past two opposed ramps. The third application extends the results of the second to produce a design tool which solves for the flow through supersonic external or mixed compression inlets. The applications were originally developed to run on SGI or IBM workstations running GL graphics. They are currently being extended to solve additional types of flow problems and modified to operate on any X-based workstation.

  9. Advanced computational simulations of water waves interacting with wave energy converters

    Science.gov (United States)

    Pathak, Ashish; Freniere, Cole; Raessi, Mehdi

    2017-03-01

    Wave energy converter (WEC) devices harness the renewable ocean wave energy and convert it into useful forms of energy, e.g. mechanical or electrical. This paper presents an advanced 3D computational framework to study the interaction between water waves and WEC devices. The computational tool solves the full Navier-Stokes equations and considers all important effects impacting the device performance. To enable large-scale simulations in fast turnaround times, the computational solver was developed in an MPI parallel framework. A fast multigrid preconditioned solver is introduced to solve the computationally expensive pressure Poisson equation. The computational solver was applied to two surface-piercing WEC geometries: bottom-hinged cylinder and flap. Their numerically simulated response was validated against experimental data. Additional simulations were conducted to investigate the applicability of Froude scaling in predicting full-scale WEC response from the model experiments.

  10. Hybrid Design Tools Intuit Interaction

    NARCIS (Netherlands)

    Wendrich, Robert E.; Kyvsgaard Hansen, P.; Rasmussen, J.; Jorgensen, K.A.; Tollestrup, C.

    2012-01-01

    Non-linear, non-explicit, non-standard thinking and ambiguity in design tools has a great impact on enhancement of creativity during ideation and conceptualization. Tacit-tangible representation based on a mere idiosyncratic and individual approach combined with computational assistance allows the

  11. Partitioned Fluid-Structure Interaction for Full Rotor Computations Using CFD

    DEFF Research Database (Denmark)

    Heinz, Joachim Christian

    ) based aerodynamic model which is computationally cheap but includes several limitations and corrections in order to account for three-dimensional and unsteady eects. The present work discusses the development of an aero-elastic simulation tool where high-fidelity computational fluid dynamics (CFD......) is used to model the aerodynamics of the flexible wind turbine rotor. Respective CFD computations are computationally expensive but do not show the limitations of the BEM-based models. It is one of the first times that high-fidelity fluid-structure interaction (FSI) simulations are used to model the aero......-elastic response of an entire wind turbine rotor. The work employs a partitioned FSI coupling between the multi-body-based structural model of the aero-elastic solver HAWC2 and the finite volume CFD solver EllipSys3D. In order to establish an FSI coupling of sufficient time accuracy and sufficient numerical...

  12. MVPACK: a computer-aided design tool for multivariable control systems

    International Nuclear Information System (INIS)

    Mensah, S.; Frketich, G.

    1985-10-01

    The design and analysis of high-performance controllers for complex plants require a collection of interactive, powerful computer software. MVPACK, an open-ended package for the computer-aided design of control systems, has been developed in the Reactor Control Branch of the Chalk River Nuclear Laboratories. The package is fully interactive and includes a comprehensive state-of-the-art mathematical library to support development of complex, multivariable, control algorithms. Coded in RATFOR, MVPACK is portable with minimal changes. It operates with a flexible data structure which makes efficient use of minicomputer resources and provides a standard framework for program generation. The existence of a help mechanism enhances the simplicity of package utilization. This paper provides a brief tutorial overview of the package. It reviews the specifications used in the design and implementation of the package and briefly describes the database structure, supporting libraries and some design and analysis modules of MVPACK. Several application examples to illustrate the capability of the package are given. Experience with MVPACK shows that the package provides a synergistic environment for the design of control and regulation systems, and that it is a unique tool for training of control system engineers

  13. Heterogeneous computing architecture for fast detection of SNP-SNP interactions.

    Science.gov (United States)

    Sluga, Davor; Curk, Tomaz; Zupan, Blaz; Lotric, Uros

    2014-06-25

    The extent of data in a typical genome-wide association study (GWAS) poses considerable computational challenges to software tools for gene-gene interaction discovery. Exhaustive evaluation of all interactions among hundreds of thousands to millions of single nucleotide polymorphisms (SNPs) may require weeks or even months of computation. Massively parallel hardware within a modern Graphic Processing Unit (GPU) and Many Integrated Core (MIC) coprocessors can shorten the run time considerably. While the utility of GPU-based implementations in bioinformatics has been well studied, MIC architecture has been introduced only recently and may provide a number of comparative advantages that have yet to be explored and tested. We have developed a heterogeneous, GPU and Intel MIC-accelerated software module for SNP-SNP interaction discovery to replace the previously single-threaded computational core in the interactive web-based data exploration program SNPsyn. We report on differences between these two modern massively parallel architectures and their software environments. Their utility resulted in an order of magnitude shorter execution times when compared to the single-threaded CPU implementation. GPU implementation on a single Nvidia Tesla K20 runs twice as fast as that for the MIC architecture-based Xeon Phi P5110 coprocessor, but also requires considerably more programming effort. General purpose GPUs are a mature platform with large amounts of computing power capable of tackling inherently parallel problems, but can prove demanding for the programmer. On the other hand the new MIC architecture, albeit lacking in performance reduces the programming effort and makes it up with a more general architecture suitable for a wider range of problems.

  14. Eye Tracking Based Control System for Natural Human-Computer Interaction

    Directory of Open Access Journals (Sweden)

    Xuebai Zhang

    2017-01-01

    Full Text Available Eye movement can be regarded as a pivotal real-time input medium for human-computer communication, which is especially important for people with physical disability. In order to improve the reliability, mobility, and usability of eye tracking technique in user-computer dialogue, a novel eye control system with integrating both mouse and keyboard functions is proposed in this paper. The proposed system focuses on providing a simple and convenient interactive mode by only using user’s eye. The usage flow of the proposed system is designed to perfectly follow human natural habits. Additionally, a magnifier module is proposed to allow the accurate operation. In the experiment, two interactive tasks with different difficulty (searching article and browsing multimedia web were done to compare the proposed eye control tool with an existing system. The Technology Acceptance Model (TAM measures are used to evaluate the perceived effectiveness of our system. It is demonstrated that the proposed system is very effective with regard to usability and interface design.

  15. Eye Tracking Based Control System for Natural Human-Computer Interaction.

    Science.gov (United States)

    Zhang, Xuebai; Liu, Xiaolong; Yuan, Shyan-Ming; Lin, Shu-Fan

    2017-01-01

    Eye movement can be regarded as a pivotal real-time input medium for human-computer communication, which is especially important for people with physical disability. In order to improve the reliability, mobility, and usability of eye tracking technique in user-computer dialogue, a novel eye control system with integrating both mouse and keyboard functions is proposed in this paper. The proposed system focuses on providing a simple and convenient interactive mode by only using user's eye. The usage flow of the proposed system is designed to perfectly follow human natural habits. Additionally, a magnifier module is proposed to allow the accurate operation. In the experiment, two interactive tasks with different difficulty (searching article and browsing multimedia web) were done to compare the proposed eye control tool with an existing system. The Technology Acceptance Model (TAM) measures are used to evaluate the perceived effectiveness of our system. It is demonstrated that the proposed system is very effective with regard to usability and interface design.

  16. Fluid-structure interaction computations for geometrically resolved rotor simulations using CFD

    DEFF Research Database (Denmark)

    Heinz, Joachim Christian; Sørensen, Niels N.; Zahle, Frederik

    2016-01-01

    fluid dynamics (CFD) solver EllipSys3D. The paper shows that the implemented loose coupling scheme, despite a non-conservative force transfer, maintains a sufficient numerical stability and a second-order time accuracy. The use of a strong coupling is found to be redundant. In a first test case......This paper presents a newly developed high-fidelity fluid–structure interaction simulation tool for geometrically resolved rotor simulations of wind turbines. The tool consists of a partitioned coupling between the structural part of the aero-elastic solver HAWC2 and the finite volume computational......, the newly developed coupling between HAWC2 and EllipSys3D (HAWC2CFD) is utilized to compute the aero-elastic response of the NREL 5-MW reference wind turbine (RWT) under normal operational conditions. A comparison with the low-fidelity but state-of-the-art aero-elastic solver HAWC2 reveals a very good...

  17. pulver: an R package for parallel ultra-rapid p-value computation for linear regression interaction terms.

    Science.gov (United States)

    Molnos, Sophie; Baumbach, Clemens; Wahl, Simone; Müller-Nurasyid, Martina; Strauch, Konstantin; Wang-Sattler, Rui; Waldenberger, Melanie; Meitinger, Thomas; Adamski, Jerzy; Kastenmüller, Gabi; Suhre, Karsten; Peters, Annette; Grallert, Harald; Theis, Fabian J; Gieger, Christian

    2017-09-29

    Genome-wide association studies allow us to understand the genetics of complex diseases. Human metabolism provides information about the disease-causing mechanisms, so it is usual to investigate the associations between genetic variants and metabolite levels. However, only considering genetic variants and their effects on one trait ignores the possible interplay between different "omics" layers. Existing tools only consider single-nucleotide polymorphism (SNP)-SNP interactions, and no practical tool is available for large-scale investigations of the interactions between pairs of arbitrary quantitative variables. We developed an R package called pulver to compute p-values for the interaction term in a very large number of linear regression models. Comparisons based on simulated data showed that pulver is much faster than the existing tools. This is achieved by using the correlation coefficient to test the null-hypothesis, which avoids the costly computation of inversions. Additional tricks are a rearrangement of the order, when iterating through the different "omics" layers, and implementing this algorithm in the fast programming language C++. Furthermore, we applied our algorithm to data from the German KORA study to investigate a real-world problem involving the interplay among DNA methylation, genetic variants, and metabolite levels. The pulver package is a convenient and rapid tool for screening huge numbers of linear regression models for significant interaction terms in arbitrary pairs of quantitative variables. pulver is written in R and C++, and can be downloaded freely from CRAN at https://cran.r-project.org/web/packages/pulver/ .

  18. Novel 3D Approach to Flare Modeling via Interactive IDL Widget Tools

    Science.gov (United States)

    Nita, G. M.; Fleishman, G. D.; Gary, D. E.; Kuznetsov, A.; Kontar, E. P.

    2011-12-01

    Currently, and soon-to-be, available sophisticated 3D models of particle acceleration and transport in solar flares require a new level of user-friendly visualization and analysis tools allowing quick and easy adjustment of the model parameters and computation of realistic radiation patterns (images, spectra, polarization, etc). We report the current state of the art of these tools in development, already proved to be highly efficient for the direct flare modeling. We present an interactive IDL widget application intended to provide a flexible tool that allows the user to generate spatially resolved radio and X-ray spectra. The object-based architecture of this application provides full interaction with imported 3D magnetic field models (e.g., from an extrapolation) that may be embedded in a global coronal model. Various tools provided allow users to explore the magnetic connectivity of the model by generating magnetic field lines originating in user-specified volume positions. Such lines may serve as reference lines for creating magnetic flux tubes, which are further populated with user-defined analytical thermal/non thermal particle distribution models. By default, the application integrates IDL callable DLL and Shared libraries containing fast GS emission codes developed in FORTRAN and C++ and soft and hard X-ray codes developed in IDL. However, the interactive interface allows interchanging these default libraries with any user-defined IDL or external callable codes designed to solve the radiation transfer equation in the same or other wavelength ranges of interest. To illustrate the tool capacity and generality, we present a step-by-step real-time computation of microwave and X-ray images from realistic magnetic structures obtained from a magnetic field extrapolation preceding a real event, and compare them with the actual imaging data obtained by NORH and RHESSI instruments. We discuss further anticipated developments of the tools needed to accommodate

  19. Simulation tools for scattering corrections in spectrally resolved x-ray computed tomography using McXtrace

    Science.gov (United States)

    Busi, Matteo; Olsen, Ulrik L.; Knudsen, Erik B.; Frisvad, Jeppe R.; Kehres, Jan; Dreier, Erik S.; Khalil, Mohamad; Haldrup, Kristoffer

    2018-03-01

    Spectral computed tomography is an emerging imaging method that involves using recently developed energy discriminating photon-counting detectors (PCDs). This technique enables measurements at isolated high-energy ranges, in which the dominating undergoing interaction between the x-ray and the sample is the incoherent scattering. The scattered radiation causes a loss of contrast in the results, and its correction has proven to be a complex problem, due to its dependence on energy, material composition, and geometry. Monte Carlo simulations can utilize a physical model to estimate the scattering contribution to the signal, at the cost of high computational time. We present a fast Monte Carlo simulation tool, based on McXtrace, to predict the energy resolved radiation being scattered and absorbed by objects of complex shapes. We validate the tool through measurements using a CdTe single PCD (Multix ME-100) and use it for scattering correction in a simulation of a spectral CT. We found the correction to account for up to 7% relative amplification in the reconstructed linear attenuation. It is a useful tool for x-ray CT to obtain a more accurate material discrimination, especially in the high-energy range, where the incoherent scattering interactions become prevailing (>50 keV).

  20. Mutually Beneficial Foreign Language Learning: Creating Meaningful Interactions through Video-Synchronous Computer-Mediated Communication

    Science.gov (United States)

    Kato, Fumie; Spring, Ryan; Mori, Chikako

    2016-01-01

    Providing learners of a foreign language with meaningful opportunities for interactions, specifically with native speakers, is especially challenging for instructors. One way to overcome this obstacle is through video-synchronous computer-mediated communication tools such as Skype software. This study reports quantitative and qualitative data from…

  1. Accelerator physics analysis with interactive tools

    International Nuclear Information System (INIS)

    Holt, J.A.; Michelotti, L.

    1993-05-01

    Work is in progress on interactive tools for linear and nonlinear accelerator design, analysis, and simulation using X-based graphics. The BEAMLINE and MXYZPTLK class libraries, were used with an X Windows graphics library to build a program for interactively editing lattices and studying their properties

  2. A portable software tool for computing digitally reconstructed radiographs

    International Nuclear Information System (INIS)

    Chaney, Edward L.; Thorn, Jesse S.; Tracton, Gregg; Cullip, Timothy; Rosenman, Julian G.; Tepper, Joel E.

    1995-01-01

    Purpose: To develop a portable software tool for fast computation of digitally reconstructed radiographs (DRR) with a friendly user interface and versatile image format and display options. To provide a means for interfacing with commercial and custom three-dimensional (3D) treatment planning systems. To make the tool freely available to the Radiation Oncology community. Methods and Materials: A computer program for computing DRRs was enhanced with new features and rewritten to increase computational efficiency. A graphical user interface was added to improve ease of data input and DRR display. Installer, programmer, and user manuals were written, and installation test data sets were developed. The code conforms to the specifications of the Cooperative Working Group (CWG) of the National Cancer Institute (NCI) Contract on Radiotherapy Treatment Planning Tools. Results: The interface allows the user to select DRR input data and image formats primarily by point-and-click mouse operations. Digitally reconstructed radiograph formats are predefined by configuration files that specify 19 calculation parameters. Enhancements include improved contrast resolution for visualizing surgical clips, an extended source model to simulate the penumbra region in a computed port film, and the ability to easily modify the CT numbers of objects contoured on the planning computed tomography (CT) scans. Conclusions: The DRR tool can be used with 3D planning systems that lack this functionality, or perhaps improve the quality and functionality of existing DRR software. The tool can be interfaced to 3D planning systems that run on most modern graphics workstations, and can also function as a stand-alone program

  3. Computer Assisted Advising Tool (CAAT).

    Science.gov (United States)

    Matsen, Marie E.

    Lane Community College's Computer Assisted Advising Tool (CAAT) is used by counselors to assist students in developing a plan for the completion of a degree or certificate. CAAT was designed to facilitate student advisement from matriculation to graduation by comparing degree requirements with the courses completed by students. Three major sources…

  4. HPCToolkit: performance tools for scientific computing

    Energy Technology Data Exchange (ETDEWEB)

    Tallent, N; Mellor-Crummey, J; Adhianto, L; Fagan, M; Krentel, M [Department of Computer Science, Rice University, Houston, TX 77005 (United States)

    2008-07-15

    As part of the U.S. Department of Energy's Scientific Discovery through Advanced Computing (SciDAC) program, science teams are tackling problems that require simulation and modeling on petascale computers. As part of activities associated with the SciDAC Center for Scalable Application Development Software (CScADS) and the Performance Engineering Research Institute (PERI), Rice University is building software tools for performance analysis of scientific applications on the leadership-class platforms. In this poster abstract, we briefly describe the HPCToolkit performance tools and how they can be used to pinpoint bottlenecks in SPMD and multi-threaded parallel codes. We demonstrate HPCToolkit's utility by applying it to two SciDAC applications: the S3D code for simulation of turbulent combustion and the MFDn code for ab initio calculations of microscopic structure of nuclei.

  5. HPCToolkit: performance tools for scientific computing

    International Nuclear Information System (INIS)

    Tallent, N; Mellor-Crummey, J; Adhianto, L; Fagan, M; Krentel, M

    2008-01-01

    As part of the U.S. Department of Energy's Scientific Discovery through Advanced Computing (SciDAC) program, science teams are tackling problems that require simulation and modeling on petascale computers. As part of activities associated with the SciDAC Center for Scalable Application Development Software (CScADS) and the Performance Engineering Research Institute (PERI), Rice University is building software tools for performance analysis of scientific applications on the leadership-class platforms. In this poster abstract, we briefly describe the HPCToolkit performance tools and how they can be used to pinpoint bottlenecks in SPMD and multi-threaded parallel codes. We demonstrate HPCToolkit's utility by applying it to two SciDAC applications: the S3D code for simulation of turbulent combustion and the MFDn code for ab initio calculations of microscopic structure of nuclei

  6. iTools: a framework for classification, categorization and integration of computational biology resources.

    Directory of Open Access Journals (Sweden)

    Ivo D Dinov

    2008-05-01

    Full Text Available The advancement of the computational biology field hinges on progress in three fundamental directions--the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources--data, software tools and web-services. The iTools design, implementation and resource meta-data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long

  7. Computer interfacing

    CERN Document Server

    Dixey, Graham

    1994-01-01

    This book explains how computers interact with the world around them and therefore how to make them a useful tool. Topics covered include descriptions of all the components that make up a computer, principles of data exchange, interaction with peripherals, serial communication, input devices, recording methods, computer-controlled motors, and printers.In an informative and straightforward manner, Graham Dixey describes how to turn what might seem an incomprehensible 'black box' PC into a powerful and enjoyable tool that can help you in all areas of your work and leisure. With plenty of handy

  8. Human-computer interaction : Guidelines for web animation

    OpenAIRE

    Galyani Moghaddam, Golnessa; Moballeghi, Mostafa

    2006-01-01

    Human-computer interaction in the large is an interdisciplinary area which attracts researchers, educators, and practioners from many differenf fields. Human-computer interaction studies a human and a machine in communication, it draws from supporting knowledge on both the machine and the human side. This paper is related to the human side of human-computer interaction and focuses on animations. The growing use of animation in Web pages testifies to the increasing ease with which such multim...

  9. A Survey of Educational Games as Interaction Design Tools for Affective Learning: Thematic Analysis Taxonomy

    Science.gov (United States)

    Yusoff, Zarwina; Kamsin, Amirrudin; Shamshirband, Shahaboddin; Chronopoulos, Anthony T.

    2018-01-01

    A Computer game is the new platform in generating learning experiences for educational purposes. There are many educational games that have been used as an interaction design tool in a learning environment to enhance students learning outcomes. However, research also claims that playing video games can have a negative impact on student behavior,…

  10. Hybrid integral-differential simulator of EM force interactions/scenario-assessment tool with pre-computed influence matrix in applications to ITER

    Science.gov (United States)

    Rozov, V.; Alekseev, A.

    2015-08-01

    A necessity to address a wide spectrum of engineering problems in ITER determined the need for efficient tools for modeling of the magnetic environment and force interactions between the main components of the magnet system. The assessment of the operating window for the machine, determined by the electro-magnetic (EM) forces, and the check of feasibility of particular scenarios play an important role for ensuring the safety of exploitation. Such analysis-powered prevention of damages forms an element of the Machine Operations and Investment Protection strategy. The corresponding analysis is a necessary step in preparation of the commissioning, which finalizes the construction phase. It shall be supported by the development of the efficient and robust simulators and multi-physics/multi-system integration of models. The developed numerical model of interactions in the ITER magnetic system, based on the use of pre-computed influence matrices, facilitated immediate and complete assessment and systematic specification of EM loads on magnets in all foreseen operating regimes, their maximum values, envelopes and the most critical scenarios. The common principles of interaction in typical bilateral configurations have been generalized for asymmetry conditions, inspired by the plasma and by the hardware, including asymmetric plasma event and magnetic system fault cases. The specification of loads is supported by the technology of functional approximation of nodal and distributed data by continuous patterns/analytical interpolants. The global model of interactions together with the mesh-independent analytical format of output provides the source of self-consistent and transferable data on the spatial distribution of the system of forces for assessments of structural performance of the components, assemblies and supporting structures. The numerical model used is fully parametrized, which makes it very suitable for multi-variant and sensitivity studies (positioning, off

  11. QR Code: An Interactive Mobile Advertising Tool

    Directory of Open Access Journals (Sweden)

    Ela Sibel Bayrak Meydanoglu

    2013-10-01

    Full Text Available Easy and rapid interaction between consumers and marketers enabled by mobile technology prompted  an increase in the usage of mobile media as an interactive marketing tool in recent years. One of the mobile technologies that can be used in interactive marketing for advertising is QR code (Quick Response Code. Interactive advertising brings back some advantages for the companies that apply it. For example, interaction with consumers provides significant information about consumers' preferences. Marketers can use information obtained from consumers for various marketing activities such as customizing advertisement messages, determining  target audience, improving future products and services. QR codes used in marketing campaigns can provide links to specific websites in which through various tools (e.g. questionnaires, voting information about the needs and wants of customers are collected. The aim of this basic research is to illustrate the contribution of  QR codes to the realization of the advantages gained by interactive advertising.

  12. Spatial computing in interactive architecture

    NARCIS (Netherlands)

    S.O. Dulman (Stefan); M. Krezer; L. Hovestad

    2014-01-01

    htmlabstractDistributed computing is the theoretical foundation for applications and technologies like interactive architecture, wearable computing, and smart materials. It evolves continuously, following needs rising from scientific developments, novel uses of technology, or simply the curiosity to

  13. Child-Computer Interaction SIG

    DEFF Research Database (Denmark)

    Hourcade, Juan Pablo; Revelle, Glenda; Zeising, Anja

    2016-01-01

    This SIG will provide child-computer interaction researchers and practitioners an opportunity to discuss four topics that represent new challenges and opportunities for the community. The four areas are: interactive technologies for children under the age of five, technology for inclusion, privacy...... and information security in the age of the quantified self, and the maker movement....

  14. INFN-Pisa scientific computation environment (GRID, HPC and Interactive Analysis)

    International Nuclear Information System (INIS)

    Arezzini, S; Carboni, A; Caruso, G; Ciampa, A; Coscetti, S; Mazzoni, E; Piras, S

    2014-01-01

    The INFN-Pisa Tier2 infrastructure is described, optimized not only for GRID CPU and Storage access, but also for a more interactive use of the resources in order to provide good solutions for the final data analysis step. The Data Center, equipped with about 6700 production cores, permits the use of modern analysis techniques realized via advanced statistical tools (like RooFit and RooStat) implemented in multicore systems. In particular a POSIX file storage access integrated with standard SRM access is provided. Therefore the unified storage infrastructure is described, based on GPFS and Xrootd, used both for SRM data repository and interactive POSIX access. Such a common infrastructure allows a transparent access to the Tier2 data to the users for their interactive analysis. The organization of a specialized many cores CPU facility devoted to interactive analysis is also described along with the login mechanism integrated with the INFN-AAI (National INFN Infrastructure) to extend the site access and use to a geographical distributed community. Such infrastructure is used also for a national computing facility in use to the INFN theoretical community, it enables a synergic use of computing and storage resources. Our Center initially developed for the HEP community is now growing and includes also HPC resources fully integrated. In recent years has been installed and managed a cluster facility (1000 cores, parallel use via InfiniBand connection) and we are now updating this facility that will provide resources for all the intermediate level HPC computing needs of the INFN theoretical national community.

  15. Evaluating interactive computer-based scenarios designed for learning medical technology.

    Science.gov (United States)

    Persson, Johanna; Dalholm, Elisabeth Hornyánszky; Wallergård, Mattias; Johansson, Gerd

    2014-11-01

    The use of medical equipment is growing in healthcare, resulting in an increased need for resources to educate users in how to manage the various devices. Learning the practical operation of a device is one thing, but learning how to work with the device in the actual clinical context is more challenging. This paper presents a computer-based simulation prototype for learning medical technology in the context of critical care. Properties from simulation and computer games have been adopted to create a visualization-based, interactive and contextually bound tool for learning. A participatory design process, including three researchers and three practitioners from a clinic for infectious diseases, was adopted to adjust the form and content of the prototype to the needs of the clinical practice and to create a situated learning experience. An evaluation with 18 practitioners showed that practitioners were positive to this type of tool for learning and that it served as a good platform for eliciting and sharing knowledge. Our conclusion is that this type of tools can be a complement to traditional learning resources to situate the learning in a context without requiring advanced technology or being resource-demanding. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. A communication protocol for interactively controlling software tools

    NARCIS (Netherlands)

    Wulp, van der J.

    2008-01-01

    We present a protocol for interactively using software tools in a loosely coupled tool environment. Such an environment can assist the user in doing tasks that require the use of multiple tools. For example, it can invoke tools on certain input, set processing parameters, await task completion and

  17. Qudit-Basis Universal Quantum Computation Using χ^{(2)} Interactions.

    Science.gov (United States)

    Niu, Murphy Yuezhen; Chuang, Isaac L; Shapiro, Jeffrey H

    2018-04-20

    We prove that universal quantum computation can be realized-using only linear optics and χ^{(2)} (three-wave mixing) interactions-in any (n+1)-dimensional qudit basis of the n-pump-photon subspace. First, we exhibit a strictly universal gate set for the qubit basis in the one-pump-photon subspace. Next, we demonstrate qutrit-basis universality by proving that χ^{(2)} Hamiltonians and photon-number operators generate the full u(3) Lie algebra in the two-pump-photon subspace, and showing how the qutrit controlled-Z gate can be implemented with only linear optics and χ^{(2)} interactions. We then use proof by induction to obtain our general qudit result. Our induction proof relies on coherent photon injection or subtraction, a technique enabled by χ^{(2)} interaction between the encoding modes and ancillary modes. Finally, we show that coherent photon injection is more than a conceptual tool, in that it offers a route to preparing high-photon-number Fock states from single-photon Fock states.

  18. An interactive computer code for calculation of gas-phase chemical equilibrium (EQLBRM)

    Science.gov (United States)

    Pratt, B. S.; Pratt, D. T.

    1984-01-01

    A user friendly, menu driven, interactive computer program known as EQLBRM which calculates the adiabatic equilibrium temperature and product composition resulting from the combustion of hydrocarbon fuels with air, at specified constant pressure and enthalpy is discussed. The program is developed primarily as an instructional tool to be run on small computers to allow the user to economically and efficiency explore the effects of varying fuel type, air/fuel ratio, inlet air and/or fuel temperature, and operating pressure on the performance of continuous combustion devices such as gas turbine combustors, Stirling engine burners, and power generation furnaces.

  19. QR Code: An Interactive Mobile Advertising Tool

    OpenAIRE

    Ela Sibel Bayrak Meydanoglu

    2013-01-01

    Easy and rapid interaction between consumers and marketers enabled by mobile technology prompted  an increase in the usage of mobile media as an interactive marketing tool in recent years. One of the mobile technologies that can be used in interactive marketing for advertising is QR code (Quick Response Code). Interactive advertising brings back some advantages for the companies that apply it. For example, interaction with consumers provides significant information about consumers' preference...

  20. Modeling multimodal human-computer interaction

    NARCIS (Netherlands)

    Obrenovic, Z.; Starcevic, D.

    2004-01-01

    Incorporating the well-known Unified Modeling Language into a generic modeling framework makes research on multimodal human-computer interaction accessible to a wide range off software engineers. Multimodal interaction is part of everyday human discourse: We speak, move, gesture, and shift our gaze

  1. Language evolution and human-computer interaction

    Science.gov (United States)

    Grudin, Jonathan; Norman, Donald A.

    1991-01-01

    Many of the issues that confront designers of interactive computer systems also appear in natural language evolution. Natural languages and human-computer interfaces share as their primary mission the support of extended 'dialogues' between responsive entities. Because in each case one participant is a human being, some of the pressures operating on natural languages, causing them to evolve in order to better support such dialogue, also operate on human-computer 'languages' or interfaces. This does not necessarily push interfaces in the direction of natural language - since one entity in this dialogue is not a human, this is not to be expected. Nonetheless, by discerning where the pressures that guide natural language evolution also appear in human-computer interaction, we can contribute to the design of computer systems and obtain a new perspective on natural languages.

  2. Affective Computing used in an imaging interaction paradigm

    DEFF Research Database (Denmark)

    Schultz, Nette

    2003-01-01

    This paper combines affective computing with an imaging interaction paradigm. An imaging interaction paradigm means that human and computer communicates primarily by images. Images evoke emotions in humans, so the computer must be able to behave emotionally intelligent. An affective image selection...

  3. A Utopian agenda in Child-Computer Interaction

    DEFF Research Database (Denmark)

    Iversen, Ole Sejer; Dindler, Christian

    2012-01-01

    While participatory techniques and practices have become commonplace in parts of the Child-Computer Interaction (CCI) related literature we believe that the tradition of Participatory Design has more to offer CCI. In particular, the Scandinavian Cooperative Design tradition, manifest through...... the Utopia project, provides a valuable resource for setting an agenda for CCI research that explicitly addresses ideals and values in research and practice. Based on a revisit of the Utopia project we position the ideals of democracy, skilfulness, and emancipation as the core ideals of a Utopian agenda...... and discuss how these resonate with issues and challenges in CCI research. Moreover, we propose that a Utopian agenda entails an explicit alignment between these ideals, a participatory epistemology, and methodology in terms of tools and techniques in CCI practice....

  4. Minimal mobile human computer interaction

    NARCIS (Netherlands)

    el Ali, A.

    2013-01-01

    In the last 20 years, the widespread adoption of personal, mobile computing devices in everyday life, has allowed entry into a new technological era in Human Computer Interaction (HCI). The constant change of the physical and social context in a user's situation made possible by the portability of

  5. An Interactive Simulation Tool for Production Planning in Bacon Factories

    DEFF Research Database (Denmark)

    Nielsen, Jens Frederik Dalsgaard; Nielsen, Kirsten Mølgaard

    1994-01-01

    The paper describes an interactive simulation tool for production planning in bacon factories. The main aim of the tool is to make it possible to combine the production plans of all parts of the factory......The paper describes an interactive simulation tool for production planning in bacon factories. The main aim of the tool is to make it possible to combine the production plans of all parts of the factory...

  6. Foundational Tools for Petascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Miller, Barton [Univ. of Wisconsin, Madison, WI (United States)

    2014-05-19

    The Paradyn project has a history of developing algorithms, techniques, and software that push the cutting edge of tool technology for high-end computing systems. Under this funding, we are working on a three-year agenda to make substantial new advances in support of new and emerging Petascale systems. The overall goal for this work is to address the steady increase in complexity of these petascale systems. Our work covers two key areas: (1) The analysis, instrumentation and control of binary programs. Work in this area falls under the general framework of the Dyninst API tool kits. (2) Infrastructure for building tools and applications at extreme scale. Work in this area falls under the general framework of the MRNet scalability framework. Note that work done under this funding is closely related to work done under a contemporaneous grant, “High-Performance Energy Applications and Systems”, SC0004061/FG02-10ER25972, UW PRJ36WV.

  7. An Interactive Tool for Creating Multi-Agent Systems and Interactive Agent-based Games

    DEFF Research Database (Denmark)

    Lund, Henrik Hautop; Pagliarini, Luigi

    2011-01-01

    Utilizing principles from parallel and distributed processing combined with inspiration from modular robotics, we developed the modular interactive tiles. As an educational tool, the modular interactive tiles facilitate the learning of multi-agent systems and interactive agent-based games...

  8. The Past, Present and Future of Human Computer Interaction

    KAUST Repository

    Churchill, Elizabeth

    2018-01-16

    Human Computer Interaction (HCI) focuses on how people interact with, and are transformed by computation. Our current technology landscape is changing rapidly. Interactive applications, devices and services are increasingly becoming embedded into our environments. From our homes to the urban and rural spaces, we traverse everyday. We are increasingly able toヨoften required toヨmanage and configure multiple, interconnected devices and program their interactions. Artificial intelligence (AI) techniques are being used to create dynamic services that learn about us and others, that make conclusions about our intents and affiliations, and that mould our digital interactions based in predictions about our actions and needs, nudging us toward certain behaviors. Computation is also increasingly embedded into our bodies. Understanding human interactions in the everyday digital and physical context. During this lecture, Elizabeth Churchill -Director of User Experience at Google- will talk about how an emerging landscape invites us to revisit old methods and tactics for understanding how people interact with computers and computation, and how it challenges us to think about new methods and frameworks for understanding the future of human-centered computation.

  9. The Impact of an Interactive Computer Game on the Quality of Life of Children Undergoing Chemotherapy.

    Science.gov (United States)

    Fazelniya, Zahra; Najafi, Mostafa; Moafi, Alireza; Talakoub, Sedigheh

    2017-01-01

    Quality of life (QOL) of children with cancer reduces right from the diagnosis of disease and the start of treatment. Computer games in medicine are utilized to interact with patients and to improve their health-related behaviors. This study aimed to investigate the effect of an interactive computer game on the QOL of children undergoing chemotherapy. In this clinical trial, 64 children with cancer aged between 8 and12 years were selected through convenience sampling and randomly assigned to experimental or control group. The experimental group played a computer game for 3 hours a week for 4 consecutive weeks and the control group only received routine care. The data collection tool was the Pediatric Quality of Life Inventory (PedsQL) 3.0 Cancer Module Child self-report designed for children aged between 8 to 12 years. Data were analyzed using descriptive and inferential statistics in SPSS software. Before intervention, there was no significant difference between the two groups in terms of mean total QOL score ( p = 0.87). However, immediately after the intervention ( p = 0.02) and 1 month after the intervention ( p computer games seem to be effective as a tool in influencing health-related behavior and improving the QOL of children undergoing chemotherapy. Therefore, according to the findings of this study, computer games can be used to improve the QOL of children undergoing chemotherapy.

  10. Computational tools for high-throughput discovery in biology

    OpenAIRE

    Jones, Neil Christopher

    2007-01-01

    High throughput data acquisition technology has inarguably transformed the landscape of the life sciences, in part by making possible---and necessary---the computational disciplines of bioinformatics and biomedical informatics. These fields focus primarily on developing tools for analyzing data and generating hypotheses about objects in nature, and it is in this context that we address three pressing problems in the fields of the computational life sciences which each require computing capaci...

  11. Interacting electrons theory and computational approaches

    CERN Document Server

    Martin, Richard M; Ceperley, David M

    2016-01-01

    Recent progress in the theory and computation of electronic structure is bringing an unprecedented level of capability for research. Many-body methods are becoming essential tools vital for quantitative calculations and understanding materials phenomena in physics, chemistry, materials science and other fields. This book provides a unified exposition of the most-used tools: many-body perturbation theory, dynamical mean field theory and quantum Monte Carlo simulations. Each topic is introduced with a less technical overview for a broad readership, followed by in-depth descriptions and mathematical formulation. Practical guidelines, illustrations and exercises are chosen to enable readers to appreciate the complementary approaches, their relationships, and the advantages and disadvantages of each method. This book is designed for graduate students and researchers who want to use and understand these advanced computational tools, get a broad overview, and acquire a basis for participating in new developments.

  12. Advanced Computing Tools and Models for Accelerator Physics

    International Nuclear Information System (INIS)

    Ryne, Robert; Ryne, Robert D.

    2008-01-01

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics

  13. Computer tools for systems engineering at LaRC

    Science.gov (United States)

    Walters, J. Milam

    1994-01-01

    The Systems Engineering Office (SEO) has been established to provide life cycle systems engineering support to Langley research Center projects. over the last two years, the computing market has been reviewed for tools which could enhance the effectiveness and efficiency of activities directed towards this mission. A group of interrelated applications have been procured, or are under development including a requirements management tool, a system design and simulation tool, and project and engineering data base. This paper will review the current configuration of these tools and provide information on future milestones and directions.

  14. Creating Electronic Books-Chapters for Computers and Tablets Using Easy Java/JavaScript Simulations, EjsS Modeling Tool

    OpenAIRE

    Wee, Loo Kang

    2015-01-01

    This paper shares my journey (tools used, design principles derived and modeling pedagogy implemented) when creating electronic books-chapters (epub3 format) for computers and tablets using Easy Java/JavaScript Simulations, (old name EJS, new EjsS) Modeling Tool. The theory underpinning this work grounded on learning by doing through dynamic and interactive simulation-models that can be more easily made sense of instead of the static nature of printed materials. I started combining related co...

  15. Software Tools For Large Scale Interactive Hydrodynamic Modeling

    NARCIS (Netherlands)

    Donchyts, G.; Baart, F.; van Dam, A; Jagers, B; van der Pijl, S.; Piasecki, M.

    2014-01-01

    Developing easy-to-use software that combines components for simultaneous visualization, simulation and interaction is a great challenge. Mainly, because it involves a number of disciplines, like computational fluid dynamics, computer graphics, high-performance computing. One of the main

  16. Computing tools for implementing standards for single-case designs.

    Science.gov (United States)

    Chen, Li-Ting; Peng, Chao-Ying Joanne; Chen, Ming-E

    2015-11-01

    In the single-case design (SCD) literature, five sets of standards have been formulated and distinguished: design standards, assessment standards, analysis standards, reporting standards, and research synthesis standards. This article reviews computing tools that can assist researchers and practitioners in meeting the analysis standards recommended by the What Works Clearinghouse: Procedures and Standards Handbook-the WWC standards. These tools consist of specialized web-based calculators or downloadable software for SCD data, and algorithms or programs written in Excel, SAS procedures, SPSS commands/Macros, or the R programming language. We aligned these tools with the WWC standards and evaluated them for accuracy and treatment of missing data, using two published data sets. All tools were tested to be accurate. When missing data were present, most tools either gave an error message or conducted analysis based on the available data. Only one program used a single imputation method. This article concludes with suggestions for an inclusive computing tool or environment, additional research on the treatment of missing data, and reasonable and flexible interpretations of the WWC standards. © The Author(s) 2015.

  17. How Our Cognition Shapes and Is Shaped by Technology: A Common Framework for Understanding Human Tool-Use Interactions in the Past, Present, and Future

    Directory of Open Access Journals (Sweden)

    François Osiurak

    2018-03-01

    Full Text Available Over the evolution, humans have constantly developed and improved their technologies. This evolution began with the use of physical tools, those tools that increase our sensorimotor abilities (e.g., first stone tools, modern knives, hammers, pencils. Although we still use some of these tools, we also employ in daily life more sophisticated tools for which we do not systematically understand the underlying physical principles (e.g., computers, cars. Current research is also turned toward the development of brain–computer interfaces directly linking our brain activity to machines (i.e., symbiotic tools. The ultimate goal of research on this topic is to identify the key cognitive processes involved in these different modes of interaction. As a primary step to fulfill this goal, we offer a first attempt at a common framework, based on the idea that humans shape technologies, which also shape us in return. The framework proposed is organized into three levels, describing how we interact when using physical (Past, sophisticated (Present, and symbiotic (Future technologies. Here we emphasize the role played by technical reasoning and practical reasoning, two key cognitive processes that could nevertheless be progressively suppressed by the proficient use of sophisticated and symbiotic tools. We hope that this framework will provide a common ground for researchers interested in the cognitive basis of human tool-use interactions, from paleoanthropology to neuroergonomics.

  18. Proxemics in Human-Computer Interaction

    OpenAIRE

    Greenberg, Saul; Honbaek, Kasper; Quigley, Aaron; Reiterer, Harald; Rädle, Roman

    2014-01-01

    In 1966, anthropologist Edward Hall coined the term "proxemics." Proxemics is an area of study that identifies the culturally dependent ways in which people use interpersonal distance to understand and mediate their interactions with others. Recent research has demonstrated the use of proxemics in human-computer interaction (HCI) for supporting users' explicit and implicit interactions in a range of uses, including remote office collaboration, home entertainment, and games. One promise of pro...

  19. Interactive computation of coverage regions for indoor wireless communication

    Science.gov (United States)

    Abbott, A. Lynn; Bhat, Nitin; Rappaport, Theodore S.

    1995-12-01

    This paper describes a system which assists in the strategic placement of rf base stations within buildings. Known as the site modeling tool (SMT), this system allows the user to display graphical floor plans and to select base station transceiver parameters, including location and orientation, interactively. The system then computes and highlights estimated coverage regions for each transceiver, enabling the user to assess the total coverage within the building. For single-floor operation, the user can choose between distance-dependent and partition- dependent path-loss models. Similar path-loss models are also available for the case of multiple floors. This paper describes the method used by the system to estimate coverage for both directional and omnidirectional antennas. The site modeling tool is intended to be simple to use by individuals who are not experts at wireless communication system design, and is expected to be very useful in the specification of indoor wireless systems.

  20. COMPUTER TOOLS OF DYNAMIC MATHEMATIC SOFTWARE AND METHODICAL PROBLEMS OF THEIR USE

    OpenAIRE

    Olena V. Semenikhina; Maryna H. Drushliak

    2014-01-01

    The article presents results of analyses of standard computer tools of dynamic mathematic software which are used in solving tasks, and tools on which the teacher can support in the teaching of mathematics. Possibility of the organization of experimental investigating of mathematical objects on the basis of these tools and the wording of new tasks on the basis of the limited number of tools, fast automated check are specified. Some methodological comments on application of computer tools and ...

  1. Caesy: A software tool for computer-aided engineering

    Science.gov (United States)

    Wette, Matt

    1993-01-01

    A new software tool, Caesy, is described. This tool provides a strongly typed programming environment for research in the development of algorithms and software for computer-aided control system design. A description of the user language and its implementation as they currently stand are presented along with a description of work in progress and areas of future work.

  2. Interactive Computer-Enhanced Remote Viewing System (ICERVS): Final report, November 1994--September 1996

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-05-01

    The Interactive Computer-Enhanced Remote Viewing System (ICERVS) is a software tool for complex three-dimensional (3-D) visualization and modeling. Its primary purpose is to facilitate the use of robotic and telerobotic systems in remote and/or hazardous environments, where spatial information is provided by 3-D mapping sensors. ICERVS provides a robust, interactive system for viewing sensor data in 3-D and combines this with interactive geometric modeling capabilities that allow an operator to construct CAD models to match the remote environment. Part I of this report traces the development of ICERVS through three evolutionary phases: (1) development of first-generation software to render orthogonal view displays and wireframe models; (2) expansion of this software to include interactive viewpoint control, surface-shaded graphics, material (scalar and nonscalar) property data, cut/slice planes, color and visibility mapping, and generalized object models; (3) demonstration of ICERVS as a tool for the remediation of underground storage tanks (USTs) and the dismantlement of contaminated processing facilities. Part II of this report details the software design of ICERVS, with particular emphasis on its object-oriented architecture and user interface.

  3. Interactive Computer-Enhanced Remote Viewing System (ICERVS): Final report, November 1994--September 1996

    International Nuclear Information System (INIS)

    1997-01-01

    The Interactive Computer-Enhanced Remote Viewing System (ICERVS) is a software tool for complex three-dimensional (3-D) visualization and modeling. Its primary purpose is to facilitate the use of robotic and telerobotic systems in remote and/or hazardous environments, where spatial information is provided by 3-D mapping sensors. ICERVS provides a robust, interactive system for viewing sensor data in 3-D and combines this with interactive geometric modeling capabilities that allow an operator to construct CAD models to match the remote environment. Part I of this report traces the development of ICERVS through three evolutionary phases: (1) development of first-generation software to render orthogonal view displays and wireframe models; (2) expansion of this software to include interactive viewpoint control, surface-shaded graphics, material (scalar and nonscalar) property data, cut/slice planes, color and visibility mapping, and generalized object models; (3) demonstration of ICERVS as a tool for the remediation of underground storage tanks (USTs) and the dismantlement of contaminated processing facilities. Part II of this report details the software design of ICERVS, with particular emphasis on its object-oriented architecture and user interface

  4. Human-computer interaction and management information systems

    CERN Document Server

    Galletta, Dennis F

    2014-01-01

    ""Human-Computer Interaction and Management Information Systems: Applications"" offers state-of-the-art research by a distinguished set of authors who span the MIS and HCI fields. The original chapters provide authoritative commentaries and in-depth descriptions of research programs that will guide 21st century scholars, graduate students, and industry professionals. Human-Computer Interaction (or Human Factors) in MIS is concerned with the ways humans interact with information, technologies, and tasks, especially in business, managerial, organizational, and cultural contexts. It is distinctiv

  5. MOLNs: A CLOUD PLATFORM FOR INTERACTIVE, REPRODUCIBLE, AND SCALABLE SPATIAL STOCHASTIC COMPUTATIONAL EXPERIMENTS IN SYSTEMS BIOLOGY USING PyURDME.

    Science.gov (United States)

    Drawert, Brian; Trogdon, Michael; Toor, Salman; Petzold, Linda; Hellander, Andreas

    2016-01-01

    Computational experiments using spatial stochastic simulations have led to important new biological insights, but they require specialized tools and a complex software stack, as well as large and scalable compute and data analysis resources due to the large computational cost associated with Monte Carlo computational workflows. The complexity of setting up and managing a large-scale distributed computation environment to support productive and reproducible modeling can be prohibitive for practitioners in systems biology. This results in a barrier to the adoption of spatial stochastic simulation tools, effectively limiting the type of biological questions addressed by quantitative modeling. In this paper, we present PyURDME, a new, user-friendly spatial modeling and simulation package, and MOLNs, a cloud computing appliance for distributed simulation of stochastic reaction-diffusion models. MOLNs is based on IPython and provides an interactive programming platform for development of sharable and reproducible distributed parallel computational experiments.

  6. Computer-based tools for decision support at the Hanford Site

    International Nuclear Information System (INIS)

    Doctor, P.G.; Mahaffey, J.A.; Cowley, P.J.; Freshley, M.D.; Hassig, N.L.; Brothers, J.W.; Glantz, C.S.; Strachan, D.M.

    1992-11-01

    To help integrate activities in the environmental restoration and waste management mission of the Hanford Site, the Hanford Integrated Planning Project (HIPP) was established and funded by the US Department of Energy. The project is divided into three key program elements, the first focusing on an explicit, defensible and comprehensive method for evaluating technical options. Based on the premise that computer technology can be used to support the decision-making process and facilitate integration among programs and activities, the Decision Support Tools Task was charged with assessing the status of computer technology for those purposes at the Site. The task addressed two types of tools: tools need to provide technical information and management support tools. Technical tools include performance and risk assessment models, information management systems, data and the computer infrastructure to supports models, data, and information management systems. Management decision support tools are used to synthesize information at a high' level to assist with making decisions. The major conclusions resulting from the assessment are that there is much technical information available, but it is not reaching the decision-makers in a form to be used. Many existing tools provide components that are needed to integrate site activities; however, some components are missing and, more importantly, the ''glue'' or connections to tie the components together to answer decision-makers questions is largely absent. Top priority should be given to decision support tools that support activities given in the TPA. Other decision tools are needed to facilitate and support the environmental restoration and waste management mission

  7. Computer-based tools for decision support at the Hanford Site

    Energy Technology Data Exchange (ETDEWEB)

    Doctor, P.G.; Mahaffey, J.A.; Cowley, P.J.; Freshley, M.D.; Hassig, N.L.; Brothers, J.W.; Glantz, C.S.; Strachan, D.M.

    1992-11-01

    To help integrate activities in the environmental restoration and waste management mission of the Hanford Site, the Hanford Integrated Planning Project (HIPP) was established and funded by the US Department of Energy. The project is divided into three key program elements, the first focusing on an explicit, defensible and comprehensive method for evaluating technical options. Based on the premise that computer technology can be used to support the decision-making process and facilitate integration among programs and activities, the Decision Support Tools Task was charged with assessing the status of computer technology for those purposes at the Site. The task addressed two types of tools: tools need to provide technical information and management support tools. Technical tools include performance and risk assessment models, information management systems, data and the computer infrastructure to supports models, data, and information management systems. Management decision support tools are used to synthesize information at a high' level to assist with making decisions. The major conclusions resulting from the assessment are that there is much technical information available, but it is not reaching the decision-makers in a form to be used. Many existing tools provide components that are needed to integrate site activities; however, some components are missing and, more importantly, the glue'' or connections to tie the components together to answer decision-makers questions is largely absent. Top priority should be given to decision support tools that support activities given in the TPA. Other decision tools are needed to facilitate and support the environmental restoration and waste management mission.

  8. Computer-based tools for decision support at the Hanford Site

    Energy Technology Data Exchange (ETDEWEB)

    Doctor, P.G.; Mahaffey, J.A.; Cowley, P.J.; Freshley, M.D.; Hassig, N.L.; Brothers, J.W.; Glantz, C.S.; Strachan, D.M.

    1992-11-01

    To help integrate activities in the environmental restoration and waste management mission of the Hanford Site, the Hanford Integrated Planning Project (HIPP) was established and funded by the US Department of Energy. The project is divided into three key program elements, the first focusing on an explicit, defensible and comprehensive method for evaluating technical options. Based on the premise that computer technology can be used to support the decision-making process and facilitate integration among programs and activities, the Decision Support Tools Task was charged with assessing the status of computer technology for those purposes at the Site. The task addressed two types of tools: tools need to provide technical information and management support tools. Technical tools include performance and risk assessment models, information management systems, data and the computer infrastructure to supports models, data, and information management systems. Management decision support tools are used to synthesize information at a high` level to assist with making decisions. The major conclusions resulting from the assessment are that there is much technical information available, but it is not reaching the decision-makers in a form to be used. Many existing tools provide components that are needed to integrate site activities; however, some components are missing and, more importantly, the ``glue`` or connections to tie the components together to answer decision-makers questions is largely absent. Top priority should be given to decision support tools that support activities given in the TPA. Other decision tools are needed to facilitate and support the environmental restoration and waste management mission.

  9. Towards early software reliability prediction for computer forensic tools (case study).

    Science.gov (United States)

    Abu Talib, Manar

    2016-01-01

    Versatility, flexibility and robustness are essential requirements for software forensic tools. Researchers and practitioners need to put more effort into assessing this type of tool. A Markov model is a robust means for analyzing and anticipating the functioning of an advanced component based system. It is used, for instance, to analyze the reliability of the state machines of real time reactive systems. This research extends the architecture-based software reliability prediction model for computer forensic tools, which is based on Markov chains and COSMIC-FFP. Basically, every part of the computer forensic tool is linked to a discrete time Markov chain. If this can be done, then a probabilistic analysis by Markov chains can be performed to analyze the reliability of the components and of the whole tool. The purposes of the proposed reliability assessment method are to evaluate the tool's reliability in the early phases of its development, to improve the reliability assessment process for large computer forensic tools over time, and to compare alternative tool designs. The reliability analysis can assist designers in choosing the most reliable topology for the components, which can maximize the reliability of the tool and meet the expected reliability level specified by the end-user. The approach of assessing component-based tool reliability in the COSMIC-FFP context is illustrated with the Forensic Toolkit Imager case study.

  10. An Open-Source Web-Based Tool for Resource-Agnostic Interactive Translation Prediction

    Directory of Open Access Journals (Sweden)

    Daniel Torregrosa

    2014-09-01

    Full Text Available We present a web-based open-source tool for interactive translation prediction (ITP and describe its underlying architecture. ITP systems assist human translators by making context-based computer-generated suggestions as they type. Most of the ITP systems in literature are strongly coupled with a statistical machine translation system that is conveniently adapted to provide the suggestions. Our system, however, follows a resource-agnostic approach and suggestions are obtained from any unmodified black-box bilingual resource. This paper reviews our ITP method and describes the architecture of Forecat, a web tool, partly based on the recent technology of web components, that eases the use of our ITP approach in any web application requiring this kind of translation assistance. We also evaluate the performance of our method when using an unmodified Moses-based statistical machine translation system as the bilingual resource.

  11. Short Paper: Design Tools, Hybridization Exploring Intuitive Interaction

    NARCIS (Netherlands)

    Wendrich, Robert E.; Kuhlen, Torsten; Coquillart, Sabine; Interrante, Victoria

    2010-01-01

    Design and Design Engineering is about making abstract representations often based on fuzzy notions, ideas or prerequisite requirements with the use of various design tools. This paper introduces an interactive hybrid design tool to assist and support singular design activity or multiple

  12. Animal-Computer Interaction (ACI) : An analysis, a perspective, and guidelines

    NARCIS (Netherlands)

    van den Broek, E.L.

    2016-01-01

    Animal-Computer Interaction (ACI)’s founding elements are discussed in relation to its overarching discipline Human-Computer Interaction (HCI). Its basic dimensions are identified: agent, computing machinery, and interaction, and their levels of processing: perceptual, cognitive, and affective.

  13. A sampler of useful computational tools for applied geometry, computer graphics, and image processing foundations for computer graphics, vision, and image processing

    CERN Document Server

    Cohen-Or, Daniel; Ju, Tao; Mitra, Niloy J; Shamir, Ariel; Sorkine-Hornung, Olga; Zhang, Hao (Richard)

    2015-01-01

    A Sampler of Useful Computational Tools for Applied Geometry, Computer Graphics, and Image Processing shows how to use a collection of mathematical techniques to solve important problems in applied mathematics and computer science areas. The book discusses fundamental tools in analytical geometry and linear algebra. It covers a wide range of topics, from matrix decomposition to curvature analysis and principal component analysis to dimensionality reduction.Written by a team of highly respected professors, the book can be used in a one-semester, intermediate-level course in computer science. It

  14. Qudit-Basis Universal Quantum Computation Using χ(2 ) Interactions

    Science.gov (United States)

    Niu, Murphy Yuezhen; Chuang, Isaac L.; Shapiro, Jeffrey H.

    2018-04-01

    We prove that universal quantum computation can be realized—using only linear optics and χ(2 ) (three-wave mixing) interactions—in any (n +1 )-dimensional qudit basis of the n -pump-photon subspace. First, we exhibit a strictly universal gate set for the qubit basis in the one-pump-photon subspace. Next, we demonstrate qutrit-basis universality by proving that χ(2 ) Hamiltonians and photon-number operators generate the full u (3 ) Lie algebra in the two-pump-photon subspace, and showing how the qutrit controlled-Z gate can be implemented with only linear optics and χ(2 ) interactions. We then use proof by induction to obtain our general qudit result. Our induction proof relies on coherent photon injection or subtraction, a technique enabled by χ(2 ) interaction between the encoding modes and ancillary modes. Finally, we show that coherent photon injection is more than a conceptual tool, in that it offers a route to preparing high-photon-number Fock states from single-photon Fock states.

  15. The Interactive Candidate Assessment Tool: A New Way to Interview Residents.

    Science.gov (United States)

    Platt, Michael P; Akhtar-Khavari, Vafa; Ortega, Rafael; Schneider, Jeffrey I; Fineberg, Tabitha; Grundfast, Kenneth M

    2017-06-01

    The purpose of the residency interview is to determine the extent to which a well-qualified applicant is a good fit with a residency program. However, questions asked during residency interviews tend to be standard and repetitive, and they may not elicit information that best differentiates one applicant from another. The iCAT (interactive Candidate Assessment Tool) is a novel interview instrument that allows both interviewers and interviewees to learn about each other in a meaningful way. The iCAT uses a tablet computer to enable the candidate to select questions from an array of video and nonvideo vignettes. Vignettes include recorded videos regarding some aspect of the program, while other icons include questions within recognizable categories. Postinterview surveys demonstrated advantages over traditional interview methods, with 93% agreeing that it was an innovative and effective tool for conducting residency program interviews. The iCAT for residency interviews is a technological advancement that facilitates in-depth candidate assessment.

  16. A computer-aided software-tool for sustainable process synthesis-intensification

    DEFF Research Database (Denmark)

    Kumar Tula, Anjan; Babi, Deenesh K.; Bottlaender, Jack

    2017-01-01

    and determine within the design space, the more sustainable processes. In this paper, an integrated computer-aided software-tool that searches the design space for hybrid/intensified more sustainable process options is presented. Embedded within the software architecture are process synthesis...... operations as well as reported hybrid/intensified unit operations is large and can be difficult to manually navigate in order to determine the best process flowsheet for the production of a desired chemical product. Therefore, it is beneficial to utilize computer-aided methods and tools to enumerate, analyze...... constraints while also matching the design targets, they are therefore more sustainable than the base case. The application of the software-tool to the production of biodiesel is presented, highlighting the main features of the computer-aided, multi-stage, multi-scale methods that are able to determine more...

  17. Effects of Attitudes and Behaviours on Learning Mathematics with Computer Tools

    Science.gov (United States)

    Reed, Helen C.; Drijvers, Paul; Kirschner, Paul A.

    2010-01-01

    This mixed-methods study investigates the effects of student attitudes and behaviours on the outcomes of learning mathematics with computer tools. A computer tool was used to help students develop the mathematical concept of function. In the whole sample (N = 521), student attitudes could account for a 3.4 point difference in test scores between…

  18. OPTHYLIC: An Optimised Tool for Hybrid Limits Computation

    Science.gov (United States)

    Busato, Emmanuel; Calvet, David; Theveneaux-Pelzer, Timothée

    2018-05-01

    A software tool, computing observed and expected upper limits on Poissonian process rates using a hybrid frequentist-Bayesian CLs method, is presented. This tool can be used for simple counting experiments where only signal, background and observed yields are provided or for multi-bin experiments where binned distributions of discriminating variables are provided. It allows the combination of several channels and takes into account statistical and systematic uncertainties, as well as correlations of systematic uncertainties between channels. It has been validated against other software tools and analytical calculations, for several realistic cases.

  19. Modeling The Interaction Effects Between Tools And The Work Piece For Metal Forming Processes

    International Nuclear Information System (INIS)

    Franzke, Martin; Puchhala, Sreedhar; Dackweiler, Harald

    2007-01-01

    In metal forming processes especially in cold forming, elastic deformation of the tools has a big impact on the final shape of the work-piece. Computation of such processes considering the plastic effects of the work-piece and elastic deformations of the tools at a time in a single FE model complicates to manage the convergence criteria. This situation is even aggravated if the contact situations (between working and support rolls) have to be considered in the simulation, which requires a very fine discretization of the contact zones of both the tool and work piece. This paper presents recently developed concept which meets the above mentioned demands very effectively. Within this concept, the computation of the elastic effects of the tools is separated from the process simulation (which considers elastic-plastic effects of the work-piece). Both simulations are coupled via automatic data interchange, which is bi-directional, because both simulations influence each other. The advantages of this concept include a quite easy to handle contact situations in process simulation, smaller stiffness matrix compared to single model approach and good convergence of the computation. This concept is highly generalized and successfully applied to simulate rolling, drawing, extrusion and forging processes. The above mentioned concept is being implemented into the FE package PEP and LARSTRAN/SHAPE. Rolling experiments are conducted in duo and quarto configuration. Optical three-dimensional digitalizing system was used to measure the deformations within the machine and work-piece profile. These results are used for the validation of FE simulations. This work is being sponsored by the German Research Foundation (DFG) through the project ''Interaction effects between processes and structures-SPP1180''

  20. System-level tools and reconfigurable computing for next-generation HWIL systems

    Science.gov (United States)

    Stark, Derek; McAulay, Derek; Cantle, Allan J.; Devlin, Malachy

    2001-08-01

    Previous work has been presented on the creation of computing architectures called DIME, which addressed the particular computing demands of hardware in the loop systems. These demands include low latency, high data rates and interfacing. While it is essential to have a capable platform for handling and processing of the data streams, the tools must also complement this so that a system's engineer is able to construct their final system. The paper will present the work in the area of integration of system level design tools, such as MATLAB and SIMULINK, with a reconfigurable computing platform. This will demonstrate how algorithms can be implemented and simulated in a familiar rapid application development environment before they are automatically transposed for downloading directly to the computing platform. This complements the established control tools, which handle the configuration and control of the processing systems leading to a tool suite for system development and implementation. As the development tools have evolved the core-processing platform has also been enhanced. These improved platforms are based on dynamically reconfigurable computing, utilizing FPGA technologies, and parallel processing methods that more than double the performance and data bandwidth capabilities. This offers support for the processing of images in Infrared Scene Projectors with 1024 X 1024 resolutions at 400 Hz frame rates. The processing elements will be using the latest generation of FPGAs, which implies that the presented systems will be rated in terms of Tera (1012) operations per second.

  1. Humans, computers and wizards human (simulated) computer interaction

    CERN Document Server

    Fraser, Norman; McGlashan, Scott; Wooffitt, Robin

    2013-01-01

    Using data taken from a major European Union funded project on speech understanding, the SunDial project, this book considers current perspectives on human computer interaction and argues for the value of an approach taken from sociology which is based on conversation analysis.

  2. LATTICE: an interactive lattice computer code

    International Nuclear Information System (INIS)

    Staples, J.

    1976-10-01

    LATTICE is a computer code which enables an interactive user to calculate the functions of a synchrotron lattice. This program satisfies the requirements at LBL for a simple interactive lattice program by borrowing ideas from both TRANSPORT and SYNCH. A fitting routine is included

  3. 8th International Workshop on Parallel Tools for High Performance Computing

    CERN Document Server

    Gracia, José; Knüpfer, Andreas; Resch, Michael; Nagel, Wolfgang

    2015-01-01

    Numerical simulation and modelling using High Performance Computing has evolved into an established technique in academic and industrial research. At the same time, the High Performance Computing infrastructure is becoming ever more complex. For instance, most of the current top systems around the world use thousands of nodes in which classical CPUs are combined with accelerator cards in order to enhance their compute power and energy efficiency. This complexity can only be mastered with adequate development and optimization tools. Key topics addressed by these tools include parallelization on heterogeneous systems, performance optimization for CPUs and accelerators, debugging of increasingly complex scientific applications, and optimization of energy usage in the spirit of green IT. This book represents the proceedings of the 8th International Parallel Tools Workshop, held October 1-2, 2014 in Stuttgart, Germany – which is a forum to discuss the latest advancements in the parallel tools.

  4. CMS offline web tools

    International Nuclear Information System (INIS)

    Metson, S; Newbold, D; Belforte, S; Kavka, C; Bockelman, B; Dziedziniewicz, K; Egeland, R; Elmer, P; Eulisse, G; Tuura, L; Evans, D; Fanfani, A; Feichtinger, D; Kuznetsov, V; Lingen, F van; Wakefield, S

    2008-01-01

    We describe a relatively new effort within CMS to converge on a set of web based tools, using state of the art industry techniques, to engage with the CMS offline computing system. CMS collaborators require tools to monitor various components of the computing system and interact with the system itself. The current state of the various CMS web tools is described along side current planned developments. The CMS collaboration comprises of nearly 3000 people from all over the world. As well as its collaborators, its computing resources are spread all over globe and are accessed via the LHC grid to run analysis, large scale production and data transfer tasks. Due to the distributed nature of collaborators effective provision of collaborative tools is essential to maximise physics exploitation of the CMS experiment, especially when the size of the CMS data set is considered. CMS has chosen to provide such tools over the world wide web as a top level service, enabling all members of the collaboration to interact with the various offline computing components. Traditionally web interfaces have been added in HEP experiments as an afterthought. In the CMS offline we have decided to put web interfaces, and the development of a common CMS web framework, on an equal footing with the rest of the offline development. Tools exist within CMS to transfer and catalogue data (PhEDEx and DBS/DLS), run Monte Carlo production (ProdAgent) and submit analysis (CRAB). Effective human interfaces to these systems are required for users with different agendas and practical knowledge of the systems to effectively use the CMS computing system. The CMS web tools project aims to provide a consistent interface to all these tools

  5. CMS offline web tools

    Energy Technology Data Exchange (ETDEWEB)

    Metson, S; Newbold, D [H.H. Wills Physics Laboratory, University of Bristol, Tyndall Avenue, Bristol BS8 1TL (United Kingdom); Belforte, S; Kavka, C [INFN, Sezione di Trieste (Italy); Bockelman, B [University of Nebraska Lincoln, Lincoln, NE (United States); Dziedziniewicz, K [CERN, Geneva (Switzerland); Egeland, R [University of Minnesota Twin Cities, Minneapolis, MN (United States); Elmer, P [Princeton (United States); Eulisse, G; Tuura, L [Northeastern University, Boston, MA (United States); Evans, D [Fermilab MS234, Batavia, IL (United States); Fanfani, A [Universita degli Studi di Bologna (Italy); Feichtinger, D [PSI, Villigen (Switzerland); Kuznetsov, V [Cornell University, Ithaca, NY (United States); Lingen, F van [California Institute of Technology, Pasedena, CA (United States); Wakefield, S [Blackett Laboratory, Imperial College, London (United Kingdom)

    2008-07-15

    We describe a relatively new effort within CMS to converge on a set of web based tools, using state of the art industry techniques, to engage with the CMS offline computing system. CMS collaborators require tools to monitor various components of the computing system and interact with the system itself. The current state of the various CMS web tools is described along side current planned developments. The CMS collaboration comprises of nearly 3000 people from all over the world. As well as its collaborators, its computing resources are spread all over globe and are accessed via the LHC grid to run analysis, large scale production and data transfer tasks. Due to the distributed nature of collaborators effective provision of collaborative tools is essential to maximise physics exploitation of the CMS experiment, especially when the size of the CMS data set is considered. CMS has chosen to provide such tools over the world wide web as a top level service, enabling all members of the collaboration to interact with the various offline computing components. Traditionally web interfaces have been added in HEP experiments as an afterthought. In the CMS offline we have decided to put web interfaces, and the development of a common CMS web framework, on an equal footing with the rest of the offline development. Tools exist within CMS to transfer and catalogue data (PhEDEx and DBS/DLS), run Monte Carlo production (ProdAgent) and submit analysis (CRAB). Effective human interfaces to these systems are required for users with different agendas and practical knowledge of the systems to effectively use the CMS computing system. The CMS web tools project aims to provide a consistent interface to all these tools.

  6. Best of Affective Computing and Intelligent Interaction 2013 in Multimodal Interactions

    NARCIS (Netherlands)

    Soleymani, Mohammad; Soleymani, M.; Pun, T.; Pun, Thierry; Nijholt, Antinus

    The fifth biannual Humaine Association Conference on Affective Computing and Intelligent Interaction (ACII 2013) was held in Geneva, Switzerland. This conference featured the recent advancement in affective computing and relevant applications in education, entertainment and health. A number of

  7. Computational tool for postoperative evaluation of cochlear implant patients

    International Nuclear Information System (INIS)

    Giacomini, Guilherme; Pavan, Ana Luiza M.; Pina, Diana R. de; Altemani, Joao M.C.; Castilho, Arthur M.

    2016-01-01

    The aim of this study was to develop a tool to calculate the insertion depth angle of cochlear implants, from computed tomography exams. The tool uses different image processing techniques, such as thresholding and active contour. Then, we compared the average insertion depth angle of three different implant manufacturers. The developed tool can be used, in the future, to compare the insertion depth angle of the cochlear implant with postoperative response of patient's hearing. (author)

  8. Portable computing - A fielded interactive scientific application in a small off-the-shelf package

    Science.gov (United States)

    Groleau, Nicolas; Hazelton, Lyman; Frainier, Rich; Compton, Michael; Colombano, Silvano; Szolovits, Peter

    1993-01-01

    Experience with the design and implementation of a portable computing system for STS crew-conducted science is discussed. Principal-Investigator-in-a-Box (PI) will help the SLS-2 astronauts perform vestibular (human orientation system) experiments in flight. PI is an interactive system that provides data acquisition and analysis, experiment step rescheduling, and various other forms of reasoning to astronaut users. The hardware architecture of PI consists of a computer and an analog interface box. 'Off-the-shelf' equipment is employed in the system wherever possible in an effort to use widely available tools and then to add custom functionality and application codes to them. Other projects which can help prospective teams to learn more about portable computing in space are also discussed.

  9. The impact of an interactive computer game on the quality of life of children undergoing chemotherapy

    Directory of Open Access Journals (Sweden)

    Zahra Fazelniya

    2017-01-01

    Full Text Available Background: Quality of life (QOL of children with cancer reduces right from the diagnosis of disease and the start of treatment. Computer games in medicine are utilized to interact with patients and to improve their health-related behaviors. This study aimed to investigate the effect of an interactive computer game on the QOL of children undergoing chemotherapy. Materials and Methods: In this clinical trial, 64 children with cancer aged between 8 and12 years were selected through convenience sampling and randomly assigned to experimental or control group. The experimental group played a computer game for 3 hours a week for 4 consecutive weeks and the control group only received routine care. The data collection tool was the Pediatric Quality of Life Inventory (PedsQL 3.0 Cancer Module Child self-report designed for children aged between 8 to 12 years. Data were analyzed using descriptive and inferential statistics in SPSS software. Results: Before intervention, there was no significant difference between the two groups in terms of mean total QOL score (p = 0.87. However, immediately after the intervention (p = 0.02 and 1 month after the intervention (p < 0.001, the overall mean QOL score was significantly higher in the intervention group than the control group. Conclusions: Based on the findings, computer games seem to be effective as a tool in influencing health-related behavior and improving the QOL of children undergoing chemotherapy. Therefore, according to the findings of this study, computer games can be used to improve the QOL of children undergoing chemotherapy.

  10. Measuring Multimodal Synchrony for Human-Computer Interaction

    NARCIS (Netherlands)

    Reidsma, Dennis; Nijholt, Antinus; Tschacher, Wolfgang; Ramseyer, Fabian; Sourin, A.

    2010-01-01

    Nonverbal synchrony is an important and natural element in human-human interaction. It can also play various roles in human-computer interaction. In particular this is the case in the interaction between humans and the virtual humans that inhabit our cyberworlds. Virtual humans need to adapt their

  11. Computer Assistance for Writing Interactive Programs: TICS.

    Science.gov (United States)

    Kaplow, Roy; And Others

    1973-01-01

    Investigators developed an on-line, interactive programing system--the Teacher-Interactive Computer System (TICS)--to provide assistance to those who were not programers, but nevertheless wished to write interactive instructional programs. TICS had two components: an author system and a delivery system. Underlying assumptions were that…

  12. General aviation design synthesis utilizing interactive computer graphics

    Science.gov (United States)

    Galloway, T. L.; Smith, M. R.

    1976-01-01

    Interactive computer graphics is a fast growing area of computer application, due to such factors as substantial cost reductions in hardware, general availability of software, and expanded data communication networks. In addition to allowing faster and more meaningful input/output, computer graphics permits the use of data in graphic form to carry out parametric studies for configuration selection and for assessing the impact of advanced technologies on general aviation designs. The incorporation of interactive computer graphics into a NASA developed general aviation synthesis program is described, and the potential uses of the synthesis program in preliminary design are demonstrated.

  13. Modeling with data tools and techniques for scientific computing

    CERN Document Server

    Klemens, Ben

    2009-01-01

    Modeling with Data fully explains how to execute computationally intensive analyses on very large data sets, showing readers how to determine the best methods for solving a variety of different problems, how to create and debug statistical models, and how to run an analysis and evaluate the results. Ben Klemens introduces a set of open and unlimited tools, and uses them to demonstrate data management, analysis, and simulation techniques essential for dealing with large data sets and computationally intensive procedures. He then demonstrates how to easily apply these tools to the many threads of statistical technique, including classical, Bayesian, maximum likelihood, and Monte Carlo methods

  14. Synchronous Computer-Mediated Communication and Interaction

    Science.gov (United States)

    Ziegler, Nicole

    2016-01-01

    The current study reports on a meta-analysis of the relative effectiveness of interaction in synchronous computer-mediated communication (SCMC) and face-to-face (FTF) contexts. The primary studies included in the analysis were journal articles and dissertations completed between 1990 and 2012 (k = 14). Results demonstrate that interaction in SCMC…

  15. Design tools for computer-generated display of information to operators

    International Nuclear Information System (INIS)

    O'Brien, J.F.; Cain, D.G.; Sun, B.K.H.

    1985-01-01

    More and more computers are being used to process and display information to operators who control nuclear power plants. Implementation of computer-generated displays in power plant control rooms represents a considerable design challenge for industry designers. Over the last several years, the EPRI has conducted research aimed at providing industry designers tools to meet this new design challenge. These tools provide guidance in defining more 'intelligent' information for plant control and in developing effective displays to communicate this information to the operators. (orig./HP)

  16. An integrated computational tool for precipitation simulation

    Science.gov (United States)

    Cao, W.; Zhang, F.; Chen, S.-L.; Zhang, C.; Chang, Y. A.

    2011-07-01

    Computer aided materials design is of increasing interest because the conventional approach solely relying on experimentation is no longer viable within the constraint of available resources. Modeling of microstructure and mechanical properties during precipitation plays a critical role in understanding the behavior of materials and thus accelerating the development of materials. Nevertheless, an integrated computational tool coupling reliable thermodynamic calculation, kinetic simulation, and property prediction of multi-component systems for industrial applications is rarely available. In this regard, we are developing a software package, PanPrecipitation, under the framework of integrated computational materials engineering to simulate precipitation kinetics. It is seamlessly integrated with the thermodynamic calculation engine, PanEngine, to obtain accurate thermodynamic properties and atomic mobility data necessary for precipitation simulation.

  17. Scratch as a Computational Modelling Tool for Teaching Physics

    Science.gov (United States)

    Lopez, Victor; Hernandez, Maria Isabel

    2015-01-01

    The Scratch online authoring tool, which features a simple programming language that has been adapted to primary and secondary students, is being used more and more in schools as it offers students and teachers the opportunity to use a tool to build scientific models and evaluate their behaviour, just as can be done with computational modelling…

  18. Final Report: Correctness Tools for Petascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Mellor-Crummey, John [Rice Univ., Houston, TX (United States)

    2014-10-27

    In the course of developing parallel programs for leadership computing systems, subtle programming errors often arise that are extremely difficult to diagnose without tools. To meet this challenge, University of Maryland, the University of Wisconsin—Madison, and Rice University worked to develop lightweight tools to help code developers pinpoint a variety of program correctness errors that plague parallel scientific codes. The aim of this project was to develop software tools that help diagnose program errors including memory leaks, memory access errors, round-off errors, and data races. Research at Rice University focused on developing algorithms and data structures to support efficient monitoring of multithreaded programs for memory access errors and data races. This is a final report about research and development work at Rice University as part of this project.

  19. A tool for computing diversity and consideration on differences between diversity indices

    OpenAIRE

    Palaghianu, Ciprian

    2016-01-01

    Diversity represents a key concept in ecology, and there are various methods of assessing it. The multitude of diversity indices are quite puzzling and sometimes difficult to compute for a large volume of data. This paper promotes a computational tool used to assess the diversity of different entities. The BIODIV software is a user-friendly tool, developed using Microsoft Visual Basic. It is capable to compute several diversity indices such as: Shannon, Simpson, Pielou, Brillouin, Berger-Park...

  20. Problems and Issues in Using Computer- Based Support Tools to Enhance 'Soft' Systems Methodologies

    Directory of Open Access Journals (Sweden)

    Mark Stansfield

    2001-11-01

    Full Text Available This paper explores the issue of whether computer-based support tools can enhance the use of 'soft' systems methodologies as applied to real-world problem situations. Although work has been carried out by a number of researchers in applying computer-based technology to concepts and methodologies relating to 'soft' systems thinking such as Soft Systems Methodology (SSM, such attempts appear to be still in their infancy and have not been applied widely to real-world problem situations. This paper will highlight some of the problems that may be encountered in attempting to develop computer-based support tools for 'soft' systems methodologies. Particular attention will be paid to an attempt by the author to develop a computer-based support tool for a particular 'soft' systems method of inquiry known as the Appreciative Inquiry Method that is based upon Vickers' notion of 'appreciation' (Vickers, 196S and Checkland's SSM (Checkland, 1981. The final part of the paper will explore some of the lessons learnt from developing and applying the computer-based support tool to a real world problem situation, as well as considering the feasibility of developing computer-based support tools for 'soft' systems methodologies. This paper will put forward the point that a mixture of manual and computer-based tools should be employed to allow a methodology to be used in an unconstrained manner, but the benefits provided by computer-based technology should be utilised in supporting and enhancing the more mundane and structured tasks.

  1. Computer-Based Tools for Evaluating Graphical User Interfaces

    Science.gov (United States)

    Moore, Loretta A.

    1997-01-01

    The user interface is the component of a software system that connects two very complex system: humans and computers. Each of these two systems impose certain requirements on the final product. The user is the judge of the usability and utility of the system; the computer software and hardware are the tools with which the interface is constructed. Mistakes are sometimes made in designing and developing user interfaces because the designers and developers have limited knowledge about human performance (e.g., problem solving, decision making, planning, and reasoning). Even those trained in user interface design make mistakes because they are unable to address all of the known requirements and constraints on design. Evaluation of the user inter-face is therefore a critical phase of the user interface development process. Evaluation should not be considered the final phase of design; but it should be part of an iterative design cycle with the output of evaluation being feed back into design. The goal of this research was to develop a set of computer-based tools for objectively evaluating graphical user interfaces. The research was organized into three phases. The first phase resulted in the development of an embedded evaluation tool which evaluates the usability of a graphical user interface based on a user's performance. An expert system to assist in the design and evaluation of user interfaces based upon rules and guidelines was developed during the second phase. During the final phase of the research an automatic layout tool to be used in the initial design of graphical inter- faces was developed. The research was coordinated with NASA Marshall Space Flight Center's Mission Operations Laboratory's efforts in developing onboard payload display specifications for the Space Station.

  2. Computer tool to evaluate the cue reactivity of chemically dependent individuals.

    Science.gov (United States)

    Silva, Meire Luci da; Frère, Annie France; Oliveira, Henrique Jesus Quintino de; Martucci Neto, Helio; Scardovelli, Terigi Augusto

    2017-03-01

    Anxiety is one of the major influences on the dropout of relapse and treatment of substance abuse treatment. Chemically dependent individuals need (CDI) to be aware of their emotional state in situations of risk during their treatment. Many patients do not agree with the diagnosis of the therapist when considering them vulnerable to environmental stimuli related to drugs. This research presents a cue reactivity detection tool based on a device acquiring physiological signals connected to personal computer. Depending on the variations of the emotional state of the drug addict, alteration of the physiological signals will be detected by the computer tool (CT) which will modify the displayed virtual sets without intervention of the therapist. Developed in 3ds Max® software, the CT is composed of scenarios and objects that are in the habit of marijuana and cocaine dependent individual's daily life. The interaction with the environment is accomplished using a Human-Computer Interface (HCI) that converts incoming physiological signals indicating anxiety state into commands that change the scenes. Anxiety was characterized by the average variability from cardiac and respiratory rate of 30 volunteers submitted stress environment situations. To evaluate the effectiveness of cue reactivity a total of 50 volunteers who were marijuana, cocaine or both dependent were accompanied. Prior to CT, the results demonstrated a poor correlation between the therapists' predictions and those of the chemically dependent individuals. After exposure to the CT, there was a significant increase of 73% in awareness of the risks of relapse. We confirmed the hypothesis that the CT, controlled only by physiological signals, increases the perception of vulnerability to risk situations of individuals with dependence on marijuana, cocaine or both. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  3. A database and tool, IM Browser, for exploring and integrating emerging gene and protein interaction data for Drosophila

    Directory of Open Access Journals (Sweden)

    Parrish Jodi R

    2006-04-01

    Full Text Available Abstract Background Biological processes are mediated by networks of interacting genes and proteins. Efforts to map and understand these networks are resulting in the proliferation of interaction data derived from both experimental and computational techniques for a number of organisms. The volume of this data combined with the variety of specific forms it can take has created a need for comprehensive databases that include all of the available data sets, and for exploration tools to facilitate data integration and analysis. One powerful paradigm for the navigation and analysis of interaction data is an interaction graph or map that represents proteins or genes as nodes linked by interactions. Several programs have been developed for graphical representation and analysis of interaction data, yet there remains a need for alternative programs that can provide casual users with rapid easy access to many existing and emerging data sets. Description Here we describe a comprehensive database of Drosophila gene and protein interactions collected from a variety of sources, including low and high throughput screens, genetic interactions, and computational predictions. We also present a program for exploring multiple interaction data sets and for combining data from different sources. The program, referred to as the Interaction Map (IM Browser, is a web-based application for searching and visualizing interaction data stored in a relational database system. Use of the application requires no downloads and minimal user configuration or training, thereby enabling rapid initial access to interaction data. IM Browser was designed to readily accommodate and integrate new types of interaction data as it becomes available. Moreover, all information associated with interaction measurements or predictions and the genes or proteins involved are accessible to the user. This allows combined searches and analyses based on either common or technique-specific attributes

  4. Pseudo-interactive monitoring in distributed computing

    International Nuclear Information System (INIS)

    Sfiligoi, I.; Bradley, D.; Livny, M.

    2009-01-01

    Distributed computing, and in particular Grid computing, enables physicists to use thousands of CPU days worth of computing every day, by submitting thousands of compute jobs. Unfortunately, a small fraction of such jobs regularly fail; the reasons vary from disk and network problems to bugs in the user code. A subset of these failures result in jobs being stuck for long periods of time. In order to debug such failures, interactive monitoring is highly desirable; users need to browse through the job log files and check the status of the running processes. Batch systems typically don't provide such services; at best, users get job logs at job termination, and even this may not be possible if the job is stuck in an infinite loop. In this paper we present a novel approach of using regular batch system capabilities of Condor to enable users to access the logs and processes of any running job. This does not provide true interactive access, so commands like vi are not viable, but it does allow operations like ls, cat, top, ps, lsof, netstat and dumping the stack of any process owned by the user; we call this pseudo-interactive monitoring. It is worth noting that the same method can be used to monitor Grid jobs in a glidein-based environment. We further believe that the same mechanism could be applied to many other batch systems.

  5. Pseudo-interactive monitoring in distributed computing

    International Nuclear Information System (INIS)

    Sfiligoi, I; Bradley, D; Livny, M

    2010-01-01

    Distributed computing, and in particular Grid computing, enables physicists to use thousands of CPU days worth of computing every day, by submitting thousands of compute jobs. Unfortunately, a small fraction of such jobs regularly fail; the reasons vary from disk and network problems to bugs in the user code. A subset of these failures result in jobs being stuck for long periods of time. In order to debug such failures, interactive monitoring is highly desirable; users need to browse through the job log files and check the status of the running processes. Batch systems typically don't provide such services; at best, users get job logs at job termination, and even this may not be possible if the job is stuck in an infinite loop. In this paper we present a novel approach of using regular batch system capabilities of Condor to enable users to access the logs and processes of any running job. This does not provide true interactive access, so commands like vi are not viable, but it does allow operations like ls, cat, top, ps, lsof, netstat and dumping the stack of any process owned by the user; we call this pseudo-interactive monitoring. It is worth noting that the same method can be used to monitor Grid jobs in a glidein-based environment. We further believe that the same mechanism could be applied to many other batch systems.

  6. Pseudo-interactive monitoring in distributed computing

    Energy Technology Data Exchange (ETDEWEB)

    Sfiligoi, I.; /Fermilab; Bradley, D.; Livny, M.; /Wisconsin U., Madison

    2009-05-01

    Distributed computing, and in particular Grid computing, enables physicists to use thousands of CPU days worth of computing every day, by submitting thousands of compute jobs. Unfortunately, a small fraction of such jobs regularly fail; the reasons vary from disk and network problems to bugs in the user code. A subset of these failures result in jobs being stuck for long periods of time. In order to debug such failures, interactive monitoring is highly desirable; users need to browse through the job log files and check the status of the running processes. Batch systems typically don't provide such services; at best, users get job logs at job termination, and even this may not be possible if the job is stuck in an infinite loop. In this paper we present a novel approach of using regular batch system capabilities of Condor to enable users to access the logs and processes of any running job. This does not provide true interactive access, so commands like vi are not viable, but it does allow operations like ls, cat, top, ps, lsof, netstat and dumping the stack of any process owned by the user; we call this pseudo-interactive monitoring. It is worth noting that the same method can be used to monitor Grid jobs in a glidein-based environment. We further believe that the same mechanism could be applied to many other batch systems.

  7. Interactive simulation of nuclear power systems using a dedicated minicomputer - computer graphics facility

    International Nuclear Information System (INIS)

    Tye, C.; Sezgen, A.O.

    1980-01-01

    The design of control systems and operational procedures for large scale nuclear power plant poses a difficult optimization problem requiring a lot of computational effort. Plant dynamic simulation using digital minicomputers offers the prospect of relatively low cost computing and when combined with graphical input/output provides a powerful tool for studying such problems. The paper discusses the results obtained from a simulation study carried out at the Computer Graphics Unit of the University of Manchester using a typical station control model for an Advanced Gas Cooled reactor. Particular reference is placed on the use of computer graphics for information display, parameter and control system optimization and techniques for using graphical input for defining and/or modifying the control system topology. Experience gained from this study has shown that a relatively modest minicomputer system can be used for simulating large scale dynamic systems and that highly interactive computer graphics can be used to advantage to relieve the designer of many of the tedious aspects of simulation leaving him free to concentrate on the more creative aspects of his work. (author)

  8. Human-Computer Interaction in Smart Environments

    Science.gov (United States)

    Paravati, Gianluca; Gatteschi, Valentina

    2015-01-01

    Here, we provide an overview of the content of the Special Issue on “Human-computer interaction in smart environments”. The aim of this Special Issue is to highlight technologies and solutions encompassing the use of mass-market sensors in current and emerging applications for interacting with Smart Environments. Selected papers address this topic by analyzing different interaction modalities, including hand/body gestures, face recognition, gaze/eye tracking, biosignal analysis, speech and activity recognition, and related issues.

  9. a Web-Based Interactive Tool for Multi-Resolution 3d Models of a Maya Archaeological Site

    Science.gov (United States)

    Agugiaro, G.; Remondino, F.; Girardi, G.; von Schwerin, J.; Richards-Rissetto, H.; De Amicis, R.

    2011-09-01

    Continuous technological advances in surveying, computing and digital-content delivery are strongly contributing to a change in the way Cultural Heritage is "perceived": new tools and methodologies for documentation, reconstruction and research are being created to assist not only scholars, but also to reach more potential users (e.g. students and tourists) willing to access more detailed information about art history and archaeology. 3D computer-simulated models, sometimes set in virtual landscapes, offer for example the chance to explore possible hypothetical reconstructions, while on-line GIS resources can help interactive analyses of relationships and change over space and time. While for some research purposes a traditional 2D approach may suffice, this is not the case for more complex analyses concerning spatial and temporal features of architecture, like for example the relationship of architecture and landscape, visibility studies etc. The project aims therefore at creating a tool, called "QueryArch3D" tool, which enables the web-based visualisation and queries of an interactive, multi-resolution 3D model in the framework of Cultural Heritage. More specifically, a complete Maya archaeological site, located in Copan (Honduras), has been chosen as case study to test and demonstrate the platform's capabilities. Much of the site has been surveyed and modelled at different levels of detail (LoD) and the geometric model has been semantically segmented and integrated with attribute data gathered from several external data sources. The paper describes the characteristics of the research work, along with its implementation issues and the initial results of the developed prototype.

  10. Challenges and opportunities of modeling plasma–surface interactions in tungsten using high-performance computing

    Energy Technology Data Exchange (ETDEWEB)

    Wirth, Brian D., E-mail: bdwirth@utk.edu [Department of Nuclear Engineering, University of Tennessee, Knoxville, TN 37996 (United States); Nuclear Science and Engineering Directorate, Oak Ridge National Laboratory, Oak Ridge, TN (United States); Hammond, K.D. [Department of Nuclear Engineering, University of Tennessee, Knoxville, TN 37996 (United States); Krasheninnikov, S.I. [University of California, San Diego, La Jolla, CA (United States); Maroudas, D. [University of Massachusetts, Amherst, Amherst, MA 01003 (United States)

    2015-08-15

    The performance of plasma facing components (PFCs) is critical for ITER and future magnetic fusion reactors. The ITER divertor will be tungsten, which is the primary candidate material for future reactors. Recent experiments involving tungsten exposure to low-energy helium plasmas reveal significant surface modification, including the growth of nanometer-scale tendrils of “fuzz” and formation of nanometer-sized bubbles in the near-surface region. The large span of spatial and temporal scales governing plasma surface interactions are among the challenges to modeling divertor performance. Fortunately, recent innovations in computational modeling, increasingly powerful high-performance computers, and improved experimental characterization tools provide a path toward self-consistent, experimentally validated models of PFC and divertor performance. Recent advances in understanding tungsten–helium interactions are reviewed, including such processes as helium clustering, which serve as nuclei for gas bubbles; and trap mutation, dislocation loop punching and bubble bursting; which together initiate surface morphological modification.

  11. Challenges and opportunities of modeling plasma–surface interactions in tungsten using high-performance computing

    International Nuclear Information System (INIS)

    Wirth, Brian D.; Hammond, K.D.; Krasheninnikov, S.I.; Maroudas, D.

    2015-01-01

    The performance of plasma facing components (PFCs) is critical for ITER and future magnetic fusion reactors. The ITER divertor will be tungsten, which is the primary candidate material for future reactors. Recent experiments involving tungsten exposure to low-energy helium plasmas reveal significant surface modification, including the growth of nanometer-scale tendrils of “fuzz” and formation of nanometer-sized bubbles in the near-surface region. The large span of spatial and temporal scales governing plasma surface interactions are among the challenges to modeling divertor performance. Fortunately, recent innovations in computational modeling, increasingly powerful high-performance computers, and improved experimental characterization tools provide a path toward self-consistent, experimentally validated models of PFC and divertor performance. Recent advances in understanding tungsten–helium interactions are reviewed, including such processes as helium clustering, which serve as nuclei for gas bubbles; and trap mutation, dislocation loop punching and bubble bursting; which together initiate surface morphological modification

  12. The epistemology and ontology of human-computer interaction

    NARCIS (Netherlands)

    Brey, Philip A.E.

    2005-01-01

    This paper analyzes epistemological and ontological dimensions of Human-Computer Interaction (HCI) through an analysis of the functions of computer systems in relation to their users. It is argued that the primary relation between humans and computer systems has historically been epistemic:

  13. Human-Computer Interaction in Smart Environments

    Directory of Open Access Journals (Sweden)

    Gianluca Paravati

    2015-08-01

    Full Text Available Here, we provide an overview of the content of the Special Issue on “Human-computer interaction in smart environments”. The aim of this Special Issue is to highlight technologies and solutions encompassing the use of mass-market sensors in current and emerging applications for interacting with Smart Environments. Selected papers address this topic by analyzing different interaction modalities, including hand/body gestures, face recognition, gaze/eye tracking, biosignal analysis, speech and activity recognition, and related issues.

  14. NPR Lenses : Interactive Tools for Non-photorealistic Line Drawings

    NARCIS (Netherlands)

    Neumann, Petra; Isenberg, Tobias; Carpendale, Sheelagh

    2007-01-01

    NPR Lenses is an interactive technique for producing expressive non-photorealistic renderings. It provides an intuitive visual interaction tool for illustrators, allowing them to seamlessly apply a large variety of emphasis techniques. Advantages of 3D scene manipulation are combined with the

  15. Nsite, NsiteH and NsiteM Computer Tools for Studying Tran-scription Regulatory Elements

    KAUST Repository

    Shahmuradov, Ilham

    2015-07-02

    Summary: Gene transcription is mostly conducted through interactions of various transcription factors and their binding sites on DNA (regulatory elements, REs). Today, we are still far from understanding the real regulatory content of promoter regions. Computer methods for identification of REs remain a widely used tool for studying and understanding transcriptional regulation mechanisms. The Nsite, NsiteH and NsiteM programs perform searches for statistically significant (non-random) motifs of known human, animal and plant one-box and composite REs in a single genomic sequence, in a pair of aligned homologous sequences and in a set of functionally related sequences, respectively.

  16. A computer tool to support in design of industrial Ethernet.

    Science.gov (United States)

    Lugli, Alexandre Baratella; Santos, Max Mauro Dias; Franco, Lucia Regina Horta Rodrigues

    2009-04-01

    This paper presents a computer tool to support in the project and development of an industrial Ethernet network, verifying the physical layer (cables-resistance and capacitance, scan time, network power supply-POE's concept "Power Over Ethernet" and wireless), and occupation rate (amount of information transmitted to the network versus the controller network scan time). These functions are accomplished without a single physical element installed in the network, using only simulation. The computer tool has a software that presents a detailed vision of the network to the user, besides showing some possible problems in the network, and having an extremely friendly environment.

  17. Cloud Computing as a Tool for Improving Business Competitiveness

    Directory of Open Access Journals (Sweden)

    Wišniewski Michał

    2014-08-01

    Full Text Available This article organizes knowledge on cloud computing presenting the classification of deployment models, characteristics and service models. The author, looking at the problem from the entrepreneur’s perspective, draws attention to the differences in the benefits depending on the cloud computing deployment models and considers an effective way of selection of cloud computing services according to the specificity of organization. Within this work, a thesis statement was considered that in economic terms the cloud computing is not always the best solution for your organization. This raises the question, “What kind of tools should be used to estimate the usefulness of the model cloud computing services in the enterprise?”

  18. Human-Computer Interaction The Agency Perspective

    CERN Document Server

    Oliveira, José

    2012-01-01

    Agent-centric theories, approaches and technologies are contributing to enrich interactions between users and computers. This book aims at highlighting the influence of the agency perspective in Human-Computer Interaction through a careful selection of research contributions. Split into five sections; Users as Agents, Agents and Accessibility, Agents and Interactions, Agent-centric Paradigms and Approaches, and Collective Agents, the book covers a wealth of novel, original and fully updated material, offering:   ü  To provide a coherent, in depth, and timely material on the agency perspective in HCI ü  To offer an authoritative treatment of the subject matter presented by carefully selected authors ü  To offer a balanced and broad coverage of the subject area, including, human, organizational, social, as well as technological concerns. ü  To offer a hands-on-experience by covering representative case studies and offering essential design guidelines   The book will appeal to a broad audience of resea...

  19. Brain Computer Interfaces for Enhanced Interaction with Mobile Robot Agents

    Science.gov (United States)

    2016-07-27

    SECURITY CLASSIFICATION OF: Brain Computer Interfaces (BCIs) show great potential in allowing humans to interact with computational environments in a...Distribution Unlimited UU UU UU UU 27-07-2016 17-Sep-2013 16-Sep-2014 Final Report: Brain Computer Interfaces for Enhanced Interactions with Mobile Robot...published in peer-reviewed journals: Number of Papers published in non peer-reviewed journals: Final Report: Brain Computer Interfaces for Enhanced

  20. Automated planning target volume generation: an evaluation pitting a computer-based tool against human experts

    International Nuclear Information System (INIS)

    Ketting, Case H.; Austin-Seymour, Mary; Kalet, Ira; Jacky, Jon; Kromhout-Schiro, Sharon; Hummel, Sharon; Unger, Jonathan; Fagan, Lawrence M.; Griffin, Tom

    1997-01-01

    Purpose: Software tools are seeing increased use in three-dimensional treatment planning. However, the development of these tools frequently omits careful evaluation before placing them in clinical use. This study demonstrates the application of a rigorous evaluation methodology using blinded peer review to an automated software tool that produces ICRU-50 planning target volumes (PTVs). Methods and Materials: Seven physicians from three different institutions involved in three-dimensional treatment planning participated in the evaluation. Four physicians drew partial PTVs on nine test cases, consisting of four nasopharynx and five lung primaries. Using the same information provided to the human experts, the computer tool generated PTVs for comparison. The remaining three physicians, designated evaluators, individually reviewed the PTVs for acceptability. To exclude bias, the evaluators were blinded to the source (human or computer) of the PTVs they reviewed. Their scorings of the PTVs were statistically examined to determine if the computer tool performed as well as the human experts. Results: The computer tool was as successful as the human experts in generating PTVs. Failures were primarily attributable to insufficient margins around the clinical target volume and to encroachment upon critical structures. In a qualitative analysis, the human and computer experts displayed similar types and distributions of errors. Conclusions: Rigorous evaluation of computer-based radiotherapy tools requires comparison to current practice and can reveal areas for improvement before the tool enters clinical practice

  1. Introduction to human-computer interaction

    CERN Document Server

    Booth, Paul

    2014-01-01

    Originally published in 1989 this title provided a comprehensive and authoritative introduction to the burgeoning discipline of human-computer interaction for students, academics, and those from industry who wished to know more about the subject. Assuming very little knowledge, the book provides an overview of the diverse research areas that were at the time only gradually building into a coherent and well-structured field. It aims to explain the underlying causes of the cognitive, social and organizational problems typically encountered when computer systems are introduced. It is clear and co

  2. Development and Evaluation of Computer-Based Laboratory Practical Learning Tool

    Science.gov (United States)

    Gandole, Y. B.

    2006-01-01

    Effective evaluation of educational software is a key issue for successful introduction of advanced tools in the curriculum. This paper details to developing and evaluating a tool for computer assisted learning of science laboratory courses. The process was based on the generic instructional system design model. Various categories of educational…

  3. Symbolic computation of nonlinear wave interactions on MACSYMA

    International Nuclear Information System (INIS)

    Bers, A.; Kulp, J.L.; Karney, C.F.F.

    1976-01-01

    In this paper the use of a large symbolic computation system - MACSYMA - in determining approximate analytic expressions for the nonlinear coupling of waves in an anisotropic plasma is described. MACSYMA was used to implement the solutions of a fluid plasma model nonlinear partial differential equations by perturbation expansions and subsequent iterative analytic computations. By interacting with the details of the symbolic computation, the physical processes responsible for particular nonlinear wave interactions could be uncovered and appropriate approximations introduced so as to simplify the final analytic result. Details of the MACSYMA system and its use are discussed and illustrated. (Auth.)

  4. 9th International Workshop on Parallel Tools for High Performance Computing

    CERN Document Server

    Hilbrich, Tobias; Niethammer, Christoph; Gracia, José; Nagel, Wolfgang; Resch, Michael

    2016-01-01

    High Performance Computing (HPC) remains a driver that offers huge potentials and benefits for science and society. However, a profound understanding of the computational matters and specialized software is needed to arrive at effective and efficient simulations. Dedicated software tools are important parts of the HPC software landscape, and support application developers. Even though a tool is by definition not a part of an application, but rather a supplemental piece of software, it can make a fundamental difference during the development of an application. Such tools aid application developers in the context of debugging, performance analysis, and code optimization, and therefore make a major contribution to the development of robust and efficient parallel software. This book introduces a selection of the tools presented and discussed at the 9th International Parallel Tools Workshop held in Dresden, Germany, September 2-3, 2015, which offered an established forum for discussing the latest advances in paral...

  5. Development of computer-based analytical tool for assessing physical protection system

    Science.gov (United States)

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-01

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer-based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system's detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  6. Computer Tools for Construction, Modification and Analysis of Petri Nets

    DEFF Research Database (Denmark)

    Jensen, Kurt

    1987-01-01

    The practical use of Petri nets is — just as any other description technique — very dependent on the existence of adequate computer tools, which may assist the user to cope with the many details of a large description. For Petri nets there is a need for tools supporting construction of nets...

  7. Approach and tool for computer animation of fields in electrical apparatus

    International Nuclear Information System (INIS)

    Miltchev, Radoslav; Yatchev, Ivan S.; Ritchie, Ewen

    2002-01-01

    The paper presents a technical approach and post-processing tool for creating and displaying computer animation. The approach enables handling of two- and three-dimensional physical field phenomena results obtained from finite element software or to display movement processes in electrical apparatus simulations. The main goal of this work is to extend auxiliary features built in general-purpose CAD software working in the Windows environment. Different storage techniques were examined and the one employing image capturing was chosen. The developed tool provides benefits of independent visualisation, creating scenarios and facilities for exporting animations in common file fon-nats for distribution on different computer platforms. It also provides a valuable educational tool.(Author)

  8. 10th International Workshop on Parallel Tools for High Performance Computing

    CERN Document Server

    Gracia, José; Hilbrich, Tobias; Knüpfer, Andreas; Resch, Michael; Nagel, Wolfgang

    2017-01-01

    This book presents the proceedings of the 10th International Parallel Tools Workshop, held October 4-5, 2016 in Stuttgart, Germany – a forum to discuss the latest advances in parallel tools. High-performance computing plays an increasingly important role for numerical simulation and modelling in academic and industrial research. At the same time, using large-scale parallel systems efficiently is becoming more difficult. A number of tools addressing parallel program development and analysis have emerged from the high-performance computing community over the last decade, and what may have started as collection of small helper script has now matured to production-grade frameworks. Powerful user interfaces and an extensive body of documentation allow easy usage by non-specialists. tools have been commercialized, but others are operated as open source by a growing research community.

  9. Cost-effective cloud computing: a case study using the comparative genomics tool, roundup.

    Science.gov (United States)

    Kudtarkar, Parul; Deluca, Todd F; Fusaro, Vincent A; Tonellato, Peter J; Wall, Dennis P

    2010-12-22

    Comparative genomics resources, such as ortholog detection tools and repositories are rapidly increasing in scale and complexity. Cloud computing is an emerging technological paradigm that enables researchers to dynamically build a dedicated virtual cluster and may represent a valuable alternative for large computational tools in bioinformatics. In the present manuscript, we optimize the computation of a large-scale comparative genomics resource-Roundup-using cloud computing, describe the proper operating principles required to achieve computational efficiency on the cloud, and detail important procedures for improving cost-effectiveness to ensure maximal computation at minimal costs. Utilizing the comparative genomics tool, Roundup, as a case study, we computed orthologs among 902 fully sequenced genomes on Amazon's Elastic Compute Cloud. For managing the ortholog processes, we designed a strategy to deploy the web service, Elastic MapReduce, and maximize the use of the cloud while simultaneously minimizing costs. Specifically, we created a model to estimate cloud runtime based on the size and complexity of the genomes being compared that determines in advance the optimal order of the jobs to be submitted. We computed orthologous relationships for 245,323 genome-to-genome comparisons on Amazon's computing cloud, a computation that required just over 200 hours and cost $8,000 USD, at least 40% less than expected under a strategy in which genome comparisons were submitted to the cloud randomly with respect to runtime. Our cost savings projections were based on a model that not only demonstrates the optimal strategy for deploying RSD to the cloud, but also finds the optimal cluster size to minimize waste and maximize usage. Our cost-reduction model is readily adaptable for other comparative genomics tools and potentially of significant benefit to labs seeking to take advantage of the cloud as an alternative to local computing infrastructure.

  10. On Computational Fluid Dynamics Tools in Architectural Design

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Hougaard, Mads; Stærdahl, Jesper Winther

    engineering computational fluid dynamics (CFD) simulation program ANSYS CFX and a CFD based representative program RealFlow are investigated. These two programs represent two types of CFD based tools available for use during phases of an architectural design process. However, as outlined in two case studies...

  11. Benefits of Subliminal Feedback Loops in Human-Computer Interaction

    OpenAIRE

    Walter Ritter

    2011-01-01

    A lot of efforts have been directed to enriching human-computer interaction to make the user experience more pleasing or efficient. In this paper, we briefly present work in the fields of subliminal perception and affective computing, before we outline a new approach to add analog communication channels to the human-computer interaction experience. In this approach, in addition to symbolic predefined mappings of input to output, a subliminal feedback loop is used that provides feedback in evo...

  12. An Educational Tool for Creating Distributed Physical Games

    DEFF Research Database (Denmark)

    Lund, Henrik Hautop; Pagliarini, Luigi

    2011-01-01

    programming for physical games development. This is done by providing an educational tool that allows a change of representation of the problems related to game designing from a virtual to a physical representation. Indeed, MITS seems to be a valuable system for bringing into education a vast number of issues...... (such as parallel programming, distribution, communication protocols, master dependency, connectivity, topology, island modeling software behavioral models, adaptive interactivity, feedback, user and multi-user game interaction, etc.). This can both improve the education-related issues in computer......The development of physical interactive games demands extensive knowledge in engineering, computer science and gaming. In this paper we describe how the Modular Interactive Tiles System (MITS) can be a valuable tool for introducing students to interactive parallel and distributed processing...

  13. An Interactive Tool for Outdoor Computer Controlled Cultivation of Microalgae in a Tubular Photobioreactor System

    Directory of Open Access Journals (Sweden)

    Raquel Dormido

    2014-03-01

    Full Text Available This paper describes an interactive virtual laboratory for experimenting with an outdoor tubular photobioreactor (henceforth PBR for short. This virtual laboratory it makes possible to: (a accurately reproduce the structure of a real plant (the PBR designed and built by the Department of Chemical Engineering of the University of Almería, Spain; (b simulate a generic tubular PBR by changing the PBR geometry; (c simulate the effects of changing different operating parameters such as the conditions of the culture (pH, biomass concentration, dissolved O2, inyected CO2, etc.; (d simulate the PBR in its environmental context; it is possible to change the geographic location of the system or the solar irradiation profile; (e apply different control strategies to adjust different variables such as the CO2 injection, culture circulation rate or culture temperature in order to maximize the biomass production; (f simulate the harvesting. In this way, users can learn in an intuitive way how productivity is affected by any change in the design. It facilitates the learning of how to manipulate essential variables for microalgae growth to design an optimal PBR. The simulator has been developed with Easy Java Simulations, a freeware open-source tool developed in Java, specifically designed for the creation of interactive dynamic simulations.

  14. Patient-specific surgical planning and hemodynamic computational fluid dynamics optimization through free-form haptic anatomy editing tool (SURGEM).

    Science.gov (United States)

    Pekkan, Kerem; Whited, Brian; Kanter, Kirk; Sharma, Shiva; de Zelicourt, Diane; Sundareswaran, Kartik; Frakes, David; Rossignac, Jarek; Yoganathan, Ajit P

    2008-11-01

    The first version of an anatomy editing/surgical planning tool (SURGEM) targeting anatomical complexity and patient-specific computational fluid dynamics (CFD) analysis is presented. Novel three-dimensional (3D) shape editing concepts and human-shape interaction technologies have been integrated to facilitate interactive surgical morphology alterations, grid generation and CFD analysis. In order to implement "manual hemodynamic optimization" at the surgery planning phase for patients with congenital heart defects, these tools are applied to design and evaluate possible modifications of patient-specific anatomies. In this context, anatomies involve complex geometric topologies and tortuous 3D blood flow pathways with multiple inlets and outlets. These tools make it possible to freely deform the lumen surface and to bend and position baffles through real-time, direct manipulation of the 3D models with both hands, thus eliminating the tedious and time-consuming phase of entering the desired geometry using traditional computer-aided design (CAD) systems. The 3D models of the modified anatomies are seamlessly exported and meshed for patient-specific CFD analysis. Free-formed anatomical modifications are quantified using an in-house skeletization based cross-sectional geometry analysis tool. Hemodynamic performance of the systematically modified anatomies is compared with the original anatomy using CFD. CFD results showed the relative importance of the various surgically created features such as pouch size, vena cave to pulmonary artery (PA) flare and PA stenosis. An interactive surgical-patch size estimator is also introduced. The combined design/analysis cycle time is used for comparing and optimizing surgical plans and improvements are tabulated. The reduced cost of patient-specific shape design and analysis process, made it possible to envision large clinical studies to assess the validity of predictive patient-specific CFD simulations. In this paper, model

  15. Humor in Human-Computer Interaction : A Short Survey

    NARCIS (Netherlands)

    Nijholt, Anton; Niculescu, Andreea; Valitutti, Alessandro; Banchs, Rafael E.; Joshi, Anirudha; Balkrishan, Devanuj K.; Dalvi, Girish; Winckler, Marco

    2017-01-01

    This paper is a short survey on humor in human-computer interaction. It describes how humor is designed and interacted with in social media, virtual agents, social robots and smart environments. Benefits and future use of humor in interactions with artificial entities are discussed based on

  16. Tools for Analyzing Computing Resource Management Strategies and Algorithms for SDR Clouds

    Science.gov (United States)

    Marojevic, Vuk; Gomez-Miguelez, Ismael; Gelonch, Antoni

    2012-09-01

    Software defined radio (SDR) clouds centralize the computing resources of base stations. The computing resource pool is shared between radio operators and dynamically loads and unloads digital signal processing chains for providing wireless communications services on demand. Each new user session request particularly requires the allocation of computing resources for executing the corresponding SDR transceivers. The huge amount of computing resources of SDR cloud data centers and the numerous session requests at certain hours of a day require an efficient computing resource management. We propose a hierarchical approach, where the data center is divided in clusters that are managed in a distributed way. This paper presents a set of computing resource management tools for analyzing computing resource management strategies and algorithms for SDR clouds. We use the tools for evaluating a different strategies and algorithms. The results show that more sophisticated algorithms can achieve higher resource occupations and that a tradeoff exists between cluster size and algorithm complexity.

  17. Database tools for enhanced analysis of TMX-U data

    International Nuclear Information System (INIS)

    Stewart, M.E.; Carter, M.R.; Casper, T.A.; Meyer, W.H.; Perkins, D.E.; Whitney, D.M.

    1986-01-01

    A commercial database software package has been used to create several databases and tools that assist and enhance the ability of experimental physicists to analyze data from the Tandem Mirror Experiment-Upgrade (TMX-U) experiment. This software runs on a DEC-20 computer in M-Divisions's User Service Center at Lawrence Livermore National Laboratory (LLNL), where data can be analyzed off line from the main TMX-U acquisition computers. When combined with interactive data analysis programs, these tools provide the capability to do batch-style processing or interactive data analysis on the computers in the USC or the supercomputers of the National Magnetic Fusion Energy Computer Center (NMFECC) in addition to the normal processing done by the TMX-U acquisition system. One database tool provides highly reduced data for searching and correlation analysis of several diagnostic signals within a single shot or over many shots. A second database tool provides retrieval and storage of unreduced data for use in detailed analysis of one or more diagnostic signals. We will show how these database tools form the core of an evolving off-line data analysis environment on the USC computers

  18. Database tools for enhanced analysis of TMX-U data

    International Nuclear Information System (INIS)

    Stewart, M.E.; Carter, M.R.; Casper, T.A.; Meyer, W.H.; Perkins, D.E.; Whitney, D.M.

    1986-01-01

    A commercial database software package has been used to create several databases and tools that assist and enhance the ability of experimental physicists to analyze data from the Tandem Mirror Experiment-Upgrade (TMX-U) experiment. This software runs on a DEC-20 computer in M-Division's User Service Center at Lawrence Livermore National Laboratory (LLNL), where data can be analyzed offline from the main TMX-U acquisition computers. When combined with interactive data analysis programs, these tools provide the capability to do batch-style processing or interactive data analysis on the computers in the USC or the supercomputers of the National Magnetic Fusion Energy Computer Center (NMFECC) in addition to the normal processing done by the TMX-U acquisition system. One database tool provides highly reduced data for searching and correlation analysis of several diagnostic signals within a single shot or over many shots. A second database tool provides retrieval and storage of unreduced data for use in detailed analysis of one or more diagnostic signals. We will show how these database tools form the core of an evolving offline data analysis environment on the USC computers

  19. Interactive Computer-Assisted Instruction in Acid-Base Physiology for Mobile Computer Platforms

    Science.gov (United States)

    Longmuir, Kenneth J.

    2014-01-01

    In this project, the traditional lecture hall presentation of acid-base physiology in the first-year medical school curriculum was replaced by interactive, computer-assisted instruction designed primarily for the iPad and other mobile computer platforms. Three learning modules were developed, each with ~20 screens of information, on the subjects…

  20. Numerical tool development of fluid-structure interactions for investigation of obstructive sleep apnea

    Science.gov (United States)

    Huang, Chien-Jung; White, Susan; Huang, Shao-Ching; Mallya, Sanjay; Eldredge, Jeff

    2016-11-01

    Obstructive sleep apnea (OSA) is a medical condition characterized by repetitive partial or complete occlusion of the airway during sleep. The soft tissues in the upper airway of OSA patients are prone to collapse under the low pressure loads incurred during breathing. The ultimate goal of this research is the development of a versatile numerical tool for simulation of air-tissue interactions in the patient specific upper airway geometry. This tool is expected to capture several phenomena, including flow-induced vibration (snoring) and large deformations during airway collapse of the complex airway geometry in respiratory flow conditions. Here, we present our ongoing progress toward this goal. To avoid mesh regeneration, for flow model, a sharp-interface embedded boundary method is used on Cartesian grids for resolving the fluid-structure interface, while for the structural model, a cut-cell finite element method is used. Also, to properly resolve large displacements, non-linear elasticity model is used. The fluid and structure solvers are connected with the strongly coupled iterative algorithm. The parallel computation is achieved with the numerical library PETSc. Some two- and three- dimensional preliminary results are shown to demonstrate the ability of this tool.

  1. Scalable space-time adaptive simulation tools for computational electrocardiology

    OpenAIRE

    Krause, Dorian; Krause, Rolf

    2013-01-01

    This work is concerned with the development of computational tools for the solution of reaction-diffusion equations from the field of computational electrocardiology. We designed lightweight spatially and space-time adaptive schemes for large-scale parallel simulations. We propose two different adaptive schemes based on locally structured meshes, managed either via a conforming coarse tessellation or a forest of shallow trees. A crucial ingredient of our approach is a non-conforming morta...

  2. Use of online interactive tools in an open distance learning context: Health studies students' perspective*

    Directory of Open Access Journals (Sweden)

    Kefiloe A. Maboe

    2017-10-01

    Full Text Available Background: Open distance learning (ODL institutions provide educational challenges with specific reference to the training of nurses. They have adopted online technologies to facilitate teaching and learning. However it is observed that most nurses do not use or minimally use tools such as a discussion forum for online interaction to facilitate teaching and learning. Objective: The purpose of this study was to determine how the discussion forum as an online interactive tool be used in an ODL institution to enhance student-to-student and student-to-lecturer online interactions. Design: Quantitative and descriptive in nature. Method: No sampling was done. An online questionnaire was sent to all 410 second and third years Health Services Management students around the world registered with a specific ODL institution during the second semester. Eighty seven students responded to the questionnaire. Data analysis was done quantitatively and descriptively in the form of diagrams. Results: The findings indicated that 84.9% of students own computers, and 100% own cellular phones, but only 3.8% participated in online discussion forum. Some students indicated that they were technologically challenged. Some lecturers interact minimally online and are not supportive to them. The institution does not give them the support they need to acquire the necessary skills to utilise these technologies. Conclusion: The article suggests that lecturers, active interaction in an online discussion forum as a way of supporting students, are fundamental to effective teaching and learning.The university should consider providing intensive mentoring to students to enable them to utilise the available technologies optimally.

  3. Data visualization, bar naked: A free tool for creating interactive graphics.

    Science.gov (United States)

    Weissgerber, Tracey L; Savic, Marko; Winham, Stacey J; Stanisavljevic, Dejana; Garovic, Vesna D; Milic, Natasa M

    2017-12-15

    Although bar graphs are designed for categorical data, they are routinely used to present continuous data in studies that have small sample sizes. This presentation is problematic, as many data distributions can lead to the same bar graph, and the actual data may suggest different conclusions from the summary statistics. To address this problem, many journals have implemented new policies that require authors to show the data distribution. This paper introduces a free, web-based tool for creating an interactive alternative to the bar graph (http://statistika.mfub.bg.ac.rs/interactive-dotplot/). This tool allows authors with no programming expertise to create customized interactive graphics, including univariate scatterplots, box plots, and violin plots, for comparing values of a continuous variable across different study groups. Individual data points may be overlaid on the graphs. Additional features facilitate visualization of subgroups or clusters of non-independent data. A second tool enables authors to create interactive graphics from data obtained with repeated independent experiments (http://statistika.mfub.bg.ac.rs/interactive-repeated-experiments-dotplot/). These tools are designed to encourage exploration and critical evaluation of the data behind the summary statistics and may be valuable for promoting transparency, reproducibility, and open science in basic biomedical research. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.

  4. Computer graphics and research projects

    International Nuclear Information System (INIS)

    Ingtrakul, P.

    1994-01-01

    This report was prepared as an account of scientific visualization tools and application tools for scientists and engineers. It is provided a set of tools to create pictures and to interact with them in natural ways. It applied many techniques of computer graphics and computer animation through a number of full-color presentations as computer animated commercials, 3D computer graphics, dynamic and environmental simulations, scientific modeling and visualization, physically based modelling, and beavioral, skelatal, dynamics, and particle animation. It took in depth at original hardware and limitations of existing PC graphics adapters contain syste m performance, especially with graphics intensive application programs and user interfaces

  5. Human-computer systems interaction backgrounds and applications 3

    CERN Document Server

    Kulikowski, Juliusz; Mroczek, Teresa; Wtorek, Jerzy

    2014-01-01

    This book contains an interesting and state-of the art collection of papers on the recent progress in Human-Computer System Interaction (H-CSI). It contributes the profound description of the actual status of the H-CSI field and also provides a solid base for further development and research in the discussed area. The contents of the book are divided into the following parts: I. General human-system interaction problems; II. Health monitoring and disabled people helping systems; and III. Various information processing systems. This book is intended for a wide audience of readers who are not necessarily experts in computer science, machine learning or knowledge engineering, but are interested in Human-Computer Systems Interaction. The level of particular papers and specific spreading-out into particular parts is a reason why this volume makes fascinating reading. This gives the reader a much deeper insight than he/she might glean from research papers or talks at conferences. It touches on all deep issues that ...

  6. Legal assessment tool (LAT): an interactive tool to address privacy and data protection issues for data sharing.

    Science.gov (United States)

    Kuchinke, Wolfgang; Krauth, Christian; Bergmann, René; Karakoyun, Töresin; Woollard, Astrid; Schluender, Irene; Braasch, Benjamin; Eckert, Martin; Ohmann, Christian

    2016-07-07

    In an unprecedented rate data in the life sciences is generated and stored in many different databases. An ever increasing part of this data is human health data and therefore falls under data protected by legal regulations. As part of the BioMedBridges project, which created infrastructures that connect more than 10 ESFRI research infrastructures (RI), the legal and ethical prerequisites of data sharing were examined employing a novel and pragmatic approach. We employed concepts from computer science to create legal requirement clusters that enable legal interoperability between databases for the areas of data protection, data security, Intellectual Property (IP) and security of biosample data. We analysed and extracted access rules and constraints from all data providers (databases) involved in the building of data bridges covering many of Europe's most important databases. These requirement clusters were applied to five usage scenarios representing the data flow in different data bridges: Image bridge, Phenotype data bridge, Personalised medicine data bridge, Structural data bridge, and Biosample data bridge. A matrix was built to relate the important concepts from data protection regulations (e.g. pseudonymisation, identifyability, access control, consent management) with the results of the requirement clusters. An interactive user interface for querying the matrix for requirements necessary for compliant data sharing was created. To guide researchers without the need for legal expert knowledge through legal requirements, an interactive tool, the Legal Assessment Tool (LAT), was developed. LAT provides researchers interactively with a selection process to characterise the involved types of data and databases and provides suitable requirements and recommendations for concrete data access and sharing situations. The results provided by LAT are based on an analysis of the data access and sharing conditions for different kinds of data of major databases in Europe

  7. Interactive Construction Digital Tools With Real Time Analysis

    DEFF Research Database (Denmark)

    Klitgaard, Jens; Kirkegaard, Poul Henning

    2007-01-01

    . The aim of this research is to look into integrated digital design and analysis tools in order to find out if it is suited for use by architects and designers or only by specialists and technicians - and if not, then to look at what can be done to make them more available to architects and designers...... an architect-engineer or hybrid practitioner works simultaneously with both aesthetic and technical design requirements. In this paper the problem of a vague or not existing link between digital design tools, used by architects and designers, and the analysis tools developed by and for engineers is considered......The recent developments in computational design tools have evolved into a sometimes purely digital process which opens up for new perspectives and problems in the sketching process. One of the interesting possibilities lay within the hybrid practitioner- or architect-engineer approach, where...

  8. Interactive computer-assisted instruction in acid-base physiology for mobile computer platforms.

    Science.gov (United States)

    Longmuir, Kenneth J

    2014-03-01

    In this project, the traditional lecture hall presentation of acid-base physiology in the first-year medical school curriculum was replaced by interactive, computer-assisted instruction designed primarily for the iPad and other mobile computer platforms. Three learning modules were developed, each with ∼20 screens of information, on the subjects of the CO2-bicarbonate buffer system, other body buffer systems, and acid-base disorders. Five clinical case modules were also developed. For the learning modules, the interactive, active learning activities were primarily step-by-step learner control of explanations of complex physiological concepts, usually presented graphically. For the clinical cases, the active learning activities were primarily question-and-answer exercises that related clinical findings to the relevant basic science concepts. The student response was remarkably positive, with the interactive, active learning aspect of the instruction cited as the most important feature. Also, students cited the self-paced instruction, extensive use of interactive graphics, and side-by-side presentation of text and graphics as positive features. Most students reported that it took less time to study the subject matter with this online instruction compared with subject matter presented in the lecture hall. However, the approach to learning was highly examination driven, with most students delaying the study of the subject matter until a few days before the scheduled examination. Wider implementation of active learning computer-assisted instruction will require that instructors present subject matter interactively, that students fully embrace the responsibilities of independent learning, and that institutional administrations measure instructional effort by criteria other than scheduled hours of instruction.

  9. Fluctuating hyperfine interactions: computational implementation

    International Nuclear Information System (INIS)

    Zacate, M. O.; Evenson, W. E.

    2010-01-01

    A library of computational routines has been created to assist in the analysis of stochastic models of hyperfine interactions. We call this library the stochastic hyperfine interactions modeling library (SHIML). It provides routines written in the C programming language that (1) read a text description of a model for fluctuating hyperfine fields, (2) set up the Blume matrix, upon which the evolution operator of the system depends, and (3) find the eigenvalues and eigenvectors of the Blume matrix so that theoretical spectra of experimental hyperfine interaction measurements can be calculated. Example model calculations are included in the SHIML package to illustrate its use and to generate perturbed angular correlation spectra for the special case of polycrystalline samples when anisotropy terms of higher order than A 22 can be neglected.

  10. The Design of Tools for Sketching Sensor-Based Interaction

    DEFF Research Database (Denmark)

    Brynskov, Martin; Lunding, Rasmus; Vestergaard, Lasse Steenbock

    2012-01-01

    In this paper we motivate, present, and give an initial evaluation of DUL Radio, a small wireless toolkit for sketching sensor-based interaction. In the motivation, we discuss the purpose of this specific platform, which aims to balance ease-of-use (learning, setup, initialization), size, speed......, flexibility and cost, aimed at wearable and ultra-mobile prototyping where fast reaction is needed (e.g. in controlling sound), and we discuss the general issues facing this category of embodied interaction design tools. We then present the platform in more detail, both regarding hard- ware and software....... In the brief evaluation, we present our initial experiences with the platform both in design projects and in teaching. We conclude that DUL Radio does seem to be a relatively easy-to-use tool for sketching sensor-based interaction compared to other solutions, but that there are many ways to improve it. Target...

  11. Computers in the Classroom: From Tool to Medium.

    Science.gov (United States)

    Perrone, Corrina; Repenning, Alexander; Spencer, Sarah; Ambach, James

    1996-01-01

    Discusses the computer as a communication medium to support learning. Illustrates the benefits of this reconceptualization in the context of having students author and play interactive simulation games and exchange them over the Internet. (RS)

  12. Applying systemic-structural activity theory to design of human-computer interaction systems

    CERN Document Server

    Bedny, Gregory Z; Bedny, Inna

    2015-01-01

    Human-Computer Interaction (HCI) is an interdisciplinary field that has gained recognition as an important field in ergonomics. HCI draws on ideas and theoretical concepts from computer science, psychology, industrial design, and other fields. Human-Computer Interaction is no longer limited to trained software users. Today people interact with various devices such as mobile phones, tablets, and laptops. How can you make such interaction user friendly, even when user proficiency levels vary? This book explores methods for assessing the psychological complexity of computer-based tasks. It also p

  13. VISTA - computational tools for comparative genomics

    Energy Technology Data Exchange (ETDEWEB)

    Frazer, Kelly A.; Pachter, Lior; Poliakov, Alexander; Rubin,Edward M.; Dubchak, Inna

    2004-01-01

    Comparison of DNA sequences from different species is a fundamental method for identifying functional elements in genomes. Here we describe the VISTA family of tools created to assist biologists in carrying out this task. Our first VISTA server at http://www-gsd.lbl.gov/VISTA/ was launched in the summer of 2000 and was designed to align long genomic sequences and visualize these alignments with associated functional annotations. Currently the VISTA site includes multiple comparative genomics tools and provides users with rich capabilities to browse pre-computed whole-genome alignments of large vertebrate genomes and other groups of organisms with VISTA Browser, submit their own sequences of interest to several VISTA servers for various types of comparative analysis, and obtain detailed comparative analysis results for a set of cardiovascular genes. We illustrate capabilities of the VISTA site by the analysis of a 180 kilobase (kb) interval on human chromosome 5 that encodes for the kinesin family member3A (KIF3A) protein.

  14. Mobile human-computer interaction perspective on mobile learning

    CSIR Research Space (South Africa)

    Botha, Adèle

    2010-10-01

    Full Text Available Applying a Mobile Human Computer Interaction (MHCI) view to the domain of education using Mobile Learning (Mlearning), the research outlines its understanding of the influences and effects of different interactions on the use of mobile technology...

  15. Computer Aided Design Tools for Extreme Environment Electronics, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — This project aims to provide Computer Aided Design (CAD) tools for radiation-tolerant, wide-temperature-range digital, analog, mixed-signal, and radio-frequency...

  16. Computational Tool for Aerothermal Environment Around Transatmospheric Vehicles, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this Project is to develop a high-fidelity computational tool for accurate prediction of aerothermal environment on transatmospheric vehicles. This...

  17. Interactive test tool for interoperable C-ITS development

    NARCIS (Netherlands)

    Voronov, A.; Englund, C.; Bengtsson, H.H.; Chen, L.; Ploeg, J.; Jongh, J.F.C.M. de; Sluis, H.J.D. van de

    2015-01-01

    This paper presents the architecture of an Interactive Test Tool (ITT) for interoperability testing of Cooperative Intelligent Transport Systems (C-ITS). Cooperative systems are developed by different manufacturers at different locations, which makes interoperability testing a tedious task. Up until

  18. Benchmarking therapeutic drug monitoring software: a review of available computer tools.

    Science.gov (United States)

    Fuchs, Aline; Csajka, Chantal; Thoma, Yann; Buclin, Thierry; Widmer, Nicolas

    2013-01-01

    Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare

  19. A least-squares computational ''tool kit''

    International Nuclear Information System (INIS)

    Smith, D.L.

    1993-04-01

    The information assembled in this report is intended to offer a useful computational ''tool kit'' to individuals who are interested in a variety of practical applications for the least-squares method of parameter estimation. The fundamental principles of Bayesian analysis are outlined first and these are applied to development of both the simple and the generalized least-squares conditions. Formal solutions that satisfy these conditions are given subsequently. Their application to both linear and non-linear problems is described in detail. Numerical procedures required to implement these formal solutions are discussed and two utility computer algorithms are offered for this purpose (codes LSIOD and GLSIOD written in FORTRAN). Some simple, easily understood examples are included to illustrate the use of these algorithms. Several related topics are then addressed, including the generation of covariance matrices, the role of iteration in applications of least-squares procedures, the effects of numerical precision and an approach that can be pursued in developing data analysis packages that are directed toward special applications

  20. Developing site-specific interactive environmental management tools: An exciting method of communicating training, procedures, and other information

    Energy Technology Data Exchange (ETDEWEB)

    Jaeckels, J.M.

    1999-07-01

    Environmental managers are faced with numerous programs that must be communicated throughout their organizations. Among these are regulatory training programs, internal environmental policy, regulatory guidance/procedures and internal guidance/procedures. Traditional methods of delivering this type of information are typically confined to written materials and classroom training. There are many challenges faced by environmental managers with these traditional approaches including: determining if recipients of written plans or procedures are reading and comprehending the information; scheduling training sessions to reach all affected people across multiple schedules/shifts; and maintaining adequate training records. In addition, current trends toward performance-based or competency-based training requires a more consistent method of measuring and documenting performance. The use of interactive computer applications to present training or procedural information is a new and exciting tool for delivering environmental information to employees. Site-specific pictures, text, sound, and even video can be combined with multimedia software to create informative and highly interactive applications. Some of the applications that can be produced include integrated environmental training, educational pieces, and interactive environmental procedures. They can be executed from a CD-ROM, hard drive, network or a company Intranet. Collectively, the authors refer to these as interactive environmental management tools (IEMTs). This paper focuses on site-specific, interactive training as an example of an IEMT. Interactive training not only delivers a highly effective message, but can also be designed to focus on site-specific environmental issues that are unique to each company. Interactive training also lends itself well to automated record keeping functions and to reaching all affected employees.

  1. Radiotherapy: an interactive learning tool

    International Nuclear Information System (INIS)

    Frenzel, T.; Kruell, A.; Schmidt, R.; Dobrucki, W.; Malys, B.

    1998-01-01

    The program is primarily intended for radiological medical technicians, student nurses, students of medicine and physics, and doctors. It is designed as a tool for vocational training and further training and gives comprehensive insight into the daily routines of a radiotherapy unit. The chapters deal with: fundamental biological aspects - fundamental physical aspects - radiation sources and irradiation systems - preparatory examinations - therapies and concepts - irradiation planning - irradiation performance - termination of irradiation treatment. For every page displayed, spoken texts and written, on-screen keywords, illustrations, animated sequences and a large number of videos have been combined in a way easy to digest. The software of the program permits handling also by learners less familiar with computer-based learning. (orig./) [de

  2. Interactive software tool to comprehend the calculation of optimal sequence alignments with dynamic programming.

    Science.gov (United States)

    Ibarra, Ignacio L; Melo, Francisco

    2010-07-01

    Dynamic programming (DP) is a general optimization strategy that is successfully used across various disciplines of science. In bioinformatics, it is widely applied in calculating the optimal alignment between pairs of protein or DNA sequences. These alignments form the basis of new, verifiable biological hypothesis. Despite its importance, there are no interactive tools available for training and education on understanding the DP algorithm. Here, we introduce an interactive computer application with a graphical interface, for the purpose of educating students about DP. The program displays the DP scoring matrix and the resulting optimal alignment(s), while allowing the user to modify key parameters such as the values in the similarity matrix, the sequence alignment algorithm version and the gap opening/extension penalties. We hope that this software will be useful to teachers and students of bioinformatics courses, as well as researchers who implement the DP algorithm for diverse applications. The software is freely available at: http:/melolab.org/sat. The software is written in the Java computer language, thus it runs on all major platforms and operating systems including Windows, Mac OS X and LINUX. All inquiries or comments about this software should be directed to Francisco Melo at fmelo@bio.puc.cl.

  3. A novel framework for diagnosing automatic tool changer and tool life based on cloud computing

    Directory of Open Access Journals (Sweden)

    Shang-Liang Chen

    2016-03-01

    Full Text Available Tool change is one among the most frequently performed machining processes, and if there is improper percussion as the tool’s position is changed, the spindle bearing can be damaged. A spindle malfunction can cause problems, such as a knife being dropped or bias in a machined hole. The measures currently taken to avoid such issues, which arose from the available machine tools, only involve determining whether the clapping knife’s state is correct using a spindle and the air adhesion method, which is also used to satisfy the high precision required from mechanical components. Therefore, it cannot be used with any type of machine tool; in addition, improper tapping of the spindle during an automatic tool change cannot be detected. Therefore, this study proposes a new type of diagnostic framework that combines cloud computing and vibration sensors, among of which, tool change is automatically diagnosed using an architecture to identify abnormalities and thereby enhances the reliability and productivity of the machine and equipment.

  4. An Interactive, Web-based High Performance Modeling Environment for Computational Epidemiology.

    Science.gov (United States)

    Deodhar, Suruchi; Bisset, Keith R; Chen, Jiangzhuo; Ma, Yifei; Marathe, Madhav V

    2014-07-01

    We present an integrated interactive modeling environment to support public health epidemiology. The environment combines a high resolution individual-based model with a user-friendly web-based interface that allows analysts to access the models and the analytics back-end remotely from a desktop or a mobile device. The environment is based on a loosely-coupled service-oriented-architecture that allows analysts to explore various counter factual scenarios. As the modeling tools for public health epidemiology are getting more sophisticated, it is becoming increasingly hard for non-computational scientists to effectively use the systems that incorporate such models. Thus an important design consideration for an integrated modeling environment is to improve ease of use such that experimental simulations can be driven by the users. This is achieved by designing intuitive and user-friendly interfaces that allow users to design and analyze a computational experiment and steer the experiment based on the state of the system. A key feature of a system that supports this design goal is the ability to start, stop, pause and roll-back the disease propagation and intervention application process interactively. An analyst can access the state of the system at any point in time and formulate dynamic interventions based on additional information obtained through state assessment. In addition, the environment provides automated services for experiment set-up and management, thus reducing the overall time for conducting end-to-end experimental studies. We illustrate the applicability of the system by describing computational experiments based on realistic pandemic planning scenarios. The experiments are designed to demonstrate the system's capability and enhanced user productivity.

  5. Multimodal Challenge: Analytics Beyond User-computer Interaction Data

    NARCIS (Netherlands)

    Di Mitri, Daniele; Schneider, Jan; Specht, Marcus; Drachsler, Hendrik

    2018-01-01

    This contribution describes one the challenges explored in the Fourth LAK Hackathon. This challenge aims at shifting the focus from learning situations which can be easily traced through user-computer interactions data and concentrate more on user-world interactions events, typical of co-located and

  6. Guest editorial: Brain/neuronal computer games interfaces and interaction

    OpenAIRE

    Coyle, D.; Principe, J.; Lotte, F.; Nijholt, Antinus

    2013-01-01

    Nowadays brainwave or electroencephalogram (EEG) controlled games controllers are adding new options to satisfy the continual demand for new ways to interact with games, following trends such as the Nintendo® Wii, Microsoft® Kinect and Playstation® Move which are based on accelerometers and motion capture. EEG-based brain-computer games interaction are controlled through brain-computer interface (BCI) technology which requires sophisticated signal processing to produce a low communication ban...

  7. Framework for computer-aided systems design

    International Nuclear Information System (INIS)

    Esselman, W.H.

    1992-01-01

    Advanced computer technology, analytical methods, graphics capabilities, and expert systems contribute to significant changes in the design process. Continued progress is expected. Achieving the ultimate benefits of these computer-based design tools depends on successful research and development on a number of key issues. A fundamental understanding of the design process is a prerequisite to developing these computer-based tools. In this paper a hierarchical systems design approach is described, and methods by which computers can assist the designer are examined. A framework is presented for developing computer-based design tools for power plant design. These tools include expert experience bases, tutorials, aids in decision making, and tools to develop the requirements, constraints, and interactions among subsystems and components. Early consideration of the functional tasks is encouraged. Methods of acquiring an expert's experience base is a fundamental research problem. Computer-based guidance should be provided in a manner that supports the creativity, heuristic approaches, decision making, and meticulousness of a good designer

  8. Computerized Cognitive Rehabilitation: Comparing Different Human-Computer Interactions.

    Science.gov (United States)

    Quaglini, Silvana; Alloni, Anna; Cattani, Barbara; Panzarasa, Silvia; Pistarini, Caterina

    2017-01-01

    In this work we describe an experiment involving aphasic patients, where the same speech rehabilitation exercise was administered in three different modalities, two of which are computer-based. In particular, one modality exploits the "Makey Makey", an electronic board which allows interacting with the computer using physical objects.

  9. Interaction of anthraquinone anti-cancer drugs with DNA:Experimental and computational quantum chemical study

    Science.gov (United States)

    Al-Otaibi, Jamelah S.; Teesdale Spittle, Paul; El Gogary, Tarek M.

    2017-01-01

    Anthraquinones form the basis of several anticancer drugs. Anthraquinones anticancer drugs carry out their cytotoxic activities through their interaction with DNA, and inhibition of topoisomerase II activity. Anthraquinones (AQ4 and AQ4H) were synthesized and studied along with 1,4-DAAQ by computational and experimental tools. The purpose of this study is to shade more light on mechanism of interaction between anthraquinone DNA affinic agents and different types of DNA. This study will lead to gain of information useful for drug design and development. Molecular structures were optimized using DFT B3LYP/6-31 + G(d). Depending on intramolecular hydrogen bonding interactions two conformers of AQ4 were detected and computed as 25.667 kcal/mol apart. Molecular reactivity of the anthraquinone compounds was explored using global and condensed descriptors (electrophilicity and Fukui functions). Molecular docking studies for the inhibition of CDK2 and DNA binding were carried out to explore the anti cancer potency of these drugs. NMR and UV-VIS electronic absorption spectra of anthraquinones/DNA were investigated at the physiological pH. The interaction of the three anthraquinones (AQ4, AQ4H and 1,4-DAAQ) were studied with three DNA (calf thymus DNA, (Poly[dA].Poly[dT]) and (Poly[dG].Poly[dC]). NMR study shows a qualitative pattern of drug/DNA interaction in terms of band shift and broadening. UV-VIS electronic absorption spectra were employed to measure the affinity constants of drug/DNA binding using Scatchard analysis.

  10. DEVELOPMENT AND USE OF COMPUTER-AIDED PROCESS ENGINEERING TOOLS FOR POLLUTION PREVENTION

    Science.gov (United States)

    The use of Computer-Aided Process Engineering (CAPE) and process simulation tools has become established industry practice to predict simulation software, new opportunities are available for the creation of a wide range of ancillary tools that can be used from within multiple sim...

  11. Human-scale interaction for virtual model displays: a clear case for real tools

    Science.gov (United States)

    Williams, George C.; McDowall, Ian E.; Bolas, Mark T.

    1998-04-01

    We describe a hand-held user interface for interacting with virtual environments displayed on a Virtual Model Display. The tool, constructed entirely of transparent materials, is see-through. We render a graphical counterpart of the tool on the display and map it one-to-one with the real tool. This feature, combined with a capability for touch- sensitive, discrete input, results in a useful spatial input device that is visually versatile. We discuss the tool's design and interaction techniques it supports. Briefly, we look at the human factors issues and engineering challenges presented by this tool and, in general, by the class of hand-held user interfaces that are see-through.

  12. Computational learning on specificity-determining residue-nucleotide interactions

    KAUST Repository

    Wong, Ka-Chun; Li, Yue; Peng, Chengbin; Moses, Alan M.; Zhang, Zhaolei

    2015-01-01

    The protein–DNA interactions between transcription factors and transcription factor binding sites are essential activities in gene regulation. To decipher the binding codes, it is a long-standing challenge to understand the binding mechanism across different transcription factor DNA binding families. Past computational learning studies usually focus on learning and predicting the DNA binding residues on protein side. Taking into account both sides (protein and DNA), we propose and describe a computational study for learning the specificity-determining residue-nucleotide interactions of different known DNA-binding domain families. The proposed learning models are compared to state-of-the-art models comprehensively, demonstrating its competitive learning performance. In addition, we describe and propose two applications which demonstrate how the learnt models can provide meaningful insights into protein–DNA interactions across different DNA binding families.

  13. Computational learning on specificity-determining residue-nucleotide interactions

    KAUST Repository

    Wong, Ka-Chun

    2015-11-02

    The protein–DNA interactions between transcription factors and transcription factor binding sites are essential activities in gene regulation. To decipher the binding codes, it is a long-standing challenge to understand the binding mechanism across different transcription factor DNA binding families. Past computational learning studies usually focus on learning and predicting the DNA binding residues on protein side. Taking into account both sides (protein and DNA), we propose and describe a computational study for learning the specificity-determining residue-nucleotide interactions of different known DNA-binding domain families. The proposed learning models are compared to state-of-the-art models comprehensively, demonstrating its competitive learning performance. In addition, we describe and propose two applications which demonstrate how the learnt models can provide meaningful insights into protein–DNA interactions across different DNA binding families.

  14. The classification and evaluation of Computer-Aided Software Engineering tools

    OpenAIRE

    Manley, Gary W.

    1990-01-01

    Approved for public release; distribution unlimited. The use of Computer-Aided Software Engineering (CASE) tools has been viewed as a remedy for the software development crisis by achieving improved productivity and system quality via the automation of all or part of the software engineering process. The proliferation and tremendous variety of tools available have stretched the understanding of experienced practitioners and has had a profound impact on the software engineering process itse...

  15. On Biblical Hebrew and Computer Science: Inspiration, Models, Tools, And Cross-fertilization

    DEFF Research Database (Denmark)

    Sandborg-Petersen, Ulrik

    2011-01-01

    Eep Talstra's work has been an inspiration to maby researchers, both within and outside of the field of Old Testament scholarship. Among others, Crist-Jan Doedens and the present author have been heavily influenced by Talstra in their own work within the field of computer science. The present...... of the present author. In addition, the tools surrounding Emdros, including SESB, Libronis, and the Emdros Query Tool, are described. Ecamples Biblical Hebrew scholar. Thus the inspiration of Talstra comes full-circle: from Biblical Hebrew databases to computer science and back into Biblical Hebrew scholarship....

  16. Application of computer tools to the diagnosis of the combustion in motors

    International Nuclear Information System (INIS)

    Agudelo S, John R; Delgado M, Alvaro; Gutierrez V, Elkin

    2001-01-01

    This paper describes the fundamental topics concerning to analysis of combustion process in internal combustion engines, when latest generation computational tools are employed. For achieving this, it has been developed DIATERM using graphic programming languages. It is also described the thermo-dynamical model in which is based DIATERM. In the same way it is showed the potential of this computational tool when it is applied to analysis of pressure data in the combustion chamber of a turbo charged diesel engine, changing the load while rotational speed is maintained constant

  17. Anticipated Ongoing Interaction versus Channel Effects of Relational Communication in Computer-Mediated Interaction.

    Science.gov (United States)

    Walther, Joseph B.

    1994-01-01

    Assesses the related effects of anticipated future interaction and different communication media (computer-mediated versus face-to-face communication) on the communication of relational intimacy and composure. Shows that the assignment of long-term versus short-term partnerships has a larger impact on anticipated future interaction reported by…

  18. Database tools for enhanced analysis of TMX-U data. Revision 1

    International Nuclear Information System (INIS)

    Stewart, M.E.; Carter, M.R.; Casper, T.A.; Meyer, W.H.; Perkins, D.E.; Whitney, D.M.

    1986-01-01

    A commercial database software package has been used to create several databases and tools that assist and enhance the ability of experimental physicists to analyze data from the Tandem Mirror Experiment-Upgrade (TMX-U) experiment. This software runs on a DEC-20 computer in M-Division's User Service Center at Lawrence Livermore National Laboratory (LLNL), where data can be analyzed offline from the main TMX-U acquisition computers. When combined with interactive data analysis programs, these tools provide the capability to do batch-style processing or interactive data analysis on the computers in the USC or the supercomputers of the National Magnetic Fusion Energy Computer Center (NMFECC) in addition to the normal processing done by the TMX-U acquisition system. One database tool provides highly reduced data for searching and correlation analysis of several diagnostic signals within a single shot or over many shots. A second database tool provides retrieval and storage of unreduced data for use in detailed analysis of one or more diagnostic signals. We will show how these database tools form the core of an evolving offline data analysis environment on the USC computers

  19. Human-computer interaction handbook fundamentals, evolving technologies and emerging applications

    CERN Document Server

    Sears, Andrew

    2007-01-01

    This second edition of The Human-Computer Interaction Handbook provides an updated, comprehensive overview of the most important research in the field, including insights that are directly applicable throughout the process of developing effective interactive information technologies. It features cutting-edge advances to the scientific knowledge base, as well as visionary perspectives and developments that fundamentally transform the way in which researchers and practitioners view the discipline. As the seminal volume of HCI research and practice, The Human-Computer Interaction Handbook feature

  20. A Tangible Programming Tool for Children to Cultivate Computational Thinking

    Science.gov (United States)

    Wang, Danli; Liu, Zhen

    2014-01-01

    Game and creation are activities which have good potential for computational thinking skills. In this paper we present T-Maze, an economical tangible programming tool for children aged 5–9 to build computer programs in maze games by placing wooden blocks. Through the use of computer vision technology, T-Maze provides a live programming interface with real-time graphical and voice feedback. We conducted a user study with 7 children using T-Maze to play two levels of maze-escape games and create their own mazes. The results show that T-Maze is not only easy to use, but also has the potential to help children cultivate computational thinking like abstraction, problem decomposition, and creativity. PMID:24719575

  1. A Tangible Programming Tool for Children to Cultivate Computational Thinking

    Directory of Open Access Journals (Sweden)

    Danli Wang

    2014-01-01

    Full Text Available Game and creation are activities which have good potential for computational thinking skills. In this paper we present T-Maze, an economical tangible programming tool for children aged 5–9 to build computer programs in maze games by placing wooden blocks. Through the use of computer vision technology, T-Maze provides a live programming interface with real-time graphical and voice feedback. We conducted a user study with 7 children using T-Maze to play two levels of maze-escape games and create their own mazes. The results show that T-Maze is not only easy to use, but also has the potential to help children cultivate computational thinking like abstraction, problem decomposition, and creativity.

  2. The UEA Small RNA Workbench: A Suite of Computational Tools for Small RNA Analysis.

    Science.gov (United States)

    Mohorianu, Irina; Stocks, Matthew Benedict; Applegate, Christopher Steven; Folkes, Leighton; Moulton, Vincent

    2017-01-01

    RNA silencing (RNA interference, RNAi) is a complex, highly conserved mechanism mediated by short, typically 20-24 nt in length, noncoding RNAs known as small RNAs (sRNAs). They act as guides for the sequence-specific transcriptional and posttranscriptional regulation of target mRNAs and play a key role in the fine-tuning of biological processes such as growth, response to stresses, or defense mechanism.High-throughput sequencing (HTS) technologies are employed to capture the expression levels of sRNA populations. The processing of the resulting big data sets facilitated the computational analysis of the sRNA patterns of variation within biological samples such as time point experiments, tissue series or various treatments. Rapid technological advances enable larger experiments, often with biological replicates leading to a vast amount of raw data. As a result, in this fast-evolving field, the existing methods for sequence characterization and prediction of interaction (regulatory) networks periodically require adapting or in extreme cases, a complete redesign to cope with the data deluge. In addition, the presence of numerous tools focused only on particular steps of HTS analysis hinders the systematic parsing of the results and their interpretation.The UEA small RNA Workbench (v1-4), described in this chapter, provides a user-friendly, modular, interactive analysis in the form of a suite of computational tools designed to process and mine sRNA datasets for interesting characteristics that can be linked back to the observed phenotypes. First, we show how to preprocess the raw sequencing output and prepare it for downstream analysis. Then we review some quality checks that can be used as a first indication of sources of variability between samples. Next we show how the Workbench can provide a comparison of the effects of different normalization approaches on the distributions of expression, enhanced methods for the identification of differentially expressed

  3. Automated and interactive fuel management tools: Past, present and future

    International Nuclear Information System (INIS)

    Cook, A.G.; Casadei, A.L.

    1986-01-01

    The past, present and future status of automated and interactive fuel management tools are reviewed. Issues such as who are the customers for these products and what are their needs are addressed. The nature of the fuel management problem is reviewed. The Westinghouse fuel management tools and methods are presented as an example of how the technology has evolved

  4. Introducing Enabling Computational Tools to the Climate Sciences: Multi-Resolution Climate Modeling with Adaptive Cubed-Sphere Grids

    Energy Technology Data Exchange (ETDEWEB)

    Jablonowski, Christiane [Univ. of Michigan, Ann Arbor, MI (United States)

    2015-07-14

    The research investigates and advances strategies how to bridge the scale discrepancies between local, regional and global phenomena in climate models without the prohibitive computational costs of global cloud-resolving simulations. In particular, the research explores new frontiers in computational geoscience by introducing high-order Adaptive Mesh Refinement (AMR) techniques into climate research. AMR and statically-adapted variable-resolution approaches represent an emerging trend for atmospheric models and are likely to become the new norm in future-generation weather and climate models. The research advances the understanding of multi-scale interactions in the climate system and showcases a pathway how to model these interactions effectively with advanced computational tools, like the Chombo AMR library developed at the Lawrence Berkeley National Laboratory. The research is interdisciplinary and combines applied mathematics, scientific computing and the atmospheric sciences. In this research project, a hierarchy of high-order atmospheric models on cubed-sphere computational grids have been developed that serve as an algorithmic prototype for the finite-volume solution-adaptive Chombo-AMR approach. The foci of the investigations have lied on the characteristics of both static mesh adaptations and dynamically-adaptive grids that can capture flow fields of interest like tropical cyclones. Six research themes have been chosen. These are (1) the introduction of adaptive mesh refinement techniques into the climate sciences, (2) advanced algorithms for nonhydrostatic atmospheric dynamical cores, (3) an assessment of the interplay between resolved-scale dynamical motions and subgrid-scale physical parameterizations, (4) evaluation techniques for atmospheric model hierarchies, (5) the comparison of AMR refinement strategies and (6) tropical cyclone studies with a focus on multi-scale interactions and variable-resolution modeling. The results of this research project

  5. Child-Computer Interaction SIG: Ethics and Values

    DEFF Research Database (Denmark)

    Hourcade, Juan Pablo; Zeising, Anja; Iversen, Ole Sejer

    2017-01-01

    This SIG will provide child computer interaction researchers and practitioners an opportunity to discuss topics related to ethical challenges in the design, and use of interactive technologies for children. Topics include the role of big data, the impact of technology in children’s social...... and physical ecosystem, and the consideration of ethics in children’s participation in the design of technologies, and in the conceptualization of technologies for children....

  6. Virtual reality/ augmented reality technology : the next chapter of human-computer interaction

    OpenAIRE

    Huang, Xing

    2015-01-01

    No matter how many different size and shape the computer has, the basic components of computers are still the same. If we use the user perspective to look for the development of computer history, we can surprisingly find that it is the input output device that leads the development of the industry development, in one word, human-computer interaction changes the development of computer history. Human computer interaction has been gone through three stages, the first stage relies on the inpu...

  7. Choice of Human-Computer Interaction Mode in Stroke Rehabilitation.

    Science.gov (United States)

    Mousavi Hondori, Hossein; Khademi, Maryam; Dodakian, Lucy; McKenzie, Alison; Lopes, Cristina V; Cramer, Steven C

    2016-03-01

    Advances in technology are providing new forms of human-computer interaction. The current study examined one form of human-computer interaction, augmented reality (AR), whereby subjects train in the real-world workspace with virtual objects projected by the computer. Motor performances were compared with those obtained while subjects used a traditional human-computer interaction, that is, a personal computer (PC) with a mouse. Patients used goal-directed arm movements to play AR and PC versions of the Fruit Ninja video game. The 2 versions required the same arm movements to control the game but had different cognitive demands. With AR, the game was projected onto the desktop, where subjects viewed the game plus their arm movements simultaneously, in the same visual coordinate space. In the PC version, subjects used the same arm movements but viewed the game by looking up at a computer monitor. Among 18 patients with chronic hemiparesis after stroke, the AR game was associated with 21% higher game scores (P = .0001), 19% faster reaching times (P = .0001), and 15% less movement variability (P = .0068), as compared to the PC game. Correlations between game score and arm motor status were stronger with the AR version. Motor performances during the AR game were superior to those during the PC game. This result is due in part to the greater cognitive demands imposed by the PC game, a feature problematic for some patients but clinically useful for others. Mode of human-computer interface influences rehabilitation therapy demands and can be individualized for patients. © The Author(s) 2015.

  8. Common features of microRNA target prediction tools

    Directory of Open Access Journals (Sweden)

    Sarah M. Peterson

    2014-02-01

    Full Text Available The human genome encodes for over 1800 microRNAs, which are short noncoding RNA molecules that function to regulate gene expression post-transcriptionally. Due to the potential for one microRNA to target multiple gene transcripts, microRNAs are recognized as a major mechanism to regulate gene expression and mRNA translation. Computational prediction of microRNA targets is a critical initial step in identifying microRNA:mRNA target interactions for experimental validation. The available tools for microRNA target prediction encompass a range of different computational approaches, from the modeling of physical interactions to the incorporation of machine learning. This review provides an overview of the major computational approaches to microRNA target prediction. Our discussion highlights three tools for their ease of use, reliance on relatively updated versions of miRBase, and range of capabilities, and these are DIANA-microT-CDS, miRanda-mirSVR, and TargetScan. In comparison across all microRNA target prediction tools, four main aspects of the microRNA:mRNA target interaction emerge as common features on which most target prediction is based: seed match, conservation, free energy, and site accessibility. This review explains these features and identifies how they are incorporated into currently available target prediction tools. MicroRNA target prediction is a dynamic field with increasing attention on development of new analysis tools. This review attempts to provide a comprehensive assessment of these tools in a manner that is accessible across disciplines. Understanding the basis of these prediction methodologies will aid in user selection of the appropriate tools and interpretation of the tool output.

  9. Data-base tools for enhanced analysis of TMX-U data

    International Nuclear Information System (INIS)

    Stewart, M.E.; Carter, M.R.; Casper, T.A.; Meyer, W.H.; Perkins, D.E.; Whitney, D.M.

    1986-01-01

    The authors use a commercial data-base software package to create several data-base products that enhance the ability of experimental physicists to analyze data from the TMX-U experiment. This software resides on a Dec-20 computer in M-Divisions's user service center (USC), where data can be analyzed separately from the main acquisition computers. When these data-base tools are combined with interactive data analysis programs, physicists can perform automated (batch-style) processing or interactive data analysis on the computers in the USC or on the supercomputers of the NMFECC, in addition to the normal processing done on the acquisition system. One data-base tool provides highly reduced data for searching and correlation analysis of several diagnostic signals for a single shot or many shots. A second data-base tool provides retrieval and storage of unreduced data for detailed analysis of one or more diagnostic signals. The authors report how these data-base tools form the core of an evolving off-line data-analysis environment on the USC computers

  10. Natural language processing tools for computer assisted language learning

    Directory of Open Access Journals (Sweden)

    Vandeventer Faltin, Anne

    2003-01-01

    Full Text Available This paper illustrates the usefulness of natural language processing (NLP tools for computer assisted language learning (CALL through the presentation of three NLP tools integrated within a CALL software for French. These tools are (i a sentence structure viewer; (ii an error diagnosis system; and (iii a conjugation tool. The sentence structure viewer helps language learners grasp the structure of a sentence, by providing lexical and grammatical information. This information is derived from a deep syntactic analysis. Two different outputs are presented. The error diagnosis system is composed of a spell checker, a grammar checker, and a coherence checker. The spell checker makes use of alpha-codes, phonological reinterpretation, and some ad hoc rules to provide correction proposals. The grammar checker employs constraint relaxation and phonological reinterpretation as diagnosis techniques. The coherence checker compares the underlying "semantic" structures of a stored answer and of the learners' input to detect semantic discrepancies. The conjugation tool is a resource with enhanced capabilities when put on an electronic format, enabling searches from inflected and ambiguous verb forms.

  11. Computational Approaches for Prediction of Pathogen-Host Protein-Protein Interactions

    Directory of Open Access Journals (Sweden)

    Esmaeil eNourani

    2015-02-01

    Full Text Available Infectious diseases are still among the major and prevalent health problems, mostly because of the drug resistance of novel variants of pathogens. Molecular interactions between pathogens and their hosts are the key part of the infection mechanisms. Novel antimicrobial therapeutics to fight drug resistance is only possible in case of a thorough understanding of pathogen-host interaction (PHI systems. Existing databases, which contain experimentally verified PHI data, suffer from scarcity of reported interactions due to the technically challenging and time consuming process of experiments. This has motivated many researchers to address the problem by proposing computational approaches for analysis and prediction of PHIs. The computational methods primarily utilize sequence information, protein structure and known interactions. Classic machine learning techniques are used when there are sufficient known interactions to be used as training data. On the opposite case, transfer and multi task learning methods are preferred. Here, we present an overview of these computational approaches for PHI prediction, discussing their weakness and abilities, with future directions.

  12. Audio Interaction in Computer Mediated Games

    Directory of Open Access Journals (Sweden)

    J. R. Parker

    2008-01-01

    Full Text Available The use of sound in an interactive media environment has not been advanced, as a technology, as far as graphics or artificial intelligence. This discussion will explore the use of sound as a way to influence the player of a computer game, will show ways that a game can use sound as input, and will describe ways that the player can influence sound in a game. The role of sound in computer games will be explored some practical design ideas that can be used to improve the current state of the art will be given.

  13. Vizic: A Jupyter-based interactive visualization tool for astronomical catalogs

    Science.gov (United States)

    Yu, W.; Carrasco Kind, M.; Brunner, R. J.

    2017-07-01

    The ever-growing datasets in observational astronomy have challenged scientists in many aspects, including an efficient and interactive data exploration and visualization. Many tools have been developed to confront this challenge. However, they usually focus on displaying the actual images or focus on visualizing patterns within catalogs in a predefined way. In this paper we introduce Vizic, a Python visualization library that builds the connection between images and catalogs through an interactive map of the sky region. Vizic visualizes catalog data over a custom background canvas using the shape, size and orientation of each object in the catalog. The displayed objects in the map are highly interactive and customizable comparing to those in the observation images. These objects can be filtered by or colored by their property values, such as redshift and magnitude. They also can be sub-selected using a lasso-like tool for further analysis using standard Python functions and everything is done from inside a Jupyter notebook. Furthermore, Vizic allows custom overlays to be appended dynamically on top of the sky map. We have initially implemented several overlays, namely, Voronoi, Delaunay, Minimum Spanning Tree and HEALPix grid layer, which are helpful for visualizing large-scale structure. All these overlays can be generated, added or removed interactively with just one line of code. The catalog data is stored in a non-relational database, and the interfaces have been developed in JavaScript and Python to work within Jupyter Notebook, which allows to create customizable widgets, user generated scripts to analyze and plot the data selected/displayed in the interactive map. This unique design makes Vizic a very powerful and flexible interactive analysis tool. Vizic can be adopted in variety of exercises, for example, data inspection, clustering analysis, galaxy alignment studies, outlier identification or just large scale visualizations.

  14. Automated and Assistive Tools for Accelerated Code migration of Scientific Computing on to Heterogeneous MultiCore Systems

    Science.gov (United States)

    2017-04-13

    AFRL-AFOSR-UK-TR-2017-0029 Automated and Assistive Tools for Accelerated Code migration of Scientific Computing on to Heterogeneous MultiCore Systems ...2012, “ Automated and Assistive Tools for Accelerated Code migration of Scientific Computing on to Heterogeneous MultiCore Systems .” 2. The objective...2012 - 01/25/2015 4. TITLE AND SUBTITLE Automated and Assistive Tools for Accelerated Code migration of Scientific Computing on to Heterogeneous

  15. An interactive water indicator assessment tool to support land use planning

    NARCIS (Netherlands)

    Hellegers, P.J.G.J.; Jansen, H.C.; Bastiaanssen, W.G.M.

    2012-01-01

    This paper presents an interactive web-based rapid assessment tool that generates key water related indicators to support decision making by stakeholders in land use planning. The tool is built on a consistent science based method that combines remote sensing with hydrological and socioeconomic

  16. Salesperson Ethics: An Interactive Computer Simulation

    Science.gov (United States)

    Castleberry, Stephen

    2014-01-01

    A new interactive computer simulation designed to teach sales ethics is described. Simulation learner objectives include gaining a better understanding of legal issues in selling; realizing that ethical dilemmas do arise in selling; realizing the need to be honest when selling; seeing that there are conflicting demands from a salesperson's…

  17. Software Tools: A One-Semester Secondary School Computer Course.

    Science.gov (United States)

    Bromley, John; Lakatos, John

    1985-01-01

    Provides a course outline, describes equipment and teacher requirements, discusses student evaluation and course outcomes, and details the computer programs used in a high school course. The course is designed to teach students use of the microcomputer as a tool through hands-on experience with a variety of commercial software programs. (MBR)

  18. COMPUTERS: Teraflops for Europe; EEC Working Group on High Performance Computing

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1991-03-15

    In little more than a decade, simulation on high performance computers has become an essential tool for theoretical physics, capable of solving a vast range of crucial problems inaccessible to conventional analytic mathematics. In many ways, computer simulation has become the calculus for interacting many-body systems, a key to the study of transitions from isolated to collective behaviour.

  19. COMPUTERS: Teraflops for Europe; EEC Working Group on High Performance Computing

    International Nuclear Information System (INIS)

    Anon.

    1991-01-01

    In little more than a decade, simulation on high performance computers has become an essential tool for theoretical physics, capable of solving a vast range of crucial problems inaccessible to conventional analytic mathematics. In many ways, computer simulation has become the calculus for interacting many-body systems, a key to the study of transitions from isolated to collective behaviour

  20. Computing paths and cycles in biological interaction graphs

    Directory of Open Access Journals (Sweden)

    von Kamp Axel

    2009-06-01

    Full Text Available Abstract Background Interaction graphs (signed directed graphs provide an important qualitative modeling approach for Systems Biology. They enable the analysis of causal relationships in cellular networks and can even be useful for predicting qualitative aspects of systems dynamics. Fundamental issues in the analysis of interaction graphs are the enumeration of paths and cycles (feedback loops and the calculation of shortest positive/negative paths. These computational problems have been discussed only to a minor extent in the context of Systems Biology and in particular the shortest signed paths problem requires algorithmic developments. Results We first review algorithms for the enumeration of paths and cycles and show that these algorithms are superior to a recently proposed enumeration approach based on elementary-modes computation. The main part of this work deals with the computation of shortest positive/negative paths, an NP-complete problem for which only very few algorithms are described in the literature. We propose extensions and several new algorithm variants for computing either exact results or approximations. Benchmarks with various concrete biological networks show that exact results can sometimes be obtained in networks with several hundred nodes. A class of even larger graphs can still be treated exactly by a new algorithm combining exhaustive and simple search strategies. For graphs, where the computation of exact solutions becomes time-consuming or infeasible, we devised an approximative algorithm with polynomial complexity. Strikingly, in realistic networks (where a comparison with exact results was possible this algorithm delivered results that are very close or equal to the exact values. This phenomenon can probably be attributed to the particular topology of cellular signaling and regulatory networks which contain a relatively low number of negative feedback loops. Conclusion The calculation of shortest positive

  1. The Jupyter/IPython architecture: a unified view of computational research, from interactive exploration to communication and publication.

    Science.gov (United States)

    Ragan-Kelley, M.; Perez, F.; Granger, B.; Kluyver, T.; Ivanov, P.; Frederic, J.; Bussonnier, M.

    2014-12-01

    IPython has provided terminal-based tools for interactive computing in Python since 2001. The notebook document format and multi-process architecture introduced in 2011 have expanded the applicable scope of IPython into teaching, presenting, and sharing computational work, in addition to interactive exploration. The new architecture also allows users to work in any language, with implementations in Python, R, Julia, Haskell, and several other languages. The language agnostic parts of IPython have been renamed to Jupyter, to better capture the notion that a cross-language design can encapsulate commonalities present in computational research regardless of the programming language being used. This architecture offers components like the web-based Notebook interface, that supports rich documents that combine code and computational results with text narratives, mathematics, images, video and any media that a modern browser can display. This interface can be used not only in research, but also for publication and education, as notebooks can be converted to a variety of output formats, including HTML and PDF. Recent developments in the Jupyter project include a multi-user environment for hosting notebooks for a class or research group, a live collaboration notebook via Google Docs, and better support for languages other than Python.

  2. Multimodal Information Presentation for High-Load Human Computer Interaction

    NARCIS (Netherlands)

    Cao, Y.

    2011-01-01

    This dissertation addresses multimodal information presentation in human computer interaction. Information presentation refers to the manner in which computer systems/interfaces present information to human users. More specifically, the focus of our work is not on which information to present, but

  3. Designer Modeling for Personalized Game Content Creation Tools

    DEFF Research Database (Denmark)

    Liapis, Antonios; Yannakakis, Georgios N.; Togelius, Julian

    2013-01-01

    preferences, goals and processes from their interaction with a computer-aided design tool, and suggests methods and domains within game development where such a model can be applied. We describe how designer modeling could be integrated with current work on automated and mixed-initiative content creation......With the growing use of automated content creation and computer-aided design tools in game development, there is potential for enhancing the design process through personalized interactions between the software and the game developer. This paper proposes designer modeling for capturing the designer’s......, and envision future directions which focus on personalizing the processes to a designer’s particular wishes....

  4. Aqueduct: an interactive tool to empower global water risk assessment

    Science.gov (United States)

    Reig, Paul; Gassert, Francis

    2013-04-01

    The Aqueduct Water Risk Atlas (Aqueduct) is a publicly available, global database and interactive tool that maps indicators of water related risks for decision makers worldwide. Aqueduct makes use of the latest geo-statistical modeling techniques to compute a composite index and translate the most recently available hydrological data into practical information on water related risks for companies, investors, and governments alike. Twelve global indicators are grouped into a Water Risk Framework designed in response to the growing concerns from private sector actors around water scarcity, water quality, climate change, and increasing demand for freshwater. The Aqueduct framework includes indicators of water stress, variability in supply, storage, flood, drought, groundwater, water quality and social conflict, addressing both spatial and temporal variation in water hazards. It organizes indicators into three categories of risk that bring together multiple dimensions of water related risk into comprehensive aggregated scores, which allow for dynamic weighting to capture users' unique exposure to water hazards. All information is compiled into an online, open access platform, from which decision-makers can view indicators, scores, and maps, conduct global risk assessments, and export data and shape files for further analysis. Companies can use this tool to evaluate their exposure to water risks across operations and supply chains, investors to assess water-related risks in their portfolio, and public-sector actors to better understand water security. Additionally, the open nature of the data and maps allow other organizations to build off of this effort with new research, for example in the areas of water-energy or water-food relationships. This presentation will showcase the Aqueduct Water Risk Atlas online tool and the features and functionalities it offers, as well as explain how it can be used for both private and public sector applications. The session will

  5. Development of Desktop Computing Applications and Engineering Tools on GPUs

    DEFF Research Database (Denmark)

    Sørensen, Hans Henrik Brandenborg; Glimberg, Stefan Lemvig; Hansen, Toke Jansen

    (GPUs) for high-performance computing applications and software tools in science and engineering, inverse problems, visualization, imaging, dynamic optimization. The goals are to contribute to the development of new state-of-the-art mathematical models and algorithms for maximum throughout performance...

  6. USE OF COMPUTER-AIDED PROCESS ENGINEERING TOOL IN POLLUTION PREVENTION

    Science.gov (United States)

    Computer-Aided Process Engineering has become established in industry as a design tool. With the establishment of the CAPE-OPEN software specifications for process simulation environments. CAPE-OPEN provides a set of "middleware" standards that enable software developers to acces...

  7. The MicroGrid: A Scientific Tool for Modeling Computational Grids

    Directory of Open Access Journals (Sweden)

    H.J. Song

    2000-01-01

    Full Text Available The complexity and dynamic nature of the Internet (and the emerging Computational Grid demand that middleware and applications adapt to the changes in configuration and availability of resources. However, to the best of our knowledge there are no simulation tools which support systematic exploration of dynamic Grid software (or Grid resource behavior. We describe our vision and initial efforts to build tools to meet these needs. Our MicroGrid simulation tools enable Globus applications to be run in arbitrary virtual grid resource environments, enabling broad experimentation. We describe the design of these tools, and their validation on micro-benchmarks, the NAS parallel benchmarks, and an entire Grid application. These validation experiments show that the MicroGrid can match actual experiments within a few percent (2% to 4%.

  8. Field-programmable custom computing technology architectures, tools, and applications

    CERN Document Server

    Luk, Wayne; Pocek, Ken

    2000-01-01

    Field-Programmable Custom Computing Technology: Architectures, Tools, and Applications brings together in one place important contributions and up-to-date research results in this fast-moving area. In seven selected chapters, the book describes the latest advances in architectures, design methods, and applications of field-programmable devices for high-performance reconfigurable systems. The contributors to this work were selected from the leading researchers and practitioners in the field. It will be valuable to anyone working or researching in the field of custom computing technology. It serves as an excellent reference, providing insight into some of the most challenging issues being examined today.

  9. Computational analysis of difenoconazole interaction with soil chitinases

    International Nuclear Information System (INIS)

    Vlǎdoiu, D L; Filimon, M N; Ostafe, V; Isvoran, A

    2015-01-01

    This study focusses on the investigation of the potential binding of the fungicide difenoconazole to soil chitinases using a computational approach. Computational characterization of the substrate binding sites of Serratia marcescens and Bacillus cereus chitinases using Fpocket tool reflects the role of hydrophobic residues for the substrate binding and the high local hydrophobic density of both sites. Molecular docking study reveals that difenoconazole is able to bind to Serratia marcescens and Bacillus cereus chitinases active sites, the binding energies being comparable

  10. Teachers' Use of Computational Tools to Construct and Explore Dynamic Mathematical Models

    Science.gov (United States)

    Santos-Trigo, Manuel; Reyes-Rodriguez, Aaron

    2011-01-01

    To what extent does the use of computational tools offer teachers the possibility of constructing dynamic models to identify and explore diverse mathematical relations? What ways of reasoning or thinking about the problems emerge during the model construction process that involves the use of the tools? These research questions guided the…

  11. Data analysis through interactive computer animation method (DATICAM)

    International Nuclear Information System (INIS)

    Curtis, J.N.; Schwieder, D.H.

    1983-01-01

    DATICAM is an interactive computer animation method designed to aid in the analysis of nuclear research data. DATICAM was developed at the Idaho National Engineering Laboratory (INEL) by EG and G Idaho, Inc. INEL analysts use DATICAM to produce computer codes that are better able to predict the behavior of nuclear power reactors. In addition to increased code accuracy, DATICAM has saved manpower and computer costs. DATICAM has been generalized to assist in the data analysis of virtually any data-producing dynamic process

  12. Computer programing for geosciences: Teach your students how to make tools

    Science.gov (United States)

    Grapenthin, Ronni

    2011-12-01

    When I announced my intention to pursue a Ph.D. in geophysics, some people gave me confused looks, because I was working on a master's degree in computer science at the time. My friends, like many incoming geoscience graduate students, have trouble linking these two fields. From my perspective, it is pretty straightforward: Much of geoscience evolves around novel analyses of large data sets that require custom tools—computer programs—to minimize the drudgery of manual data handling; other disciplines share this characteristic. While most faculty adapted to the need for tool development quite naturally, as they grew up around computer terminal interfaces, incoming graduate students lack intuitive understanding of programing concepts such as generalization and automation. I believe the major cause is the intuitive graphical user interfaces of modern operating systems and applications, which isolate the user from all technical details. Generally, current curricula do not recognize this gap between user and machine. For students to operate effectively, they require specialized courses teaching them the skills they need to make tools that operate on particular data sets and solve their specific problems. Courses in computer science departments are aimed at a different audience and are of limited help.

  13. Anvil Forecast Tool in the Advanced Weather Interactive Processing System

    Science.gov (United States)

    Barrett, Joe H., III; Hood, Doris

    2009-01-01

    Meteorologists from the 45th Weather Squadron (45 WS) and National Weather Service Spaceflight Meteorology Group (SMG) have identified anvil forecasting as one of their most challenging tasks when predicting the probability of violations of the Lightning Launch Commit Criteria and Space Shuttle Flight Rules. As a result, the Applied Meteorology Unit (AMU) was tasked to create a graphical overlay tool for the Meteorological Interactive Data Display System (MIDDS) that indicates the threat of thunderstorm anvil clouds, using either observed or model forecast winds as input. The tool creates a graphic depicting the potential location of thunderstorm anvils one, two, and three hours into the future. The locations are based on the average of the upper level observed or forecasted winds. The graphic includes 10 and 20 n mi standoff circles centered at the location of interest, as well as one-, two-, and three-hour arcs in the upwind direction. The arcs extend outward across a 30 sector width based on a previous AMU study that determined thunderstorm anvils move in a direction plus or minus 15 of the upper-level wind direction. The AMU was then tasked to transition the tool to the Advanced Weather Interactive Processing System (AWIPS). SMG later requested the tool be updated to provide more flexibility and quicker access to model data. This presentation describes the work performed by the AMU to transition the tool into AWIPS, as well as the subsequent improvements made to the tool.

  14. Computer-aided system design

    Science.gov (United States)

    Walker, Carrie K.

    1991-01-01

    A technique has been developed for combining features of a systems architecture design and assessment tool and a software development tool. This technique reduces simulation development time and expands simulation detail. The Architecture Design and Assessment System (ADAS), developed at the Research Triangle Institute, is a set of computer-assisted engineering tools for the design and analysis of computer systems. The ADAS system is based on directed graph concepts and supports the synthesis and analysis of software algorithms mapped to candidate hardware implementations. Greater simulation detail is provided by the ADAS functional simulator. With the functional simulator, programs written in either Ada or C can be used to provide a detailed description of graph nodes. A Computer-Aided Software Engineering tool developed at the Charles Stark Draper Laboratory (CSDL CASE) automatically generates Ada or C code from engineering block diagram specifications designed with an interactive graphical interface. A technique to use the tools together has been developed, which further automates the design process.

  15. Simulation of Robot Kinematics Using Interactive Computer Graphics.

    Science.gov (United States)

    Leu, M. C.; Mahajan, R.

    1984-01-01

    Development of a robot simulation program based on geometric transformation softwares available in most computer graphics systems and program features are described. The program can be extended to simulate robots coordinating with external devices (such as tools, fixtures, conveyors) using geometric transformations to describe the…

  16. Agent assisted interactive algorithm for computationally demanding multiobjective optimization problems

    OpenAIRE

    Ojalehto, Vesa; Podkopaev, Dmitry; Miettinen, Kaisa

    2015-01-01

    We generalize the applicability of interactive methods for solving computationally demanding, that is, time-consuming, multiobjective optimization problems. For this purpose we propose a new agent assisted interactive algorithm. It employs a computationally inexpensive surrogate problem and four different agents that intelligently update the surrogate based on the preferences specified by a decision maker. In this way, we decrease the waiting times imposed on the decision maker du...

  17. Visualization and Interaction in Research, Teaching, and Scientific Communication

    Science.gov (United States)

    Ammon, C. J.

    2017-12-01

    Modern computing provides many tools for exploring observations, numerical calculations, and theoretical relationships. The number of options is, in fact, almost overwhelming. But the choices provide those with modest programming skills opportunities to create unique views of scientific information and to develop deeper insights into their data, their computations, and the underlying theoretical data-model relationships. I present simple examples of using animation and human-computer interaction to explore scientific data and scientific-analysis approaches. I illustrate how valuable a little programming ability can free scientists from the constraints of existing tools and can facilitate the development of deeper appreciation data and models. I present examples from a suite of programming languages ranging from C to JavaScript including the Wolfram Language. JavaScript is valuable for sharing tools and insight (hopefully) with others because it is integrated into one of the most powerful communication tools in human history, the web browser. Although too much of that power is often spent on distracting advertisements, the underlying computation and graphics engines are efficient, flexible, and almost universally available in desktop and mobile computing platforms. Many are working to fulfill the browser's potential to become the most effective tool for interactive study. Open-source frameworks for visualizing everything from algorithms to data are available, but advance rapidly. One strategy for dealing with swiftly changing tools is to adopt common, open data formats that are easily adapted (often by framework or tool developers). I illustrate the use of animation and interaction in research and teaching with examples from earthquake seismology.

  18. Une approche pragmatique cognitive de l'interaction personne/système informatisé A Cognitive Pragmatic Approach of Human/Computer Interaction

    Directory of Open Access Journals (Sweden)

    Madeleine Saint-Pierre

    1998-06-01

    Full Text Available Dans cet article, nous proposons une approche inférentielle de l'interaction humain/ordinateur. C'est par la prise en compte de l'activité cognitive de l'utilisateur pendant son travail avec un système que nous voulons comprendre ce type d'interaction. Ceci mènera à une véritable évaluation des interfaces/utilisateurs et pourra servir de guide pour des interfaces en développement. Nos analyses décrivent le processus inférentiel impliqué dans le contexte dynamique d'exécution de tâche, grâce à une catégorisation de l'activité cognitive issue des verbalisations recueillies auprès d'utilisateurs qui " pensent à haute voix " en travaillant. Nous présentons des instruments méthodologiques mis au point dans notre recherche pour l'analyses et la catégorisation des protocoles. Les résultats sont interprétés dans le cadre de la théorie de la pertinence de Sperber et Wilson (1995 en termes d'effort cognitif dans le traitement des objets (linguistique, iconique, graphique... apparaissant à l'écran et d'effet cognitif de ces derniers. Cette approche est généralisable à tout autre contexte d'interaction humain/ordinateur comme, par exemple, le télé-apprentissage.This article proposes an inferential approach for the study of human/computer interaction. It is by taking into account the user's cognitive activity while working at a computer that we propose to understand this interaction. This approach leads to a real user/interface evaluation and, hopefully, will serve as guidelines for the design of new interfaces. Our analysis describe the inferential process involved in the dynamics of task performance. The cognitive activity of the user is grasped by the mean of a " thinking aloud " method through which the user is asked to verbalize while working at the computer. Tools developped by our research team for the categorization of the verbal protocols are presented. The results are interpreted within the relevance theory

  19. Computational Tools and Algorithms for Designing Customized Synthetic Genes

    Energy Technology Data Exchange (ETDEWEB)

    Gould, Nathan [Department of Computer Science, The College of New Jersey, Ewing, NJ (United States); Hendy, Oliver [Department of Biology, The College of New Jersey, Ewing, NJ (United States); Papamichail, Dimitris, E-mail: papamicd@tcnj.edu [Department of Computer Science, The College of New Jersey, Ewing, NJ (United States)

    2014-10-06

    Advances in DNA synthesis have enabled the construction of artificial genes, gene circuits, and genomes of bacterial scale. Freedom in de novo design of synthetic constructs provides significant power in studying the impact of mutations in sequence features, and verifying hypotheses on the functional information that is encoded in nucleic and amino acids. To aid this goal, a large number of software tools of variable sophistication have been implemented, enabling the design of synthetic genes for sequence optimization based on rationally defined properties. The first generation of tools dealt predominantly with singular objectives such as codon usage optimization and unique restriction site incorporation. Recent years have seen the emergence of sequence design tools that aim to evolve sequences toward combinations of objectives. The design of optimal protein-coding sequences adhering to multiple objectives is computationally hard, and most tools rely on heuristics to sample the vast sequence design space. In this review, we study some of the algorithmic issues behind gene optimization and the approaches that different tools have adopted to redesign genes and optimize desired coding features. We utilize test cases to demonstrate the efficiency of each approach, as well as identify their strengths and limitations.

  20. Computational Tools and Algorithms for Designing Customized Synthetic Genes

    International Nuclear Information System (INIS)

    Gould, Nathan; Hendy, Oliver; Papamichail, Dimitris

    2014-01-01

    Advances in DNA synthesis have enabled the construction of artificial genes, gene circuits, and genomes of bacterial scale. Freedom in de novo design of synthetic constructs provides significant power in studying the impact of mutations in sequence features, and verifying hypotheses on the functional information that is encoded in nucleic and amino acids. To aid this goal, a large number of software tools of variable sophistication have been implemented, enabling the design of synthetic genes for sequence optimization based on rationally defined properties. The first generation of tools dealt predominantly with singular objectives such as codon usage optimization and unique restriction site incorporation. Recent years have seen the emergence of sequence design tools that aim to evolve sequences toward combinations of objectives. The design of optimal protein-coding sequences adhering to multiple objectives is computationally hard, and most tools rely on heuristics to sample the vast sequence design space. In this review, we study some of the algorithmic issues behind gene optimization and the approaches that different tools have adopted to redesign genes and optimize desired coding features. We utilize test cases to demonstrate the efficiency of each approach, as well as identify their strengths and limitations.

  1. An interactive, web-based tool for genealogical entity resolution

    NARCIS (Netherlands)

    Efremova, I.; Ranjbar-Sahraei, B.; Oliehoek, F.A.; Calders, T.G.K.; Tuyls, K.P.

    2013-01-01

    We demonstrate an interactive, web-based tool which helps historians to do Genealogical Entitiy Resolution. This work has two main goals. First, it uses Machine Learning (ML) algorithms to assist humanites researchers to perform Genealogical Entity Resolution. Second, it facilitates the generation

  2. Computers and the internet: tools for youth empowerment.

    Science.gov (United States)

    Valaitis, Ruta K

    2005-10-04

    Youth are often disenfranchised in their communities and may feel they have little voice. Since computers are an important aspect of youth culture, they may offer solutions to increasing youth participation in communities. This qualitative case study investigated the perceptions of 19 (predominantly female) inner-city school youth about their use of computers and the Internet in a school-based community development project. Youth working with public health nurses in a school-based community development project communicated with local community members using computer-mediated communication, surveyed peers online, built websites, searched for information online, and prepared project materials using computers and the Internet. Participant observation, semistructured interviews, analysis of online messages, and online- and paper-based surveys were used to gather data about youth's and adults' perceptions and use of the technologies. Constant comparison method and between-method triangulation were used in the analysis to satisfy the existence of themes. Not all youth were interested in working with computers. Some electronic messages from adults were perceived to be critical, and writing to adults was intimidating for some youth. In addition, technical problems were experienced. Despite these barriers, most youth perceived that using computers and the Internet reduced their anxiety concerning communication with adults, increased their control when dealing with adults, raised their perception of their social status, increased participation within the community, supported reflective thought, increased efficiency, and improved their access to resources. Overall, youth perceived computers and the Internet to be empowering tools, and they should be encouraged to use such technology to support them in community initiatives.

  3. Computational simulations of the interaction of water waves with pitching flap-type ocean wave energy converters

    Science.gov (United States)

    Pathak, Ashish; Raessi, Mehdi

    2016-11-01

    Using an in-house computational framework, we have studied the interaction of water waves with pitching flap-type ocean wave energy converters (WECs). The computational framework solves the full 3D Navier-Stokes equations and captures important effects, including the fluid-solid interaction, the nonlinear and viscous effects. The results of the computational tool, is first compared against the experimental data on the response of a flap-type WEC in a wave tank, and excellent agreement is demonstrated. Further simulations at the model and prototype scales are presented to assess the validity of the Froude scaling. The simulations are used to address some important questions, such as the validity range of common WEC modeling approaches that rely heavily on the Froude scaling and the inviscid potential flow theory. Additionally, the simulations examine the role of the Keulegan-Carpenter (KC) number, which is often used as a measure of relative importance of viscous drag on bodies exposed to oscillating flows. The performance of the flap-type WECs is investigated at various KC numbers to establish the relationship between the viscous drag and KC number for such geometry. That is of significant importance because such relationship only exists for simple geometries, e.g., a cylinder. Support from the National Science Foundation is gratefully acknowledged.

  4. A Framework for the Evaluation of CASE Tool Learnability in Educational Environments

    Science.gov (United States)

    Senapathi, Mali

    2005-01-01

    The aim of the research is to derive a framework for the evaluation of Computer Aided Software Engineering (CASE) tool learnability in educational environments. Drawing from the literature of Human Computer Interaction and educational research, a framework for evaluating CASE tool learnability in educational environments is derived. The two main…

  5. The BioFragment Database (BFDb): An open-data platform for computational chemistry analysis of noncovalent interactions

    Science.gov (United States)

    Burns, Lori A.; Faver, John C.; Zheng, Zheng; Marshall, Michael S.; Smith, Daniel G. A.; Vanommeslaeghe, Kenno; MacKerell, Alexander D.; Merz, Kenneth M.; Sherrill, C. David

    2017-10-01

    Accurate potential energy models are necessary for reliable atomistic simulations of chemical phenomena. In the realm of biomolecular modeling, large systems like proteins comprise very many noncovalent interactions (NCIs) that can contribute to the protein's stability and structure. This work presents two high-quality chemical databases of common fragment interactions in biomolecular systems as extracted from high-resolution Protein DataBank crystal structures: 3380 sidechain-sidechain interactions and 100 backbone-backbone interactions that inaugurate the BioFragment Database (BFDb). Absolute interaction energies are generated with a computationally tractable explicitly correlated coupled cluster with perturbative triples [CCSD(T)-F12] "silver standard" (0.05 kcal/mol average error) for NCI that demands only a fraction of the cost of the conventional "gold standard," CCSD(T) at the complete basis set limit. By sampling extensively from biological environments, BFDb spans the natural diversity of protein NCI motifs and orientations. In addition to supplying a thorough assessment for lower scaling force-field (2), semi-empirical (3), density functional (244), and wavefunction (45) methods (comprising >1M interaction energies), BFDb provides interactive tools for running and manipulating the resulting large datasets and offers a valuable resource for potential energy model development and validation.

  6. Cognition beyond the brain computation, interactivity and human artifice

    CERN Document Server

    Cowley, Stephen J

    2013-01-01

    Arguing that a collective dimension has given cognitive flexibility to human intelligence, this book shows that traditional cognitive psychology underplays the role of bodies, dialogue, diagrams, tools, talk, customs, habits, computers and cultural practices.

  7. Animal-Computer Interaction: Animal-Centred, Participatory, and Playful Design

    NARCIS (Netherlands)

    Pons, Patricia; Hirskyj-Douglas, Ilyena; Nijholt, Antinus; Cheok, Adrian D.; Spink, Andrew; Riedel, Gernot; Zhou, Liting; Teekens, Lisanne; Albatal, Rami; Gurrin, Cathal

    In recent years there has been growing interest in developing technology to improve animal's wellbeing and to support the interaction of animals within the digital world. The field of Animal-Computer Interaction (ACI) considers animals as the end-users of the technology being developed, orienting

  8. Proceedings of the Third International Conference on Intelligent Human Computer Interaction

    CERN Document Server

    Pokorný, Jaroslav; Snášel, Václav; Abraham, Ajith

    2013-01-01

    The Third International Conference on Intelligent Human Computer Interaction 2011 (IHCI 2011) was held at Charles University, Prague, Czech Republic from August 29 - August 31, 2011. This conference was third in the series, following IHCI 2009 and IHCI 2010 held in January at IIIT Allahabad, India. Human computer interaction is a fast growing research area and an attractive subject of interest for both academia and industry. There are many interesting and challenging topics that need to be researched and discussed. This book aims to provide excellent opportunities for the dissemination of interesting new research and discussion about presented topics. It can be useful for researchers working on various aspects of human computer interaction. Topics covered in this book include user interface and interaction, theoretical background and applications of HCI and also data mining and knowledge discovery as a support of HCI applications.

  9. Digital Interactive Narrative Tools for Facilitating Communication with Children During Counseling

    DEFF Research Database (Denmark)

    Baceviciute, Sarune; Albæk, Katharina R.R.; Arsovski, Aleksandar

    2012-01-01

    the reconciliation between free-play and narratives afforded by interactive digital tools in order to promote children‟s engagement. We present a digital interactive narrative application integrated with a “step-by-step” guide to the counselor, which could be adapted to many different situations and contexts where...

  10. Multiyear interactive computer almanac, 1800-2050

    CERN Document Server

    United States. Naval Observatory

    2005-01-01

    The Multiyear Interactive Computer Almanac (MICA Version 2.2.2 ) is a software system that runs on modern versions of Windows and Macintosh computers created by the U.S. Naval Observatory's Astronomical Applications Department, especially for astronomers, surveyors, meteorologists, navigators and others who regularly need accurate information on the positions, motions, and phenomena of celestial objects. MICA produces high-precision astronomical data in tabular form, tailored for the times and locations specified by the user. Unlike traditional almanacs, MICA computes these data in real time, eliminating the need for table look-ups and additional hand calculations. MICA tables can be saved as standard text files, enabling their use in other applications. Several important new features have been added to this edition of MICA, including: extended date coverage from 1800 to 2050; a redesigned user interface; a graphical sky map; a phenomena calculator (eclipses, transits, equinoxes, solstices, conjunctions, oppo...

  11. Enrichment of Human-Computer Interaction in Brain-Computer Interfaces via Virtual Environments

    Directory of Open Access Journals (Sweden)

    Alonso-Valerdi Luz María

    2017-01-01

    Full Text Available Tridimensional representations stimulate cognitive processes that are the core and foundation of human-computer interaction (HCI. Those cognitive processes take place while a user navigates and explores a virtual environment (VE and are mainly related to spatial memory storage, attention, and perception. VEs have many distinctive features (e.g., involvement, immersion, and presence that can significantly improve HCI in highly demanding and interactive systems such as brain-computer interfaces (BCI. BCI is as a nonmuscular communication channel that attempts to reestablish the interaction between an individual and his/her environment. Although BCI research started in the sixties, this technology is not efficient or reliable yet for everyone at any time. Over the past few years, researchers have argued that main BCI flaws could be associated with HCI issues. The evidence presented thus far shows that VEs can (1 set out working environmental conditions, (2 maximize the efficiency of BCI control panels, (3 implement navigation systems based not only on user intentions but also on user emotions, and (4 regulate user mental state to increase the differentiation between control and noncontrol modalities.

  12. GIS-based interactive tool to map the advent of world conquerors

    Science.gov (United States)

    Lakkaraju, Mahesh

    The objective of this thesis is to show the scale and extent of some of the greatest empires the world has ever seen. This is a hybrid project between the GIS based interactive tool and the web-based JavaScript tool. This approach lets the students learn effectively about the emperors themselves while understanding how long and far their empires spread. In the GIS based tool, a map is displayed with various points on it, and when a user clicks on one point, the relevant information of what happened at that particular place is displayed. Apart from this information, users can also select the interactive animation button and can walk through a set of battles in chronological order. As mentioned, this uses Java as the main programming language, and MOJO (Map Objects Java Objects) provided by ESRI. MOJO is very effective as its GIS related features can be included in the application itself. This app. is a simple tool and has been developed for university or high school level students. D3.js is an interactive animation and visualization platform built on the Javascript framework. Though HTML5, CSS3, Javascript and SVG animations can be used to derive custom animations, this tool can help bring out results with less effort and more ease of use. Hence, it has become the most sought after visualization tool for multiple applications. D3.js has provided a map-based visualization feature so that we can easily display text-based data in a map-based interface. To draw the map and the points on it, D3.js uses data rendered in TOPO JSON format. The latitudes and longitudes can be provided, which are interpolated into the Map svg. One of the main advantages of doing it this way is that more information is retained when we use a visual medium.

  13. Unraveling the web of viroinformatics: computational tools and databases in virus research.

    Science.gov (United States)

    Sharma, Deepak; Priyadarshini, Pragya; Vrati, Sudhanshu

    2015-02-01

    The beginning of the second century of research in the field of virology (the first virus was discovered in 1898) was marked by its amalgamation with bioinformatics, resulting in the birth of a new domain--viroinformatics. The availability of more than 100 Web servers and databases embracing all or specific viruses (for example, dengue virus, influenza virus, hepatitis virus, human immunodeficiency virus [HIV], hemorrhagic fever virus [HFV], human papillomavirus [HPV], West Nile virus, etc.) as well as distinct applications (comparative/diversity analysis, viral recombination, small interfering RNA [siRNA]/short hairpin RNA [shRNA]/microRNA [miRNA] studies, RNA folding, protein-protein interaction, structural analysis, and phylotyping and genotyping) will definitely aid the development of effective drugs and vaccines. However, information about their access and utility is not available at any single source or on any single platform. Therefore, a compendium of various computational tools and resources dedicated specifically to virology is presented in this article. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  14. PPI finder: a mining tool for human protein-protein interactions.

    Directory of Open Access Journals (Sweden)

    Min He

    Full Text Available BACKGROUND: The exponential increase of published biomedical literature prompts the use of text mining tools to manage the information overload automatically. One of the most common applications is to mine protein-protein interactions (PPIs from PubMed abstracts. Currently, most tools in mining PPIs from literature are using co-occurrence-based approaches or rule-based approaches. Hybrid methods (frame-based approaches by combining these two methods may have better performance in predicting PPIs. However, the predicted PPIs from these methods are rarely evaluated by known PPI databases and co-occurred terms in Gene Ontology (GO database. METHODOLOGY/PRINCIPAL FINDINGS: We here developed a web-based tool, PPI Finder, to mine human PPIs from PubMed abstracts based on their co-occurrences and interaction words, followed by evidences in human PPI databases and shared terms in GO database. Only 28% of the co-occurred pairs in PubMed abstracts appeared in any of the commonly used human PPI databases (HPRD, BioGRID and BIND. On the other hand, of the known PPIs in HPRD, 69% showed co-occurrences in the literature, and 65% shared GO terms. CONCLUSIONS: PPI Finder provides a useful tool for biologists to uncover potential novel PPIs. It is freely accessible at http://liweilab.genetics.ac.cn/tm/.

  15. Interactive granular computations in networks and systems engineering a practical perspective

    CERN Document Server

    Jankowski, Andrzej

    2017-01-01

    The book outlines selected projects conducted under the supervision of the author. Moreover, it discusses significant relations between Interactive Granular Computing (IGrC) and numerous dynamically developing scientific domains worldwide, along with features characteristic of the author’s approach to IGrC. The results presented are a continuation and elaboration of various aspects of Wisdom Technology, initiated and developed in cooperation with Professor Andrzej Skowron. Based on the empirical findings from these projects, the author explores the following areas: (a) understanding the causes of the theory and practice gap problem (TPGP) in complex systems engineering (CSE);(b) generalizing computing models of complex adaptive systems (CAS) (in particular, natural computing models) by constructing an interactive granular computing (IGrC) model of networks of interrelated interacting complex granules (c-granules), belonging to a single agent and/or to a group of agents; (c) developing methodologies based ...

  16. Physics vs. computer science

    International Nuclear Information System (INIS)

    Pike, R.

    1982-01-01

    With computers becoming more frequently used in theoretical and experimental physics, physicists can no longer afford to be ignorant of the basic techniques and results of computer science. Computing principles belong in a physicist's tool box, along with experimental methods and applied mathematics, and the easiest way to educate physicists in computing is to provide, as part of the undergraduate curriculum, a computing course designed specifically for physicists. As well, the working physicist should interact with computer scientists, giving them challenging problems in return for their expertise. (orig.)

  17. Computational Tools for RF Structure Design

    CERN Document Server

    Jensen, E

    2004-01-01

    The Finite Differences Method and the Finite Element Method are the two principally employed numerical methods in modern RF field simulation programs. The basic ideas behind these methods are explained, with regard to available simulation programs. We then go through a list of characteristic parameters of RF structures, explaining how they can be calculated using these tools. With the help of these parameters, we introduce the frequency-domain and the time-domain calculations, leading to impedances and wake-fields, respectively. Subsequently, we present some readily available computer programs, which are in use for RF structure design, stressing their distinctive features and limitations. One final example benchmarks the precision of different codes for calculating the eigenfrequency and Q of a simple cavity resonator.

  18. Process Machine Interactions Predicition and Manipulation of Interactions between Manufacturing Processes and Machine Tool Structures

    CERN Document Server

    Hollmann, Ferdinand

    2013-01-01

    This contributed volume collects the scientific results of the DFG Priority Program 1180 Prediction and Manipulation of Interactions between Structure and Process. The research program has been conducted during the years 2005 and 2012, whereas the primary goal was the analysis of the interactions between processes and structures in modern production facilities. This book presents the findings of the 20 interdisciplinary subprojects, focusing on different manufacturing processes such as high performance milling, tool grinding or metal forming. It contains experimental investigations as well as mathematical modeling of production processes and machine interactions. New experimental advancements and novel simulation approaches are also included.

  19. Computer simulation of spacecraft/environment interaction

    International Nuclear Information System (INIS)

    Krupnikov, K.K.; Makletsov, A.A.; Mileev, V.N.; Novikov, L.S.; Sinolits, V.V.

    1999-01-01

    This report presents some examples of a computer simulation of spacecraft interaction with space environment. We analysed a set data on electron and ion fluxes measured in 1991-1994 on geostationary satellite GORIZONT-35. The influence of spacecraft eclipse and device eclipse by solar-cell panel on spacecraft charging was investigated. A simple method was developed for an estimation of spacecraft potentials in LEO. Effects of various particle flux impact and spacecraft orientation are discussed. A computer engineering model for a calculation of space radiation is presented. This model is used as a client/server model with WWW interface, including spacecraft model description and results representation based on the virtual reality markup language

  20. Computer simulation of spacecraft/environment interaction

    CERN Document Server

    Krupnikov, K K; Mileev, V N; Novikov, L S; Sinolits, V V

    1999-01-01

    This report presents some examples of a computer simulation of spacecraft interaction with space environment. We analysed a set data on electron and ion fluxes measured in 1991-1994 on geostationary satellite GORIZONT-35. The influence of spacecraft eclipse and device eclipse by solar-cell panel on spacecraft charging was investigated. A simple method was developed for an estimation of spacecraft potentials in LEO. Effects of various particle flux impact and spacecraft orientation are discussed. A computer engineering model for a calculation of space radiation is presented. This model is used as a client/server model with WWW interface, including spacecraft model description and results representation based on the virtual reality markup language.

  1. Interactive exploratory data analysis tool in Alzheimer’s disease

    Directory of Open Access Journals (Sweden)

    Diana Furcila

    2015-04-01

    Thus, MorExAn provide us the possibility to relate histopathological data with neuropsychological and clinical variables. The aid of this interactive visualization tool brings us the possibility to find unexpected conclusions beyond the insight provided by simple statistics analysis, as well as to improve neuroscientists’ productivity.

  2. CalSimHydro Tool - A Web-based interactive tool for the CalSim 3.0 Hydrology Prepropessor

    Science.gov (United States)

    Li, P.; Stough, T.; Vu, Q.; Granger, S. L.; Jones, D. J.; Ferreira, I.; Chen, Z.

    2011-12-01

    CalSimHydro, the CalSim 3.0 Hydrology Preprocessor, is an application designed to automate the various steps in the computation of hydrologic inputs for CalSim 3.0, a water resources planning model developed jointly by California State Department of Water Resources and United States Bureau of Reclamation, Mid-Pacific Region. CalSimHydro consists of a five-step FORTRAN based program that runs the individual models in succession passing information from one model to the next and aggregating data as required by each model. The final product of CalSimHydro is an updated CalSim 3.0 state variable (SV) DSS input file. CalSimHydro consists of (1) a Rainfall-Runoff Model to compute monthly infiltration, (2) a Soil moisture and demand calculator (IDC) that estimates surface runoff, deep percolation, and water demands for natural vegetation cover and various crops other than rice, (3) a Rice Water Use Model to compute the water demands, deep percolation, irrigation return flow, and runoff from precipitation for the rice fields, (4) a Refuge Water Use Model that simulates the ponding operations for managed wetlands, and (5) a Data Aggregation and Transfer Module to aggregate the outputs from the above modules and transfer them to the CalSim SV input file. In this presentation, we describe a web-based user interface for CalSimHydro using Google Earth Plug-In. The CalSimHydro tool allows users to - interact with geo-referenced layers of the Water Budget Areas (WBA) and Demand Units (DU) displayed over the Sacramento Valley, - view the input parameters of the hydrology preprocessor for a selected WBA or DU in a time series plot or a tabular form, - edit the values of the input parameters in the table or by downloading a spreadsheet of the selected parameter in a selected time range, - run the CalSimHydro modules in the backend server and notify the user when the job is done, - visualize the model output and compare it with a base run result, - download the output SV file to be

  3. The Human-Computer Interaction of Cross-Cultural Gaming Strategy

    Science.gov (United States)

    Chakraborty, Joyram; Norcio, Anthony F.; Van Der Veer, Jacob J.; Andre, Charles F.; Miller, Zachary; Regelsberger, Alexander

    2015-01-01

    This article explores the cultural dimensions of the human-computer interaction that underlies gaming strategies. The article is a desktop study of existing literature and is organized into five sections. The first examines the cultural aspects of knowledge processing. The social constructs technology interaction is discussed. Following this, the…

  4. Child computer interaction SIG: towards sustainable thinking and being

    NARCIS (Netherlands)

    Read, J.; Hourcade, J.P.; Markopoulos, P.; Iversen, O.S.

    The discipline of Child Computer Interaction (CCI) has been steadily growing and it is now firmly established as a community in its own right, having the annual IDC (Interaction and Design for Children) conference and its own journal and also enjoying its role as a highly recognisable and vibrant

  5. Non-Linear Interactive Stories in Computer Games

    DEFF Research Database (Denmark)

    Bangsø, Olav; Jensen, Ole Guttorm; Kocka, Tomas

    2003-01-01

    The paper introduces non-linear interactive stories (NOLIST) as a means to generate varied and interesting stories for computer games automatically. We give a compact representation of a NOLIST based on the specification of atomic stories, and show how to build an object-oriented Bayesian network...

  6. Interactive computer graphics for bio-stereochemical modelling

    Indian Academy of Sciences (India)

    Proc, Indian Acad. Sci., Vol. 87 A (Chem. Sci.), No. 4, April 1978, pp. 95-113, (e) printed in India. Interactive computer graphics for bio-stereochemical modelling. ROBERT REIN, SHLOMONIR, KAREN HAYDOCK and. ROBERTD MACELROY. Department of Experimental Pathology, Roswell Park Memorial Institute,. 666 Elm ...

  7. Conceptual design of pipe whip restraints using interactive computer analysis

    International Nuclear Information System (INIS)

    Rigamonti, G.; Dainora, J.

    1975-01-01

    Protection against pipe break effects necessitates a complex interaction between failure mode analysis, piping layout, and structural design. Many iterations are required to finalize structural designs and equipment arrangements. The magnitude of the pipe break loads transmitted by the pipe whip restraints to structural embedments precludes the application of conservative design margins. A simplified analytical formulation of the nonlinear dynamic problems associated with pipe whip has been developed and applied using interactive computer analysis techniques. In the dynamic analysis, the restraint and the associated portion of the piping system, are modeled using the finite element lumped mass approach to properly reflect the dynamic characteristics of the piping/restraint system. The analysis is performed as a series of piecewise linear increments. Each of these linear increments is terminated by either formation of plastic conditions or closing/opening of gaps. The stiffness matrix is modified to reflect the changed stiffness characteristics of the system and re-started using the previous boundary conditions. The formation of yield hinges are related to the plastic moment of the section and unloading paths are automatically considered. The conceptual design of the piping/restraint system is performed using interactive computer analysis. The application of the simplified analytical approach with interactive computer analysis results in an order of magnitude reduction in engineering time and computer cost. (Auth.)

  8. Computational Tools and Algorithms for Designing Customized Synthetic Genes

    Directory of Open Access Journals (Sweden)

    Nathan eGould

    2014-10-01

    Full Text Available Advances in DNA synthesis have enabled the construction of artificial genes, gene circuits, and genomes of bacterial scale. Freedom in de-novo design of synthetic constructs provides significant power in studying the impact of mutations in sequence features, and verifying hypotheses on the functional information that is encoded in nucleic and amino acids. To aid this goal, a large number of software tools of variable sophistication have been implemented, enabling the design of synthetic genes for sequence optimization based on rationally defined properties. The first generation of tools dealt predominantly with singular objectives such as codon usage optimization and unique restriction site incorporation. Recent years have seen the emergence of sequence design tools that aim to evolve sequences toward combinations of objectives. The design of optimal protein coding sequences adhering to multiple objectives is computationally hard, and most tools rely on heuristics to sample the vast sequence design space. In this review we study some of the algorithmic issues behind gene optimization and the approaches that different tools have adopted to redesign genes and optimize desired coding features. We utilize test cases to demonstrate the efficiency of each approach, as well as identify their strengths and limitations.

  9. The Design of Tools for Sketching Sensor-Based Interaction

    DEFF Research Database (Denmark)

    Brynskov, Martin; Lunding, Rasmus; Vestergaard, Lasse Steenbock

    2012-01-01

    , flexibility and cost, aimed at wearable and ultra-mobile prototyping where fast reaction is needed (e.g. in controlling sound), and we discuss the general issues facing this category of embodied interaction design tools. We then present the platform in more detail, both regarding hard- ware and software...

  10. Systematic Methods and Tools for Computer Aided Modelling

    DEFF Research Database (Denmark)

    Fedorova, Marina

    and processes can be faster, cheaper and very efficient. The developed modelling framework involves five main elements: 1) a modelling tool, that includes algorithms for model generation; 2) a template library, which provides building blocks for the templates (generic models previously developed); 3) computer......-format and COM-objects, are incorporated to allow the export and import of mathematical models; 5) a user interface that provides the work-flow and data-flow to guide the user through the different modelling tasks....

  11. CADRE-SS, an in Silico Tool for Predicting Skin Sensitization Potential Based on Modeling of Molecular Interactions.

    Science.gov (United States)

    Kostal, Jakub; Voutchkova-Kostal, Adelina

    2016-01-19

    Using computer models to accurately predict toxicity outcomes is considered to be a major challenge. However, state-of-the-art computational chemistry techniques can now be incorporated in predictive models, supported by advances in mechanistic toxicology and the exponential growth of computing resources witnessed over the past decade. The CADRE (Computer-Aided Discovery and REdesign) platform relies on quantum-mechanical modeling of molecular interactions that represent key biochemical triggers in toxicity pathways. Here, we present an external validation exercise for CADRE-SS, a variant developed to predict the skin sensitization potential of commercial chemicals. CADRE-SS is a hybrid model that evaluates skin permeability using Monte Carlo simulations, assigns reactive centers in a molecule and possible biotransformations via expert rules, and determines reactivity with skin proteins via quantum-mechanical modeling. The results were promising with an overall very good concordance of 93% between experimental and predicted values. Comparison to performance metrics yielded by other tools available for this endpoint suggests that CADRE-SS offers distinct advantages for first-round screenings of chemicals and could be used as an in silico alternative to animal tests where permissible by legislative programs.

  12. Tools For Interactive Learning And Self-Management Of Diabetes

    Science.gov (United States)

    Capelo, Rita; Baptista, Carla; Figueiredo, Júlia; Carrilho, Francisco; Furtado, Pedro

    2015-05-01

    Diabetes is a widespread disease and its control is dependent upon the patient. Although there is no permanent cure for diabetes, there are several available treatments which, when followed regularly, allow the patient to have a good quality of life. Patient education, especially about eating habits, is key to keep glucose levels stable both in the short and in the long term. This should include nutritional counselling, physical exercise, and the self monitoring of glucose levels. The University of Coimbra and the Serviço de Endocrinologia, Diabetes e Metabolismo of Centro Hospitalar e Universitário de Coimbra started a collaboration to develop interactive tools for the learning and improvement of carbohydrate counting by patients. The approach presented in this paper is an interactive multimedia tool, available to patients through either the web or a smartphone. It helps them to learn how to maintain a healthy diet and how to monitor their insulin levels correctly by measuring the carbo-hidrate “equivalents” in meals. This application will create a more dynamic and interactive way of educating patients, improving solutions currently used in the Serviço de Endocrinologia, Diabetes e Metabolismo of the Centro Hospitalar e Universitário de Coimbra.

  13. Interactive Rhythm Learning System by Combining Tablet Computers and Robots

    Directory of Open Access Journals (Sweden)

    Chien-Hsing Chou

    2017-03-01

    Full Text Available This study proposes a percussion learning device that combines tablet computers and robots. This device comprises two systems: a rhythm teaching system, in which users can compose and practice rhythms by using a tablet computer, and a robot performance system. First, teachers compose the rhythm training contents on the tablet computer. Then, the learners practice these percussion exercises by using the tablet computer and a small drum set. The teaching system provides a new and user-friendly score editing interface for composing a rhythm exercise. It also provides a rhythm rating function to facilitate percussion training for children and improve the stability of rhythmic beating. To encourage children to practice percussion exercises, a robotic performance system is used to interact with the children; this system can perform percussion exercises for students to listen to and then help them practice the exercise. This interaction enhances children’s interest and motivation to learn and practice rhythm exercises. The results of experimental course and field trials reveal that the proposed system not only increases students’ interest and efficiency in learning but also helps them in understanding musical rhythms through interaction and composing simple rhythms.

  14. Computational Modeling of Arc-Slag Interaction in DC Furnaces

    Science.gov (United States)

    Reynolds, Quinn G.

    2017-02-01

    The plasma arc is central to the operation of the direct-current arc furnace, a unit operation commonly used in high-temperature processing of both primary ores and recycled metals. The arc is a high-velocity, high-temperature jet of ionized gas created and sustained by interactions among the thermal, momentum, and electromagnetic fields resulting from the passage of electric current. In addition to being the primary source of thermal energy, the arc jet also couples mechanically with the bath of molten process material within the furnace, causing substantial splashing and stirring in the region in which it impinges. The arc's interaction with the molten bath inside the furnace is studied through use of a multiphase, multiphysics computational magnetohydrodynamic model developed in the OpenFOAM® framework. Results from the computational solver are compared with empirical correlations that account for arc-slag interaction effects.

  15. Validity of tools used for surveying physicians about their interactions with pharmaceutical company: a systematic review.

    Science.gov (United States)

    Lotfi, Tamara; Morsi, Rami Z; Zmeter, Nada; Godah, Mohammad W; Alkhaled, Lina; Kahale, Lara A; Nass, Hala; Brax, Hneine; Fadlallah, Racha; Akl, Elie A

    2015-11-25

    There is evidence that physicians' prescription behavior is negatively affected by the extent of their interactions with pharmaceutical companies. In order to develop and implement policies and interventions for better management of interactions, we need to understand physicians' perspectives on this issue. Surveys addressing physicians' interactions with pharmaceutical companies need to use validated tools to ensure the validity of their findings. To assess the validity of tools used in surveys of physicians about the extent and nature of their interactions with pharmaceutical companies, and about their knowledge, beliefs and attitudes towards such interactions; and to identify those tools that have been formally validated. We developed a search strategy with the assistance of a medical librarian. We electronically searched MEDLINE and EMBASE databases in September 2015. Teams of two reviewers conducted data selection and data abstraction. They identified eligible studies in one table and then abstracted the relevant data from the studies with validated tools in another table. Tables were piloted and standardized. We identified one validated questionnaire out of the 11 assessing the nature and extent of the interaction, and three validated questionnaires out of the 47 assessing knowledge, beliefs and attitudes of physicians toward the interaction. None of these validated questionnaires were used in more than one survey. The available supporting evidence of the issue of physicians' interaction with pharmaceutical company is of low quality. There is a need for research to develop and validate tools to survey physicians about their interactions with pharmaceutical companies.

  16. Documentation of Appliances & Interaction Devices

    DEFF Research Database (Denmark)

    2004-01-01

    The interaction devices and appliances explored in the WorkSPACE project, address spatial computing in the context of work. We have developed and explored a range of appliances and interaction devices. The scope has been to develop tools for support of collaboration by mixing digital and physical...

  17. Software tools for the particle accelerator designs

    International Nuclear Information System (INIS)

    Sugimoto, Masayoshi

    1988-01-01

    The software tools used for the designs of the particle accelerators are going to be implemented on the small computer systems, such as the personal computers or the work stations. These are called from the interactive environment like a window application program. The environment contains the small expert system to make easy to select the design parameters. (author)

  18. r.avaflow v1, an advanced open-source computational framework for the propagation and interaction of two-phase mass flows

    Science.gov (United States)

    Mergili, Martin; Fischer, Jan-Thomas; Krenn, Julia; Pudasaini, Shiva P.

    2017-02-01

    r.avaflow represents an innovative open-source computational tool for routing rapid mass flows, avalanches, or process chains from a defined release area down an arbitrary topography to a deposition area. In contrast to most existing computational tools, r.avaflow (i) employs a two-phase, interacting solid and fluid mixture model (Pudasaini, 2012); (ii) is suitable for modelling more or less complex process chains and interactions; (iii) explicitly considers both entrainment and stopping with deposition, i.e. the change of the basal topography; (iv) allows for the definition of multiple release masses, and/or hydrographs; and (v) serves with built-in functionalities for validation, parameter optimization, and sensitivity analysis. r.avaflow is freely available as a raster module of the GRASS GIS software, employing the programming languages Python and C along with the statistical software R. We exemplify the functionalities of r.avaflow by means of two sets of computational experiments: (1) generic process chains consisting in bulk mass and hydrograph release into a reservoir with entrainment of the dam and impact downstream; (2) the prehistoric Acheron rock avalanche, New Zealand. The simulation results are generally plausible for (1) and, after the optimization of two key parameters, reasonably in line with the corresponding observations for (2). However, we identify some potential to enhance the analytic and numerical concepts. Further, thorough parameter studies will be necessary in order to make r.avaflow fit for reliable forward simulations of possible future mass flow events.

  19. Improving Critical Thinking with Interactive Mobile Tools and Apps

    Science.gov (United States)

    Lin, Lin; Widdall, Chris; Ward, Laurie

    2014-01-01

    In this article, the authors describe how integrating interactive mobile tools into elementary pedagogy can generate enthusiasm and critical thinking among students as they learn about the world. The activities described took place over the course of six one-hour periods spanning six days. These activities address three major social studies…

  20. Computer aided systems human engineering: A hypermedia tool

    Science.gov (United States)

    Boff, Kenneth R.; Monk, Donald L.; Cody, William J.

    1992-01-01

    The Computer Aided Systems Human Engineering (CASHE) system, Version 1.0, is a multimedia ergonomics database on CD-ROM for the Apple Macintosh II computer, being developed for use by human system designers, educators, and researchers. It will initially be available on CD-ROM and will allow users to access ergonomics data and models stored electronically as text, graphics, and audio. The CASHE CD-ROM, Version 1.0 will contain the Boff and Lincoln (1988) Engineering Data Compendium, MIL-STD-1472D and a unique, interactive simulation capability, the Perception and Performance Prototyper. Its features also include a specialized data retrieval, scaling, and analysis capability and the state of the art in information retrieval, browsing, and navigation.

  1. Computer- Aided Design in Power Engineering Application of Software Tools

    CERN Document Server

    Stojkovic, Zlatan

    2012-01-01

    This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents  application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel & Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems ...

  2. Primary care physicians’ perspectives on computer-based health risk assessment tools for chronic diseases: a mixed methods study

    Directory of Open Access Journals (Sweden)

    Teja Voruganti

    2015-09-01

    Full Text Available Background Health risk assessment tools compute an individual’s risk of developing a disease. Routine use of such tools by primary care physicians (PCPs is potentially useful in chronic disease prevention. We sought physicians’ awareness and perceptions of the usefulness, usability and feasibility of performing assessments with computer-based risk assessment tools in primary care settings.Methods Focus groups and usability testing with a computer-based risk assessment tool were conducted with PCPs from both university-affiliated and community-based practices. Analysis was derived from grounded theory methodology.Results PCPs (n = 30 were aware of several risk assessment tools although only select tools were used routinely. The decision to use a tool depended on how use impacted practice workflow and whether the tool had credibility. Participants felt that embedding tools in the electronic medical records (EMRs system might allow for health information from the medical record to auto-populate into the tool. User comprehension of risk could also be improved with computer-based interfaces that present risk in different formats.Conclusions In this study, PCPs chose to use certain tools more regularly because of usability and credibility. Despite there being differences in the particular tools a clinical practice used, there was general appreciation for the usefulness of tools for different clinical situations. Participants characterised particular features of an ideal tool, feeling strongly that embedding risk assessment tools in the EMR would maximise accessibility and use of the tool for chronic disease management. However, appropriate practice workflow integration and features that facilitate patient understanding at point-of-care are also essential. 

  3. Interactive computer-enhanced remote viewing system

    Energy Technology Data Exchange (ETDEWEB)

    Tourtellott, J.A.; Wagner, J.F. [Mechanical Technology Incorporated, Latham, NY (United States)

    1995-10-01

    Remediation activities such as decontamination and decommissioning (D&D) typically involve materials and activities hazardous to humans. Robots are an attractive way to conduct such remediation, but for efficiency they need a good three-dimensional (3-D) computer model of the task space where they are to function. This model can be created from engineering plans and architectural drawings and from empirical data gathered by various sensors at the site. The model is used to plan robotic tasks and verify that selected paths are clear of obstacles. This report describes the development of an Interactive Computer-Enhanced Remote Viewing System (ICERVS), a software system to provide a reliable geometric description of a robotic task space, and enable robotic remediation to be conducted more effectively and more economically.

  4. Design and Implementation of a Cloud Computing Adoption Decision Tool: Generating a Cloud Road

    Science.gov (United States)

    Bildosola, Iñaki; Río-Belver, Rosa; Cilleruelo, Ernesto; Garechana, Gaizka

    2015-01-01

    Migrating to cloud computing is one of the current enterprise challenges. This technology provides a new paradigm based on “on-demand payment” for information and communication technologies. In this sense, the small and medium enterprise is supposed to be the most interested, since initial investments are avoided and the technology allows gradual implementation. However, even if the characteristics and capacities have been widely discussed, entry into the cloud is still lacking in terms of practical, real frameworks. This paper aims at filling this gap, presenting a real tool already implemented and tested, which can be used as a cloud computing adoption decision tool. This tool uses diagnosis based on specific questions to gather the required information and subsequently provide the user with valuable information to deploy the business within the cloud, specifically in the form of Software as a Service (SaaS) solutions. This information allows the decision makers to generate their particular Cloud Road. A pilot study has been carried out with enterprises at a local level with a two-fold objective: to ascertain the degree of knowledge on cloud computing and to identify the most interesting business areas and their related tools for this technology. As expected, the results show high interest and low knowledge on this subject and the tool presented aims to readdress this mismatch, insofar as possible. PMID:26230400

  5. Design and Implementation of a Cloud Computing Adoption Decision Tool: Generating a Cloud Road.

    Directory of Open Access Journals (Sweden)

    Iñaki Bildosola

    Full Text Available Migrating to cloud computing is one of the current enterprise challenges. This technology provides a new paradigm based on "on-demand payment" for information and communication technologies. In this sense, the small and medium enterprise is supposed to be the most interested, since initial investments are avoided and the technology allows gradual implementation. However, even if the characteristics and capacities have been widely discussed, entry into the cloud is still lacking in terms of practical, real frameworks. This paper aims at filling this gap, presenting a real tool already implemented and tested, which can be used as a cloud computing adoption decision tool. This tool uses diagnosis based on specific questions to gather the required information and subsequently provide the user with valuable information to deploy the business within the cloud, specifically in the form of Software as a Service (SaaS solutions. This information allows the decision makers to generate their particular Cloud Road. A pilot study has been carried out with enterprises at a local level with a two-fold objective: to ascertain the degree of knowledge on cloud computing and to identify the most interesting business areas and their related tools for this technology. As expected, the results show high interest and low knowledge on this subject and the tool presented aims to readdress this mismatch, insofar as possible.

  6. Design and Implementation of a Cloud Computing Adoption Decision Tool: Generating a Cloud Road.

    Science.gov (United States)

    Bildosola, Iñaki; Río-Belver, Rosa; Cilleruelo, Ernesto; Garechana, Gaizka

    2015-01-01

    Migrating to cloud computing is one of the current enterprise challenges. This technology provides a new paradigm based on "on-demand payment" for information and communication technologies. In this sense, the small and medium enterprise is supposed to be the most interested, since initial investments are avoided and the technology allows gradual implementation. However, even if the characteristics and capacities have been widely discussed, entry into the cloud is still lacking in terms of practical, real frameworks. This paper aims at filling this gap, presenting a real tool already implemented and tested, which can be used as a cloud computing adoption decision tool. This tool uses diagnosis based on specific questions to gather the required information and subsequently provide the user with valuable information to deploy the business within the cloud, specifically in the form of Software as a Service (SaaS) solutions. This information allows the decision makers to generate their particular Cloud Road. A pilot study has been carried out with enterprises at a local level with a two-fold objective: to ascertain the degree of knowledge on cloud computing and to identify the most interesting business areas and their related tools for this technology. As expected, the results show high interest and low knowledge on this subject and the tool presented aims to readdress this mismatch, insofar as possible.

  7. Twenty-First Century Learning: Communities, Interaction and Ubiquitous Computing

    Science.gov (United States)

    Leh, Amy S.C.; Kouba, Barbara; Davis, Dirk

    2005-01-01

    Advanced technology makes 21st century learning, communities and interactions unique and leads people to an era of ubiquitous computing. The purpose of this article is to contribute to the discussion of learning in the 21st century. The paper will review literature on learning community, community learning, interaction, 21st century learning and…

  8. Enrichr: interactive and collaborative HTML5 gene list enrichment analysis tool.

    Science.gov (United States)

    Chen, Edward Y; Tan, Christopher M; Kou, Yan; Duan, Qiaonan; Wang, Zichen; Meirelles, Gabriela Vaz; Clark, Neil R; Ma'ayan, Avi

    2013-04-15

    System-wide profiling of genes and proteins in mammalian cells produce lists of differentially expressed genes/proteins that need to be further analyzed for their collective functions in order to extract new knowledge. Once unbiased lists of genes or proteins are generated from such experiments, these lists are used as input for computing enrichment with existing lists created from prior knowledge organized into gene-set libraries. While many enrichment analysis tools and gene-set libraries databases have been developed, there is still room for improvement. Here, we present Enrichr, an integrative web-based and mobile software application that includes new gene-set libraries, an alternative approach to rank enriched terms, and various interactive visualization approaches to display enrichment results using the JavaScript library, Data Driven Documents (D3). The software can also be embedded into any tool that performs gene list analysis. We applied Enrichr to analyze nine cancer cell lines by comparing their enrichment signatures to the enrichment signatures of matched normal tissues. We observed a common pattern of up regulation of the polycomb group PRC2 and enrichment for the histone mark H3K27me3 in many cancer cell lines, as well as alterations in Toll-like receptor and interlukin signaling in K562 cells when compared with normal myeloid CD33+ cells. Such analyses provide global visualization of critical differences between normal tissues and cancer cell lines but can be applied to many other scenarios. Enrichr is an easy to use intuitive enrichment analysis web-based tool providing various types of visualization summaries of collective functions of gene lists. Enrichr is open source and freely available online at: http://amp.pharm.mssm.edu/Enrichr.

  9. Using Social Media Sentiment Analysis for Interaction Design Choices

    DEFF Research Database (Denmark)

    McGuire, Mark; Kampf, Constance Elizabeth

    2015-01-01

    Social media analytics is an emerging skill for organizations. Currently, developers are exploring ways to create tools for simplifying social media analysis. These tools tend to focus on gathering data, and using systems to make it meaningful. However, we contend that making social media data...... meaningful is by nature a human-computer interaction problem. We examine this problem around the emerging field of sentiment analysis, exploring criteria for designing sentiment analysis systems based in Human Computer interaction, HCI. We contend that effective sentiment analysis affects audience analysis...

  10. Computer aided plant engineering: An analysis and suggestions for computer use

    International Nuclear Information System (INIS)

    Leinemann, K.

    1979-09-01

    To get indications to and boundary conditions for computer use in plant engineering, an analysis of the engineering process was done. The structure of plant engineering is represented by a network of substaks and subsets of data which are to be manipulated. Main tool for integration of CAD-subsystems in plant engineering should be a central database which is described by characteristical requirements and a possible simple conceptual schema. The main features of an interactive system for computer aided plant engineering are shortly illustrated by two examples. The analysis leads to the conclusion, that an interactive graphic system for manipulation of net-like structured data, usable for various subtasks, should be the base for computer aided plant engineering. (orig.) [de

  11. The Computer as a Tool for Learning through Reflection. Technical Report No. 376.

    Science.gov (United States)

    Collins, Allan; Brown, John Seely

    Because of its ability to record and represent process, the computer can provide a powerful, motivating, and as yet untapped tool for focusing the students' attention directly on their own thought processes and learning through reflection. Properly abstracted and structured, the computational medium can capture the processes by which a novice or…

  12. Essential Means for Urban Computing: Specification of Web-Based Computing Platforms for Urban Planning, a Hitchhiker’s Guide

    OpenAIRE

    Pirouz Nourian; Carlos Martinez-Ortiz; Ken Arroyo Ohori

    2018-01-01

    This article provides an overview of the specifications of web-based computing platforms for urban data analytics and computational urban planning practice. There are currently a variety of tools and platforms that can be used in urban computing practices, including scientific computing languages, interactive web languages, data sharing platforms and still many desktop computing environments, e.g., GIS software applications. We have reviewed a list of technologies considering their potential ...

  13. Interactive, technology-enhanced self-regulated learning tools in healthcare education: a literature review.

    Science.gov (United States)

    Petty, Julia

    2013-01-01

    Learning technology is increasingly being implemented for programmes of blended learning within nurse education. With a growing emphasis on self-directed study particularly in post-basic education, there is a need for learners to be guided in their learning away from practice and limited classroom time. Technology-enabled (TE) tools which engage learners actively can play a part in this. The effectiveness and value of interactive TE learning strategies within healthcare is the focus of this paper. To identify literature that explores the effectiveness of interactive, TE tools on knowledge acquisition and learner satisfaction within healthcare with a view to evaluating their use for post-basic nurse education. A Literature Review was performed focusing on papers exploring the comparative value and perceived benefit of TE tools compared to traditional modes of learning within healthcare. The Databases identified as most suitable due to their relevance to healthcare were accessed through EBSCOhost. Primary, Boolean and advanced searches on key terms were undertaken. Inclusion and exclusion criteria were applied which resulted in a final selection of 11 studies for critique. Analysis of the literature found that knowledge acquisition in most cases was enhanced and measured learner satisfaction was generally positive for interactive, self-regulated TE tools. However, TE education may not suit all learners and this is critiqued in the light of the identified limitations. Interactive self regulation and/or testing can be a valuable learning strategy that can be incorporated into self-directed programmes of study for post-registration learners. Whilst acknowledging the learning styles not suited to such tools, the concurrent use of self-directed TE tools with those learning strategies necessitating a more social presence can work together to support enhancement of knowledge required to deliver rationale for nursing practice. Copyright © 2012 Elsevier Ltd. All rights

  14. Computer-Assisted Visual Search/Decision Aids as a Training Tool for Mammography

    National Research Council Canada - National Science Library

    Nodine, Calvin

    1999-01-01

    The primary goal of the project is to develop a computer-assisted visual search (CAVS) mammography training tool that will improve the perceptual and cognitive skills of trainees leading to mammographic expertise...

  15. Computer-Assisted Visual Search/Decision Aids as a Training Tool for Mammography

    National Research Council Canada - National Science Library

    Nodine, Calvin

    1998-01-01

    The primary goal of the project is to develop a computer-assisted visual search (CAVS) mammography training tool that will improve the perceptual and cognitive skills of trainees leading to mammographic expertise...

  16. Computer-Assisted Visual Search/Decision Aids as a Training Tool for Mammography

    National Research Council Canada - National Science Library

    Nodine, Calvin

    2000-01-01

    The primary goal of the project is to develop a computer-assisted visual search (CAVS) mammography training tool that will improve the perceptual and cognitive skills of trainees leading to mammographic expertise...

  17. INTERACTIVE LEARNING: ADVANTAGES AND DISADVANTAGES

    Directory of Open Access Journals (Sweden)

    O. Kustovska

    2016-10-01

    Full Text Available In the article the use of interactive technologies in the educational process of the university, allowing students to develop innovative thinking, away from stereotypes, develop imagination, communication skills and expertise, intellectual, emotional, motivational and other areas of personality. Implementing the principles of technological learning, interactive educational technology and provides interactive computer learning tools, and interactivity of educational process when the basic conceptual provisions defined training based on interactive communication.

  18. Video analysis of projectile motion using tablet computers as experimental tools

    Science.gov (United States)

    Klein, P.; Gröber, S.; Kuhn, J.; Müller, A.

    2014-01-01

    Tablet computers were used as experimental tools to record and analyse the motion of a ball thrown vertically from a moving skateboard. Special applications plotted the measurement data component by component, allowing a simple determination of initial conditions and g in order to explore the underlying laws of motion. This experiment can easily be performed by students themselves, providing more autonomy in their problem-solving processes than traditional learning approaches. We believe that this autonomy and the authenticity of the experimental tool both foster their motivation.

  19. Development of computer-based analytical tool for assessing physical protection system

    Energy Technology Data Exchange (ETDEWEB)

    Mardhi, Alim, E-mail: alim-m@batan.go.id [National Nuclear Energy Agency Indonesia, (BATAN), PUSPIPTEK area, Building 80, Serpong, Tangerang Selatan, Banten (Indonesia); Chulalongkorn University, Faculty of Engineering, Nuclear Engineering Department, 254 Phayathai Road, Pathumwan, Bangkok Thailand. 10330 (Thailand); Pengvanich, Phongphaeth, E-mail: ppengvan@gmail.com [Chulalongkorn University, Faculty of Engineering, Nuclear Engineering Department, 254 Phayathai Road, Pathumwan, Bangkok Thailand. 10330 (Thailand)

    2016-01-22

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer–based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system’s detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  20. Development of computer-based analytical tool for assessing physical protection system

    International Nuclear Information System (INIS)

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-01

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer–based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system’s detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure

  1. Human-Computer Interaction and Information Management Research Needs

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — In a visionary future, Human-Computer Interaction HCI and Information Management IM have the potential to enable humans to better manage their lives through the use...

  2. A Real-Time Plagiarism Detection Tool for Computer-Based Assessments

    Science.gov (United States)

    Jeske, Heimo J.; Lall, Manoj; Kogeda, Okuthe P.

    2018-01-01

    Aim/Purpose: The aim of this article is to develop a tool to detect plagiarism in real time amongst students being evaluated for learning in a computer-based assessment setting. Background: Cheating or copying all or part of source code of a program is a serious concern to academic institutions. Many academic institutions apply a combination of…

  3. Prediction of surgical view of neurovascular decompression using interactive computer graphics.

    Science.gov (United States)

    Kin, Taichi; Oyama, Hiroshi; Kamada, Kyousuke; Aoki, Shigeki; Ohtomo, Kuni; Saito, Nobuhito

    2009-07-01

    To assess the value of an interactive visualization method for detecting the offending vessels in neurovascular compression syndrome in patients with facial spasm and trigeminal neuralgia. Computer graphics models are created by fusion of fast imaging employing steady-state acquisition and magnetic resonance angiography. High-resolution magnetic resonance angiography and fast imaging employing steady-state acquisition were performed preoperatively in 17 patients with neurovascular compression syndromes (facial spasm, n = 10; trigeminal neuralgia, n = 7) using a 3.0-T magnetic resonance imaging scanner. Computer graphics models were created with computer software and observed interactively for detection of offending vessels by rotation, enlargement, reduction, and retraction on a graphic workstation. Two-dimensional images were reviewed by 2 radiologists blinded to the clinical details, and 2 neurosurgeons predicted the offending vessel with the interactive visualization method before surgery. Predictions from the 2 imaging approaches were compared with surgical findings. The vessels identified during surgery were assumed to be the true offending vessels. Offending vessels were identified correctly in 16 of 17 patients (94%) using the interactive visualization method and in 10 of 17 patients using 2-dimensional images. These data demonstrated a significant difference (P = 0.015 by Fisher's exact method). The interactive visualization method data corresponded well with surgical findings (surgical field, offending vessels, and nerves). Virtual reality 3-dimensional computer graphics using fusion magnetic resonance angiography and fast imaging employing steady-state acquisition may be helpful for preoperative simulation.

  4. 7th International Workshop on Parallel Tools for High Performance Computing

    CERN Document Server

    Gracia, José; Nagel, Wolfgang; Resch, Michael

    2014-01-01

    Current advances in High Performance Computing (HPC) increasingly impact efficient software development workflows. Programmers for HPC applications need to consider trends such as increased core counts, multiple levels of parallelism, reduced memory per core, and I/O system challenges in order to derive well performing and highly scalable codes. At the same time, the increasing complexity adds further sources of program defects. While novel programming paradigms and advanced system libraries provide solutions for some of these challenges, appropriate supporting tools are indispensable. Such tools aid application developers in debugging, performance analysis, or code optimization and therefore make a major contribution to the development of robust and efficient parallel software. This book introduces a selection of the tools presented and discussed at the 7th International Parallel Tools Workshop, held in Dresden, Germany, September 3-4, 2013.  

  5. Computer Generated Optical Illusions: A Teaching and Research Tool.

    Science.gov (United States)

    Bailey, Bruce; Harman, Wade

    Interactive computer-generated simulations that highlight psychological principles were investigated in this study in which 33 female and 19 male undergraduate college student volunteers of median age 21 matched line and circle sizes in six variations of Ponzo's illusion. Prior to working with the illusions, data were collected based on subjects'…

  6. Interactive computer-enhanced remote viewing system

    International Nuclear Information System (INIS)

    Tourtellott, J.A.; Wagner, J.F.

    1995-01-01

    Remediation activities such as decontamination and decommissioning (D ampersand D) typically involve materials and activities hazardous to humans. Robots are an attractive way to conduct such remediation, but for efficiency they need a good three-dimensional (3-D) computer model of the task space where they are to function. This model can be created from engineering plans and architectural drawings and from empirical data gathered by various sensors at the site. The model is used to plan robotic tasks and verify that selected paths are clear of obstacles. This report describes the development of an Interactive Computer-Enhanced Remote Viewing System (ICERVS), a software system to provide a reliable geometric description of a robotic task space, and enable robotic remediation to be conducted more effectively and more economically

  7. Advances in Computational Fluid-Structure Interaction and Flow Simulation Conference

    CERN Document Server

    Takizawa, Kenji

    2016-01-01

    This contributed volume celebrates the work of Tayfun E. Tezduyar on the occasion of his 60th birthday. The articles it contains were born out of the Advances in Computational Fluid-Structure Interaction and Flow Simulation (AFSI 2014) conference, also dedicated to Prof. Tezduyar and held at Waseda University in Tokyo, Japan on March 19-21, 2014. The contributing authors represent a group of international experts in the field who discuss recent trends and new directions in computational fluid dynamics (CFD) and fluid-structure interaction (FSI). Organized into seven distinct parts arranged by thematic topics, the papers included cover basic methods and applications of CFD, flows with moving boundaries and interfaces, phase-field modeling, computer science and high-performance computing (HPC) aspects of flow simulation, mathematical methods, biomedical applications, and FSI. Researchers, practitioners, and advanced graduate students working on CFD, FSI, and related topics will find this collection to be a defi...

  8. Talking with the alien: interaction with computers in the GP consultation.

    Science.gov (United States)

    Dowell, Anthony; Stubbe, Maria; Scott-Dowell, Kathy; Macdonald, Lindsay; Dew, Kevin

    2013-01-01

    This study examines New Zealand GPs' interaction with computers in routine consultations. Twenty-eight video-recorded consultations from 10 GPs were analysed in micro-detail to explore: (i) how doctors divide their time and attention between computer and patient; (ii) the different roles ascribed to the computer; and (iii) how computer use influences the interactional flow of the consultation. All GPs engaged with the computer in some way for at least 20% of each consultation, and on average spent 12% of time totally focussed on the computer. Patterns of use varied; most GPs inputted all or most notes during the consultation, but a few set aside dedicated time afterwards. The computer acted as an additional participant enacting roles like information repository and legitimiser of decisions. Computer use also altered some of the normal 'rules of engagement' between doctor and patient. Long silences and turning away interrupted the smooth flow of conversation, but various 'multitasking' strategies allowed GPs to remain engaged with patients during episodes of computer use (e.g. signposting, online commentary, verbalising while typing, social chat). Conclusions were that use of computers has many benefits but also significantly influences the fine detail of the GP consultation. Doctors must consciously develop strategies to manage this impact.

  9. Programming Interactivity

    CERN Document Server

    Noble, Joshua

    2012-01-01

    Ready to create rich interactive experiences with your artwork, designs, or prototypes? This is the ideal place to start. With this hands-on guide, you'll explore several themes in interactive art and design-including 3D graphics, sound, physical interaction, computer vision, and geolocation-and learn the basic programming and electronics concepts you need to implement them. No previous experience is necessary. You'll get a complete introduction to three free tools created specifically for artists and designers: the Processing programming language, the Arduino microcontroller, and the openFr

  10. The DiaCog: A Prototype Tool for Visualizing Online Dialog Games' Interactions

    Science.gov (United States)

    Yengin, Ilker; Lazarevic, Bojan

    2014-01-01

    This paper proposes and explains the design of a prototype learning tool named the DiaCog. The DiaCog visualizes dialog interactions within an online dialog game by using dynamically created cognitive maps. As a purposefully designed tool for enhancing learning effectiveness the DiaCog might be applicable to dialogs at discussion boards within a…

  11. Assessing Affordances of Selected Cloud Computing Tools for Language Teacher Education in Nigeria

    Science.gov (United States)

    Ofemile, Abdulmalik Yusuf

    2015-01-01

    This paper reports part of a study that hoped to understand Teacher Educators' (TE) assessment of the affordances of selected cloud computing tools ranked among the top 100 for the year 2010. Research has shown that ICT and by extension cloud computing has positive impacts on daily life and this informed the Nigerian government's policy to…

  12. A computer tool for daily application of the linear quadratic model

    International Nuclear Information System (INIS)

    Macias Jaen, J.; Galan Montenegro, P.; Bodineau Gil, C.; Wals Zurita, A.; Serradilla Gil, A.M.

    2001-01-01

    The aim of this paper is to indicate the relevance of the criteria A.S.A.R.A. (As Short As Reasonably Achievable) in the optimization of a fractionated radiotherapy schedule and the presentation of a Windows computer program as an easy tool in order to: Evaluate the Biological Equivalent Dose (BED) in a fractionated schedule; Make comparison between different treatments; Compensate a treatment when a delay has been happened with a version of the Linear Quadratic model that has into account the factor of accelerated repopulation. Conclusions: Delays in the normal radiotherapy schedule are items that have to be controlled as much as possible because it is able to be a very important parameter in order to release a good application of treatment, principally when the tumour is fast growing. It is necessary to evaluate them. ASARA criteria is useful to indicate the relevance of this aspect. Also, computer tools like this one could help us in order to achieve this. (author)

  13. Computational analysis of RNA-protein interaction interfaces via the Voronoi diagram.

    Science.gov (United States)

    Mahdavi, Sedigheh; Mohades, Ali; Salehzadeh Yazdi, Ali; Jahandideh, Samad; Masoudi-Nejad, Ali

    2012-01-21

    Cellular functions are mediated by various biological processes including biomolecular interactions, such as protein-protein, DNA-protein and RNA-protein interactions in which RNA-Protein interactions are indispensable for many biological processes like cell development and viral replication. Unlike the protein-protein and protein-DNA interactions, accurate mechanisms and structures of the RNA-Protein complexes are not fully understood. A large amount of theoretical evidence have shown during the past several years that computational geometry is the first pace in understanding the binding profiles and plays a key role in the study of intricate biological structures, interactions and complexes. In this paper, RNA-Protein interaction interface surface is computed via the weighted Voronoi diagram of atoms. Using two filter operations provides a natural definition for interface atoms as classic methods. Unbounded parts of Voronoi facets that are far from the complex are trimmed using modified convex hull of atom centers. This algorithm is implemented to a database with different RNA-Protein complexes extracted from Protein Data Bank (PDB). Afterward, the features of interfaces have been computed and compared with classic method. The results show high correlation coefficients between interface size in the Voronoi model and the classical model based on solvent accessibility, as well as high accuracy and precision in comparison to classical model. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. Computer aided composition by means of interactive GP

    DEFF Research Database (Denmark)

    Ando, Daichi; Dahlstedt, Palle; Nordahl, Mats G.

    2006-01-01

    Research on the application of Interactive Evolutionary Computation (IEC) to the field of musical computation has been improved in recent years, marking an interesting parallel to the current trend of applying human characteristics or sensitivities to computer systems. However, past techniques...... developed for IEC-based composition have not necessarily proven very effective for professional use. This is due to the large difference between data representation used by IEC and authored classical music composition. To solve this difficulties, we purpose a new IEC approach to music composition based...... on classical music theory. In this paper, we describe an established system according to the above idea, and detail of making success of composition a piece....

  15. What is needed to implement a computer-assisted health risk assessment tool? An exploratory concept mapping study

    Directory of Open Access Journals (Sweden)

    Ahmad Farah

    2012-12-01

    Full Text Available Abstract Background Emerging eHealth tools could facilitate the delivery of comprehensive care in time-constrained clinical settings. One such tool is interactive computer-assisted health-risk assessments (HRA, which may improve provider-patient communication at the point of care, particularly for psychosocial health concerns, which remain under-detected in clinical encounters. The research team explored the perspectives of healthcare providers representing a variety of disciplines (physicians, nurses, social workers, allied staff regarding the factors required for implementation of an interactive HRA on psychosocial health. Methods The research team employed a semi-qualitative participatory method known as Concept Mapping, which involved three distinct phases. First, in face-to-face and online brainstorming sessions, participants responded to an open-ended central question: “What factors should be in place within your clinical setting to support an effective computer-assisted screening tool for psychosocial risks?” The brainstormed items were consolidated by the research team. Then, in face-to-face and online sorting sessions, participants grouped the items thematically as ‘it made sense to them’. Participants also rated each item on a 5-point scale for its ‘importance’ and ‘action feasibility’ over the ensuing six month period. The sorted and rated data was analyzed using multidimensional scaling and hierarchical cluster analyses which produced visual maps. In the third and final phase, the face-to-face Interpretation sessions, the concept maps were discussed and illuminated by participants collectively. Results Overall, 54 providers participated (emergency care 48%; primary care 52%. Participants brainstormed 196 items thought to be necessary for the implementation of an interactive HRA emphasizing psychosocial health. These were consolidated by the research team into 85 items. After sorting and rating, cluster analysis

  16. The Use of Computer Tools in the Design Process of Students’ Architectural Projects. Case Studies in Algeria

    Science.gov (United States)

    Saighi, Ouafa; Salah Zerouala, Mohamed

    2017-12-01

    This The paper particularly deals with the way in which computer tools are used by students in their design studio’s projects. Four institutions of architecture education in Algeria are considered as a case study to evaluate the impact of such tools on student design process. This aims to inspect in depth such use, to sort out its advantages and shortcomings in order to suggest some solutions. A field survey was undertaken on a sample of students and their teachers at the same institutions. The analysed results mainly show that computer tools are highly focusing on improving the quality of drawings representation and images seeking observers’ satisfaction hence influencing their decision. Some teachers are not very keen to overuse the computer during the design phase; they prefer the “traditional” approach. This is the present situation that Algerian university is facing which leads to conflict and disagreement between students and teachers. Meanwhile, there was no doubt that computer tools have effectively contributed to improve the competitive level among students.

  17. Advanced computational tools and methods for nuclear analyses of fusion technology systems

    International Nuclear Information System (INIS)

    Fischer, U.; Chen, Y.; Pereslavtsev, P.; Simakov, S.P.; Tsige-Tamirat, H.; Loughlin, M.; Perel, R.L.; Petrizzi, L.; Tautges, T.J.; Wilson, P.P.H.

    2005-01-01

    An overview is presented of advanced computational tools and methods developed recently for nuclear analyses of Fusion Technology systems such as the experimental device ITER ('International Thermonuclear Experimental Reactor') and the intense neutron source IFMIF ('International Fusion Material Irradiation Facility'). These include Monte Carlo based computational schemes for the calculation of three-dimensional shut-down dose rate distributions, methods, codes and interfaces for the use of CAD geometry models in Monte Carlo transport calculations, algorithms for Monte Carlo based sensitivity/uncertainty calculations, as well as computational techniques and data for IFMIF neutronics and activation calculations. (author)

  18. Animal computer interaction (ACI) & designing for animal interaction (AXD)

    DEFF Research Database (Denmark)

    Morrison, Ann Judith; Turner, Jane; Farley, Helen

    2017-01-01

    This workshop invites researchers and practitioners from HCI and related fields who work in some capacity with animals and who recognise the sentient nature of their being. We call for those who want to better understand how to work with animals and learn from them. We are a small team looking...... to build an Australian chapter of the Animal Computer Interaction Community. The workshop will elicit discussion, forge new partnerships and head up a new group on the state of the art within this field in Australia, including comparative international studies. For more information see http://www.ozaci.org/...

  19. Assess/Mitigate Risk through the Use of Computer-Aided Software Engineering (CASE) Tools

    Science.gov (United States)

    Aguilar, Michael L.

    2013-01-01

    The NASA Engineering and Safety Center (NESC) was requested to perform an independent assessment of the mitigation of the Constellation Program (CxP) Risk 4421 through the use of computer-aided software engineering (CASE) tools. With the cancellation of the CxP, the assessment goals were modified to capture lessons learned and best practices in the use of CASE tools. The assessment goal was to prepare the next program for the use of these CASE tools. The outcome of the assessment is contained in this document.

  20. Learning Music via Tangible and Corporeal Interaction

    DEFF Research Database (Denmark)

    Valente, Andrea; Jensen, Karl Kristoffer

    2008-01-01

    to consider an existing teaching tool from the computer science domain, computational cards, and modify it to cope with the specific problems found in musical education; we re-designed it, simplified and generalized its notation. The new tool, musiCards, also permits corporeal interaction, so children can......Young music learners face a number of challenges, mostly because musical theory and practice are deeply interrelated. Many musical teaching theories and methodologies exist, and music is taught today from primary school, in a variety of ways, and to different degrees of success. We proposal...... design interactive musical machines, implement them physically, then enact the interaction to generate musical performances. MusiCards enables pupils to explore music-related concepts such as rhythm and polyphonic performance; moreover it supports active involvement, imitation, group learning...

  1. Hydrogen Financial Analysis Scenario Tool (H2FAST). Web Tool User's Manual

    Energy Technology Data Exchange (ETDEWEB)

    Bush, B. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Penev, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Melaina, M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Zuboy, J. [Independent Consultant, Golden, CO (United States)

    2015-05-11

    The Hydrogen Financial Analysis Scenario Tool (H2FAST) provides a quick and convenient indepth financial analysis for hydrogen fueling stations. This manual describes how to use the H2FAST web tool, which is one of three H2FAST formats developed by the National Renewable Energy Laboratory (NREL). Although all of the formats are based on the same financial computations and conform to generally accepted accounting principles (FASAB 2014, Investopedia 2014), each format provides a different level of complexity and user interactivity.

  2. Human-Computer Interaction, Tourism and Cultural Heritage

    Science.gov (United States)

    Cipolla Ficarra, Francisco V.

    We present a state of the art of the human-computer interaction aimed at tourism and cultural heritage in some cities of the European Mediterranean. In the work an analysis is made of the main problems deriving from training understood as business and which can derail the continuous growth of the HCI, the new technologies and tourism industry. Through a semiotic and epistemological study the current mistakes in the context of the interrelations of the formal and factual sciences will be detected and also the human factors that have an influence on the professionals devoted to the development of interactive systems in order to safeguard and boost cultural heritage.

  3. Capturing Order in Social Interactions

    OpenAIRE

    Vinciarelli, Alessandro

    2009-01-01

    As humans appear to be literally wired for social interaction, it is not surprising to observe that social aspects of human behavior and psychology attract interest in the computing community as well. The gap between social animal and unsocial machine was tolerable when computers were nothing else than improved versions of old tools (e.g., word processors replacing typewriters), but nowadays computers go far beyond that simple role. Today, computers are the natural means for a wide spectrum o...

  4. Anvil Forecast Tool in the Advanced Weather Interactive Processing System (AWIPS)

    Science.gov (United States)

    Barrett, Joe H., III; Hood, Doris

    2009-01-01

    Launch Weather Officers (LWOs) from the 45th Weather Squadron (45 WS) and forecasters from the National Weather Service (NWS) Spaceflight Meteorology Group (SMG) have identified anvil forecasting as one of their most challenging tasks when predicting the probability of violating the Lightning Launch Commit Criteria (LLCC) (Krider et al. 2006; Space Shuttle Flight Rules (FR), NASA/JSC 2004)). As a result, the Applied Meteorology Unit (AMU) developed a tool that creates an anvil threat corridor graphic that can be overlaid on satellite imagery using the Meteorological Interactive Data Display System (MIDDS, Short and Wheeler, 2002). The tool helps forecasters estimate the locations of thunderstorm anvils at one, two, and three hours into the future. It has been used extensively in launch and landing operations by both the 45 WS and SMG. The Advanced Weather Interactive Processing System (AWIPS) is now used along with MIDDS for weather analysis and display at SMG. In Phase I of this task, SMG tasked the AMU to transition the tool from MIDDS to AWIPS (Barrett et aI., 2007). For Phase II, SMG requested the AMU make the Anvil Forecast Tool in AWIPS more configurable by creating the capability to read model gridded data from user-defined model files instead of hard-coded files. An NWS local AWIPS application called AGRID was used to accomplish this. In addition, SMG needed to be able to define the pressure levels for the model data, instead of hard-coding the bottom level as 300 mb and the top level as 150 mb. This paper describes the initial development of the Anvil Forecast Tool for MIDDS, followed by the migration of the tool to AWIPS in Phase I. It then gives a detailed presentation of the Phase II improvements to the AWIPS tool.

  5. A Perspective on Computational Human Performance Models as Design Tools

    Science.gov (United States)

    Jones, Patricia M.

    2010-01-01

    The design of interactive systems, including levels of automation, displays, and controls, is usually based on design guidelines and iterative empirical prototyping. A complementary approach is to use computational human performance models to evaluate designs. An integrated strategy of model-based and empirical test and evaluation activities is particularly attractive as a methodology for verification and validation of human-rated systems for commercial space. This talk will review several computational human performance modeling approaches and their applicability to design of display and control requirements.

  6. MO-E-18C-04: Advanced Computer Simulation and Visualization Tools for Enhanced Understanding of Core Medical Physics Concepts

    International Nuclear Information System (INIS)

    Naqvi, S

    2014-01-01

    Purpose: Most medical physics programs emphasize proficiency in routine clinical calculations and QA. The formulaic aspect of these calculations and prescriptive nature of measurement protocols obviate the need to frequently apply basic physical principles, which, therefore, gradually decay away from memory. E.g. few students appreciate the role of electron transport in photon dose, making it difficult to understand key concepts such as dose buildup, electronic disequilibrium effects and Bragg-Gray theory. These conceptual deficiencies manifest when the physicist encounters a new system, requiring knowledge beyond routine activities. Methods: Two interactive computer simulation tools are developed to facilitate deeper learning of physical principles. One is a Monte Carlo code written with a strong educational aspect. The code can “label” regions and interactions to highlight specific aspects of the physics, e.g., certain regions can be designated as “starters” or “crossers,” and any interaction type can be turned on and off. Full 3D tracks with specific portions highlighted further enhance the visualization of radiation transport problems. The second code calculates and displays trajectories of a collection electrons under arbitrary space/time dependent Lorentz force using relativistic kinematics. Results: Using the Monte Carlo code, the student can interactively study photon and electron transport through visualization of dose components, particle tracks, and interaction types. The code can, for instance, be used to study kerma-dose relationship, explore electronic disequilibrium near interfaces, or visualize kernels by using interaction forcing. The electromagnetic simulator enables the student to explore accelerating mechanisms and particle optics in devices such as cyclotrons and linacs. Conclusion: The proposed tools are designed to enhance understanding of abstract concepts by highlighting various aspects of the physics. The simulations serve as

  7. MO-E-18C-04: Advanced Computer Simulation and Visualization Tools for Enhanced Understanding of Core Medical Physics Concepts

    Energy Technology Data Exchange (ETDEWEB)

    Naqvi, S [Saint Agnes Cancer Institute, Department of Radiation Oncology, Baltimore, MD (United States)

    2014-06-15

    Purpose: Most medical physics programs emphasize proficiency in routine clinical calculations and QA. The formulaic aspect of these calculations and prescriptive nature of measurement protocols obviate the need to frequently apply basic physical principles, which, therefore, gradually decay away from memory. E.g. few students appreciate the role of electron transport in photon dose, making it difficult to understand key concepts such as dose buildup, electronic disequilibrium effects and Bragg-Gray theory. These conceptual deficiencies manifest when the physicist encounters a new system, requiring knowledge beyond routine activities. Methods: Two interactive computer simulation tools are developed to facilitate deeper learning of physical principles. One is a Monte Carlo code written with a strong educational aspect. The code can “label” regions and interactions to highlight specific aspects of the physics, e.g., certain regions can be designated as “starters” or “crossers,” and any interaction type can be turned on and off. Full 3D tracks with specific portions highlighted further enhance the visualization of radiation transport problems. The second code calculates and displays trajectories of a collection electrons under arbitrary space/time dependent Lorentz force using relativistic kinematics. Results: Using the Monte Carlo code, the student can interactively study photon and electron transport through visualization of dose components, particle tracks, and interaction types. The code can, for instance, be used to study kerma-dose relationship, explore electronic disequilibrium near interfaces, or visualize kernels by using interaction forcing. The electromagnetic simulator enables the student to explore accelerating mechanisms and particle optics in devices such as cyclotrons and linacs. Conclusion: The proposed tools are designed to enhance understanding of abstract concepts by highlighting various aspects of the physics. The simulations serve as

  8. Visualization and interaction tools for aerial photograph mosaics

    Science.gov (United States)

    Fernandes, João Pedro; Fonseca, Alexandra; Pereira, Luís; Faria, Adriano; Figueira, Helder; Henriques, Inês; Garção, Rita; Câmara, António

    1997-05-01

    This paper describes the development of a digital spatial library based on mosaics of digital orthophotos, called Interactive Portugal, that will enable users both to retrieve geospatial information existing in the Portuguese National System for Geographic Information World Wide Web server, and to develop local databases connected to the main system. A set of navigation, interaction, and visualization tools are proposed and discussed. They include sketching, dynamic sketching, and navigation capabilities over the digital orthophotos mosaics. Main applications of this digital spatial library are pointed out and discussed, namely for education, professional, and tourism markets. Future developments are considered. These developments are related to user reactions, technological advancements, and projects that also aim at delivering and exploring digital imagery on the World Wide Web. Future capabilities for site selection and change detection are also considered.

  9. SMARTScience Tools: Interacting With Blazar Data In The Web Browser

    Science.gov (United States)

    Hasan, Imran; Isler, Jedidah; Urry, C. Megan; MacPherson, Emily; Buxton, Michelle; Bailyn, Charles D.; Coppi, Paolo S.

    2014-08-01

    The Yale-SMARTS blazar group has accumulated 6 years of optical-IR photometry of more than 70 blazars, mostly bright enough in gamma-rays to be detected with Fermi. Observations were done with the ANDICAM instrument on the SMARTS 1.3 m telescope at the Cerro Tololo Inter-American Observatory. As a result of this long-term, multiwavelength monitoring, we have produced a calibrated, publicly available data set (see www.astro.yale.edu/smarts/glast/home.php), which we have used to find that (i) optical-IR and gamma-ray light curves are well correlated, supporting inverse-Compton models for gamma-ray production (Bonning et al. 2009, 2012), (ii) at their brightest, blazar jets can contribute significantly to the photoionization of the broad-emission-line region, indicating that gamma-rays are produced within 0.1 pc of the black hole in at least some cases (Isler et al. 2014), and (iii) optical-IR and gamma-ray flares are symmetric, implying the time scales are dominated by light-travel-time effects rather than acceleration or cooling (Chatterjee et al. 2012). The volume of data and diversity of projects for which it is used calls out for an efficient means of visualization. To this end, we have developed a suite of visualization tools called SMARTScience Tools, which allow users to interact dynamically with our dataset. The SMARTScience Tools is publicly available via our webpage and can be used to customize multiwavelength light curves and color magnitude diagrams quickly and intuitively. Users can choose specific bands to construct plots, and the plots include features such as band-by-band panning, dynamic zooming, and direct mouse interaction with individual data points. Human and machine readable tables of the plotted data can be directly printed for the user's convenience and for further independent study. The SMARTScience Tools significantly improves the public’s ability to interact with the Yale-SMARTS 6-year data base of blazar photometry, and should make

  10. A basic tool for computer-aided sail design

    International Nuclear Information System (INIS)

    Thrasher, D.F.; Dunyak, T.J.; Mook, D.T.; Nayfeh, A.H.

    1985-01-01

    Recent developments in modelling lifting surfaces have provided a tool that also can be used to model sails. The simplest of the adequate models is the vortex-lattice method. This method can fully account for the aerodynamic interactions among several lifting surfaces having arbitrary platforms, camber, and twist as long as separation occurs only along the edges and the phenomenon known as vortex bursting does not occur near the sails. This paper describes this method and how it can be applied to the design of sails

  11. Keep IT Real : On Tools, Emotion, Cognition and Intentionality in Design

    NARCIS (Netherlands)

    Wendrich, R. E.; Kruiper, R.; Marjanović, D.; M., Štorga; Pavković, N.; Bojčetić, N.; Škec, S.

    2016-01-01

    Keep IT real, create, build, and develop design tools that support designers during the early phases of design processing. This paper investigates how, and whether, current technology can afford real-time interaction and affective computing in a HDT(E) design tool that integrates and blends

  12. Video Analysis of Projectile Motion Using Tablet Computers as Experimental Tools

    Science.gov (United States)

    Klein, P.; Gröber, S.; Kuhn, J.; Müller, A.

    2014-01-01

    Tablet computers were used as experimental tools to record and analyse the motion of a ball thrown vertically from a moving skateboard. Special applications plotted the measurement data component by component, allowing a simple determination of initial conditions and "g" in order to explore the underlying laws of motion. This experiment…

  13. Computational Tool for Coupled Simulation of Nonequilibrium Hypersonic Flows with Ablation, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this SBIR project is to develop a computational tool with unique predictive capabilities for the aerothermodynamic environment around ablation-cooled...

  14. Computational Tool for Coupled Simulation of Nonequilibrium Hypersonic Flows with Ablation, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — The goal of this SBIR project is to develop a predictive computational tool for the aerothermal environment around ablation-cooled hypersonic atmospheric entry...

  15. Computer-aided design in power engineering. Application of software tools

    International Nuclear Information System (INIS)

    Stojkovic, Zlatan

    2012-01-01

    Demonstrates the use software tools in the practice of design in the field of power systems. Presents many applications in the design in the field of power systems. Useful for educative purposes and practical work. This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel and Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems is discussed. Here, the emphasis is put on the standard software MS Excel and MS Project.

  16. Computer-aided design in power engineering. Application of software tools

    Energy Technology Data Exchange (ETDEWEB)

    Stojkovic, Zlatan

    2012-07-01

    Demonstrates the use software tools in the practice of design in the field of power systems. Presents many applications in the design in the field of power systems. Useful for educative purposes and practical work. This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel and Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems is discussed. Here, the emphasis is put on the standard software MS Excel and MS Project.

  17. TranscriptomeBrowser 3.0: introducing a new compendium of molecular interactions and a new visualization tool for the study of gene regulatory networks.

    Science.gov (United States)

    Lepoivre, Cyrille; Bergon, Aurélie; Lopez, Fabrice; Perumal, Narayanan B; Nguyen, Catherine; Imbert, Jean; Puthier, Denis

    2012-01-31

    Deciphering gene regulatory networks by in silico approaches is a crucial step in the study of the molecular perturbations that occur in diseases. The development of regulatory maps is a tedious process requiring the comprehensive integration of various evidences scattered over biological databases. Thus, the research community would greatly benefit from having a unified database storing known and predicted molecular interactions. Furthermore, given the intrinsic complexity of the data, the development of new tools offering integrated and meaningful visualizations of molecular interactions is necessary to help users drawing new hypotheses without being overwhelmed by the density of the subsequent graph. We extend the previously developed TranscriptomeBrowser database with a set of tables containing 1,594,978 human and mouse molecular interactions. The database includes: (i) predicted regulatory interactions (computed by scanning vertebrate alignments with a set of 1,213 position weight matrices), (ii) potential regulatory interactions inferred from systematic analysis of ChIP-seq experiments, (iii) regulatory interactions curated from the literature, (iv) predicted post-transcriptional regulation by micro-RNA, (v) protein kinase-substrate interactions and (vi) physical protein-protein interactions. In order to easily retrieve and efficiently analyze these interactions, we developed In-teractomeBrowser, a graph-based knowledge browser that comes as a plug-in for Transcriptome-Browser. The first objective of InteractomeBrowser is to provide a user-friendly tool to get new insight into any gene list by providing a context-specific display of putative regulatory and physical interactions. To achieve this, InteractomeBrowser relies on a "cell compartments-based layout" that makes use of a subset of the Gene Ontology to map gene products onto relevant cell compartments. This layout is particularly powerful for visual integration of heterogeneous biological information

  18. Laser Pointers: Low-Cost, Low-Tech Innovative, Interactive Instruction Tool

    Science.gov (United States)

    Zdravkovska, Nevenka; Cech, Maureen; Beygo, Pinar; Kackley, Bob

    2010-01-01

    This paper discusses the use of laser pointers at the Engineering and Physical Sciences Library, University of Maryland, College Park, as a personal response system (PRS) tool to encourage student engagement in and interactivity with one-shot, lecture-based information literacy sessions. Unlike more sophisticated personal response systems like…

  19. Distributed interactive graphics applications in computational fluid dynamics

    International Nuclear Information System (INIS)

    Rogers, S.E.; Buning, P.G.; Merritt, F.J.

    1987-01-01

    Implementation of two distributed graphics programs used in computational fluid dynamics is discussed. Both programs are interactive in nature. They run on a CRAY-2 supercomputer and use a Silicon Graphics Iris workstation as the front-end machine. The hardware and supporting software are from the Numerical Aerodynamic Simulation project. The supercomputer does all numerically intensive work and the workstation, as the front-end machine, allows the user to perform real-time interactive transformations on the displayed data. The first program was written as a distributed program that computes particle traces for fluid flow solutions existing on the supercomputer. The second is an older post-processing and plotting program modified to run in a distributed mode. Both programs have realized a large increase in speed over that obtained using a single machine. By using these programs, one can learn quickly about complex features of a three-dimensional flow field. Some color results are presented

  20. Development and Assessment of a Chemistry-Based Computer Video Game as a Learning Tool

    Science.gov (United States)

    Martinez-Hernandez, Kermin Joel

    2010-01-01

    The chemistry-based computer video game is a multidisciplinary collaboration between chemistry and computer graphics and technology fields developed to explore the use of video games as a possible learning tool. This innovative approach aims to integrate elements of commercial video game and authentic chemistry context environments into a learning…

  1. Development of an Interactive Social Media Tool for Parents with Concerns about Vaccines

    Science.gov (United States)

    Shoup, Jo Ann; Wagner, Nicole M.; Kraus, Courtney R.; Narwaney, Komal J.; Goddard, Kristin S.; Glanz, Jason M.

    2015-01-01

    Objective: Describe a process for designing, building, and evaluating a theory-driven social media intervention tool to help reduce parental concerns about vaccination. Method: We developed an interactive web-based tool using quantitative and qualitative methods (e.g., survey, focus groups, individual interviews, and usability testing). Results:…

  2. Computer Aided Methods & Tools for Separation & Purification of Fine Chemical & Pharmaceutical Products

    DEFF Research Database (Denmark)

    Afonso, Maria B.C.; Soni, Vipasha; Mitkowski, Piotr Tomasz

    2006-01-01

    An integrated approach that is particularly suitable for solving problems related to product-process design from the fine chemicals, agrochemicals, food and pharmaceutical industries is presented together with the corresponding methods and tools, which forms the basis for an integrated computer...

  3. Network computing infrastructure to share tools and data in global nuclear energy partnership

    International Nuclear Information System (INIS)

    Kim, Guehee; Suzuki, Yoshio; Teshima, Naoya

    2010-01-01

    CCSE/JAEA (Center for Computational Science and e-Systems/Japan Atomic Energy Agency) integrated a prototype system of a network computing infrastructure for sharing tools and data to support the U.S. and Japan collaboration in GNEP (Global Nuclear Energy Partnership). We focused on three technical issues to apply our information process infrastructure, which are accessibility, security, and usability. In designing the prototype system, we integrated and improved both network and Web technologies. For the accessibility issue, we adopted SSL-VPN (Security Socket Layer - Virtual Private Network) technology for the access beyond firewalls. For the security issue, we developed an authentication gateway based on the PKI (Public Key Infrastructure) authentication mechanism to strengthen the security. Also, we set fine access control policy to shared tools and data and used shared key based encryption method to protect tools and data against leakage to third parties. For the usability issue, we chose Web browsers as user interface and developed Web application to provide functions to support sharing tools and data. By using WebDAV (Web-based Distributed Authoring and Versioning) function, users can manipulate shared tools and data through the Windows-like folder environment. We implemented the prototype system in Grid infrastructure for atomic energy research: AEGIS (Atomic Energy Grid Infrastructure) developed by CCSE/JAEA. The prototype system was applied for the trial use in the first period of GNEP. (author)

  4. PREMIM and EMIM: tools for estimation of maternal, imprinting and interaction effects using multinomial modelling

    Directory of Open Access Journals (Sweden)

    Howey Richard

    2012-06-01

    Full Text Available Abstract Background Here we present two new computer tools, PREMIM and EMIM, for the estimation of parental and child genetic effects, based on genotype data from a variety of different child-parent configurations. PREMIM allows the extraction of child-parent genotype data from standard-format pedigree data files, while EMIM uses the extracted genotype data to perform subsequent statistical analysis. The use of genotype data from the parents as well as from the child in question allows the estimation of complex genetic effects such as maternal genotype effects, maternal-foetal interactions and parent-of-origin (imprinting effects. These effects are estimated by EMIM, incorporating chosen assumptions such as Hardy-Weinberg equilibrium or exchangeability of parental matings as required. Results In application to simulated data, we show that the inference provided by EMIM is essentially equivalent to that provided by alternative (competing software packages such as MENDEL and LEM. However, PREMIM and EMIM (used in combination considerably outperform MENDEL and LEM in terms of speed and ease of execution. Conclusions Together, EMIM and PREMIM provide easy-to-use command-line tools for the analysis of pedigree data, giving unbiased estimates of parental and child genotype relative risks.

  5. Patient-oriented interactive E-health tools on U.S. hospital Web sites.

    Science.gov (United States)

    Huang, Edgar; Chang, Chiu-Chi Angela

    2012-01-01

    The purpose of this study is to provide evidence for strategic planning regarding e-health development in U.S. hospitals. A content analysis of a representative sample of the U.S. hospital Web sites has revealed how U.S. hospitals have taken advantage of the 21 patient-oriented interactive tools identified in this study. Significant gaps between various types of hospitals have also been found. It is concluded that although the majority of the U.S. hospitals have adopted traditional functional tools, they need to make significant inroad in implementing the core e-business tools to serve their patients/users, making their Web sites more efficient marketing tools.

  6. Gender differences in the use of computers, programming, and peer interactions in computer science classrooms

    Science.gov (United States)

    Stoilescu, Dorian; Egodawatte, Gunawardena

    2010-12-01

    Research shows that female and male students in undergraduate computer science programs view computer culture differently. Female students are interested more in the use of computers than in doing programming, whereas male students see computer science mainly as a programming activity. The overall purpose of our research was not to find new definitions for computer science culture but to see how male and female students see themselves involved in computer science practices, how they see computer science as a successful career, and what they like and dislike about current computer science practices. The study took place in a mid-sized university in Ontario. Sixteen students and two instructors were interviewed to get their views. We found that male and female views are different on computer use, programming, and the pattern of student interactions. Female and male students did not have any major issues in using computers. In computing programming, female students were not so involved in computing activities whereas male students were heavily involved. As for the opinions about successful computer science professionals, both female and male students emphasized hard working, detailed oriented approaches, and enjoying playing with computers. The myth of the geek as a typical profile of successful computer science students was not found to be true.

  7. Development of a Planet Tool for an interactive School Atlas as eBook

    Science.gov (United States)

    Wondrak, Stephan

    2018-05-01

    The present thesis describes the development of a planet tool for an interactive school atlas using an eBook format. Especially the technical and cartographical capabilities of the open standard ePUB 3 are evaluated. An eBook application with interactive and dynamic 2-dimensional visualizations is developed especially to show whether the re-al-world dimensions and distances in the solar system can be mapped in a cartographical correct and for students easy understandable manner. In the first part of the work, the requirements of the planet tool are evaluated in co-operation with experts. The open standards PDF and ePUB 3 are investigated with regard to the requirements for the development of the planet tool. Another chapter describes in detail all significant steps of the development process for a prototype of the planet tool. A graphic file originally created for print production is prepared and enhanced with interactive features to generate one of the eBook pages. This serves to show a potential workflow for the generation of eBook pages in a cross-media atlas production. All sample pages of the prototype show different layouts and contain the entire spectrum of interactive features and multimedia content of modern eBooks. The sample pages are presented and discussed in an own chapter. The results of the present work aim at answering the question concerning the suitability of the open standard ePUB 3 for the development of a multimedia eBook for high school education.

  8. Illustrated Plant Identification Keys: An Interactive Tool to Learn Botany

    Science.gov (United States)

    Silva, Helena; Pinho, Rosa; Lopes, Lisia; Nogueira, Antonio J. A.; Silveira, Paulo

    2011-01-01

    An Interactive Dichotomous Key (IDK) for 390 "taxa" of vascular plants from the Ria de Aveiro, available on a website, was developed to help teach botany to school and universitary students. This multimedia tool includes several links to Descriptive and Illustrated Glossaries. Questionnaires answered by high-school and undergraduate students about…

  9. Utilizing of computational tools on the modelling of a simplified problem of neutron shielding

    Energy Technology Data Exchange (ETDEWEB)

    Lessa, Fabio da Silva Rangel; Platt, Gustavo Mendes; Alves Filho, Hermes [Universidade do Estado do Rio de Janeiro (UERJ), Nova Friburgo, RJ (Brazil). Inst. Politecnico]. E-mails: fsrlessa@gmail.com; gmplatt@iprj.uerj.br; halves@iprj.uerj.br

    2007-07-01

    In the current technology level, the investigation of several problems is studied through computational simulations whose results are in general satisfactory and much less expensive than the conventional forms of investigation (e.g., destructive tests, laboratory measures, etc.). Almost all of the modern scientific studies are executed using computational tools, as computers of superior capacity and their systems applications to make complex calculations, algorithmic iterations, etc. Besides the considerable economy in time and in space that the Computational Modelling provides, there is a financial economy to the scientists. The Computational Modelling is a modern methodology of investigation that asks for the theoretical study of the identified phenomena in the problem, a coherent mathematical representation of such phenomena, the generation of a numeric algorithmic system comprehensible for the computer, and finally the analysis of the acquired solution, or still getting use of pre-existent systems that facilitate the visualization of these results (editors of Cartesian graphs, for instance). In this work, was being intended to use many computational tools, implementation of numeric methods and a deterministic model in the study and analysis of a well known and simplified problem of nuclear engineering (the neutron transport), simulating a theoretical problem of neutron shielding with physical-material hypothetical parameters, of neutron flow in each space junction, programmed with Scilab version 4.0. (author)

  10. Utilizing of computational tools on the modelling of a simplified problem of neutron shielding

    International Nuclear Information System (INIS)

    Lessa, Fabio da Silva Rangel; Platt, Gustavo Mendes; Alves Filho, Hermes

    2007-01-01

    In the current technology level, the investigation of several problems is studied through computational simulations whose results are in general satisfactory and much less expensive than the conventional forms of investigation (e.g., destructive tests, laboratory measures, etc.). Almost all of the modern scientific studies are executed using computational tools, as computers of superior capacity and their systems applications to make complex calculations, algorithmic iterations, etc. Besides the considerable economy in time and in space that the Computational Modelling provides, there is a financial economy to the scientists. The Computational Modelling is a modern methodology of investigation that asks for the theoretical study of the identified phenomena in the problem, a coherent mathematical representation of such phenomena, the generation of a numeric algorithmic system comprehensible for the computer, and finally the analysis of the acquired solution, or still getting use of pre-existent systems that facilitate the visualization of these results (editors of Cartesian graphs, for instance). In this work, was being intended to use many computational tools, implementation of numeric methods and a deterministic model in the study and analysis of a well known and simplified problem of nuclear engineering (the neutron transport), simulating a theoretical problem of neutron shielding with physical-material hypothetical parameters, of neutron flow in each space junction, programmed with Scilab version 4.0. (author)

  11. Use of online interactive tools in an open distance learning context ...

    African Journals Online (AJOL)

    Kefiloe Adolphina Maboe

    Objective: The purpose of this study was to determine how the discussion forum as an online interactive tool be used in ... and study material, and students and the ODL institution. .... The questionnaire was developed in English. Its develop-.

  12. Issues on the Development and Application of Computer Tools to Support Product Structuring and Configuring

    DEFF Research Database (Denmark)

    Hansen, Claus Thorp; Riitahuhta, A.

    2001-01-01

    The aim of this article is to make a balance on the results and challenges in the efforts to develop computer tools to support product structuring and configuring in product development projects. The balance will be made in two dimensions, a design science and an industrial dimension. The design ...... that there are large positive effects to be gained for industrial companies by conscious implementing computer tools based on the results of design science. The positive effects will be measured by e.g. predictable product quality, reduced lead time, and reuse of design solutions....

  13. The Status of Interactivity in Computer Art: Formal Apories

    Directory of Open Access Journals (Sweden)

    João Castro Pinto

    2011-12-01

    Full Text Available Contemporary art, particularly that which is produced by computer technologies capable of receiving data input via interactive devices (sensors and controllers, constitutes an emerging expressive medium of interdisciplinary nature, which implies the need for a critical look at its constitution and artistic functions. To consider interactive art as a form of artistic expression that files under the present categorization, implies the acceptance of the participation of the spectator in the production of the work of art, supposedly at the time of its origin / or during its creation. When we examine the significance of the formal status of interactivity, assuming as a theoretical starting point the referred premises and reducing it to a phenomenological point of view of artistic creation, we quickly fall into difficulties of conceptual definitions and structural apories [1]. The fundamental aim of this research is to formally define the status of interactive art, by perpetrating a phenomenological examination on the creative process of this specific art, establishing crucial distinctions in order to develop a hermeneutics in favor of creation of new perspectives and aesthetic frameworks. What is interactive creation? Is interactivity, from the computing artistic creativity point of view, the exponentiation of the concept of the open work of art (ECO 2009? Does interactive art correspond to an a priori projective and unachievable meta-art? What is the status of the artist and of the spectator in relation to an interactive work of art? What ontic and factical conditions are postulated as necessary in order to determine an artistic product as co-created? What apories do we find along the progres- sive process of reaching to a clarifying conceptual definition?This brief investigation will seek to contribute to the study of this issue, intending ultimately, and above all, to expose pertinent lines of inquiry rather than to provide definite scientific and

  14. New Frontiers in Analyzing Dynamic Group Interactions: Bridging Social and Computer Science.

    Science.gov (United States)

    Lehmann-Willenbrock, Nale; Hung, Hayley; Keyton, Joann

    2017-10-01

    This special issue on advancing interdisciplinary collaboration between computer scientists and social scientists documents the joint results of the international Lorentz workshop, "Interdisciplinary Insights into Group and Team Dynamics," which took place in Leiden, The Netherlands, July 2016. An equal number of scholars from social and computer science participated in the workshop and contributed to the papers included in this special issue. In this introduction, we first identify interaction dynamics as the core of group and team models and review how scholars in social and computer science have typically approached behavioral interactions in groups and teams. Next, we identify key challenges for interdisciplinary collaboration between social and computer scientists, and we provide an overview of the different articles in this special issue aimed at addressing these challenges.

  15. MARs Tools for Interactive ANalysis (MARTIAN): Google Maps Tools for Visual Exploration of Geophysical Modeling on Mars

    Science.gov (United States)

    Dimitrova, L. L.; Haines, M.; Holt, W. E.; Schultz, R. A.; Richard, G.; Haines, A. J.

    2006-12-01

    Interactive maps of surface-breaking faults and stress models on Mars provide important tools to engage undergraduate students, educators, and scientists with current geological and geophysical research. We have developed a map based on the Google Maps API -- an Internet based tool combining DHTML and AJAX, -- which allows very large maps to be viewed over the World Wide Web. Typically, small portions of the maps are downloaded as needed, rather than the entire image at once. This set-up enables relatively fast access for users with low bandwidth. Furthermore, Google Maps provides an extensible interactive interface making it ideal for visualizing multiple data sets at the user's choice. The Google Maps API works primarily with data referenced to latitudes and longitudes, which is then mapped in Mercator projection only. We have developed utilities for general cylindrical coordinate systems by converting these coordinates into equivalent Mercator projection before including them on the map. The MARTIAN project is available at http://rock.geo.sunysb.edu/~holt/Mars/MARTIAN/. We begin with an introduction to the Martian surface using a topography model. Faults from several datasets are classified by type (extension vs. compression) and by time epoch. Deviatoric stresses due to gravitational potential energy differences, calculated from the topography and crustal thickness, can be overlain. Several quantitative measures for the fit of the stress field to the faults are also included. We provide introductory text and exercises spanning a range of topics: how are faults identified, what stress is and how it relates to faults, what gravitational potential energy is and how variations in it produce stress, how the models are created, and how these models can be evaluated and interpreted. The MARTIAN tool is used at Stony Brook University in GEO 310: Introduction to Geophysics, a class geared towards junior and senior geosciences majors. Although this project is in its

  16. An Evaluation of the Webquest as a Computer-Based Learning Tool

    Science.gov (United States)

    Hassanien, Ahmed

    2006-01-01

    This paper explores the preparation and use of an internet activity for undergraduate learners in higher education (HE). It evaluates the effectiveness of using webquest as a computer-based learning (CBL) tool to support students to learn in HE. The evaluation undertaken offers insights into learner perceptions concerning the ease of use of the…

  17. DockingShop: A Tool for Interactive Molecular Docking

    Energy Technology Data Exchange (ETDEWEB)

    Lu, Ting-Cheng; Max, Nelson L.; Ding, Jinhui; Bethel, E. Wes; Crivelli, Silvia N.

    2005-04-24

    Given two independently determined molecular structures, the molecular docking problem predicts the bound association, or best fit between them, while allowing for conformational changes of the individual molecules during construction of a molecular complex. Docking Shop is an integrated environment that permits interactive molecular docking by navigating a ligand or protein to an estimated binding site of a receptor with real-time graphical feedback of scoring factors as visual guides. Our program can be used to create initial configurations for a protein docking prediction process. Its output--the structure of aprotein-ligand or protein-protein complex--may serve as an input for aprotein docking algorithm, or an optimization process. This tool provides molecular graphics interfaces for structure modeling, interactive manipulation, navigation, optimization, and dynamic visualization to aid users steer the prediction process using their biological knowledge.

  18. A tool for monitoring lecturers’ interactions with Learning Management Systems

    Directory of Open Access Journals (Sweden)

    Magdalena Cantabella

    2016-12-01

    Full Text Available Learning Management Systems’ (LMS interaction mechanisms are mainly focused on the improvement of students’ experiences and academic results. However, special attention should also be given to the interaction between these LMS and other actors involved in the educational process. This paper specifically targets the interaction of degree coordinators with LMS when monitoring lecturers’ performance, especially in an online mode. The methodology is guided by the following three objectives: (1 analysis of the limitations of monitoring lecturers in current LMS; (2 development of software program to overcome such limitations; and (3 empirical evaluation of the proposed program. The results show that this type of tool helps coordinators to intuitively and efficiently analyze the status of the subjects taught in their degree programs.

  19. Sign Language for K-8 Mathematics by 3D Interactive Animation

    Science.gov (United States)

    Adamo-Villani, Nicoletta; Doublestein, John; Martin, Zachary

    2005-01-01

    We present a new highly interactive computer animation tool to increase the mathematical skills of deaf children. We aim at increasing the effectiveness of (hearing) parents in teaching arithmetic to their deaf children, and the opportunity of deaf children to learn arithmetic via interactive media. Using state-of-the-art computer animation…

  20. Data management of protein interaction networks

    CERN Document Server

    Cannataro, Mario

    2012-01-01

    Interactomics: a complete survey from data generation to knowledge extraction With the increasing use of high-throughput experimental assays, more and more protein interaction databases are becoming available. As a result, computational analysis of protein-to-protein interaction (PPI) data and networks, now known as interactomics, has become an essential tool to determine functionally associated proteins. From wet lab technologies to data management to knowledge extraction, this timely book guides readers through the new science of interactomics, giving them the tools needed to: Generate

  1. Exploration of Metagenome Assemblies with an Interactive Visualization Tool

    Energy Technology Data Exchange (ETDEWEB)

    Cantor, Michael; Nordberg, Henrik; Smirnova, Tatyana; Andersen, Evan; Tringe, Susannah; Hess, Matthias; Dubchak, Inna

    2014-07-09

    Metagenomics, one of the fastest growing areas of modern genomic science, is the genetic profiling of the entire community of microbial organisms present in an environmental sample. Elviz is a web-based tool for the interactive exploration of metagenome assemblies. Elviz can be used with publicly available data sets from the Joint Genome Institute or with custom user-loaded assemblies. Elviz is available at genome.jgi.doe.gov/viz

  2. Tools for remote computing in accelerator control

    International Nuclear Information System (INIS)

    Anderssen, P.S.; Frammery, V.; Wilcke, R.

    1990-01-01

    In modern accelerator control systems, the intelligence of the equipment is distributed in the geographical and the logical sense. Control processes for a large variety of tasks reside in both the equipment and the control computers. Hence successful operation hinges on the availability and reliability of the communication infrastructure. The computers are interconnected by a communication system and use remote procedure calls and message passing for information exchange. These communication mechanisms need a well-defined convention, i.e. a protocol. They also require flexibility in both the setup and changes to the protocol specification. The network compiler is a tool which provides the programmer with a means of establishing such a protocol for his application. Input to the network compiler is a single interface description file provided by the programmer. This file is written according to a grammar, and completely specifies the interprocess communication interfaces. Passed through the network compiler, the interface description file automatically produces the additional source code needed for the protocol. Hence the programmer does not have to be concerned about the details of the communication calls. Any further additions and modifications are made easy, because all the information about the interface is kept in a single file. (orig.)

  3. Twenty Years of Creativity Research in Human-Computer Interaction: Current State and Future Directions

    DEFF Research Database (Denmark)

    Frich Pedersen, Jonas; Biskjaer, Michael Mose; Dalsgaard, Peter

    2018-01-01

    Creativity has been a growing topic within the ACM community since the 1990s. However, no clear overview of this trend has been offered. We present a thorough survey of 998 creativity-related publications in the ACM Digital Library collected using keyword search to determine prevailing approaches......, topics, and characteristics of creativity-oriented Human-Computer Interaction (HCI) research. . A selected sample based on yearly citations yielded 221 publications, which were analyzed using constant comparison analysis. We found that HCI is almost exclusively responsible for creativity......-oriented publications; they focus on collaborative creativity rather than individual creativity; there is a general lack of definition of the term ‘creativity’; empirically based contributions are prevalent; and many publications focus on new tools, often developed by researchers. On this basis, we present three...

  4. Cognitive engineering models: A prerequisite to the design of human-computer interaction in complex dynamic systems

    Science.gov (United States)

    Mitchell, Christine M.

    1993-01-01

    This chapter examines a class of human-computer interaction applications, specifically the design of human-computer interaction for the operators of complex systems. Such systems include space systems (e.g., manned systems such as the Shuttle or space station, and unmanned systems such as NASA scientific satellites), aviation systems (e.g., the flight deck of 'glass cockpit' airplanes or air traffic control) and industrial systems (e.g., power plants, telephone networks, and sophisticated, e.g., 'lights out,' manufacturing facilities). The main body of human-computer interaction (HCI) research complements but does not directly address the primary issues involved in human-computer interaction design for operators of complex systems. Interfaces to complex systems are somewhat special. The 'user' in such systems - i.e., the human operator responsible for safe and effective system operation - is highly skilled, someone who in human-machine systems engineering is sometimes characterized as 'well trained, well motivated'. The 'job' or task context is paramount and, thus, human-computer interaction is subordinate to human job interaction. The design of human interaction with complex systems, i.e., the design of human job interaction, is sometimes called cognitive engineering.

  5. Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Tengfang [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Flapper, Joris [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ke, Jing [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Kramer, Klaas [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sathaye, Jayant [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-02-01

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry – including four dairy processes – cheese, fluid milk, butter, and milk powder.

  6. Interaction and control in wearable computing

    International Nuclear Information System (INIS)

    Strand, Ole Morten; Johansen, Paal; Droeivoldsmo, Asgeir; Reigstad, Magnus; Olsen, Asle; Helgar, Stein

    2004-03-01

    This report presents the status of Halden Virtual Reality Centre (HVRC) work with technological solutions for wearable computing to support operations where interaction and control of wearable information and communication systems for plant floor personnel are of importance. The report describes a framework and system prototype developed for testing technology, usability and applicability of eye movements and speech for controlling wearable equipment while having both hands free. Potentially interesting areas for further development are discussed with regard to the effect they have on the work situation for plant floor personnel using computerised wearable systems. (Author)

  7. Interactive computer graphics and its role in control system design of large space structures

    Science.gov (United States)

    Reddy, A. S. S. R.

    1985-01-01

    This paper attempts to show the relevance of interactive computer graphics in the design of control systems to maintain attitude and shape of large space structures to accomplish the required mission objectives. The typical phases of control system design, starting from the physical model such as modeling the dynamics, modal analysis, and control system design methodology are reviewed and the need of the interactive computer graphics is demonstrated. Typical constituent parts of large space structures such as free-free beams and free-free plates are used to demonstrate the complexity of the control system design and the effectiveness of the interactive computer graphics.

  8. ASTEC: Controls analysis for personal computers

    Science.gov (United States)

    Downing, John P.; Bauer, Frank H.; Thorpe, Christopher J.

    1989-01-01

    The ASTEC (Analysis and Simulation Tools for Engineering Controls) software is under development at Goddard Space Flight Center (GSFC). The design goal is to provide a wide selection of controls analysis tools at the personal computer level, as well as the capability to upload compute-intensive jobs to a mainframe or supercomputer. The project is a follow-on to the INCA (INteractive Controls Analysis) program that has been developed at GSFC over the past five years. While ASTEC makes use of the algorithms and expertise developed for the INCA program, the user interface was redesigned to take advantage of the capabilities of the personal computer. The design philosophy and the current capabilities of the ASTEC software are described.

  9. Ssecrett and neuroTrace: Interactive visualization and analysis tools for large-scale neuroscience data sets

    KAUST Repository

    Jeong, Wonki; Beyer, Johanna; Hadwiger, Markus; Blue, Rusty; Law, Charles; Vá zquez Reina, Amelio; Reid, Rollie Clay; Lichtman, Jeff W M D; Pfister, Hanspeter

    2010-01-01

    Recent advances in optical and electron microscopy let scientists acquire extremely high-resolution images for neuroscience research. Data sets imaged with modern electron microscopes can range between tens of terabytes to about one petabyte. These large data sizes and the high complexity of the underlying neural structures make it very challenging to handle the data at reasonably interactive rates. To provide neuroscientists flexible, interactive tools, the authors introduce Ssecrett and NeuroTrace, two tools they designed for interactive exploration and analysis of large-scale optical- and electron-microscopy images to reconstruct complex neural circuits of the mammalian nervous system. © 2010 IEEE.

  10. Ssecrett and neuroTrace: Interactive visualization and analysis tools for large-scale neuroscience data sets

    KAUST Repository

    Jeong, Wonki

    2010-05-01

    Recent advances in optical and electron microscopy let scientists acquire extremely high-resolution images for neuroscience research. Data sets imaged with modern electron microscopes can range between tens of terabytes to about one petabyte. These large data sizes and the high complexity of the underlying neural structures make it very challenging to handle the data at reasonably interactive rates. To provide neuroscientists flexible, interactive tools, the authors introduce Ssecrett and NeuroTrace, two tools they designed for interactive exploration and analysis of large-scale optical- and electron-microscopy images to reconstruct complex neural circuits of the mammalian nervous system. © 2010 IEEE.

  11. Graphics processing units in bioinformatics, computational biology and systems biology.

    Science.gov (United States)

    Nobile, Marco S; Cazzaniga, Paolo; Tangherloni, Andrea; Besozzi, Daniela

    2017-09-01

    Several studies in Bioinformatics, Computational Biology and Systems Biology rely on the definition of physico-chemical or mathematical models of biological systems at different scales and levels of complexity, ranging from the interaction of atoms in single molecules up to genome-wide interaction networks. Traditional computational methods and software tools developed in these research fields share a common trait: they can be computationally demanding on Central Processing Units (CPUs), therefore limiting their applicability in many circumstances. To overcome this issue, general-purpose Graphics Processing Units (GPUs) are gaining an increasing attention by the scientific community, as they can considerably reduce the running time required by standard CPU-based software, and allow more intensive investigations of biological systems. In this review, we present a collection of GPU tools recently developed to perform computational analyses in life science disciplines, emphasizing the advantages and the drawbacks in the use of these parallel architectures. The complete list of GPU-powered tools here reviewed is available at http://bit.ly/gputools. © The Author 2016. Published by Oxford University Press.

  12. An overview of interactive computer graphics and its application to computer-aided engineering and design

    International Nuclear Information System (INIS)

    Van Dam, A.

    1983-01-01

    The purpose of this brief birds-eye view of interactive graphics is to list the key ideas, and to show how one of the most important application areas, Computer Aided Engineering/Design takes advantage of it. (orig.)

  13. Computational methods, tools and data for nuclear analyses of fusion technology systems

    International Nuclear Information System (INIS)

    Fischer, U.

    2006-01-01

    An overview is presented of the Research and Development work conducted at Forschungszentrum Karlsruhe in co-operation with other associations in the framework of the European Fusion Technology Programme on the development and qualification of computational tools and data for nuclear analyses of Fusion Technology systems. The focus is on the development of advanced methods and tools based on the Monte Carlo technique for particle transport simulations, and the evaluation and qualification of dedicated nuclear data to satisfy the needs of the ITER and the IFMIF projects. (author)

  14. Mobile computing device as tools for college student education: a case on flashcards application

    Science.gov (United States)

    Kang, Congying

    2012-04-01

    Traditionally, college students always use flash cards as a tool to remember massive knowledge, such as nomenclature, structures, and reactions in chemistry. Educational and information technology have enabled flashcards viewed on computers, like Slides and PowerPoint, works as tunnels of drilling and feedback for the learners. The current generation of students is more capable of information technology and mobile computing devices. For example, they use their Mobile phones much more intensively everyday day. Trends of using Mobile phone as an educational tool is analyzed and a educational technology initiative is proposed, which use Mobile phone flash cards applications to help students learn biology and chemistry. Experiments show that users responded positively to these mobile flash cards.

  15. GeneWiz browser: An Interactive Tool for Visualizing Sequenced Chromosomes

    DEFF Research Database (Denmark)

    Hallin, Peter Fischer; Stærfeldt, Hans Henrik; Rotenberg, Eva

    2009-01-01

    , standard atlases are pre-generated for all prokaryotic genomes available in GenBank, providing a fast overview of all available genomes, including recently deposited genome sequences. The tool is available online from http://www.cbs.dtu.dk/services/gwBrowser. [Supplemental material including interactive...... atlases is available online at http://www.cbs.dtu.dk/services/gwBrowser/suppl/]....... readability and increased functionality compared to other browsers. The tool allows the user to select the display of various genomic features, color setting and data ranges. Custom numerical data can be added to the plot, allowing for example visualization of gene expression and regulation data. Further...

  16. Implementations of the CC'01 Human-Computer Interaction Guidelines Using Bloom's Taxonomy

    Science.gov (United States)

    Manaris, Bill; Wainer, Michael; Kirkpatrick, Arthur E.; Stalvey, RoxAnn H.; Shannon, Christine; Leventhal, Laura; Barnes, Julie; Wright, John; Schafer, J. Ben; Sanders, Dean

    2007-01-01

    In today's technology-laden society human-computer interaction (HCI) is an important knowledge area for computer scientists and software engineers. This paper surveys existing approaches to incorporate HCI into computer science (CS) and such related issues as the perceived gap between the interests of the HCI community and the needs of CS…

  17. Enabling Wide-Scale Computer Science Education through Improved Automated Assessment Tools

    Science.gov (United States)

    Boe, Bryce A.

    There is a proliferating demand for newly trained computer scientists as the number of computer science related jobs continues to increase. University programs will only be able to train enough new computer scientists to meet this demand when two things happen: when there are more primary and secondary school students interested in computer science, and when university departments have the resources to handle the resulting increase in enrollment. To meet these goals, significant effort is being made to both incorporate computational thinking into existing primary school education, and to support larger university computer science class sizes. We contribute to this effort through the creation and use of improved automated assessment tools. To enable wide-scale computer science education we do two things. First, we create a framework called Hairball to support the static analysis of Scratch programs targeted for fourth, fifth, and sixth grade students. Scratch is a popular building-block language utilized to pique interest in and teach the basics of computer science. We observe that Hairball allows for rapid curriculum alterations and thus contributes to wide-scale deployment of computer science curriculum. Second, we create a real-time feedback and assessment system utilized in university computer science classes to provide better feedback to students while reducing assessment time. Insights from our analysis of student submission data show that modifications to the system configuration support the way students learn and progress through course material, making it possible for instructors to tailor assignments to optimize learning in growing computer science classes.

  18. System capacity and economic modeling computer tool for satellite mobile communications systems

    Science.gov (United States)

    Wiedeman, Robert A.; Wen, Doong; Mccracken, Albert G.

    1988-01-01

    A unique computer modeling tool that combines an engineering tool with a financial analysis program is described. The resulting combination yields a flexible economic model that can predict the cost effectiveness of various mobile systems. Cost modeling is necessary in order to ascertain if a given system with a finite satellite resource is capable of supporting itself financially and to determine what services can be supported. Personal computer techniques using Lotus 123 are used for the model in order to provide as universal an application as possible such that the model can be used and modified to fit many situations and conditions. The output of the engineering portion of the model consists of a channel capacity analysis and link calculations for several qualities of service using up to 16 types of earth terminal configurations. The outputs of the financial model are a revenue analysis, an income statement, and a cost model validation section.

  19. Kinesthetic Interaction

    DEFF Research Database (Denmark)

    Fogtmann, Maiken Hillerup; Fritsch, Jonas; Kortbek, Karen Johanne

    2008-01-01

    Within the Human-Computer Interaction community there is a growing interest in designing for the whole body in interaction design. The attempts aimed at addressing the body have very different outcomes spanning from theoretical arguments for understanding the body in the design process, to more...... practical examples of designing for bodily potential. This paper presents Kinesthetic Interaction as a unifying concept for describing the body in motion as a foundation for designing interactive systems. Based on the theoretical foundation for Kinesthetic Interaction, a conceptual framework is introduced...... to reveal bodily potential in relation to three design themes – kinesthetic development, kinesthetic means and kinesthetic disorder; and seven design parameters – engagement, sociality, movability, explicit motivation, implicit motivation, expressive meaning and kinesthetic empathy. The framework is a tool...

  20. Social interaction in type 2 diabetes computer-mediated environments: How inherent features of the channels influence peer-to-peer interaction.

    Science.gov (United States)

    Lewinski, Allison A; Fisher, Edwin B

    2016-06-01

    Interventions via the internet provide support to individuals managing chronic illness. The purpose of this integrative review was to determine how the features of a computer-mediated environment influence social interactions among individuals with type 2 diabetes. A combination of MeSH and keyword terms, based on the cognates of three broad groupings: social interaction, computer-mediated environments, and chronic illness, was used to search the PubMed, PsychInfo, Sociology Research Database, and Cumulative Index to Nursing and Allied Health Literature databases. Eleven articles met the inclusion criteria. Computer-mediated environments enhance an individual's ability to interact with peers while increasing the convenience of obtaining personalized support. A matrix, focused on social interaction among peers, identified themes across all articles, and five characteristics emerged: (1) the presence of synchronous and asynchronous communication, (2) the ability to connect with similar peers, (3) the presence or absence of a moderator, (4) personalization of feedback regarding individual progress and self-management, and (5) the ability of individuals to maintain choice during participation. Individuals interact with peers to obtain relevant, situation-specific information and knowledge about managing their own care. Computer-mediated environments facilitate the ability of individuals to exchange this information despite temporal or geographical barriers that may be present, thus improving T2D self-management. © The Author(s) 2015.

  1. Planning Computer-Aided Distance Learning

    Directory of Open Access Journals (Sweden)

    Nadja Dobnik

    1996-12-01

    Full Text Available Didactics of autonomous learning changes under the influence of new technologies. Computer technology can cover all the functions that a teacher develops in personal contact with the learner. People organizing distance learning must realize all the possibilities offered by computers. Computers can take over and also combine the functions of many tools and systems, e. g. type­ writer, video, telephone. This the contents can be offered in form of classic media by means of text, speech, picture, etc. Computers take over data pro­cessing and function as study materials. Computer included in a computer network can also function as a medium for interactive communication.

  2. How accurate are adolescents in portion-size estimation using the computer tool young adolescents' nutrition assessment on computer (YANA-C)?

    OpenAIRE

    Vereecken, Carine; Dohogne, Sophie; Covents, Marc; Maes, Lea

    2010-01-01

    Computer-administered questionnaires have received increased attention for large-scale population research on nutrition. In Belgium-Flanders, Young Adolescents' Nutrition Assessment on Computer (YANA-C) has been developed. In this tool, standardised photographs are available to assist in portion-size estimation. The purpose of the present study is to assess how accurate adolescents are in estimating portion sizes of food using YANA-C. A convenience sample, aged 11-17 years, estimated the amou...

  3. Tools for studying dry-cured ham processing by using computed tomography.

    Science.gov (United States)

    Santos-Garcés, Eva; Muñoz, Israel; Gou, Pere; Sala, Xavier; Fulladosa, Elena

    2012-01-11

    An accurate knowledge and optimization of dry-cured ham elaboration processes could help to reduce operating costs and maximize product quality. The development of nondestructive tools to characterize chemical parameters such as salt and water contents and a(w) during processing is of special interest. In this paper, predictive models for salt content (R(2) = 0.960 and RMSECV = 0.393), water content (R(2) = 0.912 and RMSECV = 1.751), and a(w) (R(2) = 0.906 and RMSECV = 0.008), which comprise the whole elaboration process, were developed. These predictive models were used to develop analytical tools such as distribution diagrams, line profiles, and regions of interest (ROIs) from the acquired computed tomography (CT) scans. These CT analytical tools provided quantitative information on salt, water, and a(w) in terms of content but also distribution throughout the process. The information obtained was applied to two industrial case studies. The main drawback of the predictive models and CT analytical tools is the disturbance that fat produces in water content and a(w) predictions.

  4. Relationships among Learning Styles and Motivation with Computer-Aided Instruction in an Agronomy Course

    Science.gov (United States)

    McAndrews, Gina M.; Mullen, Russell E.; Chadwick, Scott A.

    2005-01-01

    Multi-media learning tools were developed to enhance student learning for an introductory agronomy course at Iowa State University. During fall 2002, the new interactive computer program, called Computer Interactive Multimedia Program for Learning Enhancement (CIMPLE) was incorporated into the teaching, learning, and assessment processes of the…

  5. Design Science in Human-Computer Interaction: A Model and Three Examples

    Science.gov (United States)

    Prestopnik, Nathan R.

    2013-01-01

    Humanity has entered an era where computing technology is virtually ubiquitous. From websites and mobile devices to computers embedded in appliances on our kitchen counters and automobiles parked in our driveways, information and communication technologies (ICTs) and IT artifacts are fundamentally changing the ways we interact with our world.…

  6. Development of a Computational Tool for Measuring Organizational Competitiveness in the Photovoltaic Power Plants

    Directory of Open Access Journals (Sweden)

    Carmen B. Rosa

    2018-04-01

    Full Text Available Photovoltaic (PV power generation is embedded in a globally competitive environment. This characteristic forces PV power plants to perform most processes relevant for their competitiveness with maximum efficiency. From managers’ point of view, the evaluation of solar energy performance from installed plants is justified to indicate their level of organizational competitiveness, which supports the decision-making process. This manuscript purposes a computational tool that graphically presents the level of competitiveness of PV power plants units based on performance indicators. This tool was developed by using the Key Performance Indicators (KPIs concept, which represents a set of measures focusing on the most critical aspects for the success of the organizations. The KPIs encompass four Fundamental Viewpoints (FV: Strategic Alliances, Solar Energy Monitoring, Management and Strategic Processes, and Power Generation Innovations. These four FVs were deployed on 26 Critical Success Factors (CSFs and 39 KPIs. Sequentially, the tool was applied in four solar generation plants, where three presented an organizational competitiveness global level “potentially competitive”. The proposed computational tool allows managers to assess the degree of organization competitiveness as well as aid in prospecting of future scenarios and decision-making.

  7. Decomposition recovery extension to the Computer Aided Prototyping System (CAPS) change-merge tool.

    OpenAIRE

    Keesling, William Ronald

    1997-01-01

    Approved for public release; distribution is unlimited A promising use of Computer Aided Prototyping System (CAPS) is to support concurrent design. Key to success in this context is the ability to automatically and reliably combine and integrate the prototypes produced in concurrent efforts. Thus, to be of practical use in this as well as most prototyping contexts, a CAPS tool must have a fast, automated, reliable prototype integration capability. The current CAPS Change Merge Tool is fast...

  8. TranscriptomeBrowser 3.0: introducing a new compendium of molecular interactions and a new visualization tool for the study of gene regulatory networks

    Directory of Open Access Journals (Sweden)

    Lepoivre Cyrille

    2012-01-01

    Full Text Available Abstract Background Deciphering gene regulatory networks by in silico approaches is a crucial step in the study of the molecular perturbations that occur in diseases. The development of regulatory maps is a tedious process requiring the comprehensive integration of various evidences scattered over biological databases. Thus, the research community would greatly benefit from having a unified database storing known and predicted molecular interactions. Furthermore, given the intrinsic complexity of the data, the development of new tools offering integrated and meaningful visualizations of molecular interactions is necessary to help users drawing new hypotheses without being overwhelmed by the density of the subsequent graph. Results We extend the previously developed TranscriptomeBrowser database with a set of tables containing 1,594,978 human and mouse molecular interactions. The database includes: (i predicted regulatory interactions (computed by scanning vertebrate alignments with a set of 1,213 position weight matrices, (ii potential regulatory interactions inferred from systematic analysis of ChIP-seq experiments, (iii regulatory interactions curated from the literature, (iv predicted post-transcriptional regulation by micro-RNA, (v protein kinase-substrate interactions and (vi physical protein-protein interactions. In order to easily retrieve and efficiently analyze these interactions, we developed In-teractomeBrowser, a graph-based knowledge browser that comes as a plug-in for Transcriptome-Browser. The first objective of InteractomeBrowser is to provide a user-friendly tool to get new insight into any gene list by providing a context-specific display of putative regulatory and physical interactions. To achieve this, InteractomeBrowser relies on a "cell compartments-based layout" that makes use of a subset of the Gene Ontology to map gene products onto relevant cell compartments. This layout is particularly powerful for visual integration

  9. Tools for controlling protein interactions with light

    Science.gov (United States)

    Tucker, Chandra L.; Vrana, Justin D.; Kennedy, Matthew J.

    2014-01-01

    Genetically-encoded actuators that allow control of protein-protein interactions with light, termed ‘optical dimerizers’, are emerging as new tools for experimental biology. In recent years, numerous new and versatile dimerizer systems have been developed. Here we discuss the design of optical dimerizer experiments, including choice of a dimerizer system, photoexcitation sources, and coordinate use of imaging reporters. We provide detailed protocols for experiments using two dimerization systems we previously developed, CRY2/CIB and UVR8/UVR8, for use controlling transcription, protein localization, and protein secretion with light. Additionally, we provide instructions and software for constructing a pulse-controlled LED light device for use in experiments requiring extended light treatments. PMID:25181301

  10. Computer assisted audit tools and techniques in real world: CAATT's applications and approaches in context

    OpenAIRE

    Pedrosa, I.; Costa, C. J.

    2012-01-01

    Nowadays, Computer Aided Audit Tools (and Techniques’) support almost all audit processes concerning data extraction and analysis. These tools were firstly aimed to support financial auditing processes. However, their scope is beyond this, therefore, we present case studies and good practices in an academic context. Although in large auditing companies Audit Tools to do data extraction and analysis are very common and applied in several contexts, we realized that is not easy to find practical...

  11. A network integration approach for drug-target interaction prediction and computational drug repositioning from heterogeneous information.

    Science.gov (United States)

    Luo, Yunan; Zhao, Xinbin; Zhou, Jingtian; Yang, Jinglin; Zhang, Yanqing; Kuang, Wenhua; Peng, Jian; Chen, Ligong; Zeng, Jianyang

    2017-09-18

    The emergence of large-scale genomic, chemical and pharmacological data provides new opportunities for drug discovery and repositioning. In this work, we develop a computational pipeline, called DTINet, to predict novel drug-target interactions from a constructed heterogeneous network, which integrates diverse drug-related information. DTINet focuses on learning a low-dimensional vector representation of features, which accurately explains the topological properties of individual nodes in the heterogeneous network, and then makes prediction based on these representations via a vector space projection scheme. DTINet achieves substantial performance improvement over other state-of-the-art methods for drug-target interaction prediction. Moreover, we experimentally validate the novel interactions between three drugs and the cyclooxygenase proteins predicted by DTINet, and demonstrate the new potential applications of these identified cyclooxygenase inhibitors in preventing inflammatory diseases. These results indicate that DTINet can provide a practically useful tool for integrating heterogeneous information to predict new drug-target interactions and repurpose existing drugs.Network-based data integration for drug-target prediction is a promising avenue for drug repositioning, but performance is wanting. Here, the authors introduce DTINet, whose performance is enhanced in the face of noisy, incomplete and high-dimensional biological data by learning low-dimensional vector representations.

  12. Review. Supporting problem structuring with computer-based tools in participatory forest planning

    Energy Technology Data Exchange (ETDEWEB)

    Hujala, T.; Khadka, C.; Wolfslehner, B.; Vacik, H.

    2013-09-01

    Aim of study: This review presents the state-of-art of using computerized techniques for problem structuring (PS) in participatory forest planning. Frequency and modes of using different computerized tool types and their contribution for planning processes as well as critical observations are described, followed by recommendations on how to better integrate PS with the use of forest decision support systems. Area of study: The reviewed research cases are from Asia, Europe, North-America, Africa and Australia. Material and methods: Via Scopus search and screening of abstracts, 32 research articles from years 2002-2011 were selected for review. Explicit and implicit evidence of using computerized tools for PS was recorded and assessed with content-driven qualitative analysis. Main results: GIS and forest-specific simulation tools were the most prevalent software types whereas cognitive modelling software and spreadsheet and calculation tools were less frequently used, followed by multi-criteria and interactive tools. The typical use type was to provide outputs of simulation–optimization or spatial analysis to negotiation situations or to compile summaries or illustrations afterwards; using software during group negotiation to foster interaction was observed only in a few cases. Research highlights: Expertise in both decision support systems and group learning is needed to better integrate PS and computerized decision analysis. From the knowledge management perspective, it is recommended to consider how the results of PS —e.g. conceptual models— could be stored into a problem perception database, and how PS and decision making could be streamlined by retrievals from such systems. (Author)

  13. HIFiRE-1 Turbulent Shock Boundary Layer Interaction - Flight Data and Computations

    Science.gov (United States)

    Kimmel, Roger L.; Prabhu, Dinesh

    2015-01-01

    The Hypersonic International Flight Research Experimentation (HIFiRE) program is a hypersonic flight test program executed by the Air Force Research Laboratory (AFRL) and Australian Defence Science and Technology Organisation (DSTO). This flight contained a cylinder-flare induced shock boundary layer interaction (SBLI). Computations of the interaction were conducted for a number of times during the ascent. The DPLR code used for predictions was calibrated against ground test data prior to exercising the code at flight conditions. Generally, the computations predicted the upstream influence and interaction pressures very well. Plateau pressures on the cylinder were predicted well at all conditions. Although the experimental heat transfer showed a large amount of scatter, especially at low heating levels, the measured heat transfer agreed well with computations. The primary discrepancy between the experiment and computation occurred in the pressures measured on the flare during second stage burn. Measured pressures exhibited large overshoots late in the second stage burn, the mechanism of which is unknown. The good agreement between flight measurements and CFD helps validate the philosophy of calibrating CFD against ground test, prior to exercising it at flight conditions.

  14. Structural analysis of magnetic fusion energy systems in a combined interactive/batch computer environment

    International Nuclear Information System (INIS)

    Johnson, N.E.; Singhal, M.K.; Walls, J.C.; Gray, W.H.

    1979-01-01

    A system of computer programs has been developed to aid in the preparation of input data for and the evaluation of output data from finite element structural analyses of magnetic fusion energy devices. The system utilizes the NASTRAN structural analysis computer program and a special set of interactive pre- and post-processor computer programs, and has been designed for use in an environment wherein a time-share computer system is linked to a batch computer system. In such an environment, the analyst must only enter, review and/or manipulate data through interactive terminals linked to the time-share computer system. The primary pre-processor programs include NASDAT, NASERR and TORMAC. NASDAT and TORMAC are used to generate NASTRAN input data. NASERR performs routine error checks on this data. The NASTRAN program is run on a batch computer system using data generated by NASDAT and TORMAC. The primary post-processing programs include NASCMP and NASPOP. NASCMP is used to compress the data initially stored on magnetic tape by NASTRAN so as to facilitate interactive use of the data. NASPOP reads the data stored by NASCMP and reproduces NASTRAN output for selected grid points, elements and/or data types

  15. Individual versus Interactive Task-Based Performance through Voice-Based Computer-Mediated Communication

    Science.gov (United States)

    Granena, Gisela

    2016-01-01

    Interaction is a necessary condition for second language (L2) learning (Long, 1980, 1996). Research in computer-mediated communication has shown that interaction opportunities make learners pay attention to form in a variety of ways that promote L2 learning. This research has mostly investigated text-based rather than voice-based interaction. The…

  16. Interactive virtual simulation using a 3D computer graphics model for microvascular decompression surgery.

    Science.gov (United States)

    Oishi, Makoto; Fukuda, Masafumi; Hiraishi, Tetsuya; Yajima, Naoki; Sato, Yosuke; Fujii, Yukihiko

    2012-09-01

    The purpose of this paper is to report on the authors' advanced presurgical interactive virtual simulation technique using a 3D computer graphics model for microvascular decompression (MVD) surgery. The authors performed interactive virtual simulation prior to surgery in 26 patients with trigeminal neuralgia or hemifacial spasm. The 3D computer graphics models for interactive virtual simulation were composed of the brainstem, cerebellum, cranial nerves, vessels, and skull individually created by the image analysis, including segmentation, surface rendering, and data fusion for data collected by 3-T MRI and 64-row multidetector CT systems. Interactive virtual simulation was performed by employing novel computer-aided design software with manipulation of a haptic device to imitate the surgical procedures of bone drilling and retraction of the cerebellum. The findings were compared with intraoperative findings. In all patients, interactive virtual simulation provided detailed and realistic surgical perspectives, of sufficient quality, representing the lateral suboccipital route. The causes of trigeminal neuralgia or hemifacial spasm determined by observing 3D computer graphics models were concordant with those identified intraoperatively in 25 (96%) of 26 patients, which was a significantly higher rate than the 73% concordance rate (concordance in 19 of 26 patients) obtained by review of 2D images only (p computer graphics model provided a realistic environment for performing virtual simulations prior to MVD surgery and enabled us to ascertain complex microsurgical anatomy.

  17. Neurosurgical simulation by interactive computer graphics on iPad.

    Science.gov (United States)

    Maruyama, Keisuke; Kin, Taichi; Saito, Toki; Suematsu, Shinya; Gomyo, Miho; Noguchi, Akio; Nagane, Motoo; Shiokawa, Yoshiaki

    2014-11-01

    Presurgical simulation before complicated neurosurgery is a state-of-the-art technique, and its usefulness has recently become well known. However, simulation requires complex image processing, which hinders its widespread application. We explored handling the results of interactive computer graphics on the iPad tablet, which can easily be controlled anywhere. Data from preneurosurgical simulations from 12 patients (4 men, 8 women) who underwent complex brain surgery were loaded onto an iPad. First, DICOM data were loaded using Amira visualization software to create interactive computer graphics, and ParaView, another free visualization software package, was used to convert the results of the simulation to be loaded using the free iPad software KiwiViewer. The interactive computer graphics created prior to neurosurgery were successfully displayed and smoothly controlled on the iPad in all patients. The number of elements ranged from 3 to 13 (mean 7). The mean original data size was 233 MB, which was reduced to 10.4 MB (4.4% of original size) after image processing by ParaView. This was increased to 46.6 MB (19.9%) after decompression in KiwiViewer. Controlling the magnification, transfer, rotation, and selection of translucence in 10 levels of each element were smoothly and easily performed using one or two fingers. The requisite skill to smoothly control the iPad software was acquired within 1.8 trials on average in 12 medical students and 6 neurosurgical residents. Using an iPad to handle the result of preneurosurgical simulation was extremely useful because it could easily be handled anywhere.

  18. Identifying the Computer Competency Levels of Recreation Department Undergraduates

    Science.gov (United States)

    Zorba, Erdal

    2011-01-01

    Computer-based and web-based applications are as major instructional tools to increase undergraduates' motivation at school. In the recreation field usage of, computer and the internet based recreational applications has become more prevalent in order to present visual and interactive entertainment activities. Recreation department undergraduates…

  19. New technologies of the information: tools for the education

    Directory of Open Access Journals (Sweden)

    Vicenta BUSTILLO PORRO

    2016-05-01

    Full Text Available We live in a society in which computing; telecommunications and audio-visual communication are gaining a great importance. We live in an audio-visual interactive society in which there will be more tasks made through the use of a computer connected to the Internet: working, medicine, electronic news, interactive films and of course teaching. We are encountered before new situations, which demand us the use of the Internet as a didactic tool and an essential tool of work. At the same time, these new situations will give the possibility to check the contents of the different curriculum, opening a channel up to communicate in a daily basis with the teaching staff, the classmates and the rest of the world. The new multimedia developments are the main infrastructure available with the capacity to support the media’s applications.

  20. Interacting orientations and instrumentalities to adapt a learning tool for health professionals

    Directory of Open Access Journals (Sweden)

    Kathrine L. Nygård

    2015-09-01

    Full Text Available Web-based instructional software offers new opportunities for collaborative, task-oriented in-service training. Planning and negotiation of content to adapt a web-based learning resource for nursing is the topic of this paper. We draw from Cultural Historical Activity Theory to elaborate the dialectical relationship of changing and stabilizing organizational practice. Local adaptation to create a domain-specific resource plays out as interactions of orientations and instrumentalities. Our analysis traces how orientations, i.e., in situ selection of knowledge and mobilization of experiences, and instrumentality, i.e., interpreted affordances of available cultural tools, interact. The adaptation processes are mediated by a set of new and current tools that interact with multiple orientations to ensure stability and promote change. Practice and project are introduced as intermediate, analytic concepts to assess tensions in the observed activity. Our analysis shows three central tensions, how they are introduced, addressed and subsequently resolved. Considering the opportunities help understand how engagement with technology can lead to new representations for introduction to a local knowledge domain.

  1. The Strategy Blueprint: A Strategy Process Computer-Aided Design Tool

    OpenAIRE

    Aldea, Adina Ioana; Febriani, Tania Rizki; Daneva, Maya; Iacob, Maria Eugenia

    2017-01-01

    Strategy has always been a main concern of organizations because it dictates their direction, and therefore determines their success. Thus, organizations need to have adequate support to guide them through their strategy formulation process. The goal of this research is to develop a computer-based tool, known as ‘the Strategy Blueprint’, consisting of a combination of nine strategy techniques, which can help organizations define the most suitable strategy, based on the internal and external f...

  2. Atomdroid: a computational chemistry tool for mobile platforms.

    Science.gov (United States)

    Feldt, Jonas; Mata, Ricardo A; Dieterich, Johannes M

    2012-04-23

    We present the implementation of a new molecular mechanics program designed for use in mobile platforms, the first specifically built for these devices. The software is designed to run on Android operating systems and is compatible with several modern tablet-PCs and smartphones available in the market. It includes molecular viewer/builder capabilities with integrated routines for geometry optimizations and Monte Carlo simulations. These functionalities allow it to work as a stand-alone tool. We discuss some particular development aspects, as well as the overall feasibility of using computational chemistry software packages in mobile platforms. Benchmark calculations show that through efficient implementation techniques even hand-held devices can be used to simulate midsized systems using force fields.

  3. Social Interaction in a Cooperative Brain-computer Interface Game

    NARCIS (Netherlands)

    Obbink, Michel; Gürkök, Hayrettin; Plass - Oude Bos, D.; Hakvoort, Gido; Poel, Mannes; Nijholt, Antinus; Camurri, Antonio; Costa, Cristina

    Does using a brain-computer interface (BCI) influence the social interaction between people when playing a cooperative game? By measuring the amount of speech, utterances, instrumental gestures and empathic gestures during a cooperative game where two participants had to reach a certain goal, and

  4. The Importance of Human-Computer Interaction in Radiology E-learning

    NARCIS (Netherlands)

    den Harder, Annemarie M; Frijlingh, Marissa; Ravesloot, Cécile J; Oosterbaan, Anne E; van der Gijp, Anouk

    2016-01-01

    With the development of cross-sectional imaging techniques and transformation to digital reading of radiological imaging, e-learning might be a promising tool in undergraduate radiology education. In this systematic review of the literature, we evaluate the emergence of image interaction

  5. Interactive computer graphics for stereotactic neurosurgery

    International Nuclear Information System (INIS)

    Goodman, J.H.; Davis, J.R.; Gahbauer, R.A.

    1986-01-01

    A microcomputer (IBM PC/AT) system has been developed to incorporate multiple image sources for stereotactic neurosurgery. Hard copy data is calibrated and captured with a video camera and frame grabber. Line contours are generated automatically on the basis of gray scale density or digitized manually. Interactive computer graphics provide a format with acceptable speed and accuracy for stereotactic neurosurgery. The ability to dimensionally integrate existing image data from multiple sources for target selection makes preoperative scans and scanner compatible head holders unnecessary. The system description and examples of use for brain tumor biopsy and brachytherapy ware presented

  6. Integrated computer-aided design using minicomputers

    Science.gov (United States)

    Storaasli, O. O.

    1980-01-01

    Computer-Aided Design/Computer-Aided Manufacturing (CAD/CAM), a highly interactive software, has been implemented on minicomputers at the NASA Langley Research Center. CAD/CAM software integrates many formerly fragmented programs and procedures into one cohesive system; it also includes finite element modeling and analysis, and has been interfaced via a computer network to a relational data base management system and offline plotting devices on mainframe computers. The CAD/CAM software system requires interactive graphics terminals operating at a minimum of 4800 bits/sec transfer rate to a computer. The system is portable and introduces 'interactive graphics', which permits the creation and modification of models interactively. The CAD/CAM system has already produced designs for a large area space platform, a national transonic facility fan blade, and a laminar flow control wind tunnel model. Besides the design/drafting element analysis capability, CAD/CAM provides options to produce an automatic program tooling code to drive a numerically controlled (N/C) machine. Reductions in time for design, engineering, drawing, finite element modeling, and N/C machining will benefit productivity through reduced costs, fewer errors, and a wider range of configuration.

  7. Drug-Target Interactions: Prediction Methods and Applications.

    Science.gov (United States)

    Anusuya, Shanmugam; Kesherwani, Manish; Priya, K Vishnu; Vimala, Antonydhason; Shanmugam, Gnanendra; Velmurugan, Devadasan; Gromiha, M Michael

    2018-01-01

    Identifying the interactions between drugs and target proteins is a key step in drug discovery. This not only aids to understand the disease mechanism, but also helps to identify unexpected therapeutic activity or adverse side effects of drugs. Hence, drug-target interaction prediction becomes an essential tool in the field of drug repurposing. The availability of heterogeneous biological data on known drug-target interactions enabled many researchers to develop various computational methods to decipher unknown drug-target interactions. This review provides an overview on these computational methods for predicting drug-target interactions along with available webservers and databases for drug-target interactions. Further, the applicability of drug-target interactions in various diseases for identifying lead compounds has been outlined. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  8. High-Performance Data Analysis Tools for Sun-Earth Connection Missions

    Science.gov (United States)

    Messmer, Peter

    2011-01-01

    The data analysis tool of choice for many Sun-Earth Connection missions is the Interactive Data Language (IDL) by ITT VIS. The increasing amount of data produced by these missions and the increasing complexity of image processing algorithms requires access to higher computing power. Parallel computing is a cost-effective way to increase the speed of computation, but algorithms oftentimes have to be modified to take advantage of parallel systems. Enhancing IDL to work on clusters gives scientists access to increased performance in a familiar programming environment. The goal of this project was to enable IDL applications to benefit from both computing clusters as well as graphics processing units (GPUs) for accelerating data analysis tasks. The tool suite developed in this project enables scientists now to solve demanding data analysis problems in IDL that previously required specialized software, and it allows them to be solved orders of magnitude faster than on conventional PCs. The tool suite consists of three components: (1) TaskDL, a software tool that simplifies the creation and management of task farms, collections of tasks that can be processed independently and require only small amounts of data communication; (2) mpiDL, a tool that allows IDL developers to use the Message Passing Interface (MPI) inside IDL for problems that require large amounts of data to be exchanged among multiple processors; and (3) GPULib, a tool that simplifies the use of GPUs as mathematical coprocessors from within IDL. mpiDL is unique in its support for the full MPI standard and its support of a broad range of MPI implementations. GPULib is unique in enabling users to take advantage of an inexpensive piece of hardware, possibly already installed in their computer, and achieve orders of magnitude faster execution time for numerically complex algorithms. TaskDL enables the simple setup and management of task farms on compute clusters. The products developed in this project have the

  9. Electronic structure of BN-aromatics: Choice of reliable computational tools

    Science.gov (United States)

    Mazière, Audrey; Chrostowska, Anna; Darrigan, Clovis; Dargelos, Alain; Graciaa, Alain; Chermette, Henry

    2017-10-01

    The importance of having reliable calculation tools to interpret and predict the electronic properties of BN-aromatics is directly linked to the growing interest for these very promising new systems in the field of materials science, biomedical research, or energy sustainability. Ionization energy (IE) is one of the most important parameters to approach the electronic structure of molecules. It can be theoretically estimated, but in order to evaluate their persistence and propose the most reliable tools for the evaluation of different electronic properties of existent or only imagined BN-containing compounds, we took as reference experimental values of ionization energies provided by ultra-violet photoelectron spectroscopy (UV-PES) in gas phase—the only technique giving access to the energy levels of filled molecular orbitals. Thus, a set of 21 aromatic molecules containing B-N bonds and B-N-B patterns has been merged for a comparison between experimental IEs obtained by UV-PES and various theoretical approaches for their estimation. Time-Dependent Density Functional Theory (TD-DFT) methods using B3LYP and long-range corrected CAM-B3LYP functionals are used, combined with the Δ SCF approach, and compared with electron propagator theory such as outer valence Green's function (OVGF, P3) and symmetry adapted cluster-configuration interaction ab initio methods. Direct Kohn-Sham estimation and "corrected" Kohn-Sham estimation are also given. The deviation between experimental and theoretical values is computed for each molecule, and a statistical study is performed over the average and the root mean square for the whole set and sub-sets of molecules. It is shown that (i) Δ SCF+TDDFT(CAM-B3LYP), OVGF, and P3 are the most efficient way for a good agreement with UV-PES values, (ii) a CAM-B3LYP range-separated hybrid functional is significantly better than B3LYP for the purpose, especially for extended conjugated systems, and (iii) the "corrected" Kohn-Sham result is a

  10. Digital interactive learning of oral radiographic anatomy.

    Science.gov (United States)

    Vuchkova, J; Maybury, T; Farah, C S

    2012-02-01

    Studies reporting high number of diagnostic errors made from radiographs suggest the need to improve the learning of radiographic interpretation in the dental curriculum. Given studies that show student preference for computer-assisted or digital technologies, the purpose of this study was to develop an interactive digital tool and to determine whether it was more successful than a conventional radiology textbook in assisting dental students with the learning of radiographic anatomy. Eighty-eight dental students underwent a learning phase of radiographic anatomy using an interactive digital tool alongside a conventional radiology textbook. The success of the digital tool, when compared to the textbook, was assessed by quantitative means using a radiographic interpretation test and by qualitative means using a structured Likert scale survey, asking students to evaluate their own learning outcomes from the digital tool. Student evaluations of the digital tool showed that almost all participants (95%) indicated that the tool positively enhanced their learning of radiographic anatomy and interpretation. The success of the digital tool in assisting the learning of radiographic interpretation is discussed in the broader context of learning and teaching curricula, and preference (by students) for the use of this digital form when compared to the conventional literate form of the textbook. Whilst traditional textbooks are still valued in the dental curriculum, it is evident that the preference for computer-assisted learning of oral radiographic anatomy enhances the learning experience by enabling students to interact and better engage with the course material. © 2011 John Wiley & Sons A/S.

  11. Interactive and Approachable Web-Based Tools for Exploring Global Geophysical Data Records

    Science.gov (United States)

    Croteau, M. J.; Nerem, R. S.; Merrifield, M. A.; Thompson, P. R.; Loomis, B. D.; Wiese, D. N.; Zlotnicki, V.; Larson, J.; Talpe, M.; Hardy, R. A.

    2017-12-01

    Making global and regional data accessible and understandable for non-experts can be both challenging and hazardous. While data products are often developed with end users in mind, the ease of use of these data can vary greatly. Scientists must take care to provide detailed guides for how to use data products to ensure users are not incorrectly applying data to their problem. For example, terrestrial water storage data from the Gravity Recovery and Climate Experiment (GRACE) satellite mission is notoriously difficult for non-experts to access and correctly use. However, allowing these data to be easily accessible to scientists outside the GRACE community is desirable because this would allow that data to see much wider-spread use. We have developed a web-based interactive mapping and plotting tool that provides easy access to geophysical data. This work presents an intuitive method for making such data widely accessible to experts and non-experts alike, making the data approachable and ensuring proper use of the data. This tool has proven helpful to experts by providing fast and detailed access to the data. Simultaneously, the tool allows non-experts to gain familiarity with the information contained in the data and access to that information for both scientific studies and public use. In this presentation, we discuss the development of this tool and application to both GRACE and ocean altimetry satellite missions, and demonstrate the capabilities of the tool. Focusing on the data visualization aspects of the tool, we showcase our integrations of the Mapbox API and the D3.js data-driven web document framework. We then explore the potential of these tools in other web-based visualization projects, and how incorporation of such tools into science can improve the presentation of research results. We demonstrate how the development of an interactive and exploratory resource can enable further layers of exploratory and scientific discovery.

  12. An HTML Tool for Production of Interactive Stereoscopic Compositions.

    Science.gov (United States)

    Chistyakov, Alexey; Soto, Maria Teresa; Martí, Enric; Carrabina, Jordi

    2016-12-01

    The benefits of stereoscopic vision in medical applications were appreciated and have been thoroughly studied for more than a century. The usage of the stereoscopic displays has a proven positive impact on performance in various medical tasks. At the same time the market of 3D-enabled technologies is blooming. New high resolution stereo cameras, TVs, projectors, monitors, and head mounted displays become available. This equipment, completed with a corresponding application program interface (API), could be relatively easy implemented in a system. Such complexes could open new possibilities for medical applications exploiting the stereoscopic depth. This work proposes a tool for production of interactive stereoscopic graphical user interfaces, which could represent a software layer for web-based medical systems facilitating the stereoscopic effect. Further the tool's operation mode and the results of the conducted subjective and objective performance tests will be exposed.

  13. Smartphone qualification & linux-based tools for CubeSat computing payloads

    Science.gov (United States)

    Bridges, C. P.; Yeomans, B.; Iacopino, C.; Frame, T. E.; Schofield, A.; Kenyon, S.; Sweeting, M. N.

    Modern computers are now far in advance of satellite systems and leveraging of these technologies for space applications could lead to cheaper and more capable spacecraft. Together with NASA AMES's PhoneSat, the STRaND-1 nanosatellite team has been developing and designing new ways to include smart-phone technologies to the popular CubeSat platform whilst mitigating numerous risks. Surrey Space Centre (SSC) and Surrey Satellite Technology Ltd. (SSTL) have led in qualifying state-of-the-art COTS technologies and capabilities - contributing to numerous low-cost satellite missions. The focus of this paper is to answer if 1) modern smart-phone software is compatible for fast and low-cost development as required by CubeSats, and 2) if the components utilised are robust to the space environment. The STRaND-1 smart-phone payload software explored in this paper is united using various open-source Linux tools and generic interfaces found in terrestrial systems. A major result from our developments is that many existing software and hardware processes are more than sufficient to provide autonomous and operational payload object-to-object and file-based management solutions. The paper will provide methodologies on the software chains and tools used for the STRaND-1 smartphone computing platform, the hardware built with space qualification results (thermal, thermal vacuum, and TID radiation), and how they can be implemented in future missions.

  14. Physics Education through Computational Tools: The Case of Geometrical and Physical Optics

    Science.gov (United States)

    Rodríguez, Y.; Santana, A.; Mendoza, L. M.

    2013-01-01

    Recently, with the development of more powerful and accurate computational tools, the inclusion of new didactic materials in the classroom is known to have increased. However, the form in which these materials can be used to enhance the learning process is still under debate. Many different methodologies have been suggested for constructing new…

  15. Engageability: a new sub-principle of the learnability principle in human-computer interaction

    Directory of Open Access Journals (Sweden)

    B Chimbo

    2011-12-01

    Full Text Available The learnability principle relates to improving the usability of software, as well as users’ performance and productivity. A gap has been identified as the current definition of the principle does not distinguish between users of different ages. To determine the extent of the gap, this article compares the ways in which two user groups, adults and children, learn how to use an unfamiliar software application. In doing this, we bring together the research areas of human-computer interaction (HCI, adult and child learning, learning theories and strategies, usability evaluation and interaction design. A literature survey conducted on learnability and learning processes considered the meaning of learnability of software applications across generations. In an empirical investigation, users aged from 9 to 12 and from 35 to 50 were observed in a usability laboratory while learning to use educational software applications. Insights that emerged from data analysis showed different tactics and approaches that children and adults use when learning unfamiliar software. Eye tracking data was also recorded. Findings indicated that subtle re- interpretation of the learnability principle and its associated sub-principles was required. An additional sub-principle, namely engageability was proposed to incorporate aspects of learnability that are not covered by the existing sub-principles. Our re-interpretation of the learnability principle and the resulting design recommendations should help designers to fulfill the varying needs of different-aged users, and improve the learnability of their designs. Keywords: Child computer interaction, Design principles, Eye tracking, Generational differences, human-computer interaction, Learning theories, Learnability, Engageability, Software applications, Uasability Disciplines: Human-Computer Interaction (HCI Studies, Computer science, Observational Studies

  16. Multi-step EMG Classification Algorithm for Human-Computer Interaction

    Science.gov (United States)

    Ren, Peng; Barreto, Armando; Adjouadi, Malek

    A three-electrode human-computer interaction system, based on digital processing of the Electromyogram (EMG) signal, is presented. This system can effectively help disabled individuals paralyzed from the neck down to interact with computers or communicate with people through computers using point-and-click graphic interfaces. The three electrodes are placed on the right frontalis, the left temporalis and the right temporalis muscles in the head, respectively. The signal processing algorithm used translates the EMG signals during five kinds of facial movements (left jaw clenching, right jaw clenching, eyebrows up, eyebrows down, simultaneous left & right jaw clenching) into five corresponding types of cursor movements (left, right, up, down and left-click), to provide basic mouse control. The classification strategy is based on three principles: the EMG energy of one channel is typically larger than the others during one specific muscle contraction; the spectral characteristics of the EMG signals produced by the frontalis and temporalis muscles during different movements are different; the EMG signals from adjacent channels typically have correlated energy profiles. The algorithm is evaluated on 20 pre-recorded EMG signal sets, using Matlab simulations. The results show that this method provides improvements and is more robust than other previous approaches.

  17. 3D data processing with advanced computer graphics tools

    Science.gov (United States)

    Zhang, Song; Ekstrand, Laura; Grieve, Taylor; Eisenmann, David J.; Chumbley, L. Scott

    2012-09-01

    Often, the 3-D raw data coming from an optical profilometer contains spiky noises and irregular grid, which make it difficult to analyze and difficult to store because of the enormously large size. This paper is to address these two issues for an optical profilometer by substantially reducing the spiky noise of the 3-D raw data from an optical profilometer, and by rapidly re-sampling the raw data into regular grids at any pixel size and any orientation with advanced computer graphics tools. Experimental results will be presented to demonstrate the effectiveness of the proposed approach.

  18. Computational and mathematical methods in brain atlasing.

    Science.gov (United States)

    Nowinski, Wieslaw L

    2017-12-01

    Brain atlases have a wide range of use from education to research to clinical applications. Mathematical methods as well as computational methods and tools play a major role in the process of brain atlas building and developing atlas-based applications. Computational methods and tools cover three areas: dedicated editors for brain model creation, brain navigators supporting multiple platforms, and atlas-assisted specific applications. Mathematical methods in atlas building and developing atlas-aided applications deal with problems in image segmentation, geometric body modelling, physical modelling, atlas-to-scan registration, visualisation, interaction and virtual reality. Here I overview computational and mathematical methods in atlas building and developing atlas-assisted applications, and share my contribution to and experience in this field.

  19. Hardware replacements and software tools for digital control computers

    International Nuclear Information System (INIS)

    Walker, R.A.P.; Wang, B-C.; Fung, J.

    1996-01-01

    computers which use 'Varian' technology. A new software program, Desk Top Tools, permits the designer greater flexibility in digital control computer software design and testing. This software development allows the user to emulate control of the CANDU reactor system by system. All discussions will highlight the ability of the replacements and the new developments to enhance the operation of the existing and 'repeat' plant digital control computers and will explore future applications of these developments. Examples of current use of all replacement components and software are provided. (author)

  20. ReACT!: An Interactive Educational Tool for AI Planning for Robotics

    Science.gov (United States)

    Dogmus, Zeynep; Erdem, Esra; Patogulu, Volkan

    2015-01-01

    This paper presents ReAct!, an interactive educational tool for artificial intelligence (AI) planning for robotics. ReAct! enables students to describe robots' actions and change in dynamic domains without first having to know about the syntactic and semantic details of the underlying formalism, and to solve planning problems using…

  1. SOSPEX, an interactive tool to explore SOFIA spectral cubes

    Science.gov (United States)

    Fadda, Dario; Chambers, Edward T.

    2018-01-01

    We present SOSPEX (SOFIA SPectral EXplorer), an interactive tool to visualize and analyze spectral cubes obtained with the FIFI-LS and GREAT instruments onboard the SOFIA Infrared Observatory. This software package is written in Python 3 and it is available either through Github or Anaconda.Through this GUI it is possible to explore directly the spectral cubes produced by the SOFIA pipeline and archived in the SOFIA Science Archive. Spectral cubes are visualized showing their spatial and spectral dimensions in two different windows. By selecting a part of the spectrum, the flux from the corresponding slice of the cube is visualized in the spatial window. On the other hand, it is possible to define apertures on the spatial window to show the corresponding spectral energy distribution in the spectral window.Flux isocontours can be overlapped to external images in the spatial window while line names, atmospheric transmission, or external spectra can be overplotted on the spectral window. Atmospheric models with specific parameters can be retrieved, compared to the spectra and applied to the uncorrected FIFI-LS cubes in the cases where the standard values give unsatisfactory results. Subcubes can be selected and saved as FITS files by cropping or cutting the original cubes. Lines and continuum can be fitted in the spectral window saving the results in Jyson files which can be reloaded later. Finally, in the case of spatially extended observations, it is possible to compute spectral momenta as a function of the position to obtain velocity dispersion maps or velocity diagrams.

  2. TME (Task Mapping Editor): tool for executing distributed parallel computing. TME user's manual

    International Nuclear Information System (INIS)

    Takemiya, Hiroshi; Yamagishi, Nobuhiro; Imamura, Toshiyuki

    2000-03-01

    At the Center for Promotion of Computational Science and Engineering, a software environment PPExe has been developed to support scientific computing on a parallel computer cluster (distributed parallel scientific computing). TME (Task Mapping Editor) is one of components of the PPExe and provides a visual programming environment for distributed parallel scientific computing. Users can specify data dependence among tasks (programs) visually as a data flow diagram and map these tasks onto computers interactively through GUI of TME. The specified tasks are processed by other components of PPExe such as Meta-scheduler, RIM (Resource Information Monitor), and EMS (Execution Management System) according to the execution order of these tasks determined by TME. In this report, we describe the usage of TME. (author)

  3. Aligator: A computational tool for optimizing total chemical synthesis of large proteins.

    Science.gov (United States)

    Jacobsen, Michael T; Erickson, Patrick W; Kay, Michael S

    2017-09-15

    The scope of chemical protein synthesis (CPS) continues to expand, driven primarily by advances in chemical ligation tools (e.g., reversible solubilizing groups and novel ligation chemistries). However, the design of an optimal synthesis route can be an arduous and fickle task due to the large number of theoretically possible, and in many cases problematic, synthetic strategies. In this perspective, we highlight recent CPS tool advances and then introduce a new and easy-to-use program, Aligator (Automated Ligator), for analyzing and designing the most efficient strategies for constructing large targets using CPS. As a model set, we selected the E. coli ribosomal proteins and associated factors for computational analysis. Aligator systematically scores and ranks all feasible synthetic strategies for a particular CPS target. The Aligator script methodically evaluates potential peptide segments for a target using a scoring function that includes solubility, ligation site quality, segment lengths, and number of ligations to provide a ranked list of potential synthetic strategies. We demonstrate the utility of Aligator by analyzing three recent CPS projects from our lab: TNFα (157 aa), GroES (97 aa), and DapA (312 aa). As the limits of CPS are extended, we expect that computational tools will play an increasingly important role in the efficient execution of ambitious CPS projects such as production of a mirror-image ribosome. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Generating Computational Models for Serious Gaming

    NARCIS (Netherlands)

    Westera, Wim

    2018-01-01

    Many serious games include computational models that simulate dynamic systems. These models promote enhanced interaction and responsiveness. Under the social web paradigm more and more usable game authoring tools become available that enable prosumers to create their own games, but the inclusion of

  5. CyberTEAM Interactive Epicenter Locator Tool

    Science.gov (United States)

    Ouyang, Y.; Hayden, K.; Lehmann, M.; Kilb, D.

    2008-12-01

    News coverage showing collapsed buildings, broken bridges and smashed cars help middle school students visualize the hazardous nature of earthquakes. However, few students understand how scientists investigate earthquakes through analysis of data collected using technology devices from around the world. The important findings by Muawia Barazangi and James Dorman in 1969 revealed how earthquakes charted between 1961 and 1967 delineated narrow belts of seismicity. This important discovery prompted additional research that eventually led to the theory of plate tectonics. When a large earthquake occurs, people from distances near and far can feel it to varying degrees. But how do scientists examine data to identify the locations of earthquake epicenters? The scientific definition of an earthquake: "a movement within the Earth's crust or mantle, caused by the sudden rupture or repositioning of underground material as they release stress" can be confusing for students first studying Earth science in 6th grade. Students struggle with understanding how scientists can tell when and where a rupture occurs, when the inner crust and mantle are not visible to us. Our CyberTEAM project provides 6th grade teachers with the opportunity to engage adolescents in activities that make textbooks come alive as students manipulate the same data that today's scientists use. We have developed an Earthquake Epicenter Location Tool that includes two Flash-based interactive learning objects that can be used to study basic seismology concepts and lets the user determine earthquake epicenters from current data. Through the Wilber II system maintained at the IRIS (Incorporated Research Institutions for Seismology) Web site, this project retrieves seismic data of recent earthquakes and makes them available to the public. Students choose an earthquake to perform further explorations. For each earthquake, a selection of USArray seismic stations are marked on a Google Map. Picking a station on the

  6. Interactive computer graphics displays for hierarchical data structures

    International Nuclear Information System (INIS)

    Cahn, D.F.; Murano, C.V.

    1980-05-01

    An interactive computer graphical display program was developed as an aid to user visualization and manipulation of hierarchically structured data systems such as thesauri. In the present configuration, a thesaurus term and its primary and secondary conceptual neighbors are presented to the user in tree graph form on a CRT; the user then designates, via light pen or keyboard, any of the neighbors as the next term of interest and receives a new display centered on this term. By successive specification of broader, narrower, and related terms, the user can course rapidly through the thesaurus space and refine his search file. At any stage, he deals with a term-centered, conceptually meaningful picture of a localized portion of the thesaurus, and is freed from the artificial difficulties of handling the traditional alphabetized thesaurus. Intentional limitation of the associative range of each display frame, and the use of color, case, and interconnecting vectors to encode relationships among terms, enhance interpretability of the display. Facile movement through the term space, provided by interactive computation, allows the display to remain simple, and is an essential element of the system. 3 figures

  7. ReliefSeq: a gene-wise adaptive-K nearest-neighbor feature selection tool for finding gene-gene interactions and main effects in mRNA-Seq gene expression data.

    Directory of Open Access Journals (Sweden)

    Brett A McKinney

    Full Text Available Relief-F is a nonparametric, nearest-neighbor machine learning method that has been successfully used to identify relevant variables that may interact in complex multivariate models to explain phenotypic variation. While several tools have been developed for assessing differential expression in sequence-based transcriptomics, the detection of statistical interactions between transcripts has received less attention in the area of RNA-seq analysis. We describe a new extension and assessment of Relief-F for feature selection in RNA-seq data. The ReliefSeq implementation adapts the number of nearest neighbors (k for each gene to optimize the Relief-F test statistics (importance scores for finding both main effects and interactions. We compare this gene-wise adaptive-k (gwak Relief-F method with standard RNA-seq feature selection tools, such as DESeq and edgeR, and with the popular machine learning method Random Forests. We demonstrate performance on a panel of simulated data that have a range of distributional properties reflected in real mRNA-seq data including multiple transcripts with varying sizes of main effects and interaction effects. For simulated main effects, gwak-Relief-F feature selection performs comparably to standard tools DESeq and edgeR for ranking relevant transcripts. For gene-gene interactions, gwak-Relief-F outperforms all comparison methods at ranking relevant genes in all but the highest fold change/highest signal situations where it performs similarly. The gwak-Relief-F algorithm outperforms Random Forests for detecting relevant genes in all simulation experiments. In addition, Relief-F is comparable to the other methods based on computational time. We also apply ReliefSeq to an RNA-Seq study of smallpox vaccine to identify gene expression changes between vaccinia virus-stimulated and unstimulated samples. ReliefSeq is an attractive tool for inclusion in the suite of tools used for analysis of mRNA-Seq data; it has power to

  8. A Tool for Interactive Data Visualization: Application to Over 10,000 Brain Imaging and Phantom MRI Data Sets

    Directory of Open Access Journals (Sweden)

    Sandeep R Panta

    2016-03-01

    Full Text Available In this paper we propose a web-based approach for quick visualization of big data from brain magnetic resonance imaging (MRI scans using a combination of an automated image capture and processing system, nonlinear embedding, and interactive data visualization tools. We draw upon thousands of MRI scans captured via the COllaborative Imaging and Neuroinformatics Suite (COINS. We then interface the output of several analysis pipelines based on structural and functional data to a t-distributed stochastic neighbor embedding (t-SNE algorithm which reduces the number of dimensions for each scan in the input data set to two dimensions while preserving the local structure of data sets. Finally, we interactively display the output of this approach via a web-page, based on data driven documents (D3 JavaScript library. Two distinct approaches were used to visualize the data. In the first approach, we computed multiple quality control (QC values from pre-processed data, which were used as inputs to the t-SNE algorithm. This approach helps in assessing the quality of each data set relative to others. In the second case, computed variables of interest (e.g. brain volume or voxel values from segmented gray matter images were used as inputs to the t-SNE algorithm. This approach helps in identifying interesting patterns in the data sets. We demonstrate these approaches using multiple examples including 1 quality control measures calculated from phantom data over time, 2 quality control data from human functional MRI data across various studies, scanners, sites, 3 volumetric and density measures from human structural MRI data across various studies, scanners and sites. Results from (1 and (2 show the potential of our approach to combine t-SNE data reduction with interactive color coding of variables of interest to quickly identify visually unique clusters of data (i.e. data sets with poor QC, clustering of data by site quickly. Results from (3 demonstrate

  9. Review. Supporting problem structuring with computer-based tools in participatory forest planning

    Directory of Open Access Journals (Sweden)

    T. Hujala

    2013-07-01

    Full Text Available Aim of study: This review presents the state-of-art of using computerized techniques for problem structuring (PS in participatory forest planning. Frequency and modes of using different computerized tool types and their contribution for planning processes as well as critical observations are described, followed by recommendations on how to better integrate PS with the use of forest decision support systems.Area of study: The reviewed research cases are from Asia, Europe, North-America, Africa and Australia.Materials and methods: Via Scopus search and screening of abstracts, 32 research articles from years 2002–2011 were selected for review. Explicit and implicit evidence of using computerized tools for PS was recorded and assessed with content-driven qualitative analysis.Main results: GIS and forest-specific simulation tools were the most prevalent software types whereas cognitive modelling software and spreadsheet and calculation tools were less frequently used, followed by multi-criteria and interactive tools. The typical use type was to provide outputs of simulation–optimization or spatial analysis to negotiation situations or to compile summaries or illustrations afterwards; using software during group negotiation to foster interaction was observed only in a few cases.Research highlights: Expertise in both decision support systems and group learning is needed to better integrate PS and computerized decision analysis. From the knowledge management perspective, it is recommended to consider how the results of PS – e.g. conceptual models – could be stored into a problem perception database, and how PS and decision making could be streamlined by retrievals from such systems.Keywords: facilitated modeling; group negotiation; knowledge management; natural resource management; PSM; soft OR; stakeholders.

  10. Creating science simulations through Computational Thinking Patterns

    Science.gov (United States)

    Basawapatna, Ashok Ram

    Computational thinking aims to outline fundamental skills from computer science that everyone should learn. As currently defined, with help from the National Science Foundation (NSF), these skills include problem formulation, logically organizing data, automating solutions through algorithmic thinking, and representing data through abstraction. One aim of the NSF is to integrate these and other computational thinking concepts into the classroom. End-user programming tools offer a unique opportunity to accomplish this goal. An end-user programming tool that allows students with little or no prior experience the ability to create simulations based on phenomena they see in-class could be a first step towards meeting most, if not all, of the above computational thinking goals. This thesis describes the creation, implementation and initial testing of a programming tool, called the Simulation Creation Toolkit, with which users apply high-level agent interactions called Computational Thinking Patterns (CTPs) to create simulations. Employing Computational Thinking Patterns obviates lower behavior-level programming and allows users to directly create agent interactions in a simulation by making an analogy with real world phenomena they are trying to represent. Data collected from 21 sixth grade students with no prior programming experience and 45 seventh grade students with minimal programming experience indicates that this is an effective first step towards enabling students to create simulations in the classroom environment. Furthermore, an analogical reasoning study that looked at how users might apply patterns to create simulations from high- level descriptions with little guidance shows promising results. These initial results indicate that the high level strategy employed by the Simulation Creation Toolkit is a promising strategy towards incorporating Computational Thinking concepts in the classroom environment.

  11. Seamless interaction with scrolling contents on eyewear computers using optokinetic nystagmus eye movements

    DEFF Research Database (Denmark)

    Jalaliniya, Shahram; Mardanbegi, Diako

    2016-01-01

    In this paper we investigate the utility of an eye-based interaction technique (EyeGrip) for seamless interaction with scrolling contents on eyewear computers. EyeGrip uses Optokinetic Nystagmus (OKN) eye movements to detect object of interest among a set of scrolling contents and automatically s...... a well-known input device. Moreover, the accuracy of the EyeGrip method for menu item selection was higher while in the Facebook study participants found keyboard more accurate.......In this paper we investigate the utility of an eye-based interaction technique (EyeGrip) for seamless interaction with scrolling contents on eyewear computers. EyeGrip uses Optokinetic Nystagmus (OKN) eye movements to detect object of interest among a set of scrolling contents and automatically...... stops scrolling for the user. We empirically evaluated the usability of EyeGrip in two different applications for eyewear computers: 1) a menu scroll viewer and 2) a Facebook newsfeed reader. The results of our study showed that the EyeGrip technique performs as good as keyboard which has long been...

  12. vFitness: a web-based computing tool for improving estimation of in vitro HIV-1 fitness experiments

    Directory of Open Access Journals (Sweden)

    Demeter Lisa

    2010-05-01

    Full Text Available Abstract Background The replication rate (or fitness between viral variants has been investigated in vivo and in vitro for human immunodeficiency virus (HIV. HIV fitness plays an important role in the development and persistence of drug resistance. The accurate estimation of viral fitness relies on complicated computations based on statistical methods. This calls for tools that are easy to access and intuitive to use for various experiments of viral fitness. Results Based on a mathematical model and several statistical methods (least-squares approach and measurement error models, a Web-based computing tool has been developed for improving estimation of virus fitness in growth competition assays of human immunodeficiency virus type 1 (HIV-1. Conclusions Unlike the two-point calculation used in previous studies, the estimation here uses linear regression methods with all observed data in the competition experiment to more accurately estimate relative viral fitness parameters. The dilution factor is introduced for making the computational tool more flexible to accommodate various experimental conditions. This Web-based tool is implemented in C# language with Microsoft ASP.NET, and is publicly available on the Web at http://bis.urmc.rochester.edu/vFitness/.

  13. Cloud Computing Techniques for Space Mission Design

    Science.gov (United States)

    Arrieta, Juan; Senent, Juan

    2014-01-01

    The overarching objective of space mission design is to tackle complex problems producing better results, and faster. In developing the methods and tools to fulfill this objective, the user interacts with the different layers of a computing system.

  14. The GLEaMviz computational tool, a publicly available software to explore realistic epidemic spreading scenarios at the global scale

    Directory of Open Access Journals (Sweden)

    Quaggiotto Marco

    2011-02-01

    Full Text Available Abstract Background Computational models play an increasingly important role in the assessment and control of public health crises, as demonstrated during the 2009 H1N1 influenza pandemic. Much research has been done in recent years in the development of sophisticated data-driven models for realistic computer-based simulations of infectious disease spreading. However, only a few computational tools are presently available for assessing scenarios, predicting epidemic evolutions, and managing health emergencies that can benefit a broad audience of users including policy makers and health institutions. Results We present "GLEaMviz", a publicly available software system that simulates the spread of emerging human-to-human infectious diseases across the world. The GLEaMviz tool comprises three components: the client application, the proxy middleware, and the simulation engine. The latter two components constitute the GLEaMviz server. The simulation engine leverages on the Global Epidemic and Mobility (GLEaM framework, a stochastic computational scheme that integrates worldwide high-resolution demographic and mobility data to simulate disease spread on the global scale. The GLEaMviz design aims at maximizing flexibility in defining the disease compartmental model and configuring the simulation scenario; it allows the user to set a variety of parameters including: compartment-specific features, transition values, and environmental effects. The output is a dynamic map and a corresponding set of charts that quantitatively describe the geo-temporal evolution of the disease. The software is designed as a client-server system. The multi-platform client, which can be installed on the user's local machine, is used to set up simulations that will be executed on the server, thus avoiding specific requirements for large computational capabilities on the user side. Conclusions The user-friendly graphical interface of the GLEaMviz tool, along with its high level

  15. Computer and internet use in vascular outpatients--ready for interactive applications?

    Science.gov (United States)

    Richter, J G; Schneider, M; Klein-Weigel, P

    2009-11-01

    Exploring patients' computer and internet use, their expectations and attitudes is mandatory for successful introduction of interactive online health-care applications in Angiology. We included 165 outpatients suffering from peripheral arterial disease (PAD; n = 62) and chronic venous and / or lymphatic disease (CVLD; n = 103) in a cross-sectional-study. Patients answered a paper-based questionnaire. Patients were predominantly female (54.5%). 142 (86.1%) reported regular computer use for 9.7 +/- 5.8 years and 134 (81.2 %) used the internet for 6.2 +/- 3.6 years. CVLD-patients and internet-user were younger and higher educated, resulting in a significant difference in computer and internet use between the disease groups (p internet users without significant differences between the groups. The topics retrieved from the internet covered a wide spectrum and searches for health information were mentioned by 41.2 %. Although confidence in the internet (3.3 +/- 1.1 on a 1-6 Likert scale) and reliability in information retrieved from the internet (3.1 +/- 1.1) were relatively low, health-related issues were of high actual and future interest. 42.8% of the patients were even interested in interactive applications like health educational programs, 37.4% in self-reported assessments and outcome questionnaires and 26.9% in chatforums; 50% demanded access to their medical data on an Internetserver. Compared to older participants those internet more often for shopping, chatting, and e-mailing, but not for health information retrieval and interactive applications. Computers are commonly used and the internet has been adopted as an important source of information by patients suffering from PAD and CVLD. Besides, the internet offers great potentials and new opportunities for interactive disease (self-)management in angiology. To increase confidence and reliability in the medium a careful introduction and evaluation of these new online applications is mandatory.

  16. Data Analysis Tools and Methods for Improving the Interaction Design in E-Learning

    Science.gov (United States)

    Popescu, Paul Stefan

    2015-01-01

    In this digital era, learning from data gathered from different software systems may have a great impact on the quality of the interaction experience. There are two main directions that come to enhance this emerging research domain, Intelligent Data Analysis (IDA) and Human Computer Interaction (HCI). HCI specific research methodologies can be…

  17. A computational tool to characterize particle tracking measurements in optical tweezers

    International Nuclear Information System (INIS)

    Taylor, Michael A; Bowen, Warwick P

    2013-01-01

    Here, we present a computational tool for optical tweezers which calculates the particle tracking signal measured with a quadrant detector and the shot-noise limit to position resolution. The tool is a piece of Matlab code which functions within the freely available Optical Tweezers Toolbox. It allows the measurements performed in most optical tweezer experiments to be theoretically characterized in a fast and easy manner. The code supports particles with arbitrary size, any optical fields and any combination of objective and condenser, and performs a full vector calculation of the relevant fields. Example calculations are presented which show the tracking signals for different particles, and the shot-noise limit to position sensitivity as a function of the effective condenser NA. (paper)

  18. Interactive Whiteboards and Implications for Use in Education

    Science.gov (United States)

    Gibson, Danita C.

    2013-01-01

    Interactive whiteboards (IWBs) have increasingly become a technology tool used in the educational field. IWBs are touch-sensitive screens that work in conjunction with a computer and a projector, and which are used to display information from a computer. As a qualitative case study, this study investigated the SMART Board-infused instructional…

  19. Nsite, NsiteH and NsiteM Computer Tools for Studying Tran-scription Regulatory Elements

    KAUST Repository

    Shahmuradov, Ilham; Solovyev, Victor

    2015-01-01

    regions. Computer methods for identification of REs remain a widely used tool for studying and understanding transcriptional regulation mechanisms. The Nsite, NsiteH and NsiteM programs perform searches for statistically significant (non-random) motifs

  20. First Studies for the Development of Computational Tools for the Design of Liquid Metal Electromagnetic Pumps

    Directory of Open Access Journals (Sweden)

    Carlos O. Maidana

    2017-02-01

    Full Text Available Liquid alloy systems have a high degree of thermal conductivity, far superior to ordinary nonmetallic liquids and inherent high densities and electrical conductivities. This results in the use of these materials for specific heat conducting and dissipation applications for the nuclear and space sectors. Uniquely, they can be used to conduct heat and electricity between nonmetallic and metallic surfaces. The motion of liquid metals in strong magnetic fields generally induces electric currents, which, while interacting with the magnetic field, produce electromagnetic forces. Electromagnetic pumps exploit the fact that liquid metals are conducting fluids capable of carrying currents, which is a source of electromagnetic fields useful for pumping and diagnostics. The coupling between the electromagnetics and thermo-fluid mechanical phenomena and the determination of its geometry and electrical configuration, gives rise to complex engineering magnetohydrodynamics problems. The development of tools to model, characterize, design, and build liquid metal thermo-magnetic systems for space, nuclear, and industrial applications are of primordial importance and represent a cross-cutting technology that can provide unique design and development capabilities as well as a better understanding of the physics behind the magneto-hydrodynamics of liquid metals. First studies for the development of computational tools for the design of liquid metal electromagnetic pumps are discussed.

  1. First studies for the development of computational tools for the design of liquid metal electromagnetic pumps

    Energy Technology Data Exchange (ETDEWEB)

    Maidana, Carlos O.; Nieminen, Juha E. [Maidana Research, Grandville (United States)

    2017-02-15

    Liquid alloy systems have a high degree of thermal conductivity, far superior to ordinary nonmetallic liquids and inherent high densities and electrical conductivities. This results in the use of these materials for specific heat conducting and dissipation applications for the nuclear and space sectors. Uniquely, they can be used to conduct heat and electricity between nonmetallic and metallic surfaces. The motion of liquid metals in strong magnetic fields generally induces electric currents, which, while interacting with the magnetic field, produce electromagnetic forces. Electromagnetic pumps exploit the fact that liquid metals are conducting fluids capable of carrying currents, which is a source of electromagnetic fields useful for pumping and diagnostics. The coupling between the electromagnetics and thermo-fluid mechanical phenomena and the determination of its geometry and electrical configuration, gives rise to complex engineering magnetohydrodynamics problems. The development of tools to model, characterize, design, and build liquid metal thermo-magnetic systems for space, nuclear, and industrial applications are of primordial importance and represent a cross-cutting technology that can provide unique design and development capabilities as well as a better understanding of the physics behind the magneto-hydrodynamics of liquid metals. First studies for the development of computational tools for the design of liquid metal electromagnetic pumps are discussed.

  2. Engineering computer graphics in gas turbine engine design, analysis and manufacture

    Science.gov (United States)

    Lopatka, R. S.

    1975-01-01

    A time-sharing and computer graphics facility designed to provide effective interactive tools to a large number of engineering users with varied requirements was described. The application of computer graphics displays at several levels of hardware complexity and capability is discussed, with examples of graphics systems tracing gas turbine product development, beginning with preliminary design through manufacture. Highlights of an operating system stylized for interactive engineering graphics is described.

  3. Customizable Computer-Based Interaction Analysis for Coaching and Self-Regulation in Synchronous CSCL Systems

    Science.gov (United States)

    Lonchamp, Jacques

    2010-01-01

    Computer-based interaction analysis (IA) is an automatic process that aims at understanding a computer-mediated activity. In a CSCL system, computer-based IA can provide information directly to learners for self-assessment and regulation and to tutors for coaching support. This article proposes a customizable computer-based IA approach for a…

  4. The Importance of Human-Computer Interaction in Radiology E-learning.

    Science.gov (United States)

    den Harder, Annemarie M; Frijlingh, Marissa; Ravesloot, Cécile J; Oosterbaan, Anne E; van der Gijp, Anouk

    2016-04-01

    With the development of cross-sectional imaging techniques and transformation to digital reading of radiological imaging, e-learning might be a promising tool in undergraduate radiology education. In this systematic review of the literature, we evaluate the emergence of image interaction possibilities in radiology e-learning programs and evidence for effects of radiology e-learning on learning outcomes and perspectives of medical students and teachers. A systematic search in PubMed, EMBASE, Cochrane, ERIC, and PsycInfo was performed. Articles were screened by two authors and included when they concerned the evaluation of radiological e-learning tools for undergraduate medical students. Nineteen articles were included. Seven studies evaluated e-learning programs with image interaction possibilities. Students perceived e-learning with image interaction possibilities to be a useful addition to learning with hard copy images and to be effective for learning 3D anatomy. Both e-learning programs with and without image interaction possibilities were found to improve radiological knowledge and skills. In general, students found e-learning programs easy to use, rated image quality high, and found the difficulty level of the courses appropriate. Furthermore, they felt that their knowledge and understanding of radiology improved by using e-learning. In conclusion, the addition of radiology e-learning in undergraduate medical education can improve radiological knowledge and image interpretation skills. Differences between the effect of e-learning with and without image interpretation possibilities on learning outcomes are unknown and should be subject to future research.

  5. Interactive Multimedia for Classroom and Web Use

    National Research Council Canada - National Science Library

    Harvey, Darren

    1999-01-01

    Today's advances in technology have provided new tools that allow us to be more creative. The focus of this research is on interactive lesson plans and a web-based tutorial for a graduate level computer architecture class...

  6. Analysis of Interactive Conflict Resolution Tool Usage in a Mixed Equipage Environment

    Science.gov (United States)

    Homola, Jeffrey; Morey, Susan; Cabrall, Christopher; Martin, Lynne; Mercer, Joey; Prevot, Thomas

    2013-01-01

    A human-in-the-loop simulation was conducted that examined separation assurance concepts in varying levels of traffic density with mixtures of aircraft equipage and automation. This paper's analysis focuses on one of the experimental conditions in which traffic levels were approximately fifty percent higher than today, and approximately fifty percent of the traffic within the test area were equipped with data communications (data comm) capabilities. The other fifty percent of the aircraft required control by voice much like today. Within this environment, the air traffic controller participants were provided access to tools and automation designed to support the primary task of separation assurance that are currently unavailable. Two tools were selected for analysis in this paper: 1) a pre-probed altitude fly-out menu that provided instant feedback of conflict probe results for a range of altitudes, and 2) an interactive auto resolver that provided on-demand access to an automation-generated conflict resolution trajectory. Although encouraged, use of the support tools was not required; the participants were free to use the tools as they saw fit, and they were also free to accept, reject, or modify the resolutions offered by the automation. This mode of interaction provided a unique opportunity to examine exactly when and how these tools were used, as well as how acceptable the resolutions were. Results showed that the participants used the pre-probed altitude fly-out menu in 14% of conflict cases and preferred to use it in a strategic timeframe on data comm equipped and level flight aircraft. The interactive auto resolver was also used in a primarily strategic timeframe on 22% of conflicts and that their preference was to use it on conflicts involving data comm equipped aircraft as well. Of the 258 resolutions displayed, 46% were implemented and 54% were not. The auto resolver was rated highly by participants in terms of confidence and preference. Factors such as

  7. An Interactive Learning Environment for Information and Communication Theory

    Science.gov (United States)

    Hamada, Mohamed; Hassan, Mohammed

    2017-01-01

    Interactive learning tools are emerging as effective educational materials in the area of computer science and engineering. It is a research domain that is rapidly expanding because of its positive impacts on motivating and improving students' performance during the learning process. This paper introduces an interactive learning environment for…

  8. PROACT user's guide: how to use the pallet recovery opportunity analysis computer tool

    Science.gov (United States)

    E. Bradley Hager; A.L. Hammett; Philip A. Araman

    2003-01-01

    Pallet recovery projects are environmentally responsible and offer promising business opportunities. The Pallet Recovery Opportunity Analysis Computer Tool (PROACT) assesses the operational and financial feasibility of potential pallet recovery projects. The use of project specific information supplied by the user increases the accuracy and the validity of the...

  9. Using the Eclipse Parallel Tools Platform to Assist Earth Science Model Development and Optimization on High Performance Computers

    Science.gov (United States)

    Alameda, J. C.

    2011-12-01

    Development and optimization of computational science models, particularly on high performance computers, and with the advent of ubiquitous multicore processor systems, practically on every system, has been accomplished with basic software tools, typically, command-line based compilers, debuggers, performance tools that have not changed substantially from the days of serial and early vector computers. However, model complexity, including the complexity added by modern message passing libraries such as MPI, and the need for hybrid code models (such as openMP and MPI) to be able to take full advantage of high performance computers with an increasing core count per shared memory node, has made development and optimization of such codes an increasingly arduous task. Additional architectural developments, such as many-core processors, only complicate the situation further. In this paper, we describe how our NSF-funded project, "SI2-SSI: A Productive and Accessible Development Workbench for HPC Applications Using the Eclipse Parallel Tools Platform" (WHPC) seeks to improve the Eclipse Parallel Tools Platform, an environment designed to support scientific code development targeted at a diverse set of high performance computing systems. Our WHPC project to improve Eclipse PTP takes an application-centric view to improve PTP. We are using a set of scientific applications, each with a variety of challenges, and using PTP to drive further improvements to both the scientific application, as well as to understand shortcomings in Eclipse PTP from an application developer perspective, to drive our list of improvements we seek to make. We are also partnering with performance tool providers, to drive higher quality performance tool integration. We have partnered with the Cactus group at Louisiana State University to improve Eclipse's ability to work with computational frameworks and extremely complex build systems, as well as to develop educational materials to incorporate into

  10. Integrative Metabolism: An Interactive Learning Tool for Nutrition, Biochemistry, and Physiology

    Science.gov (United States)

    Carey, Gale

    2010-01-01

    Metabolism is a dynamic, simultaneous, and integrative science that cuts across nutrition, biochemistry, and physiology. Teaching this science can be a challenge. The use of a scenario-based, visually appealing, interactive, computer-animated CD may overcome the limitations of learning "one pathway at a time" and engage two- and…

  11. Fluid/Structure Interaction Studies of Aircraft Using High Fidelity Equations on Parallel Computers

    Science.gov (United States)

    Guruswamy, Guru; VanDalsem, William (Technical Monitor)

    1994-01-01

    Abstract Aeroelasticity which involves strong coupling of fluids, structures and controls is an important element in designing an aircraft. Computational aeroelasticity using low fidelity methods such as the linear aerodynamic flow equations coupled with the modal structural equations are well advanced. Though these low fidelity approaches are computationally less intensive, they are not adequate for the analysis of modern aircraft such as High Speed Civil Transport (HSCT) and Advanced Subsonic Transport (AST) which can experience complex flow/structure interactions. HSCT can experience vortex induced aeroelastic oscillations whereas AST can experience transonic buffet associated structural oscillations. Both aircraft may experience a dip in the flutter speed at the transonic regime. For accurate aeroelastic computations at these complex fluid/structure interaction situations, high fidelity equations such as the Navier-Stokes for fluids and the finite-elements for structures are needed. Computations using these high fidelity equations require large computational resources both in memory and speed. Current conventional super computers have reached their limitations both in memory and speed. As a result, parallel computers have evolved to overcome the limitations of conventional computers. This paper will address the transition that is taking place in computational aeroelasticity from conventional computers to parallel computers. The paper will address special techniques needed to take advantage of the architecture of new parallel computers. Results will be illustrated from computations made on iPSC/860 and IBM SP2 computer by using ENSAERO code that directly couples the Euler/Navier-Stokes flow equations with high resolution finite-element structural equations.

  12. General, Interactive Computer Program for the Solution of the Schrodinger Equation

    Science.gov (United States)

    Griffin, Donald C.; McGhie, James B.

    1973-01-01

    Discusses an interactive computer algorithm which allows beginning students to solve one- and three-dimensional quantum problems. Included is an example of the Thomas-Fermi-Dirac central field approximation. (CC)

  13. Computer tomography guided lung biopsy using interactive breath-hold control

    DEFF Research Database (Denmark)

    Ashraf, Haseem; Krag-Andersen, Shella; Naqibullah, Matiullah

    2017-01-01

    Background: Interactive breath-hold control (IBC) may improve the accuracy and decrease the complication rate of computed tomography (CT)-guided lung biopsy, but this presumption has not been proven in a randomized study. Methods: Patients admitted for CT-guided lung biopsy were randomized...

  14. Exploring Biomolecular Interactions Through Single-Molecule Force Spectroscopy and Computational Simulation

    OpenAIRE

    Yang, Darren

    2016-01-01

    Molecular interactions between cellular components such as proteins and nucleic acids govern the fundamental processes of living systems. Technological advancements in the past decade have allowed the characterization of these molecular interactions at the single-molecule level with high temporal and spatial resolution. Simultaneously, progress in computer simulation has enabled theoretical research at the atomistic level, assisting in the interpretation of experimental results. This thesi...

  15. Computational protein design-the next generation tool to expand synthetic biology applications.

    Science.gov (United States)

    Gainza-Cirauqui, Pablo; Correia, Bruno Emanuel

    2018-05-02

    One powerful approach to engineer synthetic biology pathways is the assembly of proteins sourced from one or more natural organisms. However, synthetic pathways often require custom functions or biophysical properties not displayed by natural proteins, limitations that could be overcome through modern protein engineering techniques. Structure-based computational protein design is a powerful tool to engineer new functional capabilities in proteins, and it is beginning to have a profound impact in synthetic biology. Here, we review efforts to increase the capabilities of synthetic biology using computational protein design. We focus primarily on computationally designed proteins not only validated in vitro, but also shown to modulate different activities in living cells. Efforts made to validate computational designs in cells can illustrate both the challenges and opportunities in the intersection of protein design and synthetic biology. We also highlight protein design approaches, which although not validated as conveyors of new cellular function in situ, may have rapid and innovative applications in synthetic biology. We foresee that in the near-future, computational protein design will vastly expand the functional capabilities of synthetic cells. Copyright © 2018. Published by Elsevier Ltd.

  16. Computational tools for genome-wide miRNA prediction and study

    KAUST Repository

    Malas, T.B.

    2012-11-02

    MicroRNAs (miRNAs) are single-stranded non-coding RNA susually of 22 nucleotidesin length that play an important post-transcriptional regulation role in many organisms. MicroRNAs bind a seed sequence to the 3-untranslated region (UTR) region of the target messenger RNA (mRNA), inducing degradation or inhibition of translation and resulting in a reduction in the protein level. This regulatory mechanism is central to many biological processes and perturbation could lead to diseases such as cancer. Given the biological importance, of miRNAs, there is a great need to identify and study their targets and functions. However, miRNAs are very difficult to clone in the lab and this has hindered the identification of novel miRNAs. Next-generation sequencing coupled with new computational tools has recently evolved to help researchers efficiently identify large numbers of novel miRNAs. In this review, we describe recent miRNA prediction tools and discuss their priorities, advantages and disadvantages. Malas and Ravasi.

  17. Computational tools for genome-wide miRNA prediction and study

    KAUST Repository

    Malas, T.B.; Ravasi, Timothy

    2012-01-01

    MicroRNAs (miRNAs) are single-stranded non-coding RNA susually of 22 nucleotidesin length that play an important post-transcriptional regulation role in many organisms. MicroRNAs bind a seed sequence to the 3-untranslated region (UTR) region of the target messenger RNA (mRNA), inducing degradation or inhibition of translation and resulting in a reduction in the protein level. This regulatory mechanism is central to many biological processes and perturbation could lead to diseases such as cancer. Given the biological importance, of miRNAs, there is a great need to identify and study their targets and functions. However, miRNAs are very difficult to clone in the lab and this has hindered the identification of novel miRNAs. Next-generation sequencing coupled with new computational tools has recently evolved to help researchers efficiently identify large numbers of novel miRNAs. In this review, we describe recent miRNA prediction tools and discuss their priorities, advantages and disadvantages. Malas and Ravasi.

  18. Electronic circuit design with HEP computational tools

    International Nuclear Information System (INIS)

    Vaz, Mario

    1996-01-01

    CPSPICE is an electronic circuit statistical simulation program developed to run in a parallel environment under UNIX operating system and TCP/IP communications protocol, using CPS - Cooperative Processes Software , SPICE program and CERNLIB software package. It is part of a set of tools being develop, intended to help electronic engineers to design, model and simulate complex systems and circuits for High Energy Physics detectors, based on statistical methods, using the same software and methodology used by HEP physicists for data analysis. CPSPICE simulates electronic circuits by Monte Carlo method, through several different processes running simultaneously SPICE in UNIX parallel computers or workstation farms. Data transfer between CPS processes for a modified version of SPICE2G6 is done by RAM memory, but can also be done through hard disk files if no source files are available for the simulator, and for bigger simulation outputs files. Simulation results are written in a HBOOK file as a NTUPLE, to be examined by HBOOK in batch model or graphics, and analyzed by statistical procedures available. The HBOOK file be stored on hard disk for small amount of data, or into Exabyte tape file for large amount of data. HEP tools also helps circuit or component modeling, like MINUT program from CERNLIB, that implements Nelder and Mead Simplex and Gradient with or without derivatives algorithms, and can be used for design optimization.This paper presents CPSPICE program implementation. The scheme adopted is suitable to make parallel other electronic circuit simulators. (author)

  19. Interactive Graphics Analysis for Aircraft Design

    Science.gov (United States)

    Townsend, J. C.

    1983-01-01

    Program uses higher-order far field drag minimization. Computer program WDES WDEM preliminary aerodynamic design tool for one or two interacting, subsonic lifting surfaces. Subcritical wing design code employs higher-order far-field drag minimization technique. Linearized aerodynamic theory used. Program written in FORTRAN IV.

  20. Granular computing and decision-making interactive and iterative approaches

    CERN Document Server

    Chen, Shyi-Ming

    2015-01-01

    This volume is devoted to interactive and iterative processes of decision-making– I2 Fuzzy Decision Making, in brief. Decision-making is inherently interactive. Fuzzy sets help realize human-machine communication in an efficient way by facilitating a two-way interaction in a friendly and transparent manner. Human-centric interaction is of paramount relevance as a leading guiding design principle of decision support systems.   The volume provides the reader with an updated and in-depth material on the conceptually appealing and practically sound methodology and practice of I2 Fuzzy Decision Making. The book engages a wealth of methods of fuzzy sets and Granular Computing, brings new concepts, architectures and practice of fuzzy decision-making providing the reader with various application studies.   The book is aimed at a broad audience of researchers and practitioners in numerous disciplines in which decision-making processes play a pivotal role and serve as a vehicle to produce solutions to existing prob...