WorldWideScience

Sample records for computer tool lca-land

  1. Hypercard Another Computer Tool.

    Geske, Joel

    1991-01-01

    Describes "Hypercard," a computer application package usable in all three modes of instructional computing: tutor, tool, and tutee. Suggests using Hypercard in scholastic journalism programs to teach such topics as news, headlines, design, photography, and advertising. Argues that the ability to access, organize, manipulate, and comprehend…

  2. Tools for computational finance

    Seydel, Rüdiger U

    2017-01-01

    Computational and numerical methods are used in a number of ways across the field of finance. It is the aim of this book to explain how such methods work in financial engineering. By concentrating on the field of option pricing, a core task of financial engineering and risk analysis, this book explores a wide range of computational tools in a coherent and focused manner and will be of use to anyone working in computational finance. Starting with an introductory chapter that presents the financial and stochastic background, the book goes on to detail computational methods using both stochastic and deterministic approaches. Now in its sixth edition, Tools for Computational Finance has been significantly revised and contains:    Several new parts such as a section on extended applications of tree methods, including multidimensional trees, trinomial trees, and the handling of dividends; Additional material in the field of generating normal variates with acceptance-rejection methods, and on Monte Carlo methods...

  3. Computer-aided translation tools

    Christensen, Tina Paulsen; Schjoldager, Anne

    2016-01-01

    in Denmark is rather high in general, but limited in the case of machine translation (MT) tools: While most TSPs use translation-memory (TM) software, often in combination with a terminology management system (TMS), only very few have implemented MT, which is criticised for its low quality output, especially......The paper reports on a questionnaire survey from 2013 of the uptake and use of computer-aided translation (CAT) tools by Danish translation service providers (TSPs) and discusses how these tools appear to have impacted on the Danish translation industry. According to our results, the uptake...

  4. Computer Assisted Advising Tool (CAAT).

    Matsen, Marie E.

    Lane Community College's Computer Assisted Advising Tool (CAAT) is used by counselors to assist students in developing a plan for the completion of a degree or certificate. CAAT was designed to facilitate student advisement from matriculation to graduation by comparing degree requirements with the courses completed by students. Three major sources…

  5. Foundational Tools for Petascale Computing

    Miller, Barton [Univ. of Wisconsin, Madison, WI (United States)

    2014-05-19

    The Paradyn project has a history of developing algorithms, techniques, and software that push the cutting edge of tool technology for high-end computing systems. Under this funding, we are working on a three-year agenda to make substantial new advances in support of new and emerging Petascale systems. The overall goal for this work is to address the steady increase in complexity of these petascale systems. Our work covers two key areas: (1) The analysis, instrumentation and control of binary programs. Work in this area falls under the general framework of the Dyninst API tool kits. (2) Infrastructure for building tools and applications at extreme scale. Work in this area falls under the general framework of the MRNet scalability framework. Note that work done under this funding is closely related to work done under a contemporaneous grant, “High-Performance Energy Applications and Systems”, SC0004061/FG02-10ER25972, UW PRJ36WV.

  6. Visualization Tools for Teaching Computer Security

    Yuan, Xiaohong; Vega, Percy; Qadah, Yaseen; Archer, Ricky; Yu, Huiming; Xu, Jinsheng

    2010-01-01

    Using animated visualization tools has been an important teaching approach in computer science education. We have developed three visualization and animation tools that demonstrate various information security concepts and actively engage learners. The information security concepts illustrated include: packet sniffer and related computer network…

  7. Tools for Embedded Computing Systems Software

    1978-01-01

    A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.

  8. Computer-Aided Modelling Methods and Tools

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    The development of models for a range of applications requires methods and tools. In many cases a reference model is required that allows the generation of application specific models that are fit for purpose. There are a range of computer aided modelling tools available that help to define the m...

  9. HPCToolkit: performance tools for scientific computing

    Tallent, N; Mellor-Crummey, J; Adhianto, L; Fagan, M; Krentel, M [Department of Computer Science, Rice University, Houston, TX 77005 (United States)

    2008-07-15

    As part of the U.S. Department of Energy's Scientific Discovery through Advanced Computing (SciDAC) program, science teams are tackling problems that require simulation and modeling on petascale computers. As part of activities associated with the SciDAC Center for Scalable Application Development Software (CScADS) and the Performance Engineering Research Institute (PERI), Rice University is building software tools for performance analysis of scientific applications on the leadership-class platforms. In this poster abstract, we briefly describe the HPCToolkit performance tools and how they can be used to pinpoint bottlenecks in SPMD and multi-threaded parallel codes. We demonstrate HPCToolkit's utility by applying it to two SciDAC applications: the S3D code for simulation of turbulent combustion and the MFDn code for ab initio calculations of microscopic structure of nuclei.

  10. HPCToolkit: performance tools for scientific computing

    Tallent, N; Mellor-Crummey, J; Adhianto, L; Fagan, M; Krentel, M

    2008-01-01

    As part of the U.S. Department of Energy's Scientific Discovery through Advanced Computing (SciDAC) program, science teams are tackling problems that require simulation and modeling on petascale computers. As part of activities associated with the SciDAC Center for Scalable Application Development Software (CScADS) and the Performance Engineering Research Institute (PERI), Rice University is building software tools for performance analysis of scientific applications on the leadership-class platforms. In this poster abstract, we briefly describe the HPCToolkit performance tools and how they can be used to pinpoint bottlenecks in SPMD and multi-threaded parallel codes. We demonstrate HPCToolkit's utility by applying it to two SciDAC applications: the S3D code for simulation of turbulent combustion and the MFDn code for ab initio calculations of microscopic structure of nuclei

  11. Final Report: Correctness Tools for Petascale Computing

    Mellor-Crummey, John [Rice Univ., Houston, TX (United States)

    2014-10-27

    In the course of developing parallel programs for leadership computing systems, subtle programming errors often arise that are extremely difficult to diagnose without tools. To meet this challenge, University of Maryland, the University of Wisconsin—Madison, and Rice University worked to develop lightweight tools to help code developers pinpoint a variety of program correctness errors that plague parallel scientific codes. The aim of this project was to develop software tools that help diagnose program errors including memory leaks, memory access errors, round-off errors, and data races. Research at Rice University focused on developing algorithms and data structures to support efficient monitoring of multithreaded programs for memory access errors and data races. This is a final report about research and development work at Rice University as part of this project.

  12. An integrated computational tool for precipitation simulation

    Cao, W.; Zhang, F.; Chen, S.-L.; Zhang, C.; Chang, Y. A.

    2011-07-01

    Computer aided materials design is of increasing interest because the conventional approach solely relying on experimentation is no longer viable within the constraint of available resources. Modeling of microstructure and mechanical properties during precipitation plays a critical role in understanding the behavior of materials and thus accelerating the development of materials. Nevertheless, an integrated computational tool coupling reliable thermodynamic calculation, kinetic simulation, and property prediction of multi-component systems for industrial applications is rarely available. In this regard, we are developing a software package, PanPrecipitation, under the framework of integrated computational materials engineering to simulate precipitation kinetics. It is seamlessly integrated with the thermodynamic calculation engine, PanEngine, to obtain accurate thermodynamic properties and atomic mobility data necessary for precipitation simulation.

  13. Computational Tools for RF Structure Design

    Jensen, E

    2004-01-01

    The Finite Differences Method and the Finite Element Method are the two principally employed numerical methods in modern RF field simulation programs. The basic ideas behind these methods are explained, with regard to available simulation programs. We then go through a list of characteristic parameters of RF structures, explaining how they can be calculated using these tools. With the help of these parameters, we introduce the frequency-domain and the time-domain calculations, leading to impedances and wake-fields, respectively. Subsequently, we present some readily available computer programs, which are in use for RF structure design, stressing their distinctive features and limitations. One final example benchmarks the precision of different codes for calculating the eigenfrequency and Q of a simple cavity resonator.

  14. VISTA - computational tools for comparative genomics

    Frazer, Kelly A.; Pachter, Lior; Poliakov, Alexander; Rubin,Edward M.; Dubchak, Inna

    2004-01-01

    Comparison of DNA sequences from different species is a fundamental method for identifying functional elements in genomes. Here we describe the VISTA family of tools created to assist biologists in carrying out this task. Our first VISTA server at http://www-gsd.lbl.gov/VISTA/ was launched in the summer of 2000 and was designed to align long genomic sequences and visualize these alignments with associated functional annotations. Currently the VISTA site includes multiple comparative genomics tools and provides users with rich capabilities to browse pre-computed whole-genome alignments of large vertebrate genomes and other groups of organisms with VISTA Browser, submit their own sequences of interest to several VISTA servers for various types of comparative analysis, and obtain detailed comparative analysis results for a set of cardiovascular genes. We illustrate capabilities of the VISTA site by the analysis of a 180 kilobase (kb) interval on human chromosome 5 that encodes for the kinesin family member3A (KIF3A) protein.

  15. Electronic circuit design with HEP computational tools

    Vaz, Mario

    1996-01-01

    CPSPICE is an electronic circuit statistical simulation program developed to run in a parallel environment under UNIX operating system and TCP/IP communications protocol, using CPS - Cooperative Processes Software , SPICE program and CERNLIB software package. It is part of a set of tools being develop, intended to help electronic engineers to design, model and simulate complex systems and circuits for High Energy Physics detectors, based on statistical methods, using the same software and methodology used by HEP physicists for data analysis. CPSPICE simulates electronic circuits by Monte Carlo method, through several different processes running simultaneously SPICE in UNIX parallel computers or workstation farms. Data transfer between CPS processes for a modified version of SPICE2G6 is done by RAM memory, but can also be done through hard disk files if no source files are available for the simulator, and for bigger simulation outputs files. Simulation results are written in a HBOOK file as a NTUPLE, to be examined by HBOOK in batch model or graphics, and analyzed by statistical procedures available. The HBOOK file be stored on hard disk for small amount of data, or into Exabyte tape file for large amount of data. HEP tools also helps circuit or component modeling, like MINUT program from CERNLIB, that implements Nelder and Mead Simplex and Gradient with or without derivatives algorithms, and can be used for design optimization.This paper presents CPSPICE program implementation. The scheme adopted is suitable to make parallel other electronic circuit simulators. (author)

  16. Tools for remote computing in accelerator control

    Anderssen, P.S.; Frammery, V.; Wilcke, R.

    1990-01-01

    In modern accelerator control systems, the intelligence of the equipment is distributed in the geographical and the logical sense. Control processes for a large variety of tasks reside in both the equipment and the control computers. Hence successful operation hinges on the availability and reliability of the communication infrastructure. The computers are interconnected by a communication system and use remote procedure calls and message passing for information exchange. These communication mechanisms need a well-defined convention, i.e. a protocol. They also require flexibility in both the setup and changes to the protocol specification. The network compiler is a tool which provides the programmer with a means of establishing such a protocol for his application. Input to the network compiler is a single interface description file provided by the programmer. This file is written according to a grammar, and completely specifies the interprocess communication interfaces. Passed through the network compiler, the interface description file automatically produces the additional source code needed for the protocol. Hence the programmer does not have to be concerned about the details of the communication calls. Any further additions and modifications are made easy, because all the information about the interface is kept in a single file. (orig.)

  17. A least-squares computational ''tool kit''

    Smith, D.L.

    1993-04-01

    The information assembled in this report is intended to offer a useful computational ''tool kit'' to individuals who are interested in a variety of practical applications for the least-squares method of parameter estimation. The fundamental principles of Bayesian analysis are outlined first and these are applied to development of both the simple and the generalized least-squares conditions. Formal solutions that satisfy these conditions are given subsequently. Their application to both linear and non-linear problems is described in detail. Numerical procedures required to implement these formal solutions are discussed and two utility computer algorithms are offered for this purpose (codes LSIOD and GLSIOD written in FORTRAN). Some simple, easily understood examples are included to illustrate the use of these algorithms. Several related topics are then addressed, including the generation of covariance matrices, the role of iteration in applications of least-squares procedures, the effects of numerical precision and an approach that can be pursued in developing data analysis packages that are directed toward special applications

  18. Integrating Computational Science Tools into a Thermodynamics Course

    Vieira, Camilo; Magana, Alejandra J.; García, R. Edwin; Jana, Aniruddha; Krafcik, Matthew

    2018-01-01

    Computational tools and methods have permeated multiple science and engineering disciplines, because they enable scientists and engineers to process large amounts of data, represent abstract phenomena, and to model and simulate complex concepts. In order to prepare future engineers with the ability to use computational tools in the context of their disciplines, some universities have started to integrate these tools within core courses. This paper evaluates the effect of introducing three computational modules within a thermodynamics course on student disciplinary learning and self-beliefs about computation. The results suggest that using worked examples paired to computer simulations to implement these modules have a positive effect on (1) student disciplinary learning, (2) student perceived ability to do scientific computing, and (3) student perceived ability to do computer programming. These effects were identified regardless of the students' prior experiences with computer programming.

  19. Computing tools for accelerator design calculations

    Fischler, M.; Nash, T.

    1984-01-01

    This note is intended as a brief, summary guide for accelerator designers to the new generation of commercial and special processors that allow great increases in computing cost effectiveness. New thinking is required to take best advantage of these computing opportunities, in particular, when moving from analytical approaches to tracking simulations. In this paper, we outline the relevant considerations

  20. Computational Design Tools for Integrated Design

    Holst, Malene Kirstine; Kirkegaard, Poul Henning

    2010-01-01

    In an architectural conceptual sketching process, where an architect is working with the initial ideas for a design, the process is characterized by three phases: sketching, evaluation and modification. Basically the architect needs to address three areas in the conceptual sketching phase......: aesthetical, functional and technical requirements. The aim of the present paper is to address the problem of a vague or not existing link between digital conceptual design tools used by architects and designers and engineering analysis and simulation tools. Based on an analysis of the architectural design...... process different digital design methods are related to tasks in an integrated design process....

  1. Computational Tools for Stem Cell Biology.

    Bian, Qin; Cahan, Patrick

    2016-12-01

    For over half a century, the field of developmental biology has leveraged computation to explore mechanisms of developmental processes. More recently, computational approaches have been critical in the translation of high throughput data into knowledge of both developmental and stem cell biology. In the past several years, a new subdiscipline of computational stem cell biology has emerged that synthesizes the modeling of systems-level aspects of stem cells with high-throughput molecular data. In this review, we provide an overview of this new field and pay particular attention to the impact that single cell transcriptomics is expected to have on our understanding of development and our ability to engineer cell fate. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Workshop on Software Development Tools for Petascale Computing

    Vetter, Jeffrey [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Georgia Inst. of Technology, Atlanta, GA (United States)

    2007-08-01

    Petascale computing systems will soon be available to the DOE science community. Recent studies in the productivity of HPC platforms point to better software environments as a key enabler to science on these systems. To prepare for the deployment and productive use of these petascale platforms, the DOE science and general HPC community must have the software development tools, such as performance analyzers and debuggers that meet application requirements for scalability, functionality, reliability, and ease of use. In this report, we identify and prioritize the research opportunities in the area of software development tools for high performance computing. To facilitate this effort, DOE hosted a group of 55 leading international experts in this area at the Software Development Tools for PetaScale Computing (SDTPC) Workshop, which was held in Washington, D.C. on August 1 and 2, 2007. Software development tools serve as an important interface between the application teams and the target HPC architectures. Broadly speaking, these roles can be decomposed into three categories: performance tools, correctness tools, and development environments. Accordingly, this SDTPC report has four technical thrusts: performance tools, correctness tools, development environment infrastructures, and scalable tool infrastructures. The last thrust primarily targets tool developers per se, rather than end users. Finally, this report identifies non-technical strategic challenges that impact most tool development. The organizing committee emphasizes that many critical areas are outside the scope of this charter; these important areas include system software, compilers, and I/O.

  3. AI tools in computer based problem solving

    Beane, Arthur J.

    1988-01-01

    The use of computers to solve value oriented, deterministic, algorithmic problems, has evolved a structured life cycle model of the software process. The symbolic processing techniques used, primarily in research, for solving nondeterministic problems, and those for which an algorithmic solution is unknown, have evolved a different model, much less structured. Traditionally, the two approaches have been used completely independently. With the advent of low cost, high performance 32 bit workstations executing identical software with large minicomputers and mainframes, it became possible to begin to merge both models into a single extended model of computer problem solving. The implementation of such an extended model on a VAX family of micro/mini/mainframe systems is described. Examples in both development and deployment of applications involving a blending of AI and traditional techniques are given.

  4. Computational Tools applied to Urban Engineering

    Filho, Armando Carlos de Pina; Lima, Fernando Rodrigues; Amaral, Renato Dias Calado do

    2010-01-01

    This chapter looked for to present the main details on three technologies much used in Urban Engineering: CAD (Computer-Aided Design); GIS (Geographic Information System); and BIM (Building Information Modelling). As it can be seen, each one of them presents specific characteristics and with diverse applications in urban projects, providing better results in relation to the planning, management and maintenance of the systems. In relation to presented software, it is important to note that the...

  5. Computer Aided Design Tools for Extreme Environment Electronics, Phase I

    National Aeronautics and Space Administration — This project aims to provide Computer Aided Design (CAD) tools for radiation-tolerant, wide-temperature-range digital, analog, mixed-signal, and radio-frequency...

  6. Computational Tool for Aerothermal Environment Around Transatmospheric Vehicles, Phase I

    National Aeronautics and Space Administration — The goal of this Project is to develop a high-fidelity computational tool for accurate prediction of aerothermal environment on transatmospheric vehicles. This...

  7. Applications of computational tools in biosciences and medical engineering

    Altenbach, Holm

    2015-01-01

     This book presents the latest developments and applications of computational tools related to the biosciences and medical engineering. It also reports the findings of different multi-disciplinary research projects, for example, from the areas of scaffolds and synthetic bones, implants and medical devices, and medical materials. It is also shown that the application of computational tools often requires mathematical and experimental methods. Computational tools such as the finite element methods, computer-aided design and optimization as well as visualization techniques such as computed axial tomography open up completely new research fields that combine the fields of engineering and bio/medical. Nevertheless, there are still hurdles since both directions are based on quite different ways of education. Often even the “language” can vary from discipline to discipline.

  8. Scratch as a Computational Modelling Tool for Teaching Physics

    Lopez, Victor; Hernandez, Maria Isabel

    2015-01-01

    The Scratch online authoring tool, which features a simple programming language that has been adapted to primary and secondary students, is being used more and more in schools as it offers students and teachers the opportunity to use a tool to build scientific models and evaluate their behaviour, just as can be done with computational modelling…

  9. Caesy: A software tool for computer-aided engineering

    Wette, Matt

    1993-01-01

    A new software tool, Caesy, is described. This tool provides a strongly typed programming environment for research in the development of algorithms and software for computer-aided control system design. A description of the user language and its implementation as they currently stand are presented along with a description of work in progress and areas of future work.

  10. Computer Tools for Construction, Modification and Analysis of Petri Nets

    Jensen, Kurt

    1987-01-01

    The practical use of Petri nets is — just as any other description technique — very dependent on the existence of adequate computer tools, which may assist the user to cope with the many details of a large description. For Petri nets there is a need for tools supporting construction of nets...

  11. Computational tool for postoperative evaluation of cochlear implant patients

    Giacomini, Guilherme; Pavan, Ana Luiza M.; Pina, Diana R. de; Altemani, Joao M.C.; Castilho, Arthur M.

    2016-01-01

    The aim of this study was to develop a tool to calculate the insertion depth angle of cochlear implants, from computed tomography exams. The tool uses different image processing techniques, such as thresholding and active contour. Then, we compared the average insertion depth angle of three different implant manufacturers. The developed tool can be used, in the future, to compare the insertion depth angle of the cochlear implant with postoperative response of patient's hearing. (author)

  12. Advanced Computing Tools and Models for Accelerator Physics

    Ryne, Robert; Ryne, Robert D.

    2008-01-01

    This paper is based on a transcript of my EPAC'08 presentation on advanced computing tools for accelerator physics. Following an introduction I present several examples, provide a history of the development of beam dynamics capabilities, and conclude with thoughts on the future of large scale computing in accelerator physics

  13. Computational tools for high-throughput discovery in biology

    Jones, Neil Christopher

    2007-01-01

    High throughput data acquisition technology has inarguably transformed the landscape of the life sciences, in part by making possible---and necessary---the computational disciplines of bioinformatics and biomedical informatics. These fields focus primarily on developing tools for analyzing data and generating hypotheses about objects in nature, and it is in this context that we address three pressing problems in the fields of the computational life sciences which each require computing capaci...

  14. New tools to aid in scientific computing and visualization

    Wallace, M.G.; Christian-Frear, T.L.

    1992-01-01

    In this paper, two computer programs are described which aid in the pre- and post-processing of computer generated data. CoMeT (Computational Mechanics Toolkit) is a customizable, interactive, graphical, menu-driven program that provides the analyst with a consistent user-friendly interface to analysis codes. Trans Vol (Transparent Volume Visualization) is a specialized tool for the scientific three-dimensional visualization of complex solids by the technique of volume rendering. Both tools are described in basic detail along with an application example concerning the simulation of contaminant migration from an underground nuclear repository

  15. Computer tools for systems engineering at LaRC

    Walters, J. Milam

    1994-01-01

    The Systems Engineering Office (SEO) has been established to provide life cycle systems engineering support to Langley research Center projects. over the last two years, the computing market has been reviewed for tools which could enhance the effectiveness and efficiency of activities directed towards this mission. A group of interrelated applications have been procured, or are under development including a requirements management tool, a system design and simulation tool, and project and engineering data base. This paper will review the current configuration of these tools and provide information on future milestones and directions.

  16. A computer tool to support in design of industrial Ethernet.

    Lugli, Alexandre Baratella; Santos, Max Mauro Dias; Franco, Lucia Regina Horta Rodrigues

    2009-04-01

    This paper presents a computer tool to support in the project and development of an industrial Ethernet network, verifying the physical layer (cables-resistance and capacitance, scan time, network power supply-POE's concept "Power Over Ethernet" and wireless), and occupation rate (amount of information transmitted to the network versus the controller network scan time). These functions are accomplished without a single physical element installed in the network, using only simulation. The computer tool has a software that presents a detailed vision of the network to the user, besides showing some possible problems in the network, and having an extremely friendly environment.

  17. Modeling with data tools and techniques for scientific computing

    Klemens, Ben

    2009-01-01

    Modeling with Data fully explains how to execute computationally intensive analyses on very large data sets, showing readers how to determine the best methods for solving a variety of different problems, how to create and debug statistical models, and how to run an analysis and evaluate the results. Ben Klemens introduces a set of open and unlimited tools, and uses them to demonstrate data management, analysis, and simulation techniques essential for dealing with large data sets and computationally intensive procedures. He then demonstrates how to easily apply these tools to the many threads of statistical technique, including classical, Bayesian, maximum likelihood, and Monte Carlo methods

  18. Scalable space-time adaptive simulation tools for computational electrocardiology

    Krause, Dorian; Krause, Rolf

    2013-01-01

    This work is concerned with the development of computational tools for the solution of reaction-diffusion equations from the field of computational electrocardiology. We designed lightweight spatially and space-time adaptive schemes for large-scale parallel simulations. We propose two different adaptive schemes based on locally structured meshes, managed either via a conforming coarse tessellation or a forest of shallow trees. A crucial ingredient of our approach is a non-conforming morta...

  19. OPTHYLIC: An Optimised Tool for Hybrid Limits Computation

    Busato, Emmanuel; Calvet, David; Theveneaux-Pelzer, Timothée

    2018-05-01

    A software tool, computing observed and expected upper limits on Poissonian process rates using a hybrid frequentist-Bayesian CLs method, is presented. This tool can be used for simple counting experiments where only signal, background and observed yields are provided or for multi-bin experiments where binned distributions of discriminating variables are provided. It allows the combination of several channels and takes into account statistical and systematic uncertainties, as well as correlations of systematic uncertainties between channels. It has been validated against other software tools and analytical calculations, for several realistic cases.

  20. A portable software tool for computing digitally reconstructed radiographs

    Chaney, Edward L.; Thorn, Jesse S.; Tracton, Gregg; Cullip, Timothy; Rosenman, Julian G.; Tepper, Joel E.

    1995-01-01

    Purpose: To develop a portable software tool for fast computation of digitally reconstructed radiographs (DRR) with a friendly user interface and versatile image format and display options. To provide a means for interfacing with commercial and custom three-dimensional (3D) treatment planning systems. To make the tool freely available to the Radiation Oncology community. Methods and Materials: A computer program for computing DRRs was enhanced with new features and rewritten to increase computational efficiency. A graphical user interface was added to improve ease of data input and DRR display. Installer, programmer, and user manuals were written, and installation test data sets were developed. The code conforms to the specifications of the Cooperative Working Group (CWG) of the National Cancer Institute (NCI) Contract on Radiotherapy Treatment Planning Tools. Results: The interface allows the user to select DRR input data and image formats primarily by point-and-click mouse operations. Digitally reconstructed radiograph formats are predefined by configuration files that specify 19 calculation parameters. Enhancements include improved contrast resolution for visualizing surgical clips, an extended source model to simulate the penumbra region in a computed port film, and the ability to easily modify the CT numbers of objects contoured on the planning computed tomography (CT) scans. Conclusions: The DRR tool can be used with 3D planning systems that lack this functionality, or perhaps improve the quality and functionality of existing DRR software. The tool can be interfaced to 3D planning systems that run on most modern graphics workstations, and can also function as a stand-alone program

  1. On Computational Fluid Dynamics Tools in Architectural Design

    Kirkegaard, Poul Henning; Hougaard, Mads; Stærdahl, Jesper Winther

    engineering computational fluid dynamics (CFD) simulation program ANSYS CFX and a CFD based representative program RealFlow are investigated. These two programs represent two types of CFD based tools available for use during phases of an architectural design process. However, as outlined in two case studies...

  2. Development of Desktop Computing Applications and Engineering Tools on GPUs

    Sørensen, Hans Henrik Brandenborg; Glimberg, Stefan Lemvig; Hansen, Toke Jansen

    (GPUs) for high-performance computing applications and software tools in science and engineering, inverse problems, visualization, imaging, dynamic optimization. The goals are to contribute to the development of new state-of-the-art mathematical models and algorithms for maximum throughout performance...

  3. Software Tools: A One-Semester Secondary School Computer Course.

    Bromley, John; Lakatos, John

    1985-01-01

    Provides a course outline, describes equipment and teacher requirements, discusses student evaluation and course outcomes, and details the computer programs used in a high school course. The course is designed to teach students use of the microcomputer as a tool through hands-on experience with a variety of commercial software programs. (MBR)

  4. Cloud Computing as a Tool for Improving Business Competitiveness

    Wišniewski Michał

    2014-08-01

    Full Text Available This article organizes knowledge on cloud computing presenting the classification of deployment models, characteristics and service models. The author, looking at the problem from the entrepreneur’s perspective, draws attention to the differences in the benefits depending on the cloud computing deployment models and considers an effective way of selection of cloud computing services according to the specificity of organization. Within this work, a thesis statement was considered that in economic terms the cloud computing is not always the best solution for your organization. This raises the question, “What kind of tools should be used to estimate the usefulness of the model cloud computing services in the enterprise?”

  5. Computing tools for implementing standards for single-case designs.

    Chen, Li-Ting; Peng, Chao-Ying Joanne; Chen, Ming-E

    2015-11-01

    In the single-case design (SCD) literature, five sets of standards have been formulated and distinguished: design standards, assessment standards, analysis standards, reporting standards, and research synthesis standards. This article reviews computing tools that can assist researchers and practitioners in meeting the analysis standards recommended by the What Works Clearinghouse: Procedures and Standards Handbook-the WWC standards. These tools consist of specialized web-based calculators or downloadable software for SCD data, and algorithms or programs written in Excel, SAS procedures, SPSS commands/Macros, or the R programming language. We aligned these tools with the WWC standards and evaluated them for accuracy and treatment of missing data, using two published data sets. All tools were tested to be accurate. When missing data were present, most tools either gave an error message or conducted analysis based on the available data. Only one program used a single imputation method. This article concludes with suggestions for an inclusive computing tool or environment, additional research on the treatment of missing data, and reasonable and flexible interpretations of the WWC standards. © The Author(s) 2015.

  6. Computer- Aided Design in Power Engineering Application of Software Tools

    Stojkovic, Zlatan

    2012-01-01

    This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents  application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel & Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems ...

  7. A Tangible Programming Tool for Children to Cultivate Computational Thinking

    Danli Wang

    2014-01-01

    Full Text Available Game and creation are activities which have good potential for computational thinking skills. In this paper we present T-Maze, an economical tangible programming tool for children aged 5–9 to build computer programs in maze games by placing wooden blocks. Through the use of computer vision technology, T-Maze provides a live programming interface with real-time graphical and voice feedback. We conducted a user study with 7 children using T-Maze to play two levels of maze-escape games and create their own mazes. The results show that T-Maze is not only easy to use, but also has the potential to help children cultivate computational thinking like abstraction, problem decomposition, and creativity.

  8. A Tangible Programming Tool for Children to Cultivate Computational Thinking

    Wang, Danli; Liu, Zhen

    2014-01-01

    Game and creation are activities which have good potential for computational thinking skills. In this paper we present T-Maze, an economical tangible programming tool for children aged 5–9 to build computer programs in maze games by placing wooden blocks. Through the use of computer vision technology, T-Maze provides a live programming interface with real-time graphical and voice feedback. We conducted a user study with 7 children using T-Maze to play two levels of maze-escape games and create their own mazes. The results show that T-Maze is not only easy to use, but also has the potential to help children cultivate computational thinking like abstraction, problem decomposition, and creativity. PMID:24719575

  9. Elementary mathematical and computational tools for electrical and computer engineers using Matlab

    Manassah, Jamal T

    2013-01-01

    Ideal for use as a short-course textbook and for self-study Elementary Mathematical and Computational Tools for Electrical and Computer Engineers Using MATLAB fills that gap. Accessible after just one semester of calculus, it introduces the many practical analytical and numerical tools that are essential to success both in future studies and in professional life. Sharply focused on the needs of the electrical and computer engineering communities, the text provides a wealth of relevant exercises and design problems. Changes in MATLAB's version 6.0 are included in a special addendum.

  10. Field-programmable custom computing technology architectures, tools, and applications

    Luk, Wayne; Pocek, Ken

    2000-01-01

    Field-Programmable Custom Computing Technology: Architectures, Tools, and Applications brings together in one place important contributions and up-to-date research results in this fast-moving area. In seven selected chapters, the book describes the latest advances in architectures, design methods, and applications of field-programmable devices for high-performance reconfigurable systems. The contributors to this work were selected from the leading researchers and practitioners in the field. It will be valuable to anyone working or researching in the field of custom computing technology. It serves as an excellent reference, providing insight into some of the most challenging issues being examined today.

  11. Understanding organometallic reaction mechanisms and catalysis experimental and computational tools computational and experimental tools

    Ananikov, Valentin P

    2014-01-01

    Exploring and highlighting the new horizons in the studies of reaction mechanisms that open joint application of experimental studies and theoretical calculations is the goal of this book. The latest insights and developments in the mechanistic studies of organometallic reactions and catalytic processes are presented and reviewed. The book adopts a unique approach, exemplifying how to use experiments, spectroscopy measurements, and computational methods to reveal reaction pathways and molecular structures of catalysts, rather than concentrating solely on one discipline. The result is a deeper

  12. Computational Tools To Model Halogen Bonds in Medicinal Chemistry.

    Ford, Melissa Coates; Ho, P Shing

    2016-03-10

    The use of halogens in therapeutics dates back to the earliest days of medicine when seaweed was used as a source of iodine to treat goiters. The incorporation of halogens to improve the potency of drugs is now fairly standard in medicinal chemistry. In the past decade, halogens have been recognized as direct participants in defining the affinity of inhibitors through a noncovalent interaction called the halogen bond or X-bond. Incorporating X-bonding into structure-based drug design requires computational models for the anisotropic distribution of charge and the nonspherical shape of halogens, which lead to their highly directional geometries and stabilizing energies. We review here current successes and challenges in developing computational methods to introduce X-bonding into lead compound discovery and optimization during drug development. This fast-growing field will push further development of more accurate and efficient computational tools to accelerate the exploitation of halogens in medicinal chemistry.

  13. Computational Tools and Algorithms for Designing Customized Synthetic Genes

    Nathan eGould

    2014-10-01

    Full Text Available Advances in DNA synthesis have enabled the construction of artificial genes, gene circuits, and genomes of bacterial scale. Freedom in de-novo design of synthetic constructs provides significant power in studying the impact of mutations in sequence features, and verifying hypotheses on the functional information that is encoded in nucleic and amino acids. To aid this goal, a large number of software tools of variable sophistication have been implemented, enabling the design of synthetic genes for sequence optimization based on rationally defined properties. The first generation of tools dealt predominantly with singular objectives such as codon usage optimization and unique restriction site incorporation. Recent years have seen the emergence of sequence design tools that aim to evolve sequences toward combinations of objectives. The design of optimal protein coding sequences adhering to multiple objectives is computationally hard, and most tools rely on heuristics to sample the vast sequence design space. In this review we study some of the algorithmic issues behind gene optimization and the approaches that different tools have adopted to redesign genes and optimize desired coding features. We utilize test cases to demonstrate the efficiency of each approach, as well as identify their strengths and limitations.

  14. Computational Tools and Algorithms for Designing Customized Synthetic Genes

    Gould, Nathan [Department of Computer Science, The College of New Jersey, Ewing, NJ (United States); Hendy, Oliver [Department of Biology, The College of New Jersey, Ewing, NJ (United States); Papamichail, Dimitris, E-mail: papamicd@tcnj.edu [Department of Computer Science, The College of New Jersey, Ewing, NJ (United States)

    2014-10-06

    Advances in DNA synthesis have enabled the construction of artificial genes, gene circuits, and genomes of bacterial scale. Freedom in de novo design of synthetic constructs provides significant power in studying the impact of mutations in sequence features, and verifying hypotheses on the functional information that is encoded in nucleic and amino acids. To aid this goal, a large number of software tools of variable sophistication have been implemented, enabling the design of synthetic genes for sequence optimization based on rationally defined properties. The first generation of tools dealt predominantly with singular objectives such as codon usage optimization and unique restriction site incorporation. Recent years have seen the emergence of sequence design tools that aim to evolve sequences toward combinations of objectives. The design of optimal protein-coding sequences adhering to multiple objectives is computationally hard, and most tools rely on heuristics to sample the vast sequence design space. In this review, we study some of the algorithmic issues behind gene optimization and the approaches that different tools have adopted to redesign genes and optimize desired coding features. We utilize test cases to demonstrate the efficiency of each approach, as well as identify their strengths and limitations.

  15. Computational Tools and Algorithms for Designing Customized Synthetic Genes

    Gould, Nathan; Hendy, Oliver; Papamichail, Dimitris

    2014-01-01

    Advances in DNA synthesis have enabled the construction of artificial genes, gene circuits, and genomes of bacterial scale. Freedom in de novo design of synthetic constructs provides significant power in studying the impact of mutations in sequence features, and verifying hypotheses on the functional information that is encoded in nucleic and amino acids. To aid this goal, a large number of software tools of variable sophistication have been implemented, enabling the design of synthetic genes for sequence optimization based on rationally defined properties. The first generation of tools dealt predominantly with singular objectives such as codon usage optimization and unique restriction site incorporation. Recent years have seen the emergence of sequence design tools that aim to evolve sequences toward combinations of objectives. The design of optimal protein-coding sequences adhering to multiple objectives is computationally hard, and most tools rely on heuristics to sample the vast sequence design space. In this review, we study some of the algorithmic issues behind gene optimization and the approaches that different tools have adopted to redesign genes and optimize desired coding features. We utilize test cases to demonstrate the efficiency of each approach, as well as identify their strengths and limitations.

  16. Natural language processing tools for computer assisted language learning

    Vandeventer Faltin, Anne

    2003-01-01

    Full Text Available This paper illustrates the usefulness of natural language processing (NLP tools for computer assisted language learning (CALL through the presentation of three NLP tools integrated within a CALL software for French. These tools are (i a sentence structure viewer; (ii an error diagnosis system; and (iii a conjugation tool. The sentence structure viewer helps language learners grasp the structure of a sentence, by providing lexical and grammatical information. This information is derived from a deep syntactic analysis. Two different outputs are presented. The error diagnosis system is composed of a spell checker, a grammar checker, and a coherence checker. The spell checker makes use of alpha-codes, phonological reinterpretation, and some ad hoc rules to provide correction proposals. The grammar checker employs constraint relaxation and phonological reinterpretation as diagnosis techniques. The coherence checker compares the underlying "semantic" structures of a stored answer and of the learners' input to detect semantic discrepancies. The conjugation tool is a resource with enhanced capabilities when put on an electronic format, enabling searches from inflected and ambiguous verb forms.

  17. Computer-Based Tools for Evaluating Graphical User Interfaces

    Moore, Loretta A.

    1997-01-01

    The user interface is the component of a software system that connects two very complex system: humans and computers. Each of these two systems impose certain requirements on the final product. The user is the judge of the usability and utility of the system; the computer software and hardware are the tools with which the interface is constructed. Mistakes are sometimes made in designing and developing user interfaces because the designers and developers have limited knowledge about human performance (e.g., problem solving, decision making, planning, and reasoning). Even those trained in user interface design make mistakes because they are unable to address all of the known requirements and constraints on design. Evaluation of the user inter-face is therefore a critical phase of the user interface development process. Evaluation should not be considered the final phase of design; but it should be part of an iterative design cycle with the output of evaluation being feed back into design. The goal of this research was to develop a set of computer-based tools for objectively evaluating graphical user interfaces. The research was organized into three phases. The first phase resulted in the development of an embedded evaluation tool which evaluates the usability of a graphical user interface based on a user's performance. An expert system to assist in the design and evaluation of user interfaces based upon rules and guidelines was developed during the second phase. During the final phase of the research an automatic layout tool to be used in the initial design of graphical inter- faces was developed. The research was coordinated with NASA Marshall Space Flight Center's Mission Operations Laboratory's efforts in developing onboard payload display specifications for the Space Station.

  18. RATIO_TOOL - SOFTWARE FOR COMPUTING IMAGE RATIOS

    Yates, G. L.

    1994-01-01

    Geological studies analyze spectral data in order to gain information on surface materials. RATIO_TOOL is an interactive program for viewing and analyzing large multispectral image data sets that have been created by an imaging spectrometer. While the standard approach to classification of multispectral data is to match the spectrum for each input pixel against a library of known mineral spectra, RATIO_TOOL uses ratios of spectral bands in order to spot significant areas of interest within a multispectral image. Each image band can be viewed iteratively, or a selected image band of the data set can be requested and displayed. When the image ratios are computed, the result is displayed as a gray scale image. At this point a histogram option helps in viewing the distribution of values. A thresholding option can then be used to segment the ratio image result into two to four classes. The segmented image is then color coded to indicate threshold classes and displayed alongside the gray scale image. RATIO_TOOL is written in C language for Sun series computers running SunOS 4.0 and later. It requires the XView toolkit and the OpenWindows window manager (version 2.0 or 3.0). The XView toolkit is distributed with Open Windows. A color monitor is also required. The standard distribution medium for RATIO_TOOL is a .25 inch streaming magnetic tape cartridge in UNIX tar format. An electronic copy of the documentation is included on the program media. RATIO_TOOL was developed in 1992 and is a copyrighted work with all copyright vested in NASA. Sun, SunOS, and OpenWindows are trademarks of Sun Microsystems, Inc. UNIX is a registered trademark of AT&T Bell Laboratories.

  19. Computers and the internet: tools for youth empowerment.

    Valaitis, Ruta K

    2005-10-04

    Youth are often disenfranchised in their communities and may feel they have little voice. Since computers are an important aspect of youth culture, they may offer solutions to increasing youth participation in communities. This qualitative case study investigated the perceptions of 19 (predominantly female) inner-city school youth about their use of computers and the Internet in a school-based community development project. Youth working with public health nurses in a school-based community development project communicated with local community members using computer-mediated communication, surveyed peers online, built websites, searched for information online, and prepared project materials using computers and the Internet. Participant observation, semistructured interviews, analysis of online messages, and online- and paper-based surveys were used to gather data about youth's and adults' perceptions and use of the technologies. Constant comparison method and between-method triangulation were used in the analysis to satisfy the existence of themes. Not all youth were interested in working with computers. Some electronic messages from adults were perceived to be critical, and writing to adults was intimidating for some youth. In addition, technical problems were experienced. Despite these barriers, most youth perceived that using computers and the Internet reduced their anxiety concerning communication with adults, increased their control when dealing with adults, raised their perception of their social status, increased participation within the community, supported reflective thought, increased efficiency, and improved their access to resources. Overall, youth perceived computers and the Internet to be empowering tools, and they should be encouraged to use such technology to support them in community initiatives.

  20. Hardware replacements and software tools for digital control computers

    Walker, R.A.P.; Wang, B-C.; Fung, J.

    1996-01-01

    computers which use 'Varian' technology. A new software program, Desk Top Tools, permits the designer greater flexibility in digital control computer software design and testing. This software development allows the user to emulate control of the CANDU reactor system by system. All discussions will highlight the ability of the replacements and the new developments to enhance the operation of the existing and 'repeat' plant digital control computers and will explore future applications of these developments. Examples of current use of all replacement components and software are provided. (author)

  1. Systematic Methods and Tools for Computer Aided Modelling

    Fedorova, Marina

    and processes can be faster, cheaper and very efficient. The developed modelling framework involves five main elements: 1) a modelling tool, that includes algorithms for model generation; 2) a template library, which provides building blocks for the templates (generic models previously developed); 3) computer......-format and COM-objects, are incorporated to allow the export and import of mathematical models; 5) a user interface that provides the work-flow and data-flow to guide the user through the different modelling tasks....

  2. 3D data processing with advanced computer graphics tools

    Zhang, Song; Ekstrand, Laura; Grieve, Taylor; Eisenmann, David J.; Chumbley, L. Scott

    2012-09-01

    Often, the 3-D raw data coming from an optical profilometer contains spiky noises and irregular grid, which make it difficult to analyze and difficult to store because of the enormously large size. This paper is to address these two issues for an optical profilometer by substantially reducing the spiky noise of the 3-D raw data from an optical profilometer, and by rapidly re-sampling the raw data into regular grids at any pixel size and any orientation with advanced computer graphics tools. Experimental results will be presented to demonstrate the effectiveness of the proposed approach.

  3. A novel framework for diagnosing automatic tool changer and tool life based on cloud computing

    Shang-Liang Chen

    2016-03-01

    Full Text Available Tool change is one among the most frequently performed machining processes, and if there is improper percussion as the tool’s position is changed, the spindle bearing can be damaged. A spindle malfunction can cause problems, such as a knife being dropped or bias in a machined hole. The measures currently taken to avoid such issues, which arose from the available machine tools, only involve determining whether the clapping knife’s state is correct using a spindle and the air adhesion method, which is also used to satisfy the high precision required from mechanical components. Therefore, it cannot be used with any type of machine tool; in addition, improper tapping of the spindle during an automatic tool change cannot be detected. Therefore, this study proposes a new type of diagnostic framework that combines cloud computing and vibration sensors, among of which, tool change is automatically diagnosed using an architecture to identify abnormalities and thereby enhances the reliability and productivity of the machine and equipment.

  4. ATLAS Distributed Computing Monitoring tools during the LHC Run I

    Schovancová, J.; Campana, S.; Di Girolamo, A.; Jézéquel, S.; Ueda, I.; Wenaus, T.; Atlas Collaboration

    2014-06-01

    This contribution summarizes evolution of the ATLAS Distributed Computing (ADC) Monitoring project during the LHC Run I. The ADC Monitoring targets at the three groups of customers: ADC Operations team to early identify malfunctions and escalate issues to an activity or a service expert, ATLAS national contacts and sites for the real-time monitoring and long-term measurement of the performance of the provided computing resources, and the ATLAS Management for long-term trends and accounting information about the ATLAS Distributed Computing resources. During the LHC Run I a significant development effort has been invested in standardization of the monitoring and accounting applications in order to provide extensive monitoring and accounting suite. ADC Monitoring applications separate the data layer and the visualization layer. The data layer exposes data in a predefined format. The visualization layer is designed bearing in mind visual identity of the provided graphical elements, and re-usability of the visualization bits across the different tools. A rich family of various filtering and searching options enhancing available user interfaces comes naturally with the data and visualization layer separation. With a variety of reliable monitoring data accessible through standardized interfaces, the possibility of automating actions under well defined conditions correlating multiple data sources has become feasible. In this contribution we discuss also about the automated exclusion of degraded resources and their automated recovery in various activities.

  5. HEP Computing Tools, Grid and Supercomputers for Genome Sequencing Studies

    De, K.; Klimentov, A.; Maeno, T.; Mashinistov, R.; Novikov, A.; Poyda, A.; Tertychnyy, I.; Wenaus, T.

    2017-10-01

    PanDA - Production and Distributed Analysis Workload Management System has been developed to address ATLAS experiment at LHC data processing and analysis challenges. Recently PanDA has been extended to run HEP scientific applications on Leadership Class Facilities and supercomputers. The success of the projects to use PanDA beyond HEP and Grid has drawn attention from other compute intensive sciences such as bioinformatics. Recent advances of Next Generation Genome Sequencing (NGS) technology led to increasing streams of sequencing data that need to be processed, analysed and made available for bioinformaticians worldwide. Analysis of genomes sequencing data using popular software pipeline PALEOMIX can take a month even running it on the powerful computer resource. In this paper we will describe the adaptation the PALEOMIX pipeline to run it on a distributed computing environment powered by PanDA. To run pipeline we split input files into chunks which are run separately on different nodes as separate inputs for PALEOMIX and finally merge output file, it is very similar to what it done by ATLAS to process and to simulate data. We dramatically decreased the total walltime because of jobs (re)submission automation and brokering within PanDA. Using software tools developed initially for HEP and Grid can reduce payload execution time for Mammoths DNA samples from weeks to days.

  6. Atomdroid: a computational chemistry tool for mobile platforms.

    Feldt, Jonas; Mata, Ricardo A; Dieterich, Johannes M

    2012-04-23

    We present the implementation of a new molecular mechanics program designed for use in mobile platforms, the first specifically built for these devices. The software is designed to run on Android operating systems and is compatible with several modern tablet-PCs and smartphones available in the market. It includes molecular viewer/builder capabilities with integrated routines for geometry optimizations and Monte Carlo simulations. These functionalities allow it to work as a stand-alone tool. We discuss some particular development aspects, as well as the overall feasibility of using computational chemistry software packages in mobile platforms. Benchmark calculations show that through efficient implementation techniques even hand-held devices can be used to simulate midsized systems using force fields.

  7. Constructing Bridges between Computational Tools in Heterogeneous and Homogeneous Catalysis

    Falivene, Laura; Kozlov, Sergey M.; Cavallo, Luigi

    2018-01-01

    Better catalysts are needed to address numerous challenges faced by humanity. In this perspective, we review concepts and tools in theoretical and computational chemistry that can help to accelerate the rational design of homogeneous and heterogeneous catalysts. In particular, we focus on the following three topics: 1) identification of key intermediates and transition states in a reaction using the energetic span model, 2) disentanglement of factors influencing the relative stability of the key species using energy decomposition analysis and the activation strain model, and 3) discovery of new catalysts using volcano relationships. To facilitate wider use of these techniques across different areas, we illustrate their potentials and pitfalls when applied to the study of homogeneous and heterogeneous catalysts.

  8. Development of tools and models for computational fracture assessment

    Talja, H.; Santaoja, K.

    1998-01-01

    The aim of the work presented in this paper has been to develop and test new computational tools and theoretically more sound methods for fracture mechanical analysis. The applicability of the engineering integrity assessment system MASI for evaluation of piping components has been extended. The most important motivation for the theoretical development have been the well-known fundamental limitations in the validity of J-integral, which limits its applicability in many important practical safety assessment cases. Examples are extensive plastic deformation, multimaterial structures and ascending loading paths (especially warm prestress, WPS). Further, the micromechanical Gurson model has been applied to several reactor pressure vessel materials. Special attention is paid to the transferability of Gurson model parameters from tensile test results to prediction of ductile failure behaviour of cracked structures. (author)

  9. Translation Memory and Computer Assisted Translation Tool for Medieval Texts

    Törcsvári Attila

    2013-05-01

    Full Text Available Translation memories (TMs, as part of Computer Assisted Translation (CAT tools, support translators reusing portions of formerly translated text. Fencing books are good candidates for using TMs due to the high number of repeated terms. Medieval texts suffer a number of drawbacks that make hard even “simple” rewording to the modern version of the same language. The analyzed difficulties are: lack of systematic spelling, unusual word orders and typos in the original. A hypothesis is made and verified that even simple modernization increases legibility and it is feasible, also it is worthwhile to apply translation memories due to the numerous and even extremely long repeated terms. Therefore, methods and algorithms are presented 1. for automated transcription of medieval texts (when a limited training set is available, and 2. collection of repeated patterns. The efficiency of the algorithms is analyzed for recall and precision.

  10. Constructing Bridges between Computational Tools in Heterogeneous and Homogeneous Catalysis

    Falivene, Laura

    2018-05-08

    Better catalysts are needed to address numerous challenges faced by humanity. In this perspective, we review concepts and tools in theoretical and computational chemistry that can help to accelerate the rational design of homogeneous and heterogeneous catalysts. In particular, we focus on the following three topics: 1) identification of key intermediates and transition states in a reaction using the energetic span model, 2) disentanglement of factors influencing the relative stability of the key species using energy decomposition analysis and the activation strain model, and 3) discovery of new catalysts using volcano relationships. To facilitate wider use of these techniques across different areas, we illustrate their potentials and pitfalls when applied to the study of homogeneous and heterogeneous catalysts.

  11. TRAC, a collaborative computer tool for tracer-test interpretation

    Fécamp C.

    2013-05-01

    Full Text Available Artificial tracer tests are widely used by consulting engineers for demonstrating water circulation, proving the existence of leakage, or estimating groundwater velocity. However, the interpretation of such tests is often very basic, with the result that decision makers and professionals commonly face unreliable results through hasty and empirical interpretation. There is thus an increasing need for a reliable interpretation tool, compatible with the latest operating systems and available in several languages. BRGM, the French Geological Survey, has developed a project together with hydrogeologists from various other organizations to build software assembling several analytical solutions in order to comply with various field contexts. This computer program, called TRAC, is very light and simple, allowing the user to add his own analytical solution if the formula is not yet included. It aims at collaborative improvement by sharing the tool and the solutions. TRAC can be used for interpreting data recovered from a tracer test as well as for simulating the transport of a tracer in the saturated zone (for the time being. Calibration of a site operation is based on considering the hydrodynamic and hydrodispersive features of groundwater flow as well as the amount, nature and injection mode of the artificial tracer. The software is available in French, English and Spanish, and the latest version can be downloaded from the web site http://trac.brgm.fr.

  12. Integrated modeling tool for performance engineering of complex computer systems

    Wright, Gary; Ball, Duane; Hoyt, Susan; Steele, Oscar

    1989-01-01

    This report summarizes Advanced System Technologies' accomplishments on the Phase 2 SBIR contract NAS7-995. The technical objectives of the report are: (1) to develop an evaluation version of a graphical, integrated modeling language according to the specification resulting from the Phase 2 research; and (2) to determine the degree to which the language meets its objectives by evaluating ease of use, utility of two sets of performance predictions, and the power of the language constructs. The technical approach followed to meet these objectives was to design, develop, and test an evaluation prototype of a graphical, performance prediction tool. The utility of the prototype was then evaluated by applying it to a variety of test cases found in the literature and in AST case histories. Numerous models were constructed and successfully tested. The major conclusion of this Phase 2 SBIR research and development effort is that complex, real-time computer systems can be specified in a non-procedural manner using combinations of icons, windows, menus, and dialogs. Such a specification technique provides an interface that system designers and architects find natural and easy to use. In addition, PEDESTAL's multiview approach provides system engineers with the capability to perform the trade-offs necessary to produce a design that meets timing performance requirements. Sample system designs analyzed during the development effort showed that models could be constructed in a fraction of the time required by non-visual system design capture tools.

  13. REMOD: a computational tool for remodeling neuronal dendrites

    Panagiotis Bozelos

    2014-05-01

    Full Text Available In recent years, several modeling studies have indicated that dendritic morphology is a key determinant of how individual neurons acquire a unique signal processing profile. The highly branched dendritic structure that originates from the cell body, explores the surrounding 3D space in a fractal-like manner, until it reaches a certain amount of complexity. Its shape undergoes significant alterations not only in various neuropathological conditions, but in physiological, too. Yet, despite the profound effect that these alterations can have on neuronal function, the causal relationship between structure and function remains largely elusive. The lack of a systematic approach for remodeling neuronal cells and their dendritic trees is a key limitation that contributes to this problem. In this context, we developed a computational tool that allows the remodeling of any type of neurons, given a set of exemplar morphologies. The tool is written in Python and provides a simple GUI that guides the user through various options to manipulate selected neuronal morphologies. It provides the ability to load one or more morphology files (.swc or .hoc and choose specific dendrites to operate one of the following actions: shrink, remove, extend or branch (as shown in Figure 1. The user retains complete control over the extent of each alteration and if a chosen action is not possible due to pre-existing structural constraints, appropriate warnings are produced. Importantly, the tool can also be used to extract morphology statistics for one or multiple morphologies, including features such as the total dendritic length, path length to the root, branch order, diameter tapering, etc. Finally, an experimental utility enables the user to remodel entire dendritic trees based on preloaded statistics from a database of cell-type specific neuronal morphologies. To our knowledge, this is the first tool that allows (a the remodeling of existing –as opposed to the de novo

  14. COMPUTER TOOLS OF DYNAMIC MATHEMATIC SOFTWARE AND METHODICAL PROBLEMS OF THEIR USE

    Olena V. Semenikhina; Maryna H. Drushliak

    2014-01-01

    The article presents results of analyses of standard computer tools of dynamic mathematic software which are used in solving tasks, and tools on which the teacher can support in the teaching of mathematics. Possibility of the organization of experimental investigating of mathematical objects on the basis of these tools and the wording of new tasks on the basis of the limited number of tools, fast automated check are specified. Some methodological comments on application of computer tools and ...

  15. Cloud computing: An innovative tool for library services

    Sahu, R.

    2015-01-01

    Cloud computing is a new technique of information communication technology because of its potential benefits such as reduced cost, accessible anywhere any time as well as its elasticity and flexibility. In this Paper defines cloud Computing, definition, essential characteristics, model of cloud computing, components of cloud, advantages & drawbacks of cloud computing and also describe cloud computing in libraries.

  16. CRISIS2012: An Updated Tool to Compute Seismic Hazard

    Ordaz, M.; Martinelli, F.; Meletti, C.; D'Amico, V.

    2013-05-01

    CRISIS is a computer tool for probabilistic seismic hazard analysis (PSHA), whose development started in the late 1980's at the Instituto de Ingeniería, UNAM, Mexico. It started circulating outside the Mexican borders at the beginning of the 1990's, when it was first distributed as part of SEISAN tools. Throughout the years, CRISIS has been used for seismic hazard studies in several countries in Latin America (Mexico, Guatemala, Belize, El Salvador, Honduras, Nicaragua, Costa Rica, Panama, Colombia, Venezuela, Ecuador, Peru, Argentina and Chile), and in many other countries of the World. CRISIS has always circulated free of charge for non-commercial applications. It is worth noting that CRISIS has been mainly written by people that are, at the same time, PSHA practitioners. Therefore, the development loop has been relatively short, and most of the modifications and improvements have been made to satisfy the needs of the developers themselves. CRISIS has evolved from a rather simple FORTRAN code to a relatively complex program with a friendly graphical interface, able to handle a variety of modeling possibilities for source geometries, seismicity descriptions and ground motion prediction models (GMPM). We will describe some of the improvements made for the newest version of the code: CRISIS 2012.These improvements, some of which were made in the frame of the Italian research project INGV-DPC S2 (http://nuovoprogettoesse2.stru.polimi.it/), funded by the Dipartimento della Protezione Civile (DPC; National Civil Protection Department), include: A wider variety of source geometries A wider variety of seismicity models, including the ability to handle non-Poissonian occurrence models and Poissonian smoothed-seismicity descriptions. Enhanced capabilities for using different kinds of GMPM: attenuation tables, built-in models and generalized attenuation models. In the case of built-in models, there is, by default, a set ready to use in CRISIS, but additional custom GMPMs

  17. Computer games as a pedagogical tool in education

    Maher, Ken

    1997-01-01

    Designing computer based environments is never easy, especially when considering young learners. Traditionally, computer gaming has been seen as lacking in educational value, but rating highly in satisfaction and motivation. The objective of this dissertation is to look at elements of computer based learning and to ascertain how computer games can be included as a means of improving learning. Various theories are drawn together from psychology, instructional technology and computer gaming, to...

  18. Improvement of Computer Software Quality through Software Automated Tools.

    1986-08-30

    information that are returned from the tools to the human user, and the forms in which these outputs are presented. Page 2 of 4 STAGE OF DEVELOPMENT: What... AUTOMIATED SOFTWARE TOOL MONITORING SYSTEM APPENDIX 2 2-1 INTRODUCTION This document and Automated Software Tool Monitoring Program (Appendix 1) are...t Output Output features provide links from the tool to both the human user and the target machine (where applicable). They describe the types

  19. Pharmacokinetic study with computational tools in the medicinal chemistry course

    Monique Araújo de Brito

    2011-12-01

    Full Text Available To improve the teaching-learning process in the Medicinal Chemistry course, new strategies have been incorporated into practical classes of this fundamental discipline of the pharmaceutical curriculum. Many changes and improvements have been made in the area of medicinal chemistry so far, and students should be prepared for these new approaches with the use of technological resources in this field. Practical activities using computational techniques have been directed to the evaluation of chemical and physicochemical properties that affect the pharmacokinetics of drugs. Their objectives were to allow students to know these tools, to learn how to access them, to search for the structures of drugs and to analyze results. To the best of our knowledge, this is the first study in Brazil to demonstrate the use of computational practices in teaching pharmacokinetics. Practical classes using Osiris and Molinspiration were attractive to students, who developed the activities easily and acquired better theoretical knowledge.Para melhorar o processo ensino-aprendizagem no curso de Química Medicinal novas estratégias estão sendo incorporadas às aulas práticas desta disciplina fundamental do currículo farmacêutico. Muitas mudanças e melhorias vêm marcando a área de química medicinal e por isso é importante que os alunos sejam colocados nestas novas abordagens na área, com a utilização de recursos tecnológicos. As atividades práticas foram direcionadas para a avaliação dos dados químicos e físico-químicos de fármacos que influenciam as propriedades farmacocinéticas com o auxílio de técnicas computacionais. Os objetivos foram permitir aos alunos conhecer essas ferramentas, saber como acessá-las, procurar as estruturas de fármacos e analisar os resultados. Este é o primeiro estudo publicado no Brasil que apresenta aula prática computacional sobre o tema farmacocinética. As aulas práticas utilizando os servidores Osiris e

  20. COMPUTER TOOLS OF DYNAMIC MATHEMATIC SOFTWARE AND METHODICAL PROBLEMS OF THEIR USE

    Olena V. Semenikhina

    2014-08-01

    Full Text Available The article presents results of analyses of standard computer tools of dynamic mathematic software which are used in solving tasks, and tools on which the teacher can support in the teaching of mathematics. Possibility of the organization of experimental investigating of mathematical objects on the basis of these tools and the wording of new tasks on the basis of the limited number of tools, fast automated check are specified. Some methodological comments on application of computer tools and methodological features of the use of interactive mathematical environments are presented. Problems, which are arising from the use of computer tools, among which rethinking forms and methods of training by teacher, the search for creative problems, the problem of rational choice of environment, check the e-solutions, common mistakes in the use of computer tools are selected.

  1. Gear cutting tools fundamentals of design and computation

    Radzevich, Stephen P

    2010-01-01

    Presents the DG/K-based method of surface generation, a novel and practical mathematical method for designing gear cutting tools with optimal parameters. This book proposes a scientific classification for the various kinds of the gear machining meshes, discussing optimal designs of gear cutting tools.

  2. Computer-mediated-communication and social networking tools at work

    Ou, C.X.J.; Sia, C.L.; Hui, C.K.

    2013-01-01

    Purpose – Advances in information technology (IT) have resulted in the development of various computer‐mediated communication (CMC) and social networking tools. However, quantifying the benefits of utilizing these tools in the organizational context remains a challenge. In this study, the authors

  3. Effects of Attitudes and Behaviours on Learning Mathematics with Computer Tools

    Reed, Helen C.; Drijvers, Paul; Kirschner, Paul A.

    2010-01-01

    This mixed-methods study investigates the effects of student attitudes and behaviours on the outcomes of learning mathematics with computer tools. A computer tool was used to help students develop the mathematical concept of function. In the whole sample (N = 521), student attitudes could account for a 3.4 point difference in test scores between…

  4. Managing Laboratory Data Using Cloud Computing as an Organizational Tool

    Bennett, Jacqueline; Pence, Harry E.

    2011-01-01

    One of the most significant difficulties encountered when directing undergraduate research and developing new laboratory experiments is how to efficiently manage the data generated by a number of students. Cloud computing, where both software and computer files reside online, offers a solution to this data-management problem and allows researchers…

  5. Computer Art--A New Tool in Advertising Graphics.

    Wassmuth, Birgit L.

    Using computers to produce art began with scientists, mathematicians, and individuals with strong technical backgrounds who used the graphic material as visualizations of data in technical fields. People are using computer art in advertising, as well as in painting; sculpture; music; textile, product, industrial, and interior design; architecture;…

  6. Computational tools for cyclotron design, commissioning, and operation

    Kost, C.J.

    1989-05-01

    Many support systems are required in the design, commissioning, and normal operation of a modern cyclotron. Presented is an overview of the computing environment developed during these various stages at TRIUMF. The current computing environment is also discussed, with emphasis on how one can provide an integrated system which is user-friendly

  7. iTools: a framework for classification, categorization and integration of computational biology resources.

    Ivo D Dinov

    2008-05-01

    Full Text Available The advancement of the computational biology field hinges on progress in three fundamental directions--the development of new computational algorithms, the availability of informatics resource management infrastructures and the capability of tools to interoperate and synergize. There is an explosion in algorithms and tools for computational biology, which makes it difficult for biologists to find, compare and integrate such resources. We describe a new infrastructure, iTools, for managing the query, traversal and comparison of diverse computational biology resources. Specifically, iTools stores information about three types of resources--data, software tools and web-services. The iTools design, implementation and resource meta-data content reflect the broad research, computational, applied and scientific expertise available at the seven National Centers for Biomedical Computing. iTools provides a system for classification, categorization and integration of different computational biology resources across space-and-time scales, biomedical problems, computational infrastructures and mathematical foundations. A large number of resources are already iTools-accessible to the community and this infrastructure is rapidly growing. iTools includes human and machine interfaces to its resource meta-data repository. Investigators or computer programs may utilize these interfaces to search, compare, expand, revise and mine meta-data descriptions of existent computational biology resources. We propose two ways to browse and display the iTools dynamic collection of resources. The first one is based on an ontology of computational biology resources, and the second one is derived from hyperbolic projections of manifolds or complex structures onto planar discs. iTools is an open source project both in terms of the source code development as well as its meta-data content. iTools employs a decentralized, portable, scalable and lightweight framework for long

  8. A tool for computing diversity and consideration on differences between diversity indices

    Palaghianu, Ciprian

    2016-01-01

    Diversity represents a key concept in ecology, and there are various methods of assessing it. The multitude of diversity indices are quite puzzling and sometimes difficult to compute for a large volume of data. This paper promotes a computational tool used to assess the diversity of different entities. The BIODIV software is a user-friendly tool, developed using Microsoft Visual Basic. It is capable to compute several diversity indices such as: Shannon, Simpson, Pielou, Brillouin, Berger-Park...

  9. IHT: Tools for Computing Insolation Absorption by Particle Laden Flows

    Grout, R. W.

    2013-10-01

    This report describes IHT, a toolkit for computing radiative heat exchange between particles. Well suited for insolation absorption computations, it is also has potential applications in combustion (sooting flames), biomass gasification processes and similar processes. The algorithm is based on the 'Photon Monte Carlo' approach and implemented in a library that can be interfaced with a variety of computational fluid dynamics codes to analyze radiative heat transfer in particle-laden flows. The emphasis in this report is on the data structures and organization of IHT for developers seeking to use the IHT toolkit to add Photon Monte Carlo capabilities to their own codes.

  10. Risks and benefits of social computing as a healthcare tool

    Mxoli, Avuya

    2016-03-01

    Full Text Available Cybercitizen describes a frequent user of the Internet or in other terms, a member of an online community (cybercommunity). This digital space can be used to participate in educational, economical and cultural activities. Social computing...

  11. Computers in the Classroom: From Tool to Medium.

    Perrone, Corrina; Repenning, Alexander; Spencer, Sarah; Ambach, James

    1996-01-01

    Discusses the computer as a communication medium to support learning. Illustrates the benefits of this reconceptualization in the context of having students author and play interactive simulation games and exchange them over the Internet. (RS)

  12. Computer-generated movies as an analytic tool

    Elliott, R.L.

    1978-01-01

    One of the problems faced by the users of large, sophisticated modeling programs at the Los Alamos Scientific Laboratory (LASL) is the analysis of the results of their calculations. One of the more productive and frequently spectacular methods is the production of computer-generated movies. An overview of the generation of computer movies at LASL is presented. The hardware, software, and generation techniques are briefly discussed

  13. Evaluating tablet computers as a survey tool in rural communities.

    Newell, Steve M; Logan, Henrietta L; Guo, Yi; Marks, John G; Shepperd, James A

    2015-01-01

    Although tablet computers offer advantages in data collection over traditional paper-and-pencil methods, little research has examined whether the 2 formats yield similar responses, especially with underserved populations. We compared the 2 survey formats and tested whether participants' responses to common health questionnaires or perceptions of usability differed by survey format. We also tested whether we could replicate established paper-and-pencil findings via tablet computer. We recruited a sample of low-income community members living in the rural southern United States. Participants were 170 residents (black = 49%; white = 36%; other races and missing data = 15%) drawn from 2 counties meeting Florida's state statutory definition of rural with 100 persons or fewer per square mile. We randomly assigned participants to complete scales (Center for Epidemiologic Studies Depression Inventory and Regulatory Focus Questionnaire) along with survey format usability ratings via paper-and-pencil or tablet computer. All participants rated a series of previously validated posters using a tablet computer. Finally, participants completed comparisons of the survey formats and reported survey format preferences. Participants preferred using the tablet computer and showed no significant differences between formats in mean responses, scale reliabilities, or in participants' usability ratings. Overall, participants reported similar scales responses and usability ratings between formats. However, participants reported both preferring and enjoying responding via tablet computer more. Collectively, these findings are among the first data to show that tablet computers represent a suitable substitute among an underrepresented rural sample for paper-and-pencil methodology in survey research. Published 2014. This article is a U.S. Government work and is in the public domain in the USA.

  14. Safe manning of merchant ships: an approach and computer tool

    Alapetite, Alexandre; Kozin, Igor

    2017-01-01

    In the shipping industry, staffing expenses have become a vital competition parameter. In this paper, an approach and a software tool are presented to support decisions on the staffing of merchant ships. The tool is implemented in the form of a Web user interface that makes use of discrete......-event simulation and allows estimation of the workload and of whether different scenarios are successfully performed taking account of the number of crewmembers, watch schedules, distribution of competencies, and others. The software library ‘SimManning’ at the core of the project is provided as open source...

  15. Automated planning target volume generation: an evaluation pitting a computer-based tool against human experts

    Ketting, Case H.; Austin-Seymour, Mary; Kalet, Ira; Jacky, Jon; Kromhout-Schiro, Sharon; Hummel, Sharon; Unger, Jonathan; Fagan, Lawrence M.; Griffin, Tom

    1997-01-01

    Purpose: Software tools are seeing increased use in three-dimensional treatment planning. However, the development of these tools frequently omits careful evaluation before placing them in clinical use. This study demonstrates the application of a rigorous evaluation methodology using blinded peer review to an automated software tool that produces ICRU-50 planning target volumes (PTVs). Methods and Materials: Seven physicians from three different institutions involved in three-dimensional treatment planning participated in the evaluation. Four physicians drew partial PTVs on nine test cases, consisting of four nasopharynx and five lung primaries. Using the same information provided to the human experts, the computer tool generated PTVs for comparison. The remaining three physicians, designated evaluators, individually reviewed the PTVs for acceptability. To exclude bias, the evaluators were blinded to the source (human or computer) of the PTVs they reviewed. Their scorings of the PTVs were statistically examined to determine if the computer tool performed as well as the human experts. Results: The computer tool was as successful as the human experts in generating PTVs. Failures were primarily attributable to insufficient margins around the clinical target volume and to encroachment upon critical structures. In a qualitative analysis, the human and computer experts displayed similar types and distributions of errors. Conclusions: Rigorous evaluation of computer-based radiotherapy tools requires comparison to current practice and can reveal areas for improvement before the tool enters clinical practice

  16. Improvement of Computer Software Quality through Software Automated Tools.

    1986-08-31

    maintain an effective and economical monitoring program that includes both processes -* and products which makes data available to the Government...presented to help the AFPRO understand what a soft- wace tool is and how it works. There are many ways in which one can view the characteristics of soft

  17. Implementing iRound: A Computer-Based Auditing Tool.

    Brady, Darcie

    Many hospitals use rounding or auditing as a tool to help identify gaps and needs in quality and process performance. Some hospitals are also using rounding to help improve patient experience. It is known that purposeful rounding helps improve Hospital Consumer Assessment of Healthcare Providers and Systems scores by helping manage patient expectations, provide service recovery, and recognize quality caregivers. Rounding works when a standard method is used across the facility, where data are comparable and trustworthy. This facility had a pen-and-paper process in place that made data reporting difficult, created a silo culture between departments, and most audits and rounds were completed differently on each unit. It was recognized that this facility needed to standardize the rounding and auditing process. The tool created by the Advisory Board called iRound was chosen as the tool this facility would use for patient experience rounds as well as process and quality rounding. The success of the iRound tool in this facility depended on several factors that started many months before implementation to current everyday usage.

  18. Computational tool for simulation of power and refrigeration cycles

    Córdoba Tuta, E.; Reyes Orozco, M.

    2016-07-01

    Small improvement in thermal efficiency of power cycles brings huge cost savings in the production of electricity, for that reason have a tool for simulation of power cycles allows modeling the optimal changes for a best performance. There is also a big boom in research Organic Rankine Cycle (ORC), which aims to get electricity at low power through cogeneration, in which the working fluid is usually a refrigerant. A tool to design the elements of an ORC cycle and the selection of the working fluid would be helpful, because sources of heat from cogeneration are very different and in each case would be a custom design. In this work the development of a multiplatform software for the simulation of power cycles and refrigeration, which was implemented in the C ++ language and includes a graphical interface which was developed using multiplatform environment Qt and runs on operating systems Windows and Linux. The tool allows the design of custom power cycles, selection the type of fluid (thermodynamic properties are calculated through CoolProp library), calculate the plant efficiency, identify the fractions of flow in each branch and finally generates a report very educational in pdf format via the LaTeX tool.

  19. ATLAS Distributed Computing Monitoring tools during the LHC Run I

    Schovancova, J; The ATLAS collaboration; Di Girolamo, A; Jezequel, S; Ueda, I; Wenaus, T

    2013-01-01

    This contribution summarizes evolution of the ATLAS Distributed Computing (ADC) Monitoring project during the LHC Run I. The ADC Monitoring targets at the three groups of customers: ADC Operations team to early identify malfunctions and escalate issues to an activity or a service expert, ATLAS national contacts and sites for the real-time monitoring and long-term measurement of the performance of the provided computing resources, and the ATLAS Management for long-term trends and accounting information about the ATLAS Distributed Computing resources.\\\\ During the LHC Run I a significant development effort has been invested in standardization of the monitoring and accounting applications in order to provide extensive monitoring and accounting suite. ADC Monitoring applications separate the data layer and the visualization layer. The data layer exposes data in a predefined format. The visualization layer is designed bearing in mind visual identity of the provided graphical elements, and re-usability of the visua...

  20. ATLAS Distributed Computing Monitoring tools during the LHC Run I

    Schovancova, J; The ATLAS collaboration; Di Girolamo, A; Jezequel, S; Ueda, I; Wenaus, T

    2014-01-01

    This contribution summarizes evolution of the ATLAS Distributed Computing (ADC) Monitoring project during the LHC Run I. The ADC Monitoring targets at the three groups of customers: ADC Operations team to early identify malfunctions and escalate issues to an activity or a service expert, ATLAS national contacts and sites for the real-time monitoring and long-term measurement of the performance of the provided computing resources, and the ATLAS Management for long-term trends and accounting information about the ATLAS Distributed Computing resources.\\\\ During the LHC Run I a significant development effort has been invested in standardization of the monitoring and accounting applications in order to provide extensive monitoring and accounting suite. ADC Monitoring applications separate the data layer and the visualization layer. The data layer exposes data in a predefined format. The visualization layer is designed bearing in mind visual identity of the provided graphical elements, and re-usability of the visua...

  1. Computer algebra as a research tool in physics

    Drouffe, J.M.

    1985-04-01

    The progress of computer algebra observed during these last years has had certainly an impact in physics. I want to precise the role of these new techniques in this application domain and to analyze their present limitations. In Section 1, I describe briefly the use of algebraic manipulation programs at the elementary level. The numerical and symbolic solutions of problems are compared in Section 2. Section 3 is devoted to a prospective about the use of computer algebra at the highest level, as an ''intelligent'' system. I recall in Section 4 what is required from a system to be used in physics

  2. A Perspective on Computational Human Performance Models as Design Tools

    Jones, Patricia M.

    2010-01-01

    The design of interactive systems, including levels of automation, displays, and controls, is usually based on design guidelines and iterative empirical prototyping. A complementary approach is to use computational human performance models to evaluate designs. An integrated strategy of model-based and empirical test and evaluation activities is particularly attractive as a methodology for verification and validation of human-rated systems for commercial space. This talk will review several computational human performance modeling approaches and their applicability to design of display and control requirements.

  3. Enhancing interest in statistics among computer science students using computer tool entrepreneur role play

    Judi, Hairulliza Mohamad; Sahari @ Ashari, Noraidah; Eksan, Zanaton Hj

    2017-04-01

    Previous research in Malaysia indicates that there is a problem regarding attitude towards statistics among students. They didn't show positive attitude in affective, cognitive, capability, value, interest and effort aspects although did well in difficulty. This issue should be given substantial attention because students' attitude towards statistics may give impacts on the teaching and learning process of the subject. Teaching statistics using role play is an appropriate attempt to improve attitudes to statistics, to enhance the learning of statistical techniques and statistical thinking, and to increase generic skills. The objectives of the paper are to give an overview on role play in statistics learning and to access the effect of these activities on students' attitude and learning in action research framework. The computer tool entrepreneur role play is conducted in a two-hour tutorial class session of first year students in Faculty of Information Sciences and Technology (FTSM), Universiti Kebangsaan Malaysia, enrolled in Probability and Statistics course. The results show that most students feel that they have enjoyable and great time in the role play. Furthermore, benefits and disadvantages from role play activities were highlighted to complete the review. Role play is expected to serve as an important activities that take into account students' experience, emotions and responses to provide useful information on how to modify student's thinking or behavior to improve learning.

  4. Computer modelling as a tool for understanding language evolution

    de Boer, Bart; Gontier, N; VanBendegem, JP; Aerts, D

    2006-01-01

    This paper describes the uses of computer models in studying the evolution of language. Language is a complex dynamic system that can be studied at the level of the individual and at the level of the population. Much of the dynamics of language evolution and language change occur because of the

  5. The ZAP Project: Designing Interactive Computer Tools for Learning Psychology

    Hulshof, Casper; Eysink, Tessa; de Jong, Ton

    2006-01-01

    In the ZAP project, a set of interactive computer programs called "ZAPs" was developed. The programs were designed in such a way that first-year students experience psychological phenomena in a vivid and self-explanatory way. Students can either take the role of participant in a psychological experiment, they can experience phenomena themselves,…

  6. Coordinated computer-supported collaborative learning: Awareness and awareness tools

    Janssen, J.J.H.M.; Bodermer, D.

    2013-01-01

    Traditionally, research on awareness during online collaboration focused on topics such as the effects of spatial information about group members’ activities on the collaborative process. When the concept of awareness was introduced to computer-supported collaborative learning, this focus shifted to

  7. Computers, Laptops and Tools. ACER Research Monograph No. 56.

    Ainley, Mary; Bourke, Valerie; Chatfield, Robert; Hillman, Kylie; Watkins, Ian

    In 1997, Balwyn High School (Australia) instituted a class of 28 Year 7 students to use laptop computers across the curriculum. This report details findings from an action research project that monitored important aspects of what happened when this program was introduced. A range of measures was developed to assess the influence of the use of…

  8. Computer Generated Optical Illusions: A Teaching and Research Tool.

    Bailey, Bruce; Harman, Wade

    Interactive computer-generated simulations that highlight psychological principles were investigated in this study in which 33 female and 19 male undergraduate college student volunteers of median age 21 matched line and circle sizes in six variations of Ponzo's illusion. Prior to working with the illusions, data were collected based on subjects'…

  9. Uranus: a rapid prototyping tool for FPGA embedded computer vision

    Rosales-Hernández, Victor; Castillo-Jimenez, Liz; Viveros-Velez, Gilberto; Zuñiga-Grajeda, Virgilio; Treviño Torres, Abel; Arias-Estrada, M.

    2007-01-01

    The starting point for all successful system development is the simulation. Performing high level simulation of a system can help to identify, insolate and fix design problems. This work presents Uranus, a software tool for simulation and evaluation of image processing algorithms with support to migrate them to an FPGA environment for algorithm acceleration and embedded processes purposes. The tool includes an integrated library of previous coded operators in software and provides the necessary support to read and display image sequences as well as video files. The user can use the previous compiled soft-operators in a high level process chain, and code his own operators. Additional to the prototyping tool, Uranus offers FPGA-based hardware architecture with the same organization as the software prototyping part. The hardware architecture contains a library of FPGA IP cores for image processing that are connected with a PowerPC based system. The Uranus environment is intended for rapid prototyping of machine vision and the migration to FPGA accelerator platform, and it is distributed for academic purposes.

  10. A sampler of useful computational tools for applied geometry, computer graphics, and image processing foundations for computer graphics, vision, and image processing

    Cohen-Or, Daniel; Ju, Tao; Mitra, Niloy J; Shamir, Ariel; Sorkine-Hornung, Olga; Zhang, Hao (Richard)

    2015-01-01

    A Sampler of Useful Computational Tools for Applied Geometry, Computer Graphics, and Image Processing shows how to use a collection of mathematical techniques to solve important problems in applied mathematics and computer science areas. The book discusses fundamental tools in analytical geometry and linear algebra. It covers a wide range of topics, from matrix decomposition to curvature analysis and principal component analysis to dimensionality reduction.Written by a team of highly respected professors, the book can be used in a one-semester, intermediate-level course in computer science. It

  11. PCE: web tools to compute protein continuum electrostatics

    Miteva, Maria A.; Tufféry, Pierre; Villoutreix, Bruno O.

    2005-01-01

    PCE (protein continuum electrostatics) is an online service for protein electrostatic computations presently based on the MEAD (macroscopic electrostatics with atomic detail) package initially developed by D. Bashford [(2004) Front Biosci., 9, 1082–1099]. This computer method uses a macroscopic electrostatic model for the calculation of protein electrostatic properties, such as pKa values of titratable groups and electrostatic potentials. The MEAD package generates electrostatic energies via finite difference solution to the Poisson–Boltzmann equation. Users submit a PDB file and PCE returns potentials and pKa values as well as color (static or animated) figures displaying electrostatic potentials mapped on the molecular surface. This service is intended to facilitate electrostatics analyses of proteins and thereby broaden the accessibility to continuum electrostatics to the biological community. PCE can be accessed at . PMID:15980492

  12. Computer aided systems human engineering: A hypermedia tool

    Boff, Kenneth R.; Monk, Donald L.; Cody, William J.

    1992-01-01

    The Computer Aided Systems Human Engineering (CASHE) system, Version 1.0, is a multimedia ergonomics database on CD-ROM for the Apple Macintosh II computer, being developed for use by human system designers, educators, and researchers. It will initially be available on CD-ROM and will allow users to access ergonomics data and models stored electronically as text, graphics, and audio. The CASHE CD-ROM, Version 1.0 will contain the Boff and Lincoln (1988) Engineering Data Compendium, MIL-STD-1472D and a unique, interactive simulation capability, the Perception and Performance Prototyper. Its features also include a specialized data retrieval, scaling, and analysis capability and the state of the art in information retrieval, browsing, and navigation.

  13. Present status of computational tools for maglev development

    Wang, Z.; Chen, S.S.; Rote, D.M.

    1991-10-01

    High-speed vehicles that employ magnetic levitation (maglev) have received great attention worldwide as a means of relieving both highway and air-traffic congestion. At this time, Japan and Germany are leading the development of maglev. After fifteen years of inactivity that is attributed to technical policy decisions, the federal government of the United States has reconsidered the possibility of using maglev in the United States. The National Maglev Initiative (NMI) was established in May 1990 to assess the potential of maglev in the United States. One of the tasks of the NMI, which is also the objective of this report, is to determine the status of existing computer software that can be applied to maglev-related problems. The computational problems involved in maglev assessment, research, and development can be classified into two categories: electromagnetic and mechanical. Because most maglev problems are complicated and difficult to solve analytically, proper numerical methods are needed to find solutions. To determine the status of maglev-related software, developers and users of computer codes were surveyed. The results of the survey are described in this report. 25 refs.

  14. Tools for Analyzing Computing Resource Management Strategies and Algorithms for SDR Clouds

    Marojevic, Vuk; Gomez-Miguelez, Ismael; Gelonch, Antoni

    2012-09-01

    Software defined radio (SDR) clouds centralize the computing resources of base stations. The computing resource pool is shared between radio operators and dynamically loads and unloads digital signal processing chains for providing wireless communications services on demand. Each new user session request particularly requires the allocation of computing resources for executing the corresponding SDR transceivers. The huge amount of computing resources of SDR cloud data centers and the numerous session requests at certain hours of a day require an efficient computing resource management. We propose a hierarchical approach, where the data center is divided in clusters that are managed in a distributed way. This paper presents a set of computing resource management tools for analyzing computing resource management strategies and algorithms for SDR clouds. We use the tools for evaluating a different strategies and algorithms. The results show that more sophisticated algorithms can achieve higher resource occupations and that a tradeoff exists between cluster size and algorithm complexity.

  15. A basic tool for computer-aided sail design

    Thrasher, D.F.; Dunyak, T.J.; Mook, D.T.; Nayfeh, A.H.

    1985-01-01

    Recent developments in modelling lifting surfaces have provided a tool that also can be used to model sails. The simplest of the adequate models is the vortex-lattice method. This method can fully account for the aerodynamic interactions among several lifting surfaces having arbitrary platforms, camber, and twist as long as separation occurs only along the edges and the phenomenon known as vortex bursting does not occur near the sails. This paper describes this method and how it can be applied to the design of sails

  16. Computer-Assisted Visual Search/Decision Aids as a Training Tool for Mammography

    Nodine, Calvin

    2000-01-01

    The primary goal of the project is to develop a computer-assisted visual search (CAVS) mammography training tool that will improve the perceptual and cognitive skills of trainees leading to mammographic expertise...

  17. Computer-Assisted Visual Search/Decision Aids as a Training Tool for Mammography

    Nodine, Calvin

    1999-01-01

    The primary goal of the project is to develop a computer-assisted visual search (CAVS) mammography training tool that will improve the perceptual and cognitive skills of trainees leading to mammographic expertise...

  18. Computer-Assisted Visual Search/Decision Aids as a Training Tool for Mammography

    Nodine, Calvin

    1998-01-01

    The primary goal of the project is to develop a computer-assisted visual search (CAVS) mammography training tool that will improve the perceptual and cognitive skills of trainees leading to mammographic expertise...

  19. Computational Tool for Coupled Simulation of Nonequilibrium Hypersonic Flows with Ablation, Phase I

    National Aeronautics and Space Administration — The goal of this SBIR project is to develop a computational tool with unique predictive capabilities for the aerothermodynamic environment around ablation-cooled...

  20. Computational Tool for Coupled Simulation of Nonequilibrium Hypersonic Flows with Ablation, Phase II

    National Aeronautics and Space Administration — The goal of this SBIR project is to develop a predictive computational tool for the aerothermal environment around ablation-cooled hypersonic atmospheric entry...

  1. Synthetic RNAs for Gene Regulation: Design Principles and Computational Tools

    Laganà, Alessandro; Shasha, Dennis; Croce, Carlo Maria

    2014-01-01

    The use of synthetic non-coding RNAs for post-transcriptional regulation of gene expression has not only become a standard laboratory tool for gene functional studies but it has also opened up new perspectives in the design of new and potentially promising therapeutic strategies. Bioinformatics has provided researchers with a variety of tools for the design, the analysis, and the evaluation of RNAi agents such as small-interfering RNA (siRNA), short-hairpin RNA (shRNA), artificial microRNA (a-miR), and microRNA sponges. More recently, a new system for genome engineering based on the bacterial CRISPR-Cas9 system (Clustered Regularly Interspaced Short Palindromic Repeats), was shown to have the potential to also regulate gene expression at both transcriptional and post-transcriptional level in a more specific way. In this mini review, we present RNAi and CRISPRi design principles and discuss the advantages and limitations of the current design approaches.

  2. Synthetic RNAs for Gene Regulation: Design Principles and Computational Tools

    Laganà, Alessandro [Department of Molecular Virology, Immunology and Medical Genetics, Comprehensive Cancer Center, The Ohio State University, Columbus, OH (United States); Shasha, Dennis [Courant Institute of Mathematical Sciences, New York University, New York, NY (United States); Croce, Carlo Maria [Department of Molecular Virology, Immunology and Medical Genetics, Comprehensive Cancer Center, The Ohio State University, Columbus, OH (United States)

    2014-12-11

    The use of synthetic non-coding RNAs for post-transcriptional regulation of gene expression has not only become a standard laboratory tool for gene functional studies but it has also opened up new perspectives in the design of new and potentially promising therapeutic strategies. Bioinformatics has provided researchers with a variety of tools for the design, the analysis, and the evaluation of RNAi agents such as small-interfering RNA (siRNA), short-hairpin RNA (shRNA), artificial microRNA (a-miR), and microRNA sponges. More recently, a new system for genome engineering based on the bacterial CRISPR-Cas9 system (Clustered Regularly Interspaced Short Palindromic Repeats), was shown to have the potential to also regulate gene expression at both transcriptional and post-transcriptional level in a more specific way. In this mini review, we present RNAi and CRISPRi design principles and discuss the advantages and limitations of the current design approaches.

  3. Towards early software reliability prediction for computer forensic tools (case study).

    Abu Talib, Manar

    2016-01-01

    Versatility, flexibility and robustness are essential requirements for software forensic tools. Researchers and practitioners need to put more effort into assessing this type of tool. A Markov model is a robust means for analyzing and anticipating the functioning of an advanced component based system. It is used, for instance, to analyze the reliability of the state machines of real time reactive systems. This research extends the architecture-based software reliability prediction model for computer forensic tools, which is based on Markov chains and COSMIC-FFP. Basically, every part of the computer forensic tool is linked to a discrete time Markov chain. If this can be done, then a probabilistic analysis by Markov chains can be performed to analyze the reliability of the components and of the whole tool. The purposes of the proposed reliability assessment method are to evaluate the tool's reliability in the early phases of its development, to improve the reliability assessment process for large computer forensic tools over time, and to compare alternative tool designs. The reliability analysis can assist designers in choosing the most reliable topology for the components, which can maximize the reliability of the tool and meet the expected reliability level specified by the end-user. The approach of assessing component-based tool reliability in the COSMIC-FFP context is illustrated with the Forensic Toolkit Imager case study.

  4. The Use of Computers as a Design Tool.

    1980-01-01

    sponsor of numerical computing engines for defense needs; there is no such driving sponsorship today. It is concluded that as a result of these changes...coit difficilos A atteindro, soit impossibloc, soit trap faciles A atteindro. Pour cola , ii oct nlcessaire, pour le reeponsable du projot, de d~finir...CRAY I CRAY 1, BURROUGHS6 e70 " BM31/ IB CYER • e0 s 13 c amISS .MDAHL 4 /e oLe./ "B cocas ee/ i v 1,iu 70le o UNIVAC n1 t 1ons,, In 360/78 U 2

  5. Configuration monitoring tool for large-scale distributed computing

    Wu, Y.; Graham, G.; Lu, X.; Afaq, A.; Kim, B.J.; Fisk, I.

    2004-01-01

    The Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) at CERN will likely use a grid system to achieve much of its offline processing need. Given the heterogeneous and dynamic nature of grid systems, it is desirable to have in place a configuration monitor. The configuration monitoring tool is built using the Globus toolkit and web services. It consists of an information provider for the Globus MDS, a relational database for keeping track of the current and old configurations, and client interfaces to query and administer the configuration system. The Grid Security Infrastructure (GSI), together with EDG Java Security packages, are used for secure authentication and transparent access to the configuration information across the CMS grid. This work has been prototyped and tested using US-CMS grid resources

  6. Soft computing simulation tools for nuclear energy systems

    Kannan Balasubramanian, S.

    2012-01-01

    This chapter deals with simulation, a very powerful tool in designing, constructing and operating nuclear power generating facilities. There are very different types of power plants, and the examples mentioned in this chapter originate from experience with water cooled and water moderated thermal reactors, based on fission of uranium-235. Nevertheless, the methodological achievements in simulation mentioned below can definitely be used not only for this particular type of nuclear power generating reactor. Simulation means: investigation of processes in the time domain. We can calculate the characteristics and properties of different systems, e.g. we can design a bridge over a river, but if we calculate how it would respond to a thunderstorm with high winds, its movement can or can not evolve after a certain time into destructive oscillation - this type of calculations are called simulation

  7. Platformation: Cloud Computing Tools at the Service of Social Change

    Anil Patel

    2012-07-01

    Full Text Available The following article establishes some context and definitions for what is termed the “sharing imperative” – a movement or tendency towards sharing information online and in real time that has rapidly transformed several industries. As internet-enabled devices proliferate to all corners of the globe, ways of working and accessing information have changed. Users now expect to be able to access the products, services, and information that they want from anywhere, at any time, on any device. This article addresses how the nonprofit sector might respond to those demands by embracing the sharing imperative. It suggests that how well an organization shares has become one of the most pressing governance questions a nonprofit organization must tackle. Finally, the article introduces Platformation, a project whereby tools that enable better inter and intra-organizational sharing are tested for scalability, affordability, interoperability, and security, all with a non-profit lens.

  8. Configuration monitoring tool for large-scale distributed computing

    Wu, Y; Fisk, I; Graham, G; Kim, B J; Lü, X

    2004-01-01

    The Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC) at CERN will likely use a grid system to achieve much of its offline processing need. Given the heterogeneous and dynamic nature of grid systems, it is desirable to have in place a configuration monitor. The configuration monitoring tool is built using the Globus toolkit and web services. It consists of an information provider for the Globus MDS, a relational database for keeping track of the current and old configurations, and client interfaces to query and administer the configuration system. The Grid Security Infrastructure (GSI), together with EDG Java Security packages, are used for secure authentication and transparent access to the configuration information across the CMS grid. This work has been prototyped and tested using US-CMS grid resources.

  9. HCI^2 Workbench: A Development Tool for Multimodal Human-Computer Interaction Systems

    Shen, Jie; Wenzhe, Shi; Pantic, Maja

    In this paper, we present a novel software tool designed and implemented to simplify the development process of Multimodal Human-Computer Interaction (MHCI) systems. This tool, which is called the HCI^2 Workbench, exploits a Publish / Subscribe (P/S) architecture [13] [14] to facilitate efficient

  10. Development and Evaluation of Computer-Based Laboratory Practical Learning Tool

    Gandole, Y. B.

    2006-01-01

    Effective evaluation of educational software is a key issue for successful introduction of advanced tools in the curriculum. This paper details to developing and evaluating a tool for computer assisted learning of science laboratory courses. The process was based on the generic instructional system design model. Various categories of educational…

  11. DEVELOPMENT AND USE OF COMPUTER-AIDED PROCESS ENGINEERING TOOLS FOR POLLUTION PREVENTION

    The use of Computer-Aided Process Engineering (CAPE) and process simulation tools has become established industry practice to predict simulation software, new opportunities are available for the creation of a wide range of ancillary tools that can be used from within multiple sim...

  12. Teachers' Use of Computational Tools to Construct and Explore Dynamic Mathematical Models

    Santos-Trigo, Manuel; Reyes-Rodriguez, Aaron

    2011-01-01

    To what extent does the use of computational tools offer teachers the possibility of constructing dynamic models to identify and explore diverse mathematical relations? What ways of reasoning or thinking about the problems emerge during the model construction process that involves the use of the tools? These research questions guided the…

  13. Computer assisted audit tools and techniques in real world: CAATT's applications and approaches in context

    Pedrosa, I.; Costa, C. J.

    2012-01-01

    Nowadays, Computer Aided Audit Tools (and Techniques’) support almost all audit processes concerning data extraction and analysis. These tools were firstly aimed to support financial auditing processes. However, their scope is beyond this, therefore, we present case studies and good practices in an academic context. Although in large auditing companies Audit Tools to do data extraction and analysis are very common and applied in several contexts, we realized that is not easy to find practical...

  14. Towards a Tool for Computer Supported Structuring of Products

    Hansen, Claus Thorp

    1997-01-01

    . However, a product possesses not only a component structure but also various organ structures which are superimposed on the component structure. The organ structures carry behaviour and make the product suited for its life phases.Our long-term research goal is to develop a computer-based system...... that is capable of supporting synthesis activities in engineering design, and thereby also support handling of various organ structures. Such a system must contain a product model, in which it is possible to describe and manipulate both various organ structures and the component structure.In this paper we focus...... on the relationships between organ structures and the component structure. By an analysis of an existing product it is shown that a component may contribute to more than one organ. A set of organ structures is identified and their influence on the component strucute is illustrated....

  15. Assessing Affordances of Selected Cloud Computing Tools for Language Teacher Education in Nigeria

    Ofemile, Abdulmalik Yusuf

    2015-01-01

    This paper reports part of a study that hoped to understand Teacher Educators' (TE) assessment of the affordances of selected cloud computing tools ranked among the top 100 for the year 2010. Research has shown that ICT and by extension cloud computing has positive impacts on daily life and this informed the Nigerian government's policy to…

  16. Development and Assessment of a Chemistry-Based Computer Video Game as a Learning Tool

    Martinez-Hernandez, Kermin Joel

    2010-01-01

    The chemistry-based computer video game is a multidisciplinary collaboration between chemistry and computer graphics and technology fields developed to explore the use of video games as a possible learning tool. This innovative approach aims to integrate elements of commercial video game and authentic chemistry context environments into a learning…

  17. The Computer as a Tool for Learning through Reflection. Technical Report No. 376.

    Collins, Allan; Brown, John Seely

    Because of its ability to record and represent process, the computer can provide a powerful, motivating, and as yet untapped tool for focusing the students' attention directly on their own thought processes and learning through reflection. Properly abstracted and structured, the computational medium can capture the processes by which a novice or…

  18. Automatic Parallelization Tool: Classification of Program Code for Parallel Computing

    Mustafa Basthikodi

    2016-04-01

    Full Text Available Performance growth of single-core processors has come to a halt in the past decade, but was re-enabled by the introduction of parallelism in processors. Multicore frameworks along with Graphical Processing Units empowered to enhance parallelism broadly. Couples of compilers are updated to developing challenges forsynchronization and threading issues. Appropriate program and algorithm classifications will have advantage to a great extent to the group of software engineers to get opportunities for effective parallelization. In present work we investigated current species for classification of algorithms, in that related work on classification is discussed along with the comparison of issues that challenges the classification. The set of algorithms are chosen which matches the structure with different issues and perform given task. We have tested these algorithms utilizing existing automatic species extraction toolsalong with Bones compiler. We have added functionalities to existing tool, providing a more detailed characterization. The contributions of our work include support for pointer arithmetic, conditional and incremental statements, user defined types, constants and mathematical functions. With this, we can retain significant data which is not captured by original speciesof algorithms. We executed new theories into the device, empowering automatic characterization of program code.

  19. Cost-effective cloud computing: a case study using the comparative genomics tool, roundup.

    Kudtarkar, Parul; Deluca, Todd F; Fusaro, Vincent A; Tonellato, Peter J; Wall, Dennis P

    2010-12-22

    Comparative genomics resources, such as ortholog detection tools and repositories are rapidly increasing in scale and complexity. Cloud computing is an emerging technological paradigm that enables researchers to dynamically build a dedicated virtual cluster and may represent a valuable alternative for large computational tools in bioinformatics. In the present manuscript, we optimize the computation of a large-scale comparative genomics resource-Roundup-using cloud computing, describe the proper operating principles required to achieve computational efficiency on the cloud, and detail important procedures for improving cost-effectiveness to ensure maximal computation at minimal costs. Utilizing the comparative genomics tool, Roundup, as a case study, we computed orthologs among 902 fully sequenced genomes on Amazon's Elastic Compute Cloud. For managing the ortholog processes, we designed a strategy to deploy the web service, Elastic MapReduce, and maximize the use of the cloud while simultaneously minimizing costs. Specifically, we created a model to estimate cloud runtime based on the size and complexity of the genomes being compared that determines in advance the optimal order of the jobs to be submitted. We computed orthologous relationships for 245,323 genome-to-genome comparisons on Amazon's computing cloud, a computation that required just over 200 hours and cost $8,000 USD, at least 40% less than expected under a strategy in which genome comparisons were submitted to the cloud randomly with respect to runtime. Our cost savings projections were based on a model that not only demonstrates the optimal strategy for deploying RSD to the cloud, but also finds the optimal cluster size to minimize waste and maximize usage. Our cost-reduction model is readily adaptable for other comparative genomics tools and potentially of significant benefit to labs seeking to take advantage of the cloud as an alternative to local computing infrastructure.

  20. 8th International Workshop on Parallel Tools for High Performance Computing

    Gracia, José; Knüpfer, Andreas; Resch, Michael; Nagel, Wolfgang

    2015-01-01

    Numerical simulation and modelling using High Performance Computing has evolved into an established technique in academic and industrial research. At the same time, the High Performance Computing infrastructure is becoming ever more complex. For instance, most of the current top systems around the world use thousands of nodes in which classical CPUs are combined with accelerator cards in order to enhance their compute power and energy efficiency. This complexity can only be mastered with adequate development and optimization tools. Key topics addressed by these tools include parallelization on heterogeneous systems, performance optimization for CPUs and accelerators, debugging of increasingly complex scientific applications, and optimization of energy usage in the spirit of green IT. This book represents the proceedings of the 8th International Parallel Tools Workshop, held October 1-2, 2014 in Stuttgart, Germany – which is a forum to discuss the latest advancements in the parallel tools.

  1. A review of computer tools for analysing the integration of renewable energy into various energy systems

    Connolly, D.; Lund, Henrik; Mathiesen, Brian Vad

    2010-01-01

    to integrating renewable energy, but instead the ‘ideal’ energy tool is highly dependent on the specific objectives that must be fulfilled. The typical applications for the 37 tools reviewed (from analysing single-building systems to national energy-systems), combined with numerous other factors......This paper includes a review of the different computer tools that can be used to analyse the integration of renewable energy. Initially 68 tools were considered, but 37 were included in the final analysis which was carried out in collaboration with the tool developers or recommended points...... of contact. The results in this paper provide the information necessary to identify a suitable energy tool for analysing the integration of renewable energy into various energy-systems under different objectives. It is evident from this paper that there is no energy tool that addresses all issues related...

  2. Problems and Issues in Using Computer- Based Support Tools to Enhance 'Soft' Systems Methodologies

    Mark Stansfield

    2001-11-01

    Full Text Available This paper explores the issue of whether computer-based support tools can enhance the use of 'soft' systems methodologies as applied to real-world problem situations. Although work has been carried out by a number of researchers in applying computer-based technology to concepts and methodologies relating to 'soft' systems thinking such as Soft Systems Methodology (SSM, such attempts appear to be still in their infancy and have not been applied widely to real-world problem situations. This paper will highlight some of the problems that may be encountered in attempting to develop computer-based support tools for 'soft' systems methodologies. Particular attention will be paid to an attempt by the author to develop a computer-based support tool for a particular 'soft' systems method of inquiry known as the Appreciative Inquiry Method that is based upon Vickers' notion of 'appreciation' (Vickers, 196S and Checkland's SSM (Checkland, 1981. The final part of the paper will explore some of the lessons learnt from developing and applying the computer-based support tool to a real world problem situation, as well as considering the feasibility of developing computer-based support tools for 'soft' systems methodologies. This paper will put forward the point that a mixture of manual and computer-based tools should be employed to allow a methodology to be used in an unconstrained manner, but the benefits provided by computer-based technology should be utilised in supporting and enhancing the more mundane and structured tasks.

  3. 9th International Workshop on Parallel Tools for High Performance Computing

    Hilbrich, Tobias; Niethammer, Christoph; Gracia, José; Nagel, Wolfgang; Resch, Michael

    2016-01-01

    High Performance Computing (HPC) remains a driver that offers huge potentials and benefits for science and society. However, a profound understanding of the computational matters and specialized software is needed to arrive at effective and efficient simulations. Dedicated software tools are important parts of the HPC software landscape, and support application developers. Even though a tool is by definition not a part of an application, but rather a supplemental piece of software, it can make a fundamental difference during the development of an application. Such tools aid application developers in the context of debugging, performance analysis, and code optimization, and therefore make a major contribution to the development of robust and efficient parallel software. This book introduces a selection of the tools presented and discussed at the 9th International Parallel Tools Workshop held in Dresden, Germany, September 2-3, 2015, which offered an established forum for discussing the latest advances in paral...

  4. System-level tools and reconfigurable computing for next-generation HWIL systems

    Stark, Derek; McAulay, Derek; Cantle, Allan J.; Devlin, Malachy

    2001-08-01

    Previous work has been presented on the creation of computing architectures called DIME, which addressed the particular computing demands of hardware in the loop systems. These demands include low latency, high data rates and interfacing. While it is essential to have a capable platform for handling and processing of the data streams, the tools must also complement this so that a system's engineer is able to construct their final system. The paper will present the work in the area of integration of system level design tools, such as MATLAB and SIMULINK, with a reconfigurable computing platform. This will demonstrate how algorithms can be implemented and simulated in a familiar rapid application development environment before they are automatically transposed for downloading directly to the computing platform. This complements the established control tools, which handle the configuration and control of the processing systems leading to a tool suite for system development and implementation. As the development tools have evolved the core-processing platform has also been enhanced. These improved platforms are based on dynamically reconfigurable computing, utilizing FPGA technologies, and parallel processing methods that more than double the performance and data bandwidth capabilities. This offers support for the processing of images in Infrared Scene Projectors with 1024 X 1024 resolutions at 400 Hz frame rates. The processing elements will be using the latest generation of FPGAs, which implies that the presented systems will be rated in terms of Tera (1012) operations per second.

  5. Informed public choices for low-carbon electricity portfolios using a computer decision tool.

    Mayer, Lauren A Fleishman; Bruine de Bruin, Wändi; Morgan, M Granger

    2014-04-01

    Reducing CO2 emissions from the electricity sector will likely require policies that encourage the widespread deployment of a diverse mix of low-carbon electricity generation technologies. Public discourse informs such policies. To make informed decisions and to productively engage in public discourse, citizens need to understand the trade-offs between electricity technologies proposed for widespread deployment. Building on previous paper-and-pencil studies, we developed a computer tool that aimed to help nonexperts make informed decisions about the challenges faced in achieving a low-carbon energy future. We report on an initial usability study of this interactive computer tool. After providing participants with comparative and balanced information about 10 electricity technologies, we asked them to design a low-carbon electricity portfolio. Participants used the interactive computer tool, which constrained portfolio designs to be realistic and yield low CO2 emissions. As they changed their portfolios, the tool updated information about projected CO2 emissions, electricity costs, and specific environmental impacts. As in the previous paper-and-pencil studies, most participants designed diverse portfolios that included energy efficiency, nuclear, coal with carbon capture and sequestration, natural gas, and wind. Our results suggest that participants understood the tool and used it consistently. The tool may be downloaded from http://cedmcenter.org/tools-for-cedm/informing-the-public-about-low-carbon-technologies/ .

  6. A computer-aided software-tool for sustainable process synthesis-intensification

    Kumar Tula, Anjan; Babi, Deenesh K.; Bottlaender, Jack

    2017-01-01

    and determine within the design space, the more sustainable processes. In this paper, an integrated computer-aided software-tool that searches the design space for hybrid/intensified more sustainable process options is presented. Embedded within the software architecture are process synthesis...... operations as well as reported hybrid/intensified unit operations is large and can be difficult to manually navigate in order to determine the best process flowsheet for the production of a desired chemical product. Therefore, it is beneficial to utilize computer-aided methods and tools to enumerate, analyze...... constraints while also matching the design targets, they are therefore more sustainable than the base case. The application of the software-tool to the production of biodiesel is presented, highlighting the main features of the computer-aided, multi-stage, multi-scale methods that are able to determine more...

  7. 10th International Workshop on Parallel Tools for High Performance Computing

    Gracia, José; Hilbrich, Tobias; Knüpfer, Andreas; Resch, Michael; Nagel, Wolfgang

    2017-01-01

    This book presents the proceedings of the 10th International Parallel Tools Workshop, held October 4-5, 2016 in Stuttgart, Germany – a forum to discuss the latest advances in parallel tools. High-performance computing plays an increasingly important role for numerical simulation and modelling in academic and industrial research. At the same time, using large-scale parallel systems efficiently is becoming more difficult. A number of tools addressing parallel program development and analysis have emerged from the high-performance computing community over the last decade, and what may have started as collection of small helper script has now matured to production-grade frameworks. Powerful user interfaces and an extensive body of documentation allow easy usage by non-specialists. tools have been commercialized, but others are operated as open source by a growing research community.

  8. The classification and evaluation of Computer-Aided Software Engineering tools

    Manley, Gary W.

    1990-01-01

    Approved for public release; distribution unlimited. The use of Computer-Aided Software Engineering (CASE) tools has been viewed as a remedy for the software development crisis by achieving improved productivity and system quality via the automation of all or part of the software engineering process. The proliferation and tremendous variety of tools available have stretched the understanding of experienced practitioners and has had a profound impact on the software engineering process itse...

  9. Decomposition recovery extension to the Computer Aided Prototyping System (CAPS) change-merge tool.

    Keesling, William Ronald

    1997-01-01

    Approved for public release; distribution is unlimited A promising use of Computer Aided Prototyping System (CAPS) is to support concurrent design. Key to success in this context is the ability to automatically and reliably combine and integrate the prototypes produced in concurrent efforts. Thus, to be of practical use in this as well as most prototyping contexts, a CAPS tool must have a fast, automated, reliable prototype integration capability. The current CAPS Change Merge Tool is fast...

  10. Issues on the Development and Application of Computer Tools to Support Product Structuring and Configuring

    Hansen, Claus Thorp; Riitahuhta, A.

    2001-01-01

    The aim of this article is to make a balance on the results and challenges in the efforts to develop computer tools to support product structuring and configuring in product development projects. The balance will be made in two dimensions, a design science and an industrial dimension. The design ...... that there are large positive effects to be gained for industrial companies by conscious implementing computer tools based on the results of design science. The positive effects will be measured by e.g. predictable product quality, reduced lead time, and reuse of design solutions....

  11. Design tools for computer-generated display of information to operators

    O'Brien, J.F.; Cain, D.G.; Sun, B.K.H.

    1985-01-01

    More and more computers are being used to process and display information to operators who control nuclear power plants. Implementation of computer-generated displays in power plant control rooms represents a considerable design challenge for industry designers. Over the last several years, the EPRI has conducted research aimed at providing industry designers tools to meet this new design challenge. These tools provide guidance in defining more 'intelligent' information for plant control and in developing effective displays to communicate this information to the operators. (orig./HP)

  12. Application of computer tools to the diagnosis of the combustion in motors

    Agudelo S, John R; Delgado M, Alvaro; Gutierrez V, Elkin

    2001-01-01

    This paper describes the fundamental topics concerning to analysis of combustion process in internal combustion engines, when latest generation computational tools are employed. For achieving this, it has been developed DIATERM using graphic programming languages. It is also described the thermo-dynamical model in which is based DIATERM. In the same way it is showed the potential of this computational tool when it is applied to analysis of pressure data in the combustion chamber of a turbo charged diesel engine, changing the load while rotational speed is maintained constant

  13. On Biblical Hebrew and Computer Science: Inspiration, Models, Tools, And Cross-fertilization

    Sandborg-Petersen, Ulrik

    2011-01-01

    Eep Talstra's work has been an inspiration to maby researchers, both within and outside of the field of Old Testament scholarship. Among others, Crist-Jan Doedens and the present author have been heavily influenced by Talstra in their own work within the field of computer science. The present...... of the present author. In addition, the tools surrounding Emdros, including SESB, Libronis, and the Emdros Query Tool, are described. Ecamples Biblical Hebrew scholar. Thus the inspiration of Talstra comes full-circle: from Biblical Hebrew databases to computer science and back into Biblical Hebrew scholarship....

  14. Intelligent cloud computing security using genetic algorithm as a computational tools

    Razuky AL-Shaikhly, Mazin H.

    2018-05-01

    An essential change had occurred in the field of Information Technology which represented with cloud computing, cloud giving virtual assets by means of web yet awesome difficulties in the field of information security and security assurance. Currently main problem with cloud computing is how to improve privacy and security for cloud “cloud is critical security”. This paper attempts to solve cloud security by using intelligent system with genetic algorithm as wall to provide cloud data secure, all services provided by cloud must detect who receive and register it to create list of users (trusted or un-trusted) depend on behavior. The execution of present proposal has shown great outcome.

  15. Enabling Wide-Scale Computer Science Education through Improved Automated Assessment Tools

    Boe, Bryce A.

    There is a proliferating demand for newly trained computer scientists as the number of computer science related jobs continues to increase. University programs will only be able to train enough new computer scientists to meet this demand when two things happen: when there are more primary and secondary school students interested in computer science, and when university departments have the resources to handle the resulting increase in enrollment. To meet these goals, significant effort is being made to both incorporate computational thinking into existing primary school education, and to support larger university computer science class sizes. We contribute to this effort through the creation and use of improved automated assessment tools. To enable wide-scale computer science education we do two things. First, we create a framework called Hairball to support the static analysis of Scratch programs targeted for fourth, fifth, and sixth grade students. Scratch is a popular building-block language utilized to pique interest in and teach the basics of computer science. We observe that Hairball allows for rapid curriculum alterations and thus contributes to wide-scale deployment of computer science curriculum. Second, we create a real-time feedback and assessment system utilized in university computer science classes to provide better feedback to students while reducing assessment time. Insights from our analysis of student submission data show that modifications to the system configuration support the way students learn and progress through course material, making it possible for instructors to tailor assignments to optimize learning in growing computer science classes.

  16. A COMPUTATIONAL FRAMEWORK INVOLVING CFD AND DATA MINING TOOLS FOR ANALYZING DISEASE IN CAROTID ARTERY BIFURCATION

    Tabib, Mandar; Rasheed, Adil; Fonn, Eivind

    2017-01-01

    Cardiovascular diseases, like Carotid Artery Disease and Coronary Artery Disease (CAD) are associated with the narrowing of artery due to build-up of fatty substances and cholesterol deposits (called plaque). Carotid Artery Disease increases the chances of brain stroke. Hence, the main objective of this work is to apply computational tools to help differentiate between the healthy and unhealthy artery (with 25% stenosis) using a combination of Computational Fluid Dynamics (CFD) and data minin...

  17. Computer-based tools for decision support at the Hanford Site

    Doctor, P.G.; Mahaffey, J.A.; Cowley, P.J.; Freshley, M.D.; Hassig, N.L.; Brothers, J.W.; Glantz, C.S.; Strachan, D.M.

    1992-11-01

    To help integrate activities in the environmental restoration and waste management mission of the Hanford Site, the Hanford Integrated Planning Project (HIPP) was established and funded by the US Department of Energy. The project is divided into three key program elements, the first focusing on an explicit, defensible and comprehensive method for evaluating technical options. Based on the premise that computer technology can be used to support the decision-making process and facilitate integration among programs and activities, the Decision Support Tools Task was charged with assessing the status of computer technology for those purposes at the Site. The task addressed two types of tools: tools need to provide technical information and management support tools. Technical tools include performance and risk assessment models, information management systems, data and the computer infrastructure to supports models, data, and information management systems. Management decision support tools are used to synthesize information at a high' level to assist with making decisions. The major conclusions resulting from the assessment are that there is much technical information available, but it is not reaching the decision-makers in a form to be used. Many existing tools provide components that are needed to integrate site activities; however, some components are missing and, more importantly, the ''glue'' or connections to tie the components together to answer decision-makers questions is largely absent. Top priority should be given to decision support tools that support activities given in the TPA. Other decision tools are needed to facilitate and support the environmental restoration and waste management mission

  18. Computer-based tools for decision support at the Hanford Site

    Doctor, P.G.; Mahaffey, J.A.; Cowley, P.J.; Freshley, M.D.; Hassig, N.L.; Brothers, J.W.; Glantz, C.S.; Strachan, D.M.

    1992-11-01

    To help integrate activities in the environmental restoration and waste management mission of the Hanford Site, the Hanford Integrated Planning Project (HIPP) was established and funded by the US Department of Energy. The project is divided into three key program elements, the first focusing on an explicit, defensible and comprehensive method for evaluating technical options. Based on the premise that computer technology can be used to support the decision-making process and facilitate integration among programs and activities, the Decision Support Tools Task was charged with assessing the status of computer technology for those purposes at the Site. The task addressed two types of tools: tools need to provide technical information and management support tools. Technical tools include performance and risk assessment models, information management systems, data and the computer infrastructure to supports models, data, and information management systems. Management decision support tools are used to synthesize information at a high' level to assist with making decisions. The major conclusions resulting from the assessment are that there is much technical information available, but it is not reaching the decision-makers in a form to be used. Many existing tools provide components that are needed to integrate site activities; however, some components are missing and, more importantly, the glue'' or connections to tie the components together to answer decision-makers questions is largely absent. Top priority should be given to decision support tools that support activities given in the TPA. Other decision tools are needed to facilitate and support the environmental restoration and waste management mission.

  19. Computer-based tools for decision support at the Hanford Site

    Doctor, P.G.; Mahaffey, J.A.; Cowley, P.J.; Freshley, M.D.; Hassig, N.L.; Brothers, J.W.; Glantz, C.S.; Strachan, D.M.

    1992-11-01

    To help integrate activities in the environmental restoration and waste management mission of the Hanford Site, the Hanford Integrated Planning Project (HIPP) was established and funded by the US Department of Energy. The project is divided into three key program elements, the first focusing on an explicit, defensible and comprehensive method for evaluating technical options. Based on the premise that computer technology can be used to support the decision-making process and facilitate integration among programs and activities, the Decision Support Tools Task was charged with assessing the status of computer technology for those purposes at the Site. The task addressed two types of tools: tools need to provide technical information and management support tools. Technical tools include performance and risk assessment models, information management systems, data and the computer infrastructure to supports models, data, and information management systems. Management decision support tools are used to synthesize information at a high` level to assist with making decisions. The major conclusions resulting from the assessment are that there is much technical information available, but it is not reaching the decision-makers in a form to be used. Many existing tools provide components that are needed to integrate site activities; however, some components are missing and, more importantly, the ``glue`` or connections to tie the components together to answer decision-makers questions is largely absent. Top priority should be given to decision support tools that support activities given in the TPA. Other decision tools are needed to facilitate and support the environmental restoration and waste management mission.

  20. Assess/Mitigate Risk through the Use of Computer-Aided Software Engineering (CASE) Tools

    Aguilar, Michael L.

    2013-01-01

    The NASA Engineering and Safety Center (NESC) was requested to perform an independent assessment of the mitigation of the Constellation Program (CxP) Risk 4421 through the use of computer-aided software engineering (CASE) tools. With the cancellation of the CxP, the assessment goals were modified to capture lessons learned and best practices in the use of CASE tools. The assessment goal was to prepare the next program for the use of these CASE tools. The outcome of the assessment is contained in this document.

  1. Advanced computational tools and methods for nuclear analyses of fusion technology systems

    Fischer, U.; Chen, Y.; Pereslavtsev, P.; Simakov, S.P.; Tsige-Tamirat, H.; Loughlin, M.; Perel, R.L.; Petrizzi, L.; Tautges, T.J.; Wilson, P.P.H.

    2005-01-01

    An overview is presented of advanced computational tools and methods developed recently for nuclear analyses of Fusion Technology systems such as the experimental device ITER ('International Thermonuclear Experimental Reactor') and the intense neutron source IFMIF ('International Fusion Material Irradiation Facility'). These include Monte Carlo based computational schemes for the calculation of three-dimensional shut-down dose rate distributions, methods, codes and interfaces for the use of CAD geometry models in Monte Carlo transport calculations, algorithms for Monte Carlo based sensitivity/uncertainty calculations, as well as computational techniques and data for IFMIF neutronics and activation calculations. (author)

  2. Utilizing of computational tools on the modelling of a simplified problem of neutron shielding

    Lessa, Fabio da Silva Rangel; Platt, Gustavo Mendes; Alves Filho, Hermes [Universidade do Estado do Rio de Janeiro (UERJ), Nova Friburgo, RJ (Brazil). Inst. Politecnico]. E-mails: fsrlessa@gmail.com; gmplatt@iprj.uerj.br; halves@iprj.uerj.br

    2007-07-01

    In the current technology level, the investigation of several problems is studied through computational simulations whose results are in general satisfactory and much less expensive than the conventional forms of investigation (e.g., destructive tests, laboratory measures, etc.). Almost all of the modern scientific studies are executed using computational tools, as computers of superior capacity and their systems applications to make complex calculations, algorithmic iterations, etc. Besides the considerable economy in time and in space that the Computational Modelling provides, there is a financial economy to the scientists. The Computational Modelling is a modern methodology of investigation that asks for the theoretical study of the identified phenomena in the problem, a coherent mathematical representation of such phenomena, the generation of a numeric algorithmic system comprehensible for the computer, and finally the analysis of the acquired solution, or still getting use of pre-existent systems that facilitate the visualization of these results (editors of Cartesian graphs, for instance). In this work, was being intended to use many computational tools, implementation of numeric methods and a deterministic model in the study and analysis of a well known and simplified problem of nuclear engineering (the neutron transport), simulating a theoretical problem of neutron shielding with physical-material hypothetical parameters, of neutron flow in each space junction, programmed with Scilab version 4.0. (author)

  3. Utilizing of computational tools on the modelling of a simplified problem of neutron shielding

    Lessa, Fabio da Silva Rangel; Platt, Gustavo Mendes; Alves Filho, Hermes

    2007-01-01

    In the current technology level, the investigation of several problems is studied through computational simulations whose results are in general satisfactory and much less expensive than the conventional forms of investigation (e.g., destructive tests, laboratory measures, etc.). Almost all of the modern scientific studies are executed using computational tools, as computers of superior capacity and their systems applications to make complex calculations, algorithmic iterations, etc. Besides the considerable economy in time and in space that the Computational Modelling provides, there is a financial economy to the scientists. The Computational Modelling is a modern methodology of investigation that asks for the theoretical study of the identified phenomena in the problem, a coherent mathematical representation of such phenomena, the generation of a numeric algorithmic system comprehensible for the computer, and finally the analysis of the acquired solution, or still getting use of pre-existent systems that facilitate the visualization of these results (editors of Cartesian graphs, for instance). In this work, was being intended to use many computational tools, implementation of numeric methods and a deterministic model in the study and analysis of a well known and simplified problem of nuclear engineering (the neutron transport), simulating a theoretical problem of neutron shielding with physical-material hypothetical parameters, of neutron flow in each space junction, programmed with Scilab version 4.0. (author)

  4. PROACT user's guide: how to use the pallet recovery opportunity analysis computer tool

    E. Bradley Hager; A.L. Hammett; Philip A. Araman

    2003-01-01

    Pallet recovery projects are environmentally responsible and offer promising business opportunities. The Pallet Recovery Opportunity Analysis Computer Tool (PROACT) assesses the operational and financial feasibility of potential pallet recovery projects. The use of project specific information supplied by the user increases the accuracy and the validity of the...

  5. Physics Education through Computational Tools: The Case of Geometrical and Physical Optics

    Rodríguez, Y.; Santana, A.; Mendoza, L. M.

    2013-01-01

    Recently, with the development of more powerful and accurate computational tools, the inclusion of new didactic materials in the classroom is known to have increased. However, the form in which these materials can be used to enhance the learning process is still under debate. Many different methodologies have been suggested for constructing new…

  6. Computer Aided Methods & Tools for Separation & Purification of Fine Chemical & Pharmaceutical Products

    Afonso, Maria B.C.; Soni, Vipasha; Mitkowski, Piotr Tomasz

    2006-01-01

    An integrated approach that is particularly suitable for solving problems related to product-process design from the fine chemicals, agrochemicals, food and pharmaceutical industries is presented together with the corresponding methods and tools, which forms the basis for an integrated computer...

  7. Video Analysis of Projectile Motion Using Tablet Computers as Experimental Tools

    Klein, P.; Gröber, S.; Kuhn, J.; Müller, A.

    2014-01-01

    Tablet computers were used as experimental tools to record and analyse the motion of a ball thrown vertically from a moving skateboard. Special applications plotted the measurement data component by component, allowing a simple determination of initial conditions and "g" in order to explore the underlying laws of motion. This experiment…

  8. A Real-Time Plagiarism Detection Tool for Computer-Based Assessments

    Jeske, Heimo J.; Lall, Manoj; Kogeda, Okuthe P.

    2018-01-01

    Aim/Purpose: The aim of this article is to develop a tool to detect plagiarism in real time amongst students being evaluated for learning in a computer-based assessment setting. Background: Cheating or copying all or part of source code of a program is a serious concern to academic institutions. Many academic institutions apply a combination of…

  9. An Evaluation of the Webquest as a Computer-Based Learning Tool

    Hassanien, Ahmed

    2006-01-01

    This paper explores the preparation and use of an internet activity for undergraduate learners in higher education (HE). It evaluates the effectiveness of using webquest as a computer-based learning (CBL) tool to support students to learn in HE. The evaluation undertaken offers insights into learner perceptions concerning the ease of use of the…

  10. Using brain-computer interfaces and brain-state dependent stimulation as tools in cognitive neuroscience

    Jensen, O.; Bahramisharif, A.; Oostenveld, R.; Klanke, S.; Hadjipapas, A.; Okazaki, Y.O.; Gerven, M.A.J. van

    2011-01-01

    Large efforts are currently being made to develop and improve online analysis of brain activity which can be used, e.g., for brain-computer interfacing (BCI). A BCI allows a subject to control a device by willfully changing his/her own brain activity. BCI therefore holds the promise as a tool for

  11. Using brain-computer interfaces and brain-state dependent stimulation as tools in cognitive neuroscience

    Jensen, O.; Bahramisharif, A.; Oostenveld, R.; Klanke, S.; Hadjipapas, A.; Okazaki, Y.O.; Gerven, M.A.J. van

    2011-01-01

    Large efforts are currently being made to develop and improve online analysis of brain activity which can be used, e.g., for brain–computer interfacing (BCI). A BCI allows a subject to control a device by willfully changing his/her own brain activity. BCI therefore holds the promise as a tool for

  12. Instruction of Statistics via Computer-Based Tools: Effects on Statistics' Anxiety, Attitude, and Achievement

    Ciftci, S. Koza; Karadag, Engin; Akdal, Pinar

    2014-01-01

    The purpose of this study was to determine the effect of statistics instruction using computer-based tools, on statistics anxiety, attitude, and achievement. This study was designed as quasi-experimental research and the pattern used was a matched pre-test/post-test with control group design. Data was collected using three scales: a Statistics…

  13. Multimedia Instructional Tools' Impact on Student Motivation and Learning Strategies in Computer Applications Courses

    Chapman, Debra; Wang, Shuyan

    2015-01-01

    Multimedia instructional tools (MMIT) have been identified as a way effectively and economically present instructional material. MMITs are commonly used in introductory computer applications courses as MMITs should be effective in increasing student knowledge and positively impact motivation and learning strategies, without increasing costs. This…

  14. USE OF COMPUTER-AIDED PROCESS ENGINEERING TOOL IN POLLUTION PREVENTION

    Computer-Aided Process Engineering has become established in industry as a design tool. With the establishment of the CAPE-OPEN software specifications for process simulation environments. CAPE-OPEN provides a set of "middleware" standards that enable software developers to acces...

  15. Technology and Jobs: Computer-Aided Design. Numerical-Control Machine-Tool Operators. Office Automation.

    Stanton, Michael; And Others

    1985-01-01

    Three reports on the effects of high technology on the nature of work include (1) Stanton on applications and implications of computer-aided design for engineers, drafters, and architects; (2) Nardone on the outlook and training of numerical-control machine tool operators; and (3) Austin and Drake on the future of clerical occupations in automated…

  16. Design and Implementation of a Cloud Computing Adoption Decision Tool: Generating a Cloud Road

    Bildosola, Iñaki; Río-Belver, Rosa; Cilleruelo, Ernesto; Garechana, Gaizka

    2015-01-01

    Migrating to cloud computing is one of the current enterprise challenges. This technology provides a new paradigm based on “on-demand payment” for information and communication technologies. In this sense, the small and medium enterprise is supposed to be the most interested, since initial investments are avoided and the technology allows gradual implementation. However, even if the characteristics and capacities have been widely discussed, entry into the cloud is still lacking in terms of practical, real frameworks. This paper aims at filling this gap, presenting a real tool already implemented and tested, which can be used as a cloud computing adoption decision tool. This tool uses diagnosis based on specific questions to gather the required information and subsequently provide the user with valuable information to deploy the business within the cloud, specifically in the form of Software as a Service (SaaS) solutions. This information allows the decision makers to generate their particular Cloud Road. A pilot study has been carried out with enterprises at a local level with a two-fold objective: to ascertain the degree of knowledge on cloud computing and to identify the most interesting business areas and their related tools for this technology. As expected, the results show high interest and low knowledge on this subject and the tool presented aims to readdress this mismatch, insofar as possible. PMID:26230400

  17. Design and Implementation of a Cloud Computing Adoption Decision Tool: Generating a Cloud Road.

    Iñaki Bildosola

    Full Text Available Migrating to cloud computing is one of the current enterprise challenges. This technology provides a new paradigm based on "on-demand payment" for information and communication technologies. In this sense, the small and medium enterprise is supposed to be the most interested, since initial investments are avoided and the technology allows gradual implementation. However, even if the characteristics and capacities have been widely discussed, entry into the cloud is still lacking in terms of practical, real frameworks. This paper aims at filling this gap, presenting a real tool already implemented and tested, which can be used as a cloud computing adoption decision tool. This tool uses diagnosis based on specific questions to gather the required information and subsequently provide the user with valuable information to deploy the business within the cloud, specifically in the form of Software as a Service (SaaS solutions. This information allows the decision makers to generate their particular Cloud Road. A pilot study has been carried out with enterprises at a local level with a two-fold objective: to ascertain the degree of knowledge on cloud computing and to identify the most interesting business areas and their related tools for this technology. As expected, the results show high interest and low knowledge on this subject and the tool presented aims to readdress this mismatch, insofar as possible.

  18. Design and Implementation of a Cloud Computing Adoption Decision Tool: Generating a Cloud Road.

    Bildosola, Iñaki; Río-Belver, Rosa; Cilleruelo, Ernesto; Garechana, Gaizka

    2015-01-01

    Migrating to cloud computing is one of the current enterprise challenges. This technology provides a new paradigm based on "on-demand payment" for information and communication technologies. In this sense, the small and medium enterprise is supposed to be the most interested, since initial investments are avoided and the technology allows gradual implementation. However, even if the characteristics and capacities have been widely discussed, entry into the cloud is still lacking in terms of practical, real frameworks. This paper aims at filling this gap, presenting a real tool already implemented and tested, which can be used as a cloud computing adoption decision tool. This tool uses diagnosis based on specific questions to gather the required information and subsequently provide the user with valuable information to deploy the business within the cloud, specifically in the form of Software as a Service (SaaS) solutions. This information allows the decision makers to generate their particular Cloud Road. A pilot study has been carried out with enterprises at a local level with a two-fold objective: to ascertain the degree of knowledge on cloud computing and to identify the most interesting business areas and their related tools for this technology. As expected, the results show high interest and low knowledge on this subject and the tool presented aims to readdress this mismatch, insofar as possible.

  19. Automated and Assistive Tools for Accelerated Code migration of Scientific Computing on to Heterogeneous MultiCore Systems

    2017-04-13

    AFRL-AFOSR-UK-TR-2017-0029 Automated and Assistive Tools for Accelerated Code migration of Scientific Computing on to Heterogeneous MultiCore Systems ...2012, “ Automated and Assistive Tools for Accelerated Code migration of Scientific Computing on to Heterogeneous MultiCore Systems .” 2. The objective...2012 - 01/25/2015 4. TITLE AND SUBTITLE Automated and Assistive Tools for Accelerated Code migration of Scientific Computing on to Heterogeneous

  20. ESHOPPS: A COMPUTATIONAL TOOL TO AID THE TEACHING OF SHORTEST PATH ALGORITHMS

    S. J. de A. LIMA

    2015-07-01

    Full Text Available The development of a computational tool called EShoPPS – Environment for Shortest Path Problem Solving, which is used to assist students in understanding the working of Dijkstra, Greedy search and A*(star algorithms is presented in this paper. Such algorithms are commonly taught in graduate and undergraduate courses of Engineering and Informatics and are used for solving many optimization problems that can be characterized as Shortest Path Problem. The EShoPPS is an interactive tool that allows students to create a graph representing the problem and also helps in developing their knowledge of each specific algorithm. Experiments performed with 155 students of undergraduate and graduate courses such as Industrial Engineering, Computer Science and Information Systems have shown that by using the EShoPPS tool students were able to improve their interpretation of investigated algorithms.

  1. Approach and tool for computer animation of fields in electrical apparatus

    Miltchev, Radoslav; Yatchev, Ivan S.; Ritchie, Ewen

    2002-01-01

    The paper presents a technical approach and post-processing tool for creating and displaying computer animation. The approach enables handling of two- and three-dimensional physical field phenomena results obtained from finite element software or to display movement processes in electrical apparatus simulations. The main goal of this work is to extend auxiliary features built in general-purpose CAD software working in the Windows environment. Different storage techniques were examined and the one employing image capturing was chosen. The developed tool provides benefits of independent visualisation, creating scenarios and facilities for exporting animations in common file fon-nats for distribution on different computer platforms. It also provides a valuable educational tool.(Author)

  2. Development of computer-aided software engineering tool for sequential control of JT-60U

    Shimono, M.; Akasaka, H.; Kurihara, K.; Kimura, T.

    1995-01-01

    Discharge sequential control (DSC) is an essential control function for the intermittent and pulse discharge operation of a tokamak device, so that many subsystems may work with each other in correct order and/or synchronously. In the development of the DSC program, block diagrams of logical operation for sequential control are illustrated in its design at first. Then, the logical operators and I/O's which are involved in the block diagrams are compiled and converted to a certain particular form. Since the block diagrams of the sequential control amounts to about 50 sheets in the case of the JT-60 upgrade tokamak (JT-60U) high power discharge and the above steps of the development have been performed manually so far, a great effort has been required for the program development. In order to remove inefficiency in such development processes, a computer-aided software engineering (CASE) tool has been developed on a UNIX workstation. This paper reports how the authors design it for the development of the sequential control programs. The tool is composed of the following three tools: (1) Automatic drawing tool, (2) Editing tool, and (3) Trace tool. This CASE tool, an object-oriented programming tool having graphical formalism, can powerfully accelerate the cycle for the development of the sequential control function commonly associated with pulse discharge in a tokamak fusion device

  3. The MicroGrid: A Scientific Tool for Modeling Computational Grids

    H.J. Song

    2000-01-01

    Full Text Available The complexity and dynamic nature of the Internet (and the emerging Computational Grid demand that middleware and applications adapt to the changes in configuration and availability of resources. However, to the best of our knowledge there are no simulation tools which support systematic exploration of dynamic Grid software (or Grid resource behavior. We describe our vision and initial efforts to build tools to meet these needs. Our MicroGrid simulation tools enable Globus applications to be run in arbitrary virtual grid resource environments, enabling broad experimentation. We describe the design of these tools, and their validation on micro-benchmarks, the NAS parallel benchmarks, and an entire Grid application. These validation experiments show that the MicroGrid can match actual experiments within a few percent (2% to 4%.

  4. 7th International Workshop on Parallel Tools for High Performance Computing

    Gracia, José; Nagel, Wolfgang; Resch, Michael

    2014-01-01

    Current advances in High Performance Computing (HPC) increasingly impact efficient software development workflows. Programmers for HPC applications need to consider trends such as increased core counts, multiple levels of parallelism, reduced memory per core, and I/O system challenges in order to derive well performing and highly scalable codes. At the same time, the increasing complexity adds further sources of program defects. While novel programming paradigms and advanced system libraries provide solutions for some of these challenges, appropriate supporting tools are indispensable. Such tools aid application developers in debugging, performance analysis, or code optimization and therefore make a major contribution to the development of robust and efficient parallel software. This book introduces a selection of the tools presented and discussed at the 7th International Parallel Tools Workshop, held in Dresden, Germany, September 3-4, 2013.  

  5. Network computing infrastructure to share tools and data in global nuclear energy partnership

    Kim, Guehee; Suzuki, Yoshio; Teshima, Naoya

    2010-01-01

    CCSE/JAEA (Center for Computational Science and e-Systems/Japan Atomic Energy Agency) integrated a prototype system of a network computing infrastructure for sharing tools and data to support the U.S. and Japan collaboration in GNEP (Global Nuclear Energy Partnership). We focused on three technical issues to apply our information process infrastructure, which are accessibility, security, and usability. In designing the prototype system, we integrated and improved both network and Web technologies. For the accessibility issue, we adopted SSL-VPN (Security Socket Layer - Virtual Private Network) technology for the access beyond firewalls. For the security issue, we developed an authentication gateway based on the PKI (Public Key Infrastructure) authentication mechanism to strengthen the security. Also, we set fine access control policy to shared tools and data and used shared key based encryption method to protect tools and data against leakage to third parties. For the usability issue, we chose Web browsers as user interface and developed Web application to provide functions to support sharing tools and data. By using WebDAV (Web-based Distributed Authoring and Versioning) function, users can manipulate shared tools and data through the Windows-like folder environment. We implemented the prototype system in Grid infrastructure for atomic energy research: AEGIS (Atomic Energy Grid Infrastructure) developed by CCSE/JAEA. The prototype system was applied for the trial use in the first period of GNEP. (author)

  6. Development of computer-based analytical tool for assessing physical protection system

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-01

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer-based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system's detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  7. Video analysis of projectile motion using tablet computers as experimental tools

    Klein, P.; Gröber, S.; Kuhn, J.; Müller, A.

    2014-01-01

    Tablet computers were used as experimental tools to record and analyse the motion of a ball thrown vertically from a moving skateboard. Special applications plotted the measurement data component by component, allowing a simple determination of initial conditions and g in order to explore the underlying laws of motion. This experiment can easily be performed by students themselves, providing more autonomy in their problem-solving processes than traditional learning approaches. We believe that this autonomy and the authenticity of the experimental tool both foster their motivation.

  8. Computational methods, tools and data for nuclear analyses of fusion technology systems

    Fischer, U.

    2006-01-01

    An overview is presented of the Research and Development work conducted at Forschungszentrum Karlsruhe in co-operation with other associations in the framework of the European Fusion Technology Programme on the development and qualification of computational tools and data for nuclear analyses of Fusion Technology systems. The focus is on the development of advanced methods and tools based on the Monte Carlo technique for particle transport simulations, and the evaluation and qualification of dedicated nuclear data to satisfy the needs of the ITER and the IFMIF projects. (author)

  9. System capacity and economic modeling computer tool for satellite mobile communications systems

    Wiedeman, Robert A.; Wen, Doong; Mccracken, Albert G.

    1988-01-01

    A unique computer modeling tool that combines an engineering tool with a financial analysis program is described. The resulting combination yields a flexible economic model that can predict the cost effectiveness of various mobile systems. Cost modeling is necessary in order to ascertain if a given system with a finite satellite resource is capable of supporting itself financially and to determine what services can be supported. Personal computer techniques using Lotus 123 are used for the model in order to provide as universal an application as possible such that the model can be used and modified to fit many situations and conditions. The output of the engineering portion of the model consists of a channel capacity analysis and link calculations for several qualities of service using up to 16 types of earth terminal configurations. The outputs of the financial model are a revenue analysis, an income statement, and a cost model validation section.

  10. Mobile computing device as tools for college student education: a case on flashcards application

    Kang, Congying

    2012-04-01

    Traditionally, college students always use flash cards as a tool to remember massive knowledge, such as nomenclature, structures, and reactions in chemistry. Educational and information technology have enabled flashcards viewed on computers, like Slides and PowerPoint, works as tunnels of drilling and feedback for the learners. The current generation of students is more capable of information technology and mobile computing devices. For example, they use their Mobile phones much more intensively everyday day. Trends of using Mobile phone as an educational tool is analyzed and a educational technology initiative is proposed, which use Mobile phone flash cards applications to help students learn biology and chemistry. Experiments show that users responded positively to these mobile flash cards.

  11. Computer Assessed Design – A Vehicle of Architectural Communication and a Design Tool

    Petrovici, Liliana-Mihaela

    2012-01-01

    In comparison with the limits of the traditional representation tools, the development of the computer graphics constitutes an opportunity to assert architectural values. The differences between communication codes of the architects and public are diminished; the architectural ideas can be represented in a coherent, intelligible and attractive way, so that they get more chances to be materialized according to the thinking of the creator. Concurrently, the graphic software have been improving ...

  12. Auditors’ Usage of Computer Assisted Audit Tools and Techniques: Empirical Evidence from Nigeria

    Appah Ebimobowei; G.N. Ogbonna; Zuokemefa P. Enebraye

    2013-01-01

    This study examines use of computer assisted audit tool and techniques in audit practice in the Niger Delta of Nigeria. To achieve this objective, data was collected from primary and secondary sources. The secondary sources were from scholarly books and journals while the primary source involved a well structured questionnaire of three sections of thirty seven items with an average reliability of 0.838. The data collected from the questionnaire were analyzed using relevant descriptive statist...

  13. A new software tool for computing Earth's atmospheric transmission of near- and far-infrared radiation

    Lord, Steven D.

    1992-01-01

    This report describes a new software tool, ATRAN, which computes the transmittance of Earth's atmosphere at near- and far-infrared wavelengths. We compare the capabilities of this program with others currently available and demonstrate its utility for observational data calibration and reduction. The program employs current water-vapor and ozone models to produce fast and accurate transmittance spectra for wavelengths ranging from 0.8 microns to 10 mm.

  14. The Strategy Blueprint: A Strategy Process Computer-Aided Design Tool

    Aldea, Adina Ioana; Febriani, Tania Rizki; Daneva, Maya; Iacob, Maria Eugenia

    2017-01-01

    Strategy has always been a main concern of organizations because it dictates their direction, and therefore determines their success. Thus, organizations need to have adequate support to guide them through their strategy formulation process. The goal of this research is to develop a computer-based tool, known as ‘the Strategy Blueprint’, consisting of a combination of nine strategy techniques, which can help organizations define the most suitable strategy, based on the internal and external f...

  15. Benchmarking therapeutic drug monitoring software: a review of available computer tools.

    Fuchs, Aline; Csajka, Chantal; Thoma, Yann; Buclin, Thierry; Widmer, Nicolas

    2013-01-01

    Therapeutic drug monitoring (TDM) aims to optimize treatments by individualizing dosage regimens based on the measurement of blood concentrations. Dosage individualization to maintain concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculations currently represent the gold standard TDM approach but require computation assistance. In recent decades computer programs have been developed to assist clinicians in this assignment. The aim of this survey was to assess and compare computer tools designed to support TDM clinical activities. The literature and the Internet were searched to identify software. All programs were tested on personal computers. Each program was scored against a standardized grid covering pharmacokinetic relevance, user friendliness, computing aspects, interfacing and storage. A weighting factor was applied to each criterion of the grid to account for its relative importance. To assess the robustness of the software, six representative clinical vignettes were processed through each of them. Altogether, 12 software tools were identified, tested and ranked, representing a comprehensive review of the available software. Numbers of drugs handled by the software vary widely (from two to 180), and eight programs offer users the possibility of adding new drug models based on population pharmacokinetic analyses. Bayesian computation to predict dosage adaptation from blood concentration (a posteriori adjustment) is performed by ten tools, while nine are also able to propose a priori dosage regimens, based only on individual patient covariates such as age, sex and bodyweight. Among those applying Bayesian calculation, MM-USC*PACK© uses the non-parametric approach. The top two programs emerging from this benchmark were MwPharm© and TCIWorks. Most other programs evaluated had good potential while being less sophisticated or less user friendly. Programs vary in complexity and might not fit all healthcare

  16. Development of a Computational Tool for Measuring Organizational Competitiveness in the Photovoltaic Power Plants

    Carmen B. Rosa

    2018-04-01

    Full Text Available Photovoltaic (PV power generation is embedded in a globally competitive environment. This characteristic forces PV power plants to perform most processes relevant for their competitiveness with maximum efficiency. From managers’ point of view, the evaluation of solar energy performance from installed plants is justified to indicate their level of organizational competitiveness, which supports the decision-making process. This manuscript purposes a computational tool that graphically presents the level of competitiveness of PV power plants units based on performance indicators. This tool was developed by using the Key Performance Indicators (KPIs concept, which represents a set of measures focusing on the most critical aspects for the success of the organizations. The KPIs encompass four Fundamental Viewpoints (FV: Strategic Alliances, Solar Energy Monitoring, Management and Strategic Processes, and Power Generation Innovations. These four FVs were deployed on 26 Critical Success Factors (CSFs and 39 KPIs. Sequentially, the tool was applied in four solar generation plants, where three presented an organizational competitiveness global level “potentially competitive”. The proposed computational tool allows managers to assess the degree of organization competitiveness as well as aid in prospecting of future scenarios and decision-making.

  17. Computer programing for geosciences: Teach your students how to make tools

    Grapenthin, Ronni

    2011-12-01

    When I announced my intention to pursue a Ph.D. in geophysics, some people gave me confused looks, because I was working on a master's degree in computer science at the time. My friends, like many incoming geoscience graduate students, have trouble linking these two fields. From my perspective, it is pretty straightforward: Much of geoscience evolves around novel analyses of large data sets that require custom tools—computer programs—to minimize the drudgery of manual data handling; other disciplines share this characteristic. While most faculty adapted to the need for tool development quite naturally, as they grew up around computer terminal interfaces, incoming graduate students lack intuitive understanding of programing concepts such as generalization and automation. I believe the major cause is the intuitive graphical user interfaces of modern operating systems and applications, which isolate the user from all technical details. Generally, current curricula do not recognize this gap between user and machine. For students to operate effectively, they require specialized courses teaching them the skills they need to make tools that operate on particular data sets and solve their specific problems. Courses in computer science departments are aimed at a different audience and are of limited help.

  18. A benchmarking tool to evaluate computer tomography perfusion infarct core predictions against a DWI standard.

    Cereda, Carlo W; Christensen, Søren; Campbell, Bruce Cv; Mishra, Nishant K; Mlynash, Michael; Levi, Christopher; Straka, Matus; Wintermark, Max; Bammer, Roland; Albers, Gregory W; Parsons, Mark W; Lansberg, Maarten G

    2016-10-01

    Differences in research methodology have hampered the optimization of Computer Tomography Perfusion (CTP) for identification of the ischemic core. We aim to optimize CTP core identification using a novel benchmarking tool. The benchmarking tool consists of an imaging library and a statistical analysis algorithm to evaluate the performance of CTP. The tool was used to optimize and evaluate an in-house developed CTP-software algorithm. Imaging data of 103 acute stroke patients were included in the benchmarking tool. Median time from stroke onset to CT was 185 min (IQR 180-238), and the median time between completion of CT and start of MRI was 36 min (IQR 25-79). Volumetric accuracy of the CTP-ROIs was optimal at an rCBF threshold of benchmarking tool can play an important role in optimizing CTP software as it provides investigators with a novel method to directly compare the performance of alternative CTP software packages. © The Author(s) 2015.

  19. Computer-aided design in power engineering. Application of software tools

    Stojkovic, Zlatan

    2012-01-01

    Demonstrates the use software tools in the practice of design in the field of power systems. Presents many applications in the design in the field of power systems. Useful for educative purposes and practical work. This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel and Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems is discussed. Here, the emphasis is put on the standard software MS Excel and MS Project.

  20. Development of computer-based analytical tool for assessing physical protection system

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-01

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer–based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system’s detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure

  1. Development of computer-based analytical tool for assessing physical protection system

    Mardhi, Alim, E-mail: alim-m@batan.go.id [National Nuclear Energy Agency Indonesia, (BATAN), PUSPIPTEK area, Building 80, Serpong, Tangerang Selatan, Banten (Indonesia); Chulalongkorn University, Faculty of Engineering, Nuclear Engineering Department, 254 Phayathai Road, Pathumwan, Bangkok Thailand. 10330 (Thailand); Pengvanich, Phongphaeth, E-mail: ppengvan@gmail.com [Chulalongkorn University, Faculty of Engineering, Nuclear Engineering Department, 254 Phayathai Road, Pathumwan, Bangkok Thailand. 10330 (Thailand)

    2016-01-22

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer–based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system’s detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  2. Computer-aided design in power engineering. Application of software tools

    Stojkovic, Zlatan

    2012-07-01

    Demonstrates the use software tools in the practice of design in the field of power systems. Presents many applications in the design in the field of power systems. Useful for educative purposes and practical work. This textbooks demonstrates the application of software tools in solving a series of problems from the field of designing power system structures and systems. It contains four chapters: The first chapter leads the reader through all the phases necessary in the procedures of computer aided modeling and simulation. It guides through the complex problems presenting on the basis of eleven original examples. The second chapter presents application of software tools in power system calculations of power systems equipment design. Several design example calculations are carried out using engineering standards like MATLAB, EMTP/ATP, Excel and Access, AutoCAD and Simulink. The third chapters focuses on the graphical documentation using a collection of software tools (AutoCAD, EPLAN, SIMARIS SIVACON, SIMARIS DESIGN) which enable the complete automation of the development of graphical documentation of a power systems. In the fourth chapter, the application of software tools in the project management in power systems is discussed. Here, the emphasis is put on the standard software MS Excel and MS Project.

  3. Primary care physicians’ perspectives on computer-based health risk assessment tools for chronic diseases: a mixed methods study

    Teja Voruganti

    2015-09-01

    Full Text Available Background Health risk assessment tools compute an individual’s risk of developing a disease. Routine use of such tools by primary care physicians (PCPs is potentially useful in chronic disease prevention. We sought physicians’ awareness and perceptions of the usefulness, usability and feasibility of performing assessments with computer-based risk assessment tools in primary care settings.Methods Focus groups and usability testing with a computer-based risk assessment tool were conducted with PCPs from both university-affiliated and community-based practices. Analysis was derived from grounded theory methodology.Results PCPs (n = 30 were aware of several risk assessment tools although only select tools were used routinely. The decision to use a tool depended on how use impacted practice workflow and whether the tool had credibility. Participants felt that embedding tools in the electronic medical records (EMRs system might allow for health information from the medical record to auto-populate into the tool. User comprehension of risk could also be improved with computer-based interfaces that present risk in different formats.Conclusions In this study, PCPs chose to use certain tools more regularly because of usability and credibility. Despite there being differences in the particular tools a clinical practice used, there was general appreciation for the usefulness of tools for different clinical situations. Participants characterised particular features of an ideal tool, feeling strongly that embedding risk assessment tools in the EMR would maximise accessibility and use of the tool for chronic disease management. However, appropriate practice workflow integration and features that facilitate patient understanding at point-of-care are also essential. 

  4. Computational protein design-the next generation tool to expand synthetic biology applications.

    Gainza-Cirauqui, Pablo; Correia, Bruno Emanuel

    2018-05-02

    One powerful approach to engineer synthetic biology pathways is the assembly of proteins sourced from one or more natural organisms. However, synthetic pathways often require custom functions or biophysical properties not displayed by natural proteins, limitations that could be overcome through modern protein engineering techniques. Structure-based computational protein design is a powerful tool to engineer new functional capabilities in proteins, and it is beginning to have a profound impact in synthetic biology. Here, we review efforts to increase the capabilities of synthetic biology using computational protein design. We focus primarily on computationally designed proteins not only validated in vitro, but also shown to modulate different activities in living cells. Efforts made to validate computational designs in cells can illustrate both the challenges and opportunities in the intersection of protein design and synthetic biology. We also highlight protein design approaches, which although not validated as conveyors of new cellular function in situ, may have rapid and innovative applications in synthetic biology. We foresee that in the near-future, computational protein design will vastly expand the functional capabilities of synthetic cells. Copyright © 2018. Published by Elsevier Ltd.

  5. Aligator: A computational tool for optimizing total chemical synthesis of large proteins.

    Jacobsen, Michael T; Erickson, Patrick W; Kay, Michael S

    2017-09-15

    The scope of chemical protein synthesis (CPS) continues to expand, driven primarily by advances in chemical ligation tools (e.g., reversible solubilizing groups and novel ligation chemistries). However, the design of an optimal synthesis route can be an arduous and fickle task due to the large number of theoretically possible, and in many cases problematic, synthetic strategies. In this perspective, we highlight recent CPS tool advances and then introduce a new and easy-to-use program, Aligator (Automated Ligator), for analyzing and designing the most efficient strategies for constructing large targets using CPS. As a model set, we selected the E. coli ribosomal proteins and associated factors for computational analysis. Aligator systematically scores and ranks all feasible synthetic strategies for a particular CPS target. The Aligator script methodically evaluates potential peptide segments for a target using a scoring function that includes solubility, ligation site quality, segment lengths, and number of ligations to provide a ranked list of potential synthetic strategies. We demonstrate the utility of Aligator by analyzing three recent CPS projects from our lab: TNFα (157 aa), GroES (97 aa), and DapA (312 aa). As the limits of CPS are extended, we expect that computational tools will play an increasingly important role in the efficient execution of ambitious CPS projects such as production of a mirror-image ribosome. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Comparison of High-Fidelity Computational Tools for Wing Design of a Distributed Electric Propulsion Aircraft

    Deere, Karen A.; Viken, Sally A.; Carter, Melissa B.; Viken, Jeffrey K.; Derlaga, Joseph M.; Stoll, Alex M.

    2017-01-01

    A variety of tools, from fundamental to high order, have been used to better understand applications of distributed electric propulsion to aid the wing and propulsion system design of the Leading Edge Asynchronous Propulsion Technology (LEAPTech) project and the X-57 Maxwell airplane. Three high-fidelity, Navier-Stokes computational fluid dynamics codes used during the project with results presented here are FUN3D, STAR-CCM+, and OVERFLOW. These codes employ various turbulence models to predict fully turbulent and transitional flow. Results from these codes are compared for two distributed electric propulsion configurations: the wing tested at NASA Armstrong on the Hybrid-Electric Integrated Systems Testbed truck, and the wing designed for the X-57 Maxwell airplane. Results from these computational tools for the high-lift wing tested on the Hybrid-Electric Integrated Systems Testbed truck and the X-57 high-lift wing presented compare reasonably well. The goal of the X-57 wing and distributed electric propulsion system design achieving or exceeding the required ?? (sub L) = 3.95 for stall speed was confirmed with all of the computational codes.

  7. Basic data, computer codes and integral experiments: The tools for modelling in nuclear technology

    Sartori, E.

    2001-01-01

    When studying applications in nuclear technology we need to understand and be able to predict the behavior of systems manufactured by human enterprise. First, the underlying basic physical and chemical phenomena need to be understood. We have then to predict the results from the interplay of the large number of the different basic events: i.e. the macroscopic effects. In order to be able to build confidence in our modelling capability, we need then to compare these results against measurements carried out on such systems. The different levels of modelling require the solution of different types of equations using different type of parameters. The tools required for carrying out a complete validated analysis are: - The basic nuclear or chemical data; - The computer codes, and; - The integral experiments. This article describes the role each component plays in a computational scheme designed for modelling purposes. It describes also which tools have been developed and are internationally available. The role of the OECD/NEA Data Bank, the Radiation Shielding Information Computational Center (RSICC), and the IAEA Nuclear Data Section are playing in making these elements available to the community of scientists and engineers is described. (author)

  8. Creating a strategic plan for configuration management using computer aided software engineering (CASE) tools

    Smith, P.R.; Sarfaty, R.

    1993-01-01

    This paper provides guidance in the definition, documentation, measurement, enhancement of processes, and validation of a strategic plan for configuration management (CM). The approach and methodology used in establishing a strategic plan is the same for any enterprise, including the Department of Energy (DOE), commercial nuclear plants, the Department of Defense (DOD), or large industrial complexes. The principles and techniques presented are used world wide by some of the largest corporations. The authors used industry knowledge and the areas of their current employment to illustrate and provide examples. Developing a strategic configuration and information management plan for DOE Idaho Field Office (DOE-ID) facilities is discussed in this paper. A good knowledge of CM principles is the key to successful strategic planning. This paper will describe and define CM elements, and discuss how CM integrates the facility's physical configuration, design basis, and documentation. The strategic plan does not need the support of a computer aided software engineering (CASE) tool. However, the use of the CASE tool provides a methodology for consistency in approach, graphics, and database capability combined to form an encyclopedia and a method of presentation that is easily understood and aids the process of reengineering. CASE tools have much more capability than those stated above. Some examples are supporting a joint application development group (JAD) to prepare a software functional specification document and, if necessary, provide the capability to automatically generate software application code. This paper briefly discusses characteristics and capabilities of two CASE tools that use different methodologies to generate similar deliverables

  9. Tools for studying dry-cured ham processing by using computed tomography.

    Santos-Garcés, Eva; Muñoz, Israel; Gou, Pere; Sala, Xavier; Fulladosa, Elena

    2012-01-11

    An accurate knowledge and optimization of dry-cured ham elaboration processes could help to reduce operating costs and maximize product quality. The development of nondestructive tools to characterize chemical parameters such as salt and water contents and a(w) during processing is of special interest. In this paper, predictive models for salt content (R(2) = 0.960 and RMSECV = 0.393), water content (R(2) = 0.912 and RMSECV = 1.751), and a(w) (R(2) = 0.906 and RMSECV = 0.008), which comprise the whole elaboration process, were developed. These predictive models were used to develop analytical tools such as distribution diagrams, line profiles, and regions of interest (ROIs) from the acquired computed tomography (CT) scans. These CT analytical tools provided quantitative information on salt, water, and a(w) in terms of content but also distribution throughout the process. The information obtained was applied to two industrial case studies. The main drawback of the predictive models and CT analytical tools is the disturbance that fat produces in water content and a(w) predictions.

  10. The secondary metabolite bioinformatics portal: Computational tools to facilitate synthetic biology of secondary metabolite production

    Tilmann Weber

    2016-06-01

    Full Text Available Natural products are among the most important sources of lead molecules for drug discovery. With the development of affordable whole-genome sequencing technologies and other ‘omics tools, the field of natural products research is currently undergoing a shift in paradigms. While, for decades, mainly analytical and chemical methods gave access to this group of compounds, nowadays genomics-based methods offer complementary approaches to find, identify and characterize such molecules. This paradigm shift also resulted in a high demand for computational tools to assist researchers in their daily work. In this context, this review gives a summary of tools and databases that currently are available to mine, identify and characterize natural product biosynthesis pathways and their producers based on ‘omics data. A web portal called Secondary Metabolite Bioinformatics Portal (SMBP at http://www.secondarymetabolites.org is introduced to provide a one-stop catalog and links to these bioinformatics resources. In addition, an outlook is presented how the existing tools and those to be developed will influence synthetic biology approaches in the natural products field.

  11. Validation of RetroPath, a computer-aided design tool for metabolic pathway engineering.

    Fehér, Tamás; Planson, Anne-Gaëlle; Carbonell, Pablo; Fernández-Castané, Alfred; Grigoras, Ioana; Dariy, Ekaterina; Perret, Alain; Faulon, Jean-Loup

    2014-11-01

    Metabolic engineering has succeeded in biosynthesis of numerous commodity or high value compounds. However, the choice of pathways and enzymes used for production was many times made ad hoc, or required expert knowledge of the specific biochemical reactions. In order to rationalize the process of engineering producer strains, we developed the computer-aided design (CAD) tool RetroPath that explores and enumerates metabolic pathways connecting the endogenous metabolites of a chassis cell to the target compound. To experimentally validate our tool, we constructed 12 top-ranked enzyme combinations producing the flavonoid pinocembrin, four of which displayed significant yields. Namely, our tool queried the enzymes found in metabolic databases based on their annotated and predicted activities. Next, it ranked pathways based on the predicted efficiency of the available enzymes, the toxicity of the intermediate metabolites and the calculated maximum product flux. To implement the top-ranking pathway, our procedure narrowed down a list of nine million possible enzyme combinations to 12, a number easily assembled and tested. One round of metabolic network optimization based on RetroPath output further increased pinocembrin titers 17-fold. In total, 12 out of the 13 enzymes tested in this work displayed a relative performance that was in accordance with its predicted score. These results validate the ranking function of our CAD tool, and open the way to its utilization in the biosynthesis of novel compounds. Copyright © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Introduction of the computer-based operation training tools in classrooms to support simulator training

    Noji, K.; Suzuki, K.; Kobayashi, A.

    1997-01-01

    Operation training with full-scope simulators is effective to improve trainees operation competency. To obtain more effective results of simulator training, roles of the ''classroom operation training'' closely cooperated to simulator training are important. The ''classroom operation training'' is aimed at pre- and post-studies for operation knowledge related to operation training using full-scope simulators. We have been developing computer-based operation training tools which are used in classroom training sessions. As the first step, we developed the Simulator Training Replay System. This is an aiding tool in the classroom used to enhance trainees operation performance. This system can synchronously replay plant behavior on CRT display with operators action on a video monitor in the simulator training sessions. This system is used to review plant behavior - trainees response after simulator training sessions and to understand plant behavior - operation procedure before operation training. (author)

  13. A computational tool to characterize particle tracking measurements in optical tweezers

    Taylor, Michael A; Bowen, Warwick P

    2013-01-01

    Here, we present a computational tool for optical tweezers which calculates the particle tracking signal measured with a quadrant detector and the shot-noise limit to position resolution. The tool is a piece of Matlab code which functions within the freely available Optical Tweezers Toolbox. It allows the measurements performed in most optical tweezer experiments to be theoretically characterized in a fast and easy manner. The code supports particles with arbitrary size, any optical fields and any combination of objective and condenser, and performs a full vector calculation of the relevant fields. Example calculations are presented which show the tracking signals for different particles, and the shot-noise limit to position sensitivity as a function of the effective condenser NA. (paper)

  14. Tools for 3D scientific visualization in computational aerodynamics at NASA Ames Research Center

    Bancroft, G.; Plessel, T.; Merritt, F.; Watson, V.

    1989-01-01

    Hardware, software, and techniques used by the Fluid Dynamics Division (NASA) for performing visualization of computational aerodynamics, which can be applied to the visualization of flow fields from computer simulations of fluid dynamics about the Space Shuttle, are discussed. Three visualization techniques applied, post-processing, tracking, and steering, are described, as well as the post-processing software packages used, PLOT3D, SURF (Surface Modeller), GAS (Graphical Animation System), and FAST (Flow Analysis software Toolkit). Using post-processing methods a flow simulation was executed on a supercomputer and, after the simulation was complete, the results were processed for viewing. It is shown that the high-resolution, high-performance three-dimensional workstation combined with specially developed display and animation software provides a good tool for analyzing flow field solutions obtained from supercomputers. 7 refs

  15. Porting of Bio-Informatics Tools for Plant Virology on a Computational Grid

    Lanzalone, G.; Lombardo, A.; Muoio, A.; Iacono-Manno, M.

    2007-01-01

    The goal of Tri Grid Project and PI2S2 is the creation of the first Sicilian regional computational Grid. In particular, it aims to build various software-hardware interfaces between the infrastructure and some scientific and industrial applications. In this context, we have integrated some among the most innovative computing applications in virology research inside these Grid infrastructure. Particularly, we have implemented in a complete work flow, various tools for pairwise or multiple sequence alignment and phylogeny tree construction (ClustalW-MPI), phylogenetic networks (Splits Tree), detection of recombination by phylogenetic methods (TOPALi) and prediction of DNA or RNA secondary consensus structures (KnetFold). This work will show how the ported applications decrease the execution time of the analysis programs, improve the accessibility to the data storage system and allow the use of metadata for data processing. (Author)

  16. Variation of densitometry on computed tomography in COPD--influence of different software tools.

    Mark O Wielpütz

    Full Text Available Quantitative multidetector computed tomography (MDCT as a potential biomarker is increasingly used for severity assessment of emphysema in chronic obstructive pulmonary disease (COPD. Aim of this study was to evaluate the user-independent measurement variability between five different fully-automatic densitometry software tools.MDCT and full-body plethysmography incl. forced expiratory volume in 1s and total lung capacity were available for 49 patients with advanced COPD (age = 64±9 years, forced expiratory volume in 1 s = 31±6% predicted. Measurement variation regarding lung volume, emphysema volume, emphysema index, and mean lung density was evaluated for two scientific and three commercially available lung densitometry software tools designed to analyze MDCT from different scanner types.One scientific tool and one commercial tool failed to process most or all datasets, respectively, and were excluded. One scientific and another commercial tool analyzed 49, the remaining commercial tool 30 datasets. Lung volume, emphysema volume, emphysema index and mean lung density were significantly different amongst these three tools (p<0.001. Limits of agreement for lung volume were [-0.195, -0.052 l], [-0.305, -0.131 l], and [-0.123, -0.052 l] with correlation coefficients of r = 1.00 each. Limits of agreement for emphysema index were [-6.2, 2.9%], [-27.0, 16.9%], and [-25.5, 18.8%], with r = 0.79 to 0.98. Correlation of lung volume with total lung capacity was good to excellent (r = 0.77 to 0.91, p<0.001, but segmented lung volume (6.7±1.3-6.8±1.3 l were significantly lower than total lung capacity (7.7±1.7 l, p<0.001.Technical incompatibilities hindered evaluation of two of five tools. The remaining three showed significant measurement variation for emphysema, hampering quantitative MDCT as a biomarker in COPD. Follow-up studies should currently use identical software, and standardization efforts should encompass software as

  17. Nsite, NsiteH and NsiteM Computer Tools for Studying Tran-scription Regulatory Elements

    Shahmuradov, Ilham; Solovyev, Victor

    2015-01-01

    regions. Computer methods for identification of REs remain a widely used tool for studying and understanding transcriptional regulation mechanisms. The Nsite, NsiteH and NsiteM programs perform searches for statistically significant (non-random) motifs

  18. Proceedings of the Workshop on Methods & Tools for Computer Supported Collaborative Creativity Process: Linking creativity & informal learning

    Retalis, Symeon; Sloep, Peter

    2009-01-01

    Retalis, S., & Sloep, P. B. (Eds.) (2009). Collection of 4 symposium papers at EC-TEL 2009. Proceedings of the Workshop on Methods & Tools for Computer Supported Collaborative Creativity Process: Linking creativity & informal learning. September, 30, 2009, Nice,

  19. An Interactive Computer Tool for Teaching About Desalination and Managing Water Demand in the US

    Ziolkowska, J. R.; Reyes, R.

    2016-12-01

    This paper presents an interactive tool to geospatially and temporally analyze desalination developments and trends in the US in the time span 1950-2013, its current contribution to satisfying water demands and its future potentials. The computer tool is open access and can be used by any user with Internet connection, thus facilitating interactive learning about water resources. The tool can also be used by stakeholders and policy makers for decision-making support and with designing sustainable water management strategies. Desalination technology has been acknowledged as a solution to a sustainable water demand management stemming from many sectors, including municipalities, industry, agriculture, power generation, and other users. Desalination has been applied successfully in the US and many countries around the world since 1950s. As of 2013, around 1,336 desalination plants were operating in the US alone, with a daily production capacity of 2 BGD (billion gallons per day) (GWI, 2013). Despite a steady increase in the number of new desalination plants and growing production capacity, in many regions, the costs of desalination are still prohibitive. At the same time, the technology offers a tremendous potential for `enormous supply expansion that exceeds all likely demands' (Chowdhury et al., 2013). The model and tool are based on data from Global Water Intelligence (GWI, 2013). The analysis shows that more than 90% of all the plants in the US are small-scale plants with the capacity below 4.31 MGD. Most of the plants (and especially larger plants) are located on the US East Coast, as well as in California, Texas, Oklahoma, and Florida. The models and the tool provide information about economic feasibility of potential new desalination plants based on the access to feed water, energy sources, water demand, and experiences of other plants in that region.

  20. Analytical Design Package (ADP2): A computer aided engineering tool for aircraft transparency design

    Wuerer, J. E.; Gran, M.; Held, T. W.

    1994-01-01

    The Analytical Design Package (ADP2) is being developed as a part of the Air Force Frameless Transparency Program (FTP). ADP2 is an integrated design tool consisting of existing analysis codes and Computer Aided Engineering (CAE) software. The objective of the ADP2 is to develop and confirm an integrated design methodology for frameless transparencies, related aircraft interfaces, and their corresponding tooling. The application of this methodology will generate high confidence for achieving a qualified part prior to mold fabrication. ADP2 is a customized integration of analysis codes, CAE software, and material databases. The primary CAE integration tool for the ADP2 is P3/PATRAN, a commercial-off-the-shelf (COTS) software tool. The open architecture of P3/PATRAN allows customized installations with different applications modules for specific site requirements. Integration of material databases allows the engineer to select a material, and those material properties are automatically called into the relevant analysis code. The ADP2 materials database will be composed of four independent schemas: CAE Design, Processing, Testing, and Logistics Support. The design of ADP2 places major emphasis on the seamless integration of CAE and analysis modules with a single intuitive graphical interface. This tool is being designed to serve and be used by an entire project team, i.e., analysts, designers, materials experts, and managers. The final version of the software will be delivered to the Air Force in Jan. 1994. The Analytical Design Package (ADP2) will then be ready for transfer to industry. The package will be capable of a wide range of design and manufacturing applications.

  1. COMPUTING

    M. Kasemann

    Overview During the past three months activities were focused on data operations, testing and re-enforcing shift and operational procedures for data production and transfer, MC production and on user support. Planning of the computing resources in view of the new LHC calendar in ongoing. Two new task forces were created for supporting the integration work: Site Commissioning, which develops tools helping distributed sites to monitor job and data workflows, and Analysis Support, collecting the user experience and feedback during analysis activities and developing tools to increase efficiency. The development plan for DMWM for 2009/2011 was developed at the beginning of the year, based on the requirements from the Physics, Computing and Offline groups (see Offline section). The Computing management meeting at FermiLab on February 19th and 20th was an excellent opportunity discussing the impact and for addressing issues and solutions to the main challenges facing CMS computing. The lack of manpower is particul...

  2. DR2DI: a powerful computational tool for predicting novel drug-disease associations

    Lu, Lu; Yu, Hua

    2018-05-01

    Finding the new related candidate diseases for known drugs provides an effective method for fast-speed and low-risk drug development. However, experimental identification of drug-disease associations is expensive and time-consuming. This motivates the need for developing in silico computational methods that can infer true drug-disease pairs with high confidence. In this study, we presented a novel and powerful computational tool, DR2DI, for accurately uncovering the potential associations between drugs and diseases using high-dimensional and heterogeneous omics data as information sources. Based on a unified and extended similarity kernel framework, DR2DI inferred the unknown relationships between drugs and diseases using Regularized Kernel Classifier. Importantly, DR2DI employed a semi-supervised and global learning algorithm which can be applied to uncover the diseases (drugs) associated with known and novel drugs (diseases). In silico global validation experiments showed that DR2DI significantly outperforms recent two approaches for predicting drug-disease associations. Detailed case studies further demonstrated that the therapeutic indications and side effects of drugs predicted by DR2DI could be validated by existing database records and literature, suggesting that DR2DI can be served as a useful bioinformatic tool for identifying the potential drug-disease associations and guiding drug repositioning. Our software and comparison codes are freely available at https://github.com/huayu1111/DR2DI.

  3. DR2DI: a powerful computational tool for predicting novel drug-disease associations

    Lu, Lu; Yu, Hua

    2018-04-01

    Finding the new related candidate diseases for known drugs provides an effective method for fast-speed and low-risk drug development. However, experimental identification of drug-disease associations is expensive and time-consuming. This motivates the need for developing in silico computational methods that can infer true drug-disease pairs with high confidence. In this study, we presented a novel and powerful computational tool, DR2DI, for accurately uncovering the potential associations between drugs and diseases using high-dimensional and heterogeneous omics data as information sources. Based on a unified and extended similarity kernel framework, DR2DI inferred the unknown relationships between drugs and diseases using Regularized Kernel Classifier. Importantly, DR2DI employed a semi-supervised and global learning algorithm which can be applied to uncover the diseases (drugs) associated with known and novel drugs (diseases). In silico global validation experiments showed that DR2DI significantly outperforms recent two approaches for predicting drug-disease associations. Detailed case studies further demonstrated that the therapeutic indications and side effects of drugs predicted by DR2DI could be validated by existing database records and literature, suggesting that DR2DI can be served as a useful bioinformatic tool for identifying the potential drug-disease associations and guiding drug repositioning. Our software and comparison codes are freely available at https://github.com/huayu1111/DR2DI.

  4. Smartphone qualification & linux-based tools for CubeSat computing payloads

    Bridges, C. P.; Yeomans, B.; Iacopino, C.; Frame, T. E.; Schofield, A.; Kenyon, S.; Sweeting, M. N.

    Modern computers are now far in advance of satellite systems and leveraging of these technologies for space applications could lead to cheaper and more capable spacecraft. Together with NASA AMES's PhoneSat, the STRaND-1 nanosatellite team has been developing and designing new ways to include smart-phone technologies to the popular CubeSat platform whilst mitigating numerous risks. Surrey Space Centre (SSC) and Surrey Satellite Technology Ltd. (SSTL) have led in qualifying state-of-the-art COTS technologies and capabilities - contributing to numerous low-cost satellite missions. The focus of this paper is to answer if 1) modern smart-phone software is compatible for fast and low-cost development as required by CubeSats, and 2) if the components utilised are robust to the space environment. The STRaND-1 smart-phone payload software explored in this paper is united using various open-source Linux tools and generic interfaces found in terrestrial systems. A major result from our developments is that many existing software and hardware processes are more than sufficient to provide autonomous and operational payload object-to-object and file-based management solutions. The paper will provide methodologies on the software chains and tools used for the STRaND-1 smartphone computing platform, the hardware built with space qualification results (thermal, thermal vacuum, and TID radiation), and how they can be implemented in future missions.

  5. A computer tool for daily application of the linear quadratic model

    Macias Jaen, J.; Galan Montenegro, P.; Bodineau Gil, C.; Wals Zurita, A.; Serradilla Gil, A.M.

    2001-01-01

    The aim of this paper is to indicate the relevance of the criteria A.S.A.R.A. (As Short As Reasonably Achievable) in the optimization of a fractionated radiotherapy schedule and the presentation of a Windows computer program as an easy tool in order to: Evaluate the Biological Equivalent Dose (BED) in a fractionated schedule; Make comparison between different treatments; Compensate a treatment when a delay has been happened with a version of the Linear Quadratic model that has into account the factor of accelerated repopulation. Conclusions: Delays in the normal radiotherapy schedule are items that have to be controlled as much as possible because it is able to be a very important parameter in order to release a good application of treatment, principally when the tumour is fast growing. It is necessary to evaluate them. ASARA criteria is useful to indicate the relevance of this aspect. Also, computer tools like this one could help us in order to achieve this. (author)

  6. Covariance Analysis Tool (G-CAT) for Computing Ascent, Descent, and Landing Errors

    Boussalis, Dhemetrios; Bayard, David S.

    2013-01-01

    G-CAT is a covariance analysis tool that enables fast and accurate computation of error ellipses for descent, landing, ascent, and rendezvous scenarios, and quantifies knowledge error contributions needed for error budgeting purposes. Because GCAT supports hardware/system trade studies in spacecraft and mission design, it is useful in both early and late mission/ proposal phases where Monte Carlo simulation capability is not mature, Monte Carlo simulation takes too long to run, and/or there is a need to perform multiple parametric system design trades that would require an unwieldy number of Monte Carlo runs. G-CAT is formulated as a variable-order square-root linearized Kalman filter (LKF), typically using over 120 filter states. An important property of G-CAT is that it is based on a 6-DOF (degrees of freedom) formulation that completely captures the combined effects of both attitude and translation errors on the propagated trajectories. This ensures its accuracy for guidance, navigation, and control (GN&C) analysis. G-CAT provides the desired fast turnaround analysis needed for error budgeting in support of mission concept formulations, design trade studies, and proposal development efforts. The main usefulness of a covariance analysis tool such as G-CAT is its ability to calculate the performance envelope directly from a single run. This is in sharp contrast to running thousands of simulations to obtain similar information using Monte Carlo methods. It does this by propagating the "statistics" of the overall design, rather than simulating individual trajectories. G-CAT supports applications to lunar, planetary, and small body missions. It characterizes onboard knowledge propagation errors associated with inertial measurement unit (IMU) errors (gyro and accelerometer), gravity errors/dispersions (spherical harmonics, masscons), and radar errors (multiple altimeter beams, multiple Doppler velocimeter beams). G-CAT is a standalone MATLAB- based tool intended to

  7. SOAP. A tool for the fast computation of photometry and radial velocity induced by stellar spots

    Boisse, I.; Bonfils, X.; Santos, N. C.

    2012-09-01

    We define and put at the disposal of the community SOAP, Spot Oscillation And Planet, a software tool that simulates the effect of stellar spots and plages on radial velocimetry and photometry. This paper describes the tool release and provides instructions for its use. We present detailed tests with previous computations and real data to assess the code's performance and to validate its suitability. We characterize the variations of the radial velocity, line bisector, and photometric amplitude as a function of the main variables: projected stellar rotational velocity, filling factor of the spot, resolution of the spectrograph, linear limb-darkening coefficient, latitude of the spot, and inclination of the star. Finally, we model the spot distributions on the active stars HD 166435, TW Hya and HD 189733, which reproduce the observations. We show that the software is remarkably fast, allowing several evolutions in its capabilities that could be performed to study the next challenges in the exoplanetary field connected with the stellar variability. The tool is available at http://www.astro.up.pt/soap

  8. Development of a computer tool to support scenario analysis for safety assessment of HLW geological disposal

    Makino, Hitoshi; Kawamura, Makoto; Wakasugi, Keiichiro; Okubo, Hiroo; Takase, Hiroyasu

    2007-02-01

    In 'H12 Project to Establishing Technical Basis for HLW Disposal in Japan' a systematic approach that was based on an international consensus was adopted to develop scenarios to be considered in performance assessment. Adequacy of the approach was, in general term, appreciated through the domestic and international peer review. However it was also suggested that there were issues related to improving transparency and traceability of the procedure. To achieve this, improvement of scenario analysis method has been studied. In this study, based on an improvement method for treatment of FEP interaction a computer tool to support scenario analysis by specialists of performance assessment has been developed. Anticipated effects of this tool are to improve efficiency of complex and time consuming scenario analysis work and to reduce possibility of human errors in this work. This tool also enables to describe interactions among a vast number of FEPs and the related information as interaction matrix, and analysis those interactions from a variety of perspectives. (author)

  9. Computer-aided tool for the teaching of relational algebra in data base courses

    Johnny Villalobos Murillo

    2016-03-01

    Full Text Available This article describes the design and implementation of computer-aided tool called Relational Algebra Translator (RAT in data base courses, for the teaching of relational algebra. There was a problem when introducing the relational algebra topic in the course EIF 211 Design and Implementation of Databases, which belongs to the career of Engineering in Information Systems of the National University of Costa Rica, because students attending this course were lacking profound mathematical knowledge, which led to a learning problem, being this an important subject to understand what the data bases search and request do RAT comes along to enhance the teaching-learning process. It introduces the architectural and design principles required for its implementation, such as: the language symbol table, the gramatical rules and the basic algorithms that RAT uses to translate from relational algebra to SQL language. This tool has been used for one periods and has demonstrated to be effective in the learning-teaching process.  This urged investigators to publish it in the web site: www.slinfo.una.ac.cr in order for this tool to be used in other university courses.

  10. Ratsnake: A Versatile Image Annotation Tool with Application to Computer-Aided Diagnosis

    D. K. Iakovidis

    2014-01-01

    Full Text Available Image segmentation and annotation are key components of image-based medical computer-aided diagnosis (CAD systems. In this paper we present Ratsnake, a publicly available generic image annotation tool providing annotation efficiency, semantic awareness, versatility, and extensibility, features that can be exploited to transform it into an effective CAD system. In order to demonstrate this unique capability, we present its novel application for the evaluation and quantification of salient objects and structures of interest in kidney biopsy images. Accurate annotation identifying and quantifying such structures in microscopy images can provide an estimation of pathogenesis in obstructive nephropathy, which is a rather common disease with severe implication in children and infants. However a tool for detecting and quantifying the disease is not yet available. A machine learning-based approach, which utilizes prior domain knowledge and textural image features, is considered for the generation of an image force field customizing the presented tool for automatic evaluation of kidney biopsy images. The experimental evaluation of the proposed application of Ratsnake demonstrates its efficiency and effectiveness and promises its wide applicability across a variety of medical imaging domains.

  11. A remote sensing computer-assisted learning tool developed using the unified modeling language

    Friedrich, J.; Karslioglu, M. O.

    The goal of this work has been to create an easy-to-use and simple-to-make learning tool for remote sensing at an introductory level. Many students struggle to comprehend what seems to be a very basic knowledge of digital images, image processing and image arithmetic, for example. Because professional programs are generally too complex and overwhelming for beginners and often not tailored to the specific needs of a course regarding functionality, a computer-assisted learning (CAL) program was developed based on the unified modeling language (UML), the present standard for object-oriented (OO) system development. A major advantage of this approach is an easier transition from modeling to coding of such an application, if modern UML tools are being used. After introducing the constructed UML model, its implementation is briefly described followed by a series of learning exercises. They illustrate how the resulting CAL tool supports students taking an introductory course in remote sensing at the author's institution.

  12. Computer tool to evaluate the cue reactivity of chemically dependent individuals.

    Silva, Meire Luci da; Frère, Annie France; Oliveira, Henrique Jesus Quintino de; Martucci Neto, Helio; Scardovelli, Terigi Augusto

    2017-03-01

    Anxiety is one of the major influences on the dropout of relapse and treatment of substance abuse treatment. Chemically dependent individuals need (CDI) to be aware of their emotional state in situations of risk during their treatment. Many patients do not agree with the diagnosis of the therapist when considering them vulnerable to environmental stimuli related to drugs. This research presents a cue reactivity detection tool based on a device acquiring physiological signals connected to personal computer. Depending on the variations of the emotional state of the drug addict, alteration of the physiological signals will be detected by the computer tool (CT) which will modify the displayed virtual sets without intervention of the therapist. Developed in 3ds Max® software, the CT is composed of scenarios and objects that are in the habit of marijuana and cocaine dependent individual's daily life. The interaction with the environment is accomplished using a Human-Computer Interface (HCI) that converts incoming physiological signals indicating anxiety state into commands that change the scenes. Anxiety was characterized by the average variability from cardiac and respiratory rate of 30 volunteers submitted stress environment situations. To evaluate the effectiveness of cue reactivity a total of 50 volunteers who were marijuana, cocaine or both dependent were accompanied. Prior to CT, the results demonstrated a poor correlation between the therapists' predictions and those of the chemically dependent individuals. After exposure to the CT, there was a significant increase of 73% in awareness of the risks of relapse. We confirmed the hypothesis that the CT, controlled only by physiological signals, increases the perception of vulnerability to risk situations of individuals with dependence on marijuana, cocaine or both. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  13. Wigner functions and density matrices in curved spaces as computational tools

    Habib, S.; Kandrup, H.E.

    1989-01-01

    This paper contrasts two alternative approaches to statistical quantum field theory in curved spacetimes, namely (1) a canonical Hamiltonian approach, in which the basic object is a density matrix ρ characterizing the noncovariant, but globally defined, modes of the field; and (2) a Wigner function approach, in which the basic object is a Wigner function f defined quasilocally from the Hadamard, or correlation, function G 1 (x 1 , x 2 ). The key object is to isolate on the conceptual biases underlying each of these approaches and then to assess their utility and limitations in effecting concerete calculations. The following questions are therefore addressed and largely answered. What sort of spacetimes (e.g., de Sitter or Friedmann-Robertson-Walker) are comparatively eas to consider? What sorts of objects (e.g., average fields or renormalized stress energies) are easy to compute approximately? What, if anything, can be computed exactly? What approximations are intrinsic to each approach or convenient as computational tools? What sorts of ''field entropies'' are natural to define? copyright 1989 Academic Press, Inc

  14. C-arm Cone Beam Computed Tomography: A New Tool in the Interventional Suite.

    Raj, Santhosh; Irani, Farah Gillan; Tay, Kiang Hiong; Tan, Bien Soo

    2013-11-01

    C-arm Cone Beam CT (CBCT) is a technology that is being integrated into many of the newer angiography systems in the interventional suite. Due to its ability to provide cross sectional imaging, it has opened a myriad of opportunities for creating new clinical applications. We review the technical aspects, current reported clinical applications and potential benefits of this technology. Searches were made via PubMed using the string "CBCT", "Cone Beam CT", "Cone Beam Computed Tomography" and "C-arm Cone Beam Computed Tomography". All relevant articles in the results were reviewed. CBCT clinical applications have been reported in both vascular and non-vascular interventions. They encompass many aspects of a procedure including preprocedural planning, intraprocedural guidance and postprocedural assessment. As a result, they have allowed the interventionalist to be safer and more accurate in performing image guided procedures. There are however several technical limitations. The quality of images produced is not comparable to conventional computed tomography (CT). Radiation doses are also difficult to quantify when compared to CT and fluoroscopy. CBCT technology in the interventional suite has contributed significant benefits to the patient despite its current limitations. It is a tool that will evolve and potentially become an integral part of imaging guidance for intervention.

  15. N2A: a computational tool for modeling from neurons to algorithms

    Fredrick eRothganger

    2014-01-01

    Full Text Available The exponential increase in available neural data has combined with the exponential growth in computing (Moore’s law to create new opportunities to understand neural systems at large scale and high detail. The ability to produce large and sophisticated simulations has introduced unique challenges to neuroscientists. Computational models in neuroscience are increasingly broad efforts, often involving the collaboration of experts in different domains. Furthermore, the size and detail of models have grown to levels for which understanding the implications of variability and assumptions is no longer trivial. Here, we introduce the model design platform N2A which aims to facilitate the design and validation of biologically realistic models. N2A uses a hierarchical representation of neural information to enable the integration of models from different users. N2A streamlines computational validation of a model by natively implementing standard tools in sensitivity analysis and uncertainty quantification. The part-relationship representation allows both network-level analysis and dynamical simulations. We will demonstrate how N2A can be used in a range of examples, including a simple Hodgkin-Huxley cable model, basic parameter sensitivity of an 80/20 network, and the expression of the structural plasticity of a growing dendrite and stem cell proliferation and differentiation.

  16. Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    Xu, Tengfang [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Flapper, Joris [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ke, Jing [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Kramer, Klaas [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sathaye, Jayant [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-02-01

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry – including four dairy processes – cheese, fluid milk, butter, and milk powder.

  17. DVS-SOFTWARE: An Effective Tool for Applying Highly Parallelized Hardware To Computational Geophysics

    Herrera, I.; Herrera, G. S.

    2015-12-01

    Most geophysical systems are macroscopic physical systems. The behavior prediction of such systems is carried out by means of computational models whose basic models are partial differential equations (PDEs) [1]. Due to the enormous size of the discretized version of such PDEs it is necessary to apply highly parallelized super-computers. For them, at present, the most efficient software is based on non-overlapping domain decomposition methods (DDM). However, a limiting feature of the present state-of-the-art techniques is due to the kind of discretizations used in them. Recently, I. Herrera and co-workers using 'non-overlapping discretizations' have produced the DVS-Software which overcomes this limitation [2]. The DVS-software can be applied to a great variety of geophysical problems and achieves very high parallel efficiencies (90%, or so [3]). It is therefore very suitable for effectively applying the most advanced parallel supercomputers available at present. In a parallel talk, in this AGU Fall Meeting, Graciela Herrera Z. will present how this software is being applied to advance MOD-FLOW. Key Words: Parallel Software for Geophysics, High Performance Computing, HPC, Parallel Computing, Domain Decomposition Methods (DDM)REFERENCES [1]. Herrera Ismael and George F. Pinder, Mathematical Modelling in Science and Engineering: An axiomatic approach", John Wiley, 243p., 2012. [2]. Herrera, I., de la Cruz L.M. and Rosas-Medina A. "Non Overlapping Discretization Methods for Partial, Differential Equations". NUMER METH PART D E, 30: 1427-1454, 2014, DOI 10.1002/num 21852. (Open source) [3]. Herrera, I., & Contreras Iván "An Innovative Tool for Effectively Applying Highly Parallelized Software To Problems of Elasticity". Geofísica Internacional, 2015 (In press)

  18. Design of Intelligent Robot as A Tool for Teaching Media Based on Computer Interactive Learning and Computer Assisted Learning to Improve the Skill of University Student

    Zuhrie, M. S.; Basuki, I.; Asto B, I. G. P.; Anifah, L.

    2018-01-01

    The focus of the research is the teaching module which incorporates manufacturing, planning mechanical designing, controlling system through microprocessor technology and maneuverability of the robot. Computer interactive and computer-assisted learning is strategies that emphasize the use of computers and learning aids (computer assisted learning) in teaching and learning activity. This research applied the 4-D model research and development. The model is suggested by Thiagarajan, et.al (1974). 4-D Model consists of four stages: Define Stage, Design Stage, Develop Stage, and Disseminate Stage. This research was conducted by applying the research design development with an objective to produce a tool of learning in the form of intelligent robot modules and kit based on Computer Interactive Learning and Computer Assisted Learning. From the data of the Indonesia Robot Contest during the period of 2009-2015, it can be seen that the modules that have been developed confirm the fourth stage of the research methods of development; disseminate method. The modules which have been developed for students guide students to produce Intelligent Robot Tool for Teaching Based on Computer Interactive Learning and Computer Assisted Learning. Results of students’ responses also showed a positive feedback to relate to the module of robotics and computer-based interactive learning.

  19. M4D: a powerful tool for structured programming at assembly level for MODCOMP computers

    Shah, R.R.; Basso, R.A.J.

    1984-04-01

    Structured programming techniques offer numerous benefits for software designers and form the basis of the current high level languages. However, these techniques are generally not available to assembly programmers. The M4D package was therefore developed for a large project to enable the use of structured programming constructs such as DO.WHILE-ENDDO and IF-ORIF-ORIF...-ELSE-ENDIF in the assembly code for MODCOMP computers. Programs can thus be produced that have clear semantics and are considerably easier to read than normal assembly code, resulting in reduced program development and testing effort, and in improved long-term maintainability of the code. This paper describes the M4D structured programming tool as implemented for MODCOMP'S MAX III and MAX IV assemblers, and illustrates the use of the facility with a number of examples

  20. Modeling of edge effect in subaperture tool influence functions of computer controlled optical surfacing.

    Wan, Songlin; Zhang, Xiangchao; He, Xiaoying; Xu, Min

    2016-12-20

    Computer controlled optical surfacing requires an accurate tool influence function (TIF) for reliable path planning and deterministic fabrication. Near the edge of the workpieces, the TIF has a nonlinear removal behavior, which will cause a severe edge-roll phenomenon. In the present paper, a new edge pressure model is developed based on the finite element analysis results. The model is represented as the product of a basic pressure function and a correcting function. The basic pressure distribution is calculated according to the surface shape of the polishing pad, and the correcting function is used to compensate the errors caused by the edge effect. Practical experimental results demonstrate that the new model can accurately predict the edge TIFs with different overhang ratios. The relative error of the new edge model can be reduced to 15%.

  1. INTERACTIONS: DESIGN, IMPLEMENTATION AND EVALUATION OF A COMPUTATIONAL TOOL FOR TEACHING INTERMOLECULAR FORCES IN HIGHER EDUCATION

    Francisco Geraldo Barbosa

    2015-12-01

    Full Text Available Intermolecular forces are a useful concept that can explain the attraction between particulate matter as well as numerous phenomena in our lives such as viscosity, solubility, drug interactions, and dyeing of fibers. However, studies show that students have difficulty understanding this important concept, which has led us to develop a free educational software in English and Portuguese. The software can be used interactively by teachers and students, thus facilitating better understanding. Professors and students, both graduate and undergraduate, were questioned about the software quality and its intuitiveness of use, facility of navigation, and pedagogical application using a Likert scale. The results led to the conclusion that the developed computer application can be characterized as an auxiliary tool to assist teachers in their lectures and students in their learning process of intermolecular forces.

  2. A computer tool for a minimax criterion in binary response and heteroscedastic simple linear regression models.

    Casero-Alonso, V; López-Fidalgo, J; Torsney, B

    2017-01-01

    Binary response models are used in many real applications. For these models the Fisher information matrix (FIM) is proportional to the FIM of a weighted simple linear regression model. The same is also true when the weight function has a finite integral. Thus, optimal designs for one binary model are also optimal for the corresponding weighted linear regression model. The main objective of this paper is to provide a tool for the construction of MV-optimal designs, minimizing the maximum of the variances of the estimates, for a general design space. MV-optimality is a potentially difficult criterion because of its nondifferentiability at equal variance designs. A methodology for obtaining MV-optimal designs where the design space is a compact interval [a, b] will be given for several standard weight functions. The methodology will allow us to build a user-friendly computer tool based on Mathematica to compute MV-optimal designs. Some illustrative examples will show a representation of MV-optimal designs in the Euclidean plane, taking a and b as the axes. The applet will be explained using two relevant models. In the first one the case of a weighted linear regression model is considered, where the weight function is directly chosen from a typical family. In the second example a binary response model is assumed, where the probability of the outcome is given by a typical probability distribution. Practitioners can use the provided applet to identify the solution and to know the exact support points and design weights. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  3. Computer simulation tools for X-ray analysis scattering and diffraction methods

    Morelhão, Sérgio Luiz

    2016-01-01

    The main goal of this book is to break down the huge barrier of difficulties faced by beginners from many fields (Engineering, Physics, Chemistry, Biology, Medicine, Material Science, etc.) in using X-rays as an analytical tool in their research. Besides fundamental concepts, MatLab routines are provided, showing how to test and implement the concepts. The major difficult in analyzing materials by X-ray techniques is that it strongly depends on simulation software. This book teaches the users on how to construct a library of routines to simulate scattering and diffraction by almost any kind of samples. It provides to a young student the knowledge that would take more than 20 years to acquire by working on X-rays and relying on the available textbooks. In this book, fundamental concepts in applied X-ray physics are demonstrated through available computer simulation tools. Using MatLab, more than eighty routines are developed for solving the proposed exercises, most of which can be directly used in experimental...

  4. First Studies for the Development of Computational Tools for the Design of Liquid Metal Electromagnetic Pumps

    Carlos O. Maidana

    2017-02-01

    Full Text Available Liquid alloy systems have a high degree of thermal conductivity, far superior to ordinary nonmetallic liquids and inherent high densities and electrical conductivities. This results in the use of these materials for specific heat conducting and dissipation applications for the nuclear and space sectors. Uniquely, they can be used to conduct heat and electricity between nonmetallic and metallic surfaces. The motion of liquid metals in strong magnetic fields generally induces electric currents, which, while interacting with the magnetic field, produce electromagnetic forces. Electromagnetic pumps exploit the fact that liquid metals are conducting fluids capable of carrying currents, which is a source of electromagnetic fields useful for pumping and diagnostics. The coupling between the electromagnetics and thermo-fluid mechanical phenomena and the determination of its geometry and electrical configuration, gives rise to complex engineering magnetohydrodynamics problems. The development of tools to model, characterize, design, and build liquid metal thermo-magnetic systems for space, nuclear, and industrial applications are of primordial importance and represent a cross-cutting technology that can provide unique design and development capabilities as well as a better understanding of the physics behind the magneto-hydrodynamics of liquid metals. First studies for the development of computational tools for the design of liquid metal electromagnetic pumps are discussed.

  5. CAD-RADS - a new clinical decision support tool for coronary computed tomography angiography.

    Foldyna, Borek; Szilveszter, Bálint; Scholtz, Jan-Erik; Banerji, Dahlia; Maurovich-Horvat, Pál; Hoffmann, Udo

    2018-04-01

    Coronary computed tomography angiography (CTA) has been established as an accurate method to non-invasively assess coronary artery disease (CAD). The proposed 'Coronary Artery Disease Reporting and Data System' (CAD-RADS) may enable standardised reporting of the broad spectrum of coronary CTA findings related to the presence, extent and composition of coronary atherosclerosis. The CAD-RADS classification is a comprehensive tool for summarising findings on a per-patient-basis dependent on the highest-grade coronary artery lesion, ranging from CAD-RADS 0 (absence of CAD) to CAD-RADS 5 (total occlusion of a coronary artery). In addition, it provides suggestions for clinical management for each classification, including further testing and therapeutic options. Despite some limitations, CAD-RADS may facilitate improved communication between imagers and patient caregivers. As such, CAD-RADS may enable a more efficient use of coronary CTA leading to more accurate utilisation of invasive coronary angiograms. Furthermore, widespread use of CAD-RADS may facilitate registry-based research of diagnostic and prognostic aspects of CTA. • CAD-RADS is a tool for standardising coronary CTA reports. • CAD-RADS includes clinical treatment recommendations based on CTA findings. • CAD-RADS has the potential to reduce variability of CTA reports.

  6. SOAP: A Tool for the Fast Computation of Photometry and Radial Velocity Induced by Stellar Spots

    Boisse, I.; Bonfils, X.; Santos, N. C.; Figueira, P.

    2013-04-01

    Dark spots and bright plages are present on the surface of dwarf stars from spectral types F to M, even in their low-active phase (like the Sun). Their appearance and disappearance on the stellar photosphere, combined with the stellar rotation, may lead to errors and uncertainties in the characterization of planets both in radial velocity (RV) and photometry. Spot Oscillation and Planet (SOAP) is a tool offered to the community that enables to simulate spots and plages on rotating stars and computes their impact on RV and photometric measurements. This tool will help to understand the challenges related to the knowledge of stellar activity for the next decade: detect telluric planets in the habitable zone of their stars (from G to M dwarfs), understand the activity in the low-mass end of M dwarf (on which future projects, like SPIRou or CARMENES, will focus), limitation to the characterization of the exoplanetary atmosphere (from the ground or with Spitzer, JWST), search for planets around young stars. These can be simulated with SOAP in order to search for indices and corrections to the effect of activity.

  7. Computational fluid dynamic simulations of coal-fired utility boilers: An engineering tool

    Efim Korytnyi; Roman Saveliev; Miron Perelman; Boris Chudnovsky; Ezra Bar-Ziv [Ben-Gurion University of the Negev, Beer-Sheva (Israel)

    2009-01-15

    The objective of this study was to develop an engineering tool by which the combustion behavior of coals in coal-fired utility boilers can be predicted. We presented in this paper that computational fluid dynamic (CFD) codes can successfully predict performance of - and emission from - full-scale pulverized-coal utility boilers of various types, provided that the model parameters required for the simulation are properly chosen and validated. For that purpose we developed a methodology combining measurements in a 50 kW pilot-scale test facility with CFD simulations using the same CFD code configured for both test and full-scale furnaces. In this method model parameters of the coal processes are extracted and validated. This paper presents the importance of the validation of the model parameters which are used in CFD codes. Our results show very good fit of CFD simulations with various parameters measured in a test furnace and several types of utility boilers. The results of this study demonstrate the viability of the present methodology as an effective tool for optimization coal burning in full-scale utility boilers. 41 refs., 9 figs., 3 tabs.

  8. Sampling and sensitivity analyses tools (SaSAT for computational modelling

    Wilson David P

    2008-02-01

    Full Text Available Abstract SaSAT (Sampling and Sensitivity Analysis Tools is a user-friendly software package for applying uncertainty and sensitivity analyses to mathematical and computational models of arbitrary complexity and context. The toolbox is built in Matlab®, a numerical mathematical software package, and utilises algorithms contained in the Matlab® Statistics Toolbox. However, Matlab® is not required to use SaSAT as the software package is provided as an executable file with all the necessary supplementary files. The SaSAT package is also designed to work seamlessly with Microsoft Excel but no functionality is forfeited if that software is not available. A comprehensive suite of tools is provided to enable the following tasks to be easily performed: efficient and equitable sampling of parameter space by various methodologies; calculation of correlation coefficients; regression analysis; factor prioritisation; and graphical output of results, including response surfaces, tornado plots, and scatterplots. Use of SaSAT is exemplified by application to a simple epidemic model. To our knowledge, a number of the methods available in SaSAT for performing sensitivity analyses have not previously been used in epidemiological modelling and their usefulness in this context is demonstrated.

  9. Computational tools for genome-wide miRNA prediction and study

    Malas, T.B.; Ravasi, Timothy

    2012-01-01

    MicroRNAs (miRNAs) are single-stranded non-coding RNA susually of 22 nucleotidesin length that play an important post-transcriptional regulation role in many organisms. MicroRNAs bind a seed sequence to the 3-untranslated region (UTR) region of the target messenger RNA (mRNA), inducing degradation or inhibition of translation and resulting in a reduction in the protein level. This regulatory mechanism is central to many biological processes and perturbation could lead to diseases such as cancer. Given the biological importance, of miRNAs, there is a great need to identify and study their targets and functions. However, miRNAs are very difficult to clone in the lab and this has hindered the identification of novel miRNAs. Next-generation sequencing coupled with new computational tools has recently evolved to help researchers efficiently identify large numbers of novel miRNAs. In this review, we describe recent miRNA prediction tools and discuss their priorities, advantages and disadvantages. Malas and Ravasi.

  10. WebCN: A web-based computation tool for in situ-produced cosmogenic nuclides

    Ma Xiuzeng [Department of Physics, Purdue University, West Lafayette, IN 47907 (United States)]. E-mail: hongju@purdue.edu; Li Yingkui [Department of Geography, University of Missouri-Columbia, Columbia, MO 65211 (United States); Bourgeois, Mike [Department of Physics, Purdue University, West Lafayette, IN 47907 (United States); Caffee, Marc [Department of Physics, Purdue University, West Lafayette, IN 47907 (United States); Elmore, David [Department of Physics, Purdue University, West Lafayette, IN 47907 (United States); Granger, Darryl [Department of Earth and Atmospheric Sciences, Purdue University, West Lafayette, IN 47907 (United States); Muzikar, Paul [Department of Physics, Purdue University, West Lafayette, IN 47907 (United States); Smith, Preston [Department of Physics, Purdue University, West Lafayette, IN 47907 (United States)

    2007-06-15

    Cosmogenic nuclide techniques are increasingly being utilized in geoscience research. For this it is critical to establish an effective, easily accessible and well defined tool for cosmogenic nuclide computations. We have been developing a web-based tool (WebCN) to calculate surface exposure ages and erosion rates based on the nuclide concentrations measured by the accelerator mass spectrometry. WebCN for {sup 10}Be and {sup 26}Al has been finished and published at http://www.physics.purdue.edu/primelab/for{sub u}sers/rockage.html. WebCN for {sup 36}Cl is under construction. WebCN is designed as a three-tier client/server model and uses the open source PostgreSQL for the database management and PHP for the interface design and calculations. On the client side, an internet browser and Microsoft Access are used as application interfaces to access the system. Open Database Connectivity is used to link PostgreSQL and Microsoft Access. WebCN accounts for both spatial and temporal distributions of the cosmic ray flux to calculate the production rates of in situ-produced cosmogenic nuclides at the Earth's surface.

  11. Computational tools for genome-wide miRNA prediction and study

    Malas, T.B.

    2012-11-02

    MicroRNAs (miRNAs) are single-stranded non-coding RNA susually of 22 nucleotidesin length that play an important post-transcriptional regulation role in many organisms. MicroRNAs bind a seed sequence to the 3-untranslated region (UTR) region of the target messenger RNA (mRNA), inducing degradation or inhibition of translation and resulting in a reduction in the protein level. This regulatory mechanism is central to many biological processes and perturbation could lead to diseases such as cancer. Given the biological importance, of miRNAs, there is a great need to identify and study their targets and functions. However, miRNAs are very difficult to clone in the lab and this has hindered the identification of novel miRNAs. Next-generation sequencing coupled with new computational tools has recently evolved to help researchers efficiently identify large numbers of novel miRNAs. In this review, we describe recent miRNA prediction tools and discuss their priorities, advantages and disadvantages. Malas and Ravasi.

  12. BioSPICE: access to the most current computational tools for biologists.

    Garvey, Thomas D; Lincoln, Patrick; Pedersen, Charles John; Martin, David; Johnson, Mark

    2003-01-01

    The goal of the BioSPICE program is to create a framework that provides biologists access to the most current computational tools. At the program midpoint, the BioSPICE member community has produced a software system that comprises contributions from approximately 20 participating laboratories integrated under the BioSPICE Dashboard and a methodology for continued software integration. These contributed software modules are the BioSPICE Dashboard, a graphical environment that combines Open Agent Architecture and NetBeans software technologies in a coherent, biologist-friendly user interface. The current Dashboard permits data sources, models, simulation engines, and output displays provided by different investigators and running on different machines to work together across a distributed, heterogeneous network. Among several other features, the Dashboard enables users to create graphical workflows by configuring and connecting available BioSPICE components. Anticipated future enhancements to BioSPICE include a notebook capability that will permit researchers to browse and compile data to support model building, a biological model repository, and tools to support the development, control, and data reduction of wet-lab experiments. In addition to the BioSPICE software products, a project website supports information exchange and community building.

  13. First studies for the development of computational tools for the design of liquid metal electromagnetic pumps

    Maidana, Carlos O.; Nieminen, Juha E. [Maidana Research, Grandville (United States)

    2017-02-15

    Liquid alloy systems have a high degree of thermal conductivity, far superior to ordinary nonmetallic liquids and inherent high densities and electrical conductivities. This results in the use of these materials for specific heat conducting and dissipation applications for the nuclear and space sectors. Uniquely, they can be used to conduct heat and electricity between nonmetallic and metallic surfaces. The motion of liquid metals in strong magnetic fields generally induces electric currents, which, while interacting with the magnetic field, produce electromagnetic forces. Electromagnetic pumps exploit the fact that liquid metals are conducting fluids capable of carrying currents, which is a source of electromagnetic fields useful for pumping and diagnostics. The coupling between the electromagnetics and thermo-fluid mechanical phenomena and the determination of its geometry and electrical configuration, gives rise to complex engineering magnetohydrodynamics problems. The development of tools to model, characterize, design, and build liquid metal thermo-magnetic systems for space, nuclear, and industrial applications are of primordial importance and represent a cross-cutting technology that can provide unique design and development capabilities as well as a better understanding of the physics behind the magneto-hydrodynamics of liquid metals. First studies for the development of computational tools for the design of liquid metal electromagnetic pumps are discussed.

  14. Using brain-computer interfaces and brain-state dependent stimulation as tools in cognitive neuroscience

    Ole eJensen

    2011-05-01

    Full Text Available Large efforts are currently being made to develop and improve online analysis of brain activity which can be used e.g. for brain-computer interfacing (BCI. A BCI allows a subject to control a device by willfully changing his/her own brain activity. BCI therefore holds the promise as a tool for aiding the disabled and for augmenting human performance. While technical developments obviously are important, we will here argue that new insight gained from cognitive neuroscience can be used to identify signatures of neural activation which reliably can be modulated by the subject at will. This review will focus mainly on oscillatory activity in the alpha band which is strongly modulated by changes in covert attention. Besides developing BCIs for their traditional purpose, they might also be used as a research tool for cognitive neuroscience. There is currently a strong interest in how brain state fluctuations impact cognition. These state fluctuations are partly reflected by ongoing oscillatory activity. The functional role of the brain state can be investigated by introducing stimuli in real time to subjects depending on the actual state of the brain. This principle of brain-state dependent stimulation may also be used as a practical tool for augmenting human behavior. In conclusion, new approaches based on online analysis of ongoing brain activity are currently in rapid development. These approaches are amongst others informed by new insight gained from EEG/MEG studies in cognitive neuroscience and hold the promise of providing new ways for investigating the brain at work.

  15. Design of a computation tool for neutron spectrometry and dosimetry through evolutionary neural networks

    Ortiz R, J. M.; Vega C, H. R.; Martinez B, M. R.; Gallego, E.

    2009-10-01

    The neutron dosimetry is one of the most complicated tasks of radiation protection, due to it is a complex technique and highly dependent of neutron energy. One of the first devices used to perform neutron spectrometry is the system known as spectrometric system of Bonner spheres, that continuous being one of spectrometers most commonly used. This system has disadvantages such as: the components weight, the low resolution of spectrum, long and drawn out procedure for the spectra reconstruction, which require an expert user in system management, the need of use a reconstruction code as BUNKIE, SAND, etc., which are based on an iterative reconstruction algorithm and whose greatest inconvenience is that for the spectrum reconstruction, are needed to provide to system and initial spectrum as close as possible to the desired spectrum get. Consequently, researchers have mentioned the need to developed alternative measurement techniques to improve existing monitoring systems for workers. Among these alternative techniques have been reported several reconstruction procedures based on artificial intelligence techniques such as genetic algorithms, artificial neural networks and hybrid systems of evolutionary artificial neural networks using genetic algorithms. However, the use of these techniques in the nuclear science area is not free of problems, so it has been suggested that more research is conducted in such a way as to solve these disadvantages. Because they are emerging technologies, there are no tools for the results analysis, so in this paper we present first the design of a computation tool that allow to analyze the neutron spectra and equivalent doses, obtained through the hybrid technology of neural networks and genetic algorithms. This tool provides an user graphical environment, friendly, intuitive and easy of operate. The speed of program operation is high, executing the analysis in a few seconds, so it may storage and or print the obtained information for

  16. A least-squares computational ``tool kit``. Nuclear data and measurements series

    Smith, D.L.

    1993-04-01

    The information assembled in this report is intended to offer a useful computational ``tool kit`` to individuals who are interested in a variety of practical applications for the least-squares method of parameter estimation. The fundamental principles of Bayesian analysis are outlined first and these are applied to development of both the simple and the generalized least-squares conditions. Formal solutions that satisfy these conditions are given subsequently. Their application to both linear and non-linear problems is described in detail. Numerical procedures required to implement these formal solutions are discussed and two utility computer algorithms are offered for this purpose (codes LSIOD and GLSIOD written in FORTRAN). Some simple, easily understood examples are included to illustrate the use of these algorithms. Several related topics are then addressed, including the generation of covariance matrices, the role of iteration in applications of least-squares procedures, the effects of numerical precision and an approach that can be pursued in developing data analysis packages that are directed toward special applications.

  17. MVPACK: a computer-aided design tool for multivariable control systems

    Mensah, S.; Frketich, G.

    1985-10-01

    The design and analysis of high-performance controllers for complex plants require a collection of interactive, powerful computer software. MVPACK, an open-ended package for the computer-aided design of control systems, has been developed in the Reactor Control Branch of the Chalk River Nuclear Laboratories. The package is fully interactive and includes a comprehensive state-of-the-art mathematical library to support development of complex, multivariable, control algorithms. Coded in RATFOR, MVPACK is portable with minimal changes. It operates with a flexible data structure which makes efficient use of minicomputer resources and provides a standard framework for program generation. The existence of a help mechanism enhances the simplicity of package utilization. This paper provides a brief tutorial overview of the package. It reviews the specifications used in the design and implementation of the package and briefly describes the database structure, supporting libraries and some design and analysis modules of MVPACK. Several application examples to illustrate the capability of the package are given. Experience with MVPACK shows that the package provides a synergistic environment for the design of control and regulation systems, and that it is a unique tool for training of control system engineers

  18. Server consolidation for heterogeneous computer clusters using Colored Petri Nets and CPN Tools

    Issam Al-Azzoni

    2015-10-01

    Full Text Available In this paper, we present a new approach to server consolidation in heterogeneous computer clusters using Colored Petri Nets (CPNs. Server consolidation aims to reduce energy costs and improve resource utilization by reducing the number of servers necessary to run the existing virtual machines in the cluster. It exploits the emerging technology of live migration which allows migrating virtual machines between servers without stopping their provided services. Server consolidation approaches attempt to find migration plans that aim to minimize the necessary size of the cluster. Our approach finds plans which not only minimize the overall number of used servers, but also minimize the total data migration overhead. The latter objective is not taken into consideration by other approaches and heuristics. We explore the use of CPN Tools in analyzing the state spaces of the CPNs. Since the state space of the CPN model can grow exponentially with the size of the cluster, we examine different techniques to generate and analyze the state space in order to find good plans to server consolidation within acceptable time and computing power.

  19. Probabilistic graphs as a conceptual and computational tool in hydrology and water management

    Schoups, Gerrit

    2014-05-01

    Originally developed in the fields of machine learning and artificial intelligence, probabilistic graphs constitute a general framework for modeling complex systems in the presence of uncertainty. The framework consists of three components: 1. Representation of the model as a graph (or network), with nodes depicting random variables in the model (e.g. parameters, states, etc), which are joined together by factors. Factors are local probabilistic or deterministic relations between subsets of variables, which, when multiplied together, yield the joint distribution over all variables. 2. Consistent use of probability theory for quantifying uncertainty, relying on basic rules of probability for assimilating data into the model and expressing unknown variables as a function of observations (via the posterior distribution). 3. Efficient, distributed approximation of the posterior distribution using general-purpose algorithms that exploit model structure encoded in the graph. These attributes make probabilistic graphs potentially useful as a conceptual and computational tool in hydrology and water management (and beyond). Conceptually, they can provide a common framework for existing and new probabilistic modeling approaches (e.g. by drawing inspiration from other fields of application), while computationally they can make probabilistic inference feasible in larger hydrological models. The presentation explores, via examples, some of these benefits.

  20. How accurate are adolescents in portion-size estimation using the computer tool young adolescents' nutrition assessment on computer (YANA-C)?

    Vereecken, Carine; Dohogne, Sophie; Covents, Marc; Maes, Lea

    2010-01-01

    Computer-administered questionnaires have received increased attention for large-scale population research on nutrition. In Belgium-Flanders, Young Adolescents' Nutrition Assessment on Computer (YANA-C) has been developed. In this tool, standardised photographs are available to assist in portion-size estimation. The purpose of the present study is to assess how accurate adolescents are in estimating portion sizes of food using YANA-C. A convenience sample, aged 11-17 years, estimated the amou...

  1. The UEA Small RNA Workbench: A Suite of Computational Tools for Small RNA Analysis.

    Mohorianu, Irina; Stocks, Matthew Benedict; Applegate, Christopher Steven; Folkes, Leighton; Moulton, Vincent

    2017-01-01

    RNA silencing (RNA interference, RNAi) is a complex, highly conserved mechanism mediated by short, typically 20-24 nt in length, noncoding RNAs known as small RNAs (sRNAs). They act as guides for the sequence-specific transcriptional and posttranscriptional regulation of target mRNAs and play a key role in the fine-tuning of biological processes such as growth, response to stresses, or defense mechanism.High-throughput sequencing (HTS) technologies are employed to capture the expression levels of sRNA populations. The processing of the resulting big data sets facilitated the computational analysis of the sRNA patterns of variation within biological samples such as time point experiments, tissue series or various treatments. Rapid technological advances enable larger experiments, often with biological replicates leading to a vast amount of raw data. As a result, in this fast-evolving field, the existing methods for sequence characterization and prediction of interaction (regulatory) networks periodically require adapting or in extreme cases, a complete redesign to cope with the data deluge. In addition, the presence of numerous tools focused only on particular steps of HTS analysis hinders the systematic parsing of the results and their interpretation.The UEA small RNA Workbench (v1-4), described in this chapter, provides a user-friendly, modular, interactive analysis in the form of a suite of computational tools designed to process and mine sRNA datasets for interesting characteristics that can be linked back to the observed phenotypes. First, we show how to preprocess the raw sequencing output and prepare it for downstream analysis. Then we review some quality checks that can be used as a first indication of sources of variability between samples. Next we show how the Workbench can provide a comparison of the effects of different normalization approaches on the distributions of expression, enhanced methods for the identification of differentially expressed

  2. THE ISSUE OF FORMING FUTURE MUSIC TEACHERS’ PROFESSIONAL COMPETENCE BY COMPUTER TECHNOLOGY TOOLS IN THE THEORY OF NATIONAL ART

    Lyudmila Gavrilova

    2017-04-01

    Full Text Available The article deals with theoretical aspects of forming future music teachers’ professional competence by computer technology tools. The concept of professional competence has become a major criterion of preparing students for professional activities. The issue of the article is relevant as the competence approach has become a basis of implementing computer technologies into future music teachers’ training. The authors give a detailed analysis of implementing computer technologies into musical education. The special attention is paid to using a computer in musical education and making electronic pedagogical resources. The aim of the article is to outline the directions of national art research in the process of implementing computer tools that is one of the most efficient ways of updating process of future music teachers’ training. The article reveals theoretical aspects of forming future music teachers’ professional competence by computer technology tools. The authors point out that implementing musical and computer technologies into music art practice is realized in some directions: using a computer as a new musical instrument in composers, sound engineers, and arrangers’ activities; using a computer for studying the quality of music sound, analysing sounds and music compositions, spectral analysis of acoustic characteristics of singers’ voice; studying ancient music manuscripts due to digital technology; developing hardware and software for music education. A distinct direction of research is the pedagogical aspect of using a computer in music education (music and the use of special software for recording and editing music, the use of multimedia to enhance visibility in education, development of e-learning resources, etc.. The authors conclude that implementing computer technologies into future music teachers’ training makes this process more efficient. In the authors’ opinion the widespread introduction of distance learning

  3. Minyoo Matata - The Vicious Worm - A Taenia solium Computer-Based Health-Education Tool - in Swahili

    Trevisan, Chiara; Fèvre, Eric M.; Owiny, Maurice

    2017-01-01

    Lack of knowledge is one of the main risk factors for the spread of the zoonotic parasite Taenia solium. The computer-based health-education tool 'The Vicious Worm' was developed to create awareness and provide evidence-based health education as a specific measure in control strategies. To increase...... the reach of the tool, a new version in Swahili was developed and can now be downloaded for free from http://theviciousworm.sites.ku.dk....

  4. Collidoscope: An Improved Tool for Computing Collisional Cross-Sections with the Trajectory Method

    Ewing, Simon A.; Donor, Micah T.; Wilson, Jesse W.; Prell, James S.

    2017-04-01

    Ion mobility-mass spectrometry (IM-MS) can be a powerful tool for determining structural information about ions in the gas phase, from small covalent analytes to large, native-like or denatured proteins and complexes. For large biomolecular ions, which may have a wide variety of possible gas-phase conformations and multiple charge sites, quantitative, physically explicit modeling of collisional cross sections (CCSs) for comparison to IMS data can be challenging and time-consuming. We present a "trajectory method" (TM) based CCS calculator, named "Collidoscope," which utilizes parallel processing and optimized trajectory sampling, and implements both He and N2 as collision gas options. Also included is a charge-placement algorithm for determining probable charge site configurations for protonated protein ions given an input geometry in pdb file format. Results from Collidoscope are compared with those from the current state-of-the-art CCS simulation suite, IMoS. Collidoscope CCSs are within 4% of IMoS values for ions with masses from 18 Da to 800 kDa. Collidoscope CCSs using X-ray crystal geometries are typically within a few percent of IM-MS experimental values for ions with mass up to 3.5 kDa (melittin), and discrepancies for larger ions up to 800 kDa (GroEL) are attributed in large part to changes in ion structure during and after the electrospray process. Due to its physically explicit modeling of scattering, computational efficiency, and accuracy, Collidoscope can be a valuable tool for IM-MS research, especially for large biomolecular ions.

  5. Rainbow: a tool for large-scale whole-genome sequencing data analysis using cloud computing.

    Zhao, Shanrong; Prenger, Kurt; Smith, Lance; Messina, Thomas; Fan, Hongtao; Jaeger, Edward; Stephens, Susan

    2013-06-27

    Technical improvements have decreased sequencing costs and, as a result, the size and number of genomic datasets have increased rapidly. Because of the lower cost, large amounts of sequence data are now being produced by small to midsize research groups. Crossbow is a software tool that can detect single nucleotide polymorphisms (SNPs) in whole-genome sequencing (WGS) data from a single subject; however, Crossbow has a number of limitations when applied to multiple subjects from large-scale WGS projects. The data storage and CPU resources that are required for large-scale whole genome sequencing data analyses are too large for many core facilities and individual laboratories to provide. To help meet these challenges, we have developed Rainbow, a cloud-based software package that can assist in the automation of large-scale WGS data analyses. Here, we evaluated the performance of Rainbow by analyzing 44 different whole-genome-sequenced subjects. Rainbow has the capacity to process genomic data from more than 500 subjects in two weeks using cloud computing provided by the Amazon Web Service. The time includes the import and export of the data using Amazon Import/Export service. The average cost of processing a single sample in the cloud was less than 120 US dollars. Compared with Crossbow, the main improvements incorporated into Rainbow include the ability: (1) to handle BAM as well as FASTQ input files; (2) to split large sequence files for better load balance downstream; (3) to log the running metrics in data processing and monitoring multiple Amazon Elastic Compute Cloud (EC2) instances; and (4) to merge SOAPsnp outputs for multiple individuals into a single file to facilitate downstream genome-wide association studies. Rainbow is a scalable, cost-effective, and open-source tool for large-scale WGS data analysis. For human WGS data sequenced by either the Illumina HiSeq 2000 or HiSeq 2500 platforms, Rainbow can be used straight out of the box. Rainbow is available

  6. INTRODUCING CAFein, A NEW COMPUTATIONAL TOOL FOR STELLAR PULSATIONS AND DYNAMIC TIDES

    Valsecchi, F.; Farr, W. M.; Willems, B.; Rasio, F. A.; Kalogera, V.

    2013-01-01

    Here we present CAFein, a new computational tool for investigating radiative dissipation of dynamic tides in close binaries and of non-adiabatic, non-radial stellar oscillations in isolated stars in the linear regime. For the latter, CAFein computes the non-adiabatic eigenfrequencies and eigenfunctions of detailed stellar models. The code is based on the so-called Riccati method, a numerical algorithm that has been successfully applied to a variety of stellar pulsators, and which does not suffer from the major drawbacks of commonly used shooting and relaxation schemes. Here we present an extension of the Riccati method to investigate dynamic tides in close binaries. We demonstrate CAFein's capabilities as a stellar pulsation code both in the adiabatic and non-adiabatic regimes, by reproducing previously published eigenfrequencies of a polytrope, and by successfully identifying the unstable modes of a stellar model in the β Cephei/SPB region of the Hertzsprung-Russell diagram. Finally, we verify CAFein's behavior in the dynamic tides regime by investigating the effects of dynamic tides on the eigenfunctions and orbital and spin evolution of massive main sequence stars in eccentric binaries, and of hot Jupiter host stars. The plethora of asteroseismic data provided by NASA's Kepler satellite, some of which include the direct detection of tidally excited stellar oscillations, make CAFein quite timely. Furthermore, the increasing number of observed short-period detached double white dwarfs (WDs) and the observed orbital decay in the tightest of such binaries open up a new possibility of investigating WD interiors through the effects of tides on their orbital evolution

  7. Computational Ecology and Open Science: Tools to Help Manage Lakes for Cyanobacteria in Lakes

    Computational ecology is an interdisciplinary field that takes advantage of modern computation abilities to expand our ecological understanding. As computational ecologists, we use large data sets, which often cover large spatial extents, and advanced statistical/mathematical co...

  8. MoManI: a tool to facilitate research, analysis, and teaching of computer models

    Howells, Mark; Pelakauskas, Martynas; Almulla, Youssef; Tkaczyk, Alan H.; Zepeda, Eduardo

    2017-04-01

    Allocating limited resource efficiently is a task to which efficient planning and policy design aspires. This may be a non-trivial task. For example, the seventh sustainable development goal (SDG) of Agenda 2030 is to provide access to affordable sustainable energy to all. On the one hand, energy is required to realise almost all other SDGs. (A clinic requires electricity for fridges to store vaccines for maternal health, irrigate agriculture requires energy to pump water to crops in dry periods etc.) On the other hand, the energy system is non-trivial. It requires the mapping of resource, its conversion into useable energy and then into machines that we use to meet our needs. That requires new tools that draw from standard techniques, best-in-class models and allow the analyst to develop new models. Thus we present the Model Management Infrastructure (MoManI). MoManI is used to develop, manage, run, store input and results data for linear programming models. MoManI, is a browser-based open source interface for systems modelling. It is available to various user audiences, from policy makers and planners through to academics. For example, we implement the Open Source energy Modelling System (OSeMOSYS) in MoManI. OSeMOSYS is a specialized energy model generator. A typical OSeMOSYS model would represent the current energy system of a country, region or city; in it, equations and constraints are specified; and calibrated to a base year. From that future technologies and policy options are represented. From those scenarios are designed and run. Efficient allocation of energy resource and expenditure on technology is calculated. Finally, results are visualized. At present this is done in relatively rigid interfaces or via (for some) cumbersome text files. Implementing and operating OSeMOSYS in MoManI shortens the learning curve and reduces phobia associated with the complexity of computer modelling, thereby supporting effective capacity building activities. The novel

  9. Abdominal computed tomography scan as a screening tool in blunt trauma

    Brasel, K.J.; Borgstrom, D.C.; Kolewe, K.A.

    1997-01-01

    Background. One of the most difficult problems in blunt trauma is evaluation for potential intraabdominal injury. Admission for serial abdominal exams remains the standard of care after intraabdominal injury has been initially excluded. We hypothesized a normal abdominal computed tomography (CT) scan in a subgroup of minimally injured patients would obviate admission for serial abdominal examinations, allowing safe discharge from the emergency department (ED). Methods. We reviewed our blunt trauma experience with patients admitted solely for serial abdominal examinations after a normal CT. Patients were identified from the trauma registry at a Level 1 trauma center from July 1991 through June 1995. Patients with abnormal CTs, extra-abdominal injuries necessitating admission, hemodynamic abnormalities, a Glasgow Coma Scale less than 13, or injury severity scores (ISSs) greater than 15 were excluded. Records of 238 patients remained; we reviewed them to determine the presence of missed abdominal injury. Results. None of the 238 patients had a missed abdominal injury. Average ISS of these patients was 3.2 (range, 0 to 10). Discharging these patients from the ED would result in a yearly cost savings of $32,874 to our medical system. Conclusions. Abdominal CT scan is a safe and cost-effective screening tool in patients with blunt trauma. A normal CT scan in minimally injured patients allows safe discharge from the ED. (authors)

  10. Burned bodies: post-mortem computed tomography, an essential tool for modern forensic medicine.

    Coty, J-B; Nedelcu, C; Yahya, S; Dupont, V; Rougé-Maillart, C; Verschoore, M; Ridereau Zins, C; Aubé, C

    2018-06-07

    Currently, post-mortem computed tomography (PMCT) has become an accessible and contemporary tool for forensic investigations. In the case of burn victims, it provides specific semiologies requiring a prudent understanding to differentiate between the normal post-mortem changes from heat-related changes. The aim of this pictorial essay is to provide to the radiologist the keys to establish complete and focused reports in cases of PMCT of burn victims. Thus, the radiologist must discern all the contextual divergences with the forensic history, and must be able to report all the relevant elements to answer to the forensic pathologist the following questions: Are there tomographic features that could help to identify the victim? Is there evidence of remains of biological fluids in liquid form available for toxicological analysis and DNA sampling? Is there another obvious cause of death than heat-related lesions, especially metallic foreign bodies of ballistic origin? Finally, what are the characteristic burn-related injuries seen on the corpse that should be sought during the autopsy? • CT is highly useful to find features permitting the identification of a severely burned body. • PMCT is a major asset in gunshot injuries to depict ballistic foreign bodies in the burned cadavers. • CT is able to recognise accessible blood for tests versus heat clot (air-crescent sign). • Heat-related fractures are easily differentiated from traumatic fractures. • Epidural collections with a subdural appearance are typical heat-related head lesions.

  11. Unraveling the web of viroinformatics: computational tools and databases in virus research.

    Sharma, Deepak; Priyadarshini, Pragya; Vrati, Sudhanshu

    2015-02-01

    The beginning of the second century of research in the field of virology (the first virus was discovered in 1898) was marked by its amalgamation with bioinformatics, resulting in the birth of a new domain--viroinformatics. The availability of more than 100 Web servers and databases embracing all or specific viruses (for example, dengue virus, influenza virus, hepatitis virus, human immunodeficiency virus [HIV], hemorrhagic fever virus [HFV], human papillomavirus [HPV], West Nile virus, etc.) as well as distinct applications (comparative/diversity analysis, viral recombination, small interfering RNA [siRNA]/short hairpin RNA [shRNA]/microRNA [miRNA] studies, RNA folding, protein-protein interaction, structural analysis, and phylotyping and genotyping) will definitely aid the development of effective drugs and vaccines. However, information about their access and utility is not available at any single source or on any single platform. Therefore, a compendium of various computational tools and resources dedicated specifically to virology is presented in this article. Copyright © 2015, American Society for Microbiology. All Rights Reserved.

  12. Comparing the use of computer-supported collaboration tools among university students with different life circumstances

    Miikka J. Eriksson

    2014-11-01

    Full Text Available The proportion of higher education students who integrate learning with various life circumstances such as employment or raising children is increasing. This study aims to compare whether and what kinds of differences exist between the perceived use of synchronous and asynchronous computer-mediated communication tools among university students with children or in full-time employment and students without these commitments. The data were collected in a Finnish University by the means of an online questionnaire. The results indicate that students with multiple commitments were using more virtual learning environments and less instant messaging (IM especially when communicating with their peers. The low level of IM might be an indication of not being able to or not wanting to create close ties with their peer students. The practical implication of the study is that pedagogical choices should support different kinds of learning strategies. Students with multiple commitments, and especially students with children, should be encouraged and assisted to create stronger ties with their peers, if they are willing to do so.

  13. Dynamic 3-D computer graphics for designing a diagnostic tool for patients with schizophrenia.

    Farkas, Attila; Papathomas, Thomas V; Silverstein, Steven M; Kourtev, Hristiyan; Papayanopoulos, John F

    2016-11-01

    We introduce a novel procedure that uses dynamic 3-D computer graphics as a diagnostic tool for assessing disease severity in schizophrenia patients, based on their reduced influence of top-down cognitive processes in interpreting bottom-up sensory input. Our procedure uses the hollow-mask illusion, in which the concave side of the mask is misperceived as convex, because familiarity with convex faces dominates sensory cues signaling a concave mask. It is known that schizophrenia patients resist this illusion and their resistance increases with illness severity. Our method uses virtual masks rendered with two competing textures: (a) realistic features that enhance the illusion; (b) random-dot visual noise that reduces the illusion. We control the relative weights of the two textures to obtain psychometric functions for controls and patients and assess illness severity. The primary novelty is the use of a rotating mask that is easy to implement on a wide variety of portable devices and avoids the use of elaborate stereoscopic devices that have been used in the past. Thus our method, which can also be used to assess the efficacy of treatments, provides clinicians the advantage to bring the test to the patient's own environment, instead of having to bring patients to the clinic.

  14. Computer-Based Driving in Dementia Decision Tool With Mail Support: Cluster Randomized Controlled Trial.

    Rapoport, Mark J; Zucchero Sarracini, Carla; Kiss, Alex; Lee, Linda; Byszewski, Anna; Seitz, Dallas P; Vrkljan, Brenda; Molnar, Frank; Herrmann, Nathan; Tang-Wai, David F; Frank, Christopher; Henry, Blair; Pimlott, Nicholas; Masellis, Mario; Naglie, Gary

    2018-05-25

    Physicians often find significant challenges in assessing automobile driving in persons with mild cognitive impairment and mild dementia and deciding when to report to transportation administrators. Care must be taken to balance the safety of patients and other road users with potential negative effects of issuing such reports. The aim of this study was to assess whether a computer-based Driving in Dementia Decision Tool (DD-DT) increased appropriate reporting of patients with mild dementia or mild cognitive impairment to transportation administrators. The study used a parallel-group cluster nonblinded randomized controlled trial design to test a multifaceted knowledge translation intervention. The intervention included a computer-based decision support system activated by the physician-user, which provides a recommendation about whether to report patients with mild dementia or mild cognitive impairment to transportation administrators, based on an algorithm derived from earlier work. The intervention also included a mailed educational package and Web-based specialized reporting forms. Specialists and family physicians with expertise in dementia or care of the elderly were stratified by sex and randomized to either use the DD-DT or a control version of the tool that required identical data input as the intervention group, but instead generated a generic reminder about the reporting legislation in Ontario, Canada. The trial ran from September 9, 2014 to January 29, 2016, and the primary outcome was the number of reports made to the transportation administrators concordant with the algorithm. A total of 69 participating physicians were randomized, and 36 of these used the DD-DT; 20 of the 35 randomized to the intervention group used DD-DT with 114 patients, and 16 of the 34 randomized to the control group used it with 103 patients. The proportion of all assessed patients reported to the transportation administrators concordant with recommendation did not differ

  15. Tool development for organ dose optimization taking into account the image quality in Computed Tomography

    Adrien-Decoene, Camille

    2015-01-01

    Due to the significant rise of computed tomography (CT) exams in the past few years and the increase of the collective dose due to medical exams, dose estimation in CT imaging has become a major public health issue. However dose optimization cannot be considered without taking into account the image quality which has to be good enough for radiologists. In clinical practice, optimization is obtained through empirical index and image quality using measurements performed on specific phantoms like the CATPHAN. Based on this kind of information, it is thus difficult to correctly optimize protocols regarding organ doses and radiologist criteria. Therefore our goal is to develop a tool allowing the optimization of the patient dose while preserving the image quality needed for diagnosis. The work is divided into two main parts: (i) the development of a Monte Carlo dose simulator based on the PENELOPE code, and (ii) the assessment of an objective image quality criterion. For that purpose, the GE Lightspeed VCT 64 CT tube was modelled with information provided by the manufacturer technical note and by adapting the method proposed by Turner et al (Med. Phys. 36: 2154-2164). The axial and helical movements of the X-ray tube were then implemented into the MC tool. To improve the efficiency of the simulation, two variance reduction techniques were used: a circular and a translational splitting. The splitting algorithms allow a uniform particle distribution along the gantry path to simulate the continuous gantry motion in a discrete way. Validations were performed in homogeneous conditions using a home-made phantom and the well-known CTDI phantoms. Then, dose values were measured in CIRS ATOM anthropomorphic phantom using both optically stimulated luminescence dosimeters for point doses and XR-QA Gafchromic films for relative dose maps. Comparisons between measured and simulated values enabled us to validate the MC tool used for dosimetric purposes. Finally, organ doses for

  16. A computationally efficient tool for assessing the depth resolution in large-scale potential-field inversion

    Paoletti, Valeria; Hansen, Per Christian; Hansen, Mads Friis

    2014-01-01

    In potential-field inversion, careful management of singular value decomposition components is crucial for obtaining information about the source distribution with respect to depth. In principle, the depth-resolution plot provides a convenient visual tool for this analysis, but its computational...... on memory and computing time. We used the ApproxDRP to study retrievable depth resolution in inversion of the gravity field of the Neapolitan Volcanic Area. Our main contribution is the combined use of the Lanczos bidiagonalization algorithm, established in the scientific computing community, and the depth...

  17. A user's manual of Tools for Error Estimation of Complex Number Matrix Computation (Ver.1.0)

    Ichihara, Kiyoshi.

    1997-03-01

    'Tools for Error Estimation of Complex Number Matrix Computation' is a subroutine library which aids the users in obtaining the error ranges of the complex number linear system's solutions or the Hermitian matrices' eigen values. This library contains routines for both sequential computers and parallel computers. The subroutines for linear system error estimation calulate norms of residual vectors, matrices's condition numbers, error bounds of solutions and so on. The error estimation subroutines for Hermitian matrix eigen values' derive the error ranges of the eigen values according to the Korn-Kato's formula. This user's manual contains a brief mathematical background of error analysis on linear algebra and usage of the subroutines. (author)

  18. The computer in the mathematics classroom: the tool, the tutor and ...

    Computers in the educational environment is not a new concept. Some schools in Zimbabwe have established computer laboratories in their schools, however, these computers are being used for computer literacy courses only and not for teaching purposes. This paper seeks to enlighten other educational practitioners that ...

  19. Evaluating a computational support tool for set-based configuration of production systems : Results from an industrial case

    Unglert, Johannes; Hoekstra, Sipke; Jauregui Becker, Juan Manuel

    2017-01-01

    This paper describes research conducted in the context of an industrial case dealing with the design of re configurable cellular manufacturing systems. Reconfiguring such systems represents a complex task due to the interdependences between the constituent subsystems. A novel computational tool was

  20. Magnetic resonance imaging and computed tomography as tools for the investigation of sperm whale (Physeter macrocephalus) teeth and eye

    Alstrup, Aage Kristian Olsen; Munk, Ole Lajord; Jensen, Trine Hammer

    2017-01-01

    Background: Scanning techniques such as magnetic resonance imaging (MRI) and computed tomography (CT) are useful tools in veterinary and human medicine. Here we demonstrate the usefulness of these techniques in the study of the anatomy of wild marine mammals as part of a necropsy. MRI and CT scan...

  1. Workshop on the applications of new computer tools to thermal engineering; Applications a la thermique des nouveaux outils informatiques

    NONE

    1996-12-31

    This workshop on the applications of new computer tools to thermal engineering has been organized by the French society of thermal engineers. Seven papers have been presented, from which two papers dealing with thermal diffusivity measurements in materials and with the optimization of dryers have been selected for ETDE. (J.S.)

  2. The Impact of Computer Simulations as Interactive Demonstration Tools on the Performance of Grade 11 Learners in Electromagnetism

    Kotoka, Jonas; Kriek, Jeanne

    2014-01-01

    The impact of computer simulations on the performance of 65 grade 11 learners in electromagnetism in a South African high school in the Mpumalanga province is investigated. Learners did not use the simulations individually, but teachers used them as an interactive demonstration tool. Basic concepts in electromagnetism are difficult to understand…

  3. The Use of Computer Tools in the Design Process of Students’ Architectural Projects. Case Studies in Algeria

    Saighi, Ouafa; Salah Zerouala, Mohamed

    2017-12-01

    This The paper particularly deals with the way in which computer tools are used by students in their design studio’s projects. Four institutions of architecture education in Algeria are considered as a case study to evaluate the impact of such tools on student design process. This aims to inspect in depth such use, to sort out its advantages and shortcomings in order to suggest some solutions. A field survey was undertaken on a sample of students and their teachers at the same institutions. The analysed results mainly show that computer tools are highly focusing on improving the quality of drawings representation and images seeking observers’ satisfaction hence influencing their decision. Some teachers are not very keen to overuse the computer during the design phase; they prefer the “traditional” approach. This is the present situation that Algerian university is facing which leads to conflict and disagreement between students and teachers. Meanwhile, there was no doubt that computer tools have effectively contributed to improve the competitive level among students.

  4. High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    Habib, Salman [Argonne National Lab. (ANL), Argonne, IL (United States); Roser, Robert [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); LeCompte, Tom [Argonne National Lab. (ANL), Argonne, IL (United States); Marshall, Zach [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Borgland, Anders [SLAC National Accelerator Lab., Menlo Park, CA (United States); Viren, Brett [Brookhaven National Lab. (BNL), Upton, NY (United States); Nugent, Peter [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Asai, Makato [SLAC National Accelerator Lab., Menlo Park, CA (United States); Bauerdick, Lothar [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Finkel, Hal [Argonne National Lab. (ANL), Argonne, IL (United States); Gottlieb, Steve [Indiana Univ., Bloomington, IN (United States); Hoeche, Stefan [SLAC National Accelerator Lab., Menlo Park, CA (United States); Sheldon, Paul [Vanderbilt Univ., Nashville, TN (United States); Vay, Jean-Luc [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Elmer, Peter [Princeton Univ., NJ (United States); Kirby, Michael [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Patton, Simon [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Potekhin, Maxim [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Yanny, Brian [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Calafiura, Paolo [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Dart, Eli [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Gutsche, Oliver [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Izubuchi, Taku [Brookhaven National Lab. (BNL), Upton, NY (United States); Lyon, Adam [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Petravick, Don [Univ. of Illinois, Urbana-Champaign, IL (United States). National Center for Supercomputing Applications (NCSA)

    2015-10-29

    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.

  5. High Energy Physics Forum for Computational Excellence: Working Group Reports (I. Applications Software II. Software Libraries and Tools III. Systems)

    Habib, Salman [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States); Roser, Robert [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2015-10-28

    Computing plays an essential role in all aspects of high energy physics. As computational technology evolves rapidly in new directions, and data throughput and volume continue to follow a steep trend-line, it is important for the HEP community to develop an effective response to a series of expected challenges. In order to help shape the desired response, the HEP Forum for Computational Excellence (HEP-FCE) initiated a roadmap planning activity with two key overlapping drivers -- 1) software effectiveness, and 2) infrastructure and expertise advancement. The HEP-FCE formed three working groups, 1) Applications Software, 2) Software Libraries and Tools, and 3) Systems (including systems software), to provide an overview of the current status of HEP computing and to present findings and opportunities for the desired HEP computational roadmap. The final versions of the reports are combined in this document, and are presented along with introductory material.

  6. The GLEaMviz computational tool, a publicly available software to explore realistic epidemic spreading scenarios at the global scale

    Quaggiotto Marco

    2011-02-01

    Full Text Available Abstract Background Computational models play an increasingly important role in the assessment and control of public health crises, as demonstrated during the 2009 H1N1 influenza pandemic. Much research has been done in recent years in the development of sophisticated data-driven models for realistic computer-based simulations of infectious disease spreading. However, only a few computational tools are presently available for assessing scenarios, predicting epidemic evolutions, and managing health emergencies that can benefit a broad audience of users including policy makers and health institutions. Results We present "GLEaMviz", a publicly available software system that simulates the spread of emerging human-to-human infectious diseases across the world. The GLEaMviz tool comprises three components: the client application, the proxy middleware, and the simulation engine. The latter two components constitute the GLEaMviz server. The simulation engine leverages on the Global Epidemic and Mobility (GLEaM framework, a stochastic computational scheme that integrates worldwide high-resolution demographic and mobility data to simulate disease spread on the global scale. The GLEaMviz design aims at maximizing flexibility in defining the disease compartmental model and configuring the simulation scenario; it allows the user to set a variety of parameters including: compartment-specific features, transition values, and environmental effects. The output is a dynamic map and a corresponding set of charts that quantitatively describe the geo-temporal evolution of the disease. The software is designed as a client-server system. The multi-platform client, which can be installed on the user's local machine, is used to set up simulations that will be executed on the server, thus avoiding specific requirements for large computational capabilities on the user side. Conclusions The user-friendly graphical interface of the GLEaMviz tool, along with its high level

  7. Electronic structure of BN-aromatics: Choice of reliable computational tools

    Mazière, Audrey; Chrostowska, Anna; Darrigan, Clovis; Dargelos, Alain; Graciaa, Alain; Chermette, Henry

    2017-10-01

    The importance of having reliable calculation tools to interpret and predict the electronic properties of BN-aromatics is directly linked to the growing interest for these very promising new systems in the field of materials science, biomedical research, or energy sustainability. Ionization energy (IE) is one of the most important parameters to approach the electronic structure of molecules. It can be theoretically estimated, but in order to evaluate their persistence and propose the most reliable tools for the evaluation of different electronic properties of existent or only imagined BN-containing compounds, we took as reference experimental values of ionization energies provided by ultra-violet photoelectron spectroscopy (UV-PES) in gas phase—the only technique giving access to the energy levels of filled molecular orbitals. Thus, a set of 21 aromatic molecules containing B-N bonds and B-N-B patterns has been merged for a comparison between experimental IEs obtained by UV-PES and various theoretical approaches for their estimation. Time-Dependent Density Functional Theory (TD-DFT) methods using B3LYP and long-range corrected CAM-B3LYP functionals are used, combined with the Δ SCF approach, and compared with electron propagator theory such as outer valence Green's function (OVGF, P3) and symmetry adapted cluster-configuration interaction ab initio methods. Direct Kohn-Sham estimation and "corrected" Kohn-Sham estimation are also given. The deviation between experimental and theoretical values is computed for each molecule, and a statistical study is performed over the average and the root mean square for the whole set and sub-sets of molecules. It is shown that (i) Δ SCF+TDDFT(CAM-B3LYP), OVGF, and P3 are the most efficient way for a good agreement with UV-PES values, (ii) a CAM-B3LYP range-separated hybrid functional is significantly better than B3LYP for the purpose, especially for extended conjugated systems, and (iii) the "corrected" Kohn-Sham result is a

  8. Conducting Creativity Brainstorming Sessions in Small and Medium-Sized Enterprises Using Computer-Mediated Communication Tools

    Murthy, Uday S.

    A variety of Web-based low cost computer-mediated communication (CMC) tools are now available for use by small and medium-sized enterprises (SME). These tools invariably incorporate chat systems that facilitate simultaneous input in synchronous electronic meeting environments, allowing what is referred to as “electronic brainstorming.” Although prior research in information systems (IS) has established that electronic brainstorming can be superior to face-to-face brainstorming, there is a lack of detailed guidance regarding how CMC tools should be optimally configured to foster creativity in SMEs. This paper discusses factors to be considered in using CMC tools for creativity brainstorming and proposes recommendations for optimally configuring CMC tools to enhance creativity in SMEs. The recommendations are based on lessons learned from several recent experimental studies on the use of CMC tools for rich brainstorming tasks that require participants to invoke domain-specific knowledge. Based on a consideration of the advantages and disadvantages of the various configuration options, the recommendations provided can form the basis for selecting a CMC tool for creativity brainstorming or for creating an in-house CMC tool for the purpose.

  9. Colon dissection: a new three-dimensional reconstruction tool for computed tomography colonography

    Roettgen, R.; Fischbach, F.; Plotkin, M.; Herzog, H.; Freund, T.; Schroeder, R. J.; Felix, R.

    2005-01-01

    Purpose: To improve the sensitivity of computed tomography (CT) colonography in the detection of polyps by comparing the 3D reconstruction tool 'colon dissection' and endoluminal view (virtual colonoscopy) with axial 2D reconstructions. Material and Methods: Forty-eight patients (22 M, 26 F, mean age 57±21) were studied after intra-anal air insufflation in the supine and prone positions using a 16-slice helical CT (16x0.625 mm, pitch 1.7; detector rotation time 0.5 s; 160 mAs und 120 kV) and conventional colonoscopy. Two radiologists blinded to the results of the conventional colonoscopy analyzed the 3D reconstruction in virtual-endoscopy mode, in colon-dissection mode, and axial 2D slices. Results: Conventional colonoscopy revealed a total of 35 polyps in 15 patients; 33 polyps were disclosed by CT methods. Sensitivity and specificity for detecting colon polyps were 94% and 94%, respectively, when using the 'colon dissection', 89% and 94% when using 'virtual endoscopy', and 62% and 100% when using axial 2D reconstruction. Sensitivity in relation to the diameter of colon polyps with 'colon dissection', 'virtual colonoscopy', and axial 2D-slices was: polyps with a diameter >5.0 mm, 100%, 100%, and 71%, respectively; polyps with a diameter of between 3 and 4.9 mm, 92%, 85%, and 46%; and polyps with a diameter <3 mm, 89%, 78%, and 56%. The difference between 'virtual endoscopy' and 'colon dissection' in diagnosing polyps up to 4.9 mm in diameter was statistically significant. Conclusion: 3D reconstruction software 'colon dissection' improves sensitivity of CT colonography compared with the endoluminal view

  10. FlowMax: A Computational Tool for Maximum Likelihood Deconvolution of CFSE Time Courses.

    Maxim Nikolaievich Shokhirev

    Full Text Available The immune response is a concerted dynamic multi-cellular process. Upon infection, the dynamics of lymphocyte populations are an aggregate of molecular processes that determine the activation, division, and longevity of individual cells. The timing of these single-cell processes is remarkably widely distributed with some cells undergoing their third division while others undergo their first. High cell-to-cell variability and technical noise pose challenges for interpreting popular dye-dilution experiments objectively. It remains an unresolved challenge to avoid under- or over-interpretation of such data when phenotyping gene-targeted mouse models or patient samples. Here we develop and characterize a computational methodology to parameterize a cell population model in the context of noisy dye-dilution data. To enable objective interpretation of model fits, our method estimates fit sensitivity and redundancy by stochastically sampling the solution landscape, calculating parameter sensitivities, and clustering to determine the maximum-likelihood solution ranges. Our methodology accounts for both technical and biological variability by using a cell fluorescence model as an adaptor during population model fitting, resulting in improved fit accuracy without the need for ad hoc objective functions. We have incorporated our methodology into an integrated phenotyping tool, FlowMax, and used it to analyze B cells from two NFκB knockout mice with distinct phenotypes; we not only confirm previously published findings at a fraction of the expended effort and cost, but reveal a novel phenotype of nfkb1/p105/50 in limiting the proliferative capacity of B cells following B-cell receptor stimulation. In addition to complementing experimental work, FlowMax is suitable for high throughput analysis of dye dilution studies within clinical and pharmacological screens with objective and quantitative conclusions.

  11. Reduction of radiation exposure and image quality using dose reduction tool on computed tomography fluoroscopy

    Sakabe, Daisuke; Tochihara, Syuichi; Ono, Michiaki; Tokuda, Masaki; Kai, Noriyuki; Nakato, Kengo; Hashida, Masahiro; Funama, Yoshinori; Murazaki, Hiroo

    2012-01-01

    The purpose of our study was to measure the reduction rate of radiation dose and variability of image noise using the angular beam modulation (ABM) on computed tomography (CT) fluoroscopy. The Alderson-Rando phantom and the homemade phantom were used in our study. These phantoms were scanned at on-center and off-center positions at -12 cm along y-axis with and without ABM technique. Regarding the technique, the x-ray tube is turned off in a 100-degree angle sector at the center of 12 o'clock, 10 o'clock, and 2 o'clock positions during CT fluoroscopy. CT fluoroscopic images were obtained with tube voltages, 120 kV; tube current-time product per reconstructed image, 30 mAs; rotation time, 0.5 s/rot; slice thickness, 4.8 mm; and reconstruction kernel B30s in each scanning. After CT scanning, radiation exposure and image noise were measured and the image artifacts were evaluated with and without the technique. The reduction rate for radiation exposure was 75-80% with and without the technique at on-center position regardless of each angle position. In the case of the off-center position at -12 cm, the reduction rate was 50% with and without the technique. In contrast, image noise remained constant with and without the technique. Visual inspection for image artifacts almost have the same scores with and without the technique and no statistical significance was found in both techniques (p>0.05). ABM is an appropriate tool for reducing radiation exposure and maintaining image-noise and artifacts during CT fluoroscopy. (author)

  12. Computer simulation models as a tool to investigate the role of microRNAs in osteoarthritis.

    Carole J Proctor

    Full Text Available The aim of this study was to show how computational models can be used to increase our understanding of the role of microRNAs in osteoarthritis (OA using miR-140 as an example. Bioinformatics analysis and experimental results from the literature were used to create and calibrate models of gene regulatory networks in OA involving miR-140 along with key regulators such as NF-κB, SMAD3, and RUNX2. The individual models were created with the modelling standard, Systems Biology Markup Language, and integrated to examine the overall effect of miR-140 on cartilage homeostasis. Down-regulation of miR-140 may have either detrimental or protective effects for cartilage, indicating that the role of miR-140 is complex. Studies of individual networks in isolation may therefore lead to different conclusions. This indicated the need to combine the five chosen individual networks involving miR-140 into an integrated model. This model suggests that the overall effect of miR-140 is to change the response to an IL-1 stimulus from a prolonged increase in matrix degrading enzymes to a pulse-like response so that cartilage degradation is temporary. Our current model can easily be modified and extended as more experimental data become available about the role of miR-140 in OA. In addition, networks of other microRNAs that are important in OA could be incorporated. A fully integrated model could not only aid our understanding of the mechanisms of microRNAs in ageing cartilage but could also provide a useful tool to investigate the effect of potential interventions to prevent cartilage loss.

  13. A suite of MATLAB-based computational tools for automated analysis of COPAS Biosort data

    Morton, Elizabeth; Lamitina, Todd

    2010-01-01

    Complex Object Parametric Analyzer and Sorter (COPAS) devices are large-object, fluorescence-capable flow cytometers used for high-throughput analysis of live model organisms, including Drosophila melanogaster, Caenorhabditis elegans, and zebrafish. The COPAS is especially useful in C. elegans high-throughput genome-wide RNA interference (RNAi) screens that utilize fluorescent reporters. However, analysis of data from such screens is relatively labor-intensive and time-consuming. Currently, there are no computational tools available to facilitate high-throughput analysis of COPAS data. We used MATLAB to develop algorithms (COPAquant, COPAmulti, and COPAcompare) to analyze different types of COPAS data. COPAquant reads single-sample files, filters and extracts values and value ratios for each file, and then returns a summary of the data. COPAmulti reads 96-well autosampling files generated with the ReFLX adapter, performs sample filtering, graphs features across both wells and plates, performs some common statistical measures for hit identification, and outputs results in graphical formats. COPAcompare performs a correlation analysis between replicate 96-well plates. For many parameters, thresholds may be defined through a simple graphical user interface (GUI), allowing our algorithms to meet a variety of screening applications. In a screen for regulators of stress-inducible GFP expression, COPAquant dramatically accelerated data analysis and allowed us to rapidly move from raw data to hit identification. Because the COPAS file structure is standardized and our MATLAB code is freely available, our algorithms should be extremely useful for analysis of COPAS data from multiple platforms and organisms. The MATLAB code is freely available at our web site (www.med.upenn.edu/lamitinalab/downloads.shtml). PMID:20569218

  14. Automated Parallel Computing Tools for Multicore Machines and Clusters, Phase I

    National Aeronautics and Space Administration — We propose to improve productivity of high performance computing for applications on multicore computers and clusters. These machines built from one or more chips...

  15. Applying knowledge engineering tools for the personal computer to the operation and maintenance of radiopharmaceutical production systems

    Alexoff, D.L.

    1990-01-01

    A practical consequence of over three decades of Artificial Intelligence (AI) research has been the emergence of Personal Computer-based AI programming tools. A special class of this microcomputer-based software, called expert systems shells, is now applied routinely outside the realm of classical AI to solve many types of problems, particularly in analytical chemistry. These AI tools offer not only some of the advantages inherent to symbolic programming languages, but, as significant, they bring with them advanced program development environments which can facilitate software development and maintenance. Exploitation of this enhanced programming environment was a major motivation for using an AI tool. The goal of this work is to evaluate the use of an example-based expert system shell (1st Class FUSION, 1st Class Expert Systems, Inc.) as a programming tool for developing software useful for automated radiopharmaceutical production

  16. Rapid mental computation system as a tool for algorithmic thinking of elementary school students development

    Ziatdinov, Rushan; Musa, Sajid

    2013-01-01

    In this paper, we describe the possibilities of using a rapid mental computation system in elementary education. The system consists of a number of readily memorized operations that allow one to perform arithmetic computations very quickly. These operations are actually simple algorithms which can develop or improve the algorithmic thinking of pupils. Using a rapid mental computation system allows forming the basis for the study of computer science in secondary school.

  17. Analysis and Verification of a Key Agreement Protocol over Cloud Computing Using Scyther Tool

    Hazem A Elbaz

    2015-01-01

    The mostly cloud computing authentication mechanisms use public key infrastructure (PKI). Hierarchical Identity Based Cryptography (HIBC) has several advantages that sound well align with the demands of cloud computing. The main objectives of cloud computing authentication protocols are security and efficiency. In this paper, we clarify Hierarchical Identity Based Authentication Key Agreement (HIB-AKA) protocol, providing lightweight key management approach for cloud computing users. Then, we...

  18. Approaches and Tools Used to Teach the Computer Input/Output Subsystem: A Survey

    Larraza-Mendiluze, Edurne; Garay-Vitoria, Nestor

    2015-01-01

    This paper surveys how the computer input/output (I/O) subsystem is taught in introductory undergraduate courses. It is important to study the educational process of the computer I/O subsystem because, in the curricula recommendations, it is considered a core topic in the area of knowledge of computer architecture and organization (CAO). It is…

  19. Public Interest Energy Research (PIER) Program Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    Xu, Tengfang [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Flapper, Joris [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ke, Jing [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Kramer, Klaas [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sathaye, Jayant [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-02-01

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry - including four dairy processes - cheese, fluid milk, butter, and milk powder. BEST-Dairy tool developed in this project provides three options for the user to benchmark each of the dairy product included in the tool, with each option differentiated based on specific detail level of process or plant, i.e., 1) plant level; 2) process-group level, and 3) process-step level. For each detail level, the tool accounts for differences in production and other variables affecting energy use in dairy processes. The dairy products include cheese, fluid milk, butter, milk powder, etc. The BEST-Dairy tool can be applied to a wide range of dairy facilities to provide energy and water savings estimates, which are based upon the comparisons with the best available reference cases that were established through reviewing information from international and national samples. We have performed and completed alpha- and beta-testing (field testing) of the BEST-Dairy tool, through which feedback from voluntary users in the U.S. dairy industry was gathered to validate and improve the tool's functionality. BEST-Dairy v1.2 was formally published in May 2011, and has been made available for free downloads from the internet (i.e., http://best-dairy.lbl.gov). A user's manual has been developed and published as the companion documentation for use with the BEST-Dairy tool. In addition, we also carried out technology transfer activities by engaging the dairy industry in the process of tool development and testing, including field testing, technical presentations, and technical assistance throughout the project. To date, users from more than ten countries in addition to those in the U.S. have downloaded the BEST-Dairy from the LBNL website. It is expected that the use of BEST-Dairy tool will advance understanding of energy and

  20. Computational assessment of hemodynamics-based diagnostic tools using a database of virtual subjects: Application to three case studies.

    Willemet, Marie; Vennin, Samuel; Alastruey, Jordi

    2016-12-08

    Many physiological indexes and algorithms based on pulse wave analysis have been suggested in order to better assess cardiovascular function. Because these tools are often computed from in-vivo hemodynamic measurements, their validation is time-consuming, challenging, and biased by measurement errors. Recently, a new methodology has been suggested to assess theoretically these computed tools: a database of virtual subjects generated using numerical 1D-0D modeling of arterial hemodynamics. The generated set of simulations encloses a wide selection of healthy cases that could be encountered in a clinical study. We applied this new methodology to three different case studies that demonstrate the potential of our new tool, and illustrated each of them with a clinically relevant example: (i) we assessed the accuracy of indexes estimating pulse wave velocity; (ii) we validated and refined an algorithm that computes central blood pressure; and (iii) we investigated theoretical mechanisms behind the augmentation index. Our database of virtual subjects is a new tool to assist the clinician: it provides insight into the physical mechanisms underlying the correlations observed in clinical practice. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  1. vFitness: a web-based computing tool for improving estimation of in vitro HIV-1 fitness experiments

    Demeter Lisa

    2010-05-01

    Full Text Available Abstract Background The replication rate (or fitness between viral variants has been investigated in vivo and in vitro for human immunodeficiency virus (HIV. HIV fitness plays an important role in the development and persistence of drug resistance. The accurate estimation of viral fitness relies on complicated computations based on statistical methods. This calls for tools that are easy to access and intuitive to use for various experiments of viral fitness. Results Based on a mathematical model and several statistical methods (least-squares approach and measurement error models, a Web-based computing tool has been developed for improving estimation of virus fitness in growth competition assays of human immunodeficiency virus type 1 (HIV-1. Conclusions Unlike the two-point calculation used in previous studies, the estimation here uses linear regression methods with all observed data in the competition experiment to more accurately estimate relative viral fitness parameters. The dilution factor is introduced for making the computational tool more flexible to accommodate various experimental conditions. This Web-based tool is implemented in C# language with Microsoft ASP.NET, and is publicly available on the Web at http://bis.urmc.rochester.edu/vFitness/.

  2. Improved mathematical and computational tools for modeling photon propagation in tissue

    Calabro, Katherine Weaver

    Light interacts with biological tissue through two predominant mechanisms: scattering and absorption, which are sensitive to the size and density of cellular organelles, and to biochemical composition (ex. hemoglobin), respectively. During the progression of disease, tissues undergo a predictable set of changes in cell morphology and vascularization, which directly affect their scattering and absorption properties. Hence, quantification of these optical property differences can be used to identify the physiological biomarkers of disease with interest often focused on cancer. Diffuse reflectance spectroscopy is a diagnostic tool, wherein broadband visible light is transmitted through a fiber optic probe into a turbid medium, and after propagating through the sample, a fraction of the light is collected at the surface as reflectance. The measured reflectance spectrum can be analyzed with appropriate mathematical models to extract the optical properties of the tissue, and from these, a set of physiological properties. A number of models have been developed for this purpose using a variety of approaches -- from diffusion theory, to computational simulations, and empirical observations. However, these models are generally limited to narrow ranges of tissue and probe geometries. In this thesis, reflectance models were developed for a much wider range of measurement parameters, and influences such as the scattering phase function and probe design were investigated rigorously for the first time. The results provide a comprehensive understanding of the factors that influence reflectance, with novel insights that, in some cases, challenge current assumptions in the field. An improved Monte Carlo simulation program, designed to run on a graphics processing unit (GPU), was built to simulate the data used in the development of the reflectance models. Rigorous error analysis was performed to identify how inaccuracies in modeling assumptions can be expected to affect the accuracy

  3. Can clinical prediction tools predict the need for computed tomography in blunt abdominal? A systematic review.

    Sharples, Alistair; Brohi, Karim

    2016-08-01

    Blunt abdominal trauma is a common reason for admission to the Emergency Department. Early detection of injuries is an important goal but is often not straightforward as physical examination alone is not a good predictor of serious injury. Computed tomography (CT) has become the primary method for assessing the stable trauma patient. It has high sensitivity and specificity but there remains concern regarding the long term consequences of high doses of radiation. Therefore an accurate and reliable method of assessing which patients are at higher risk of injury and hence require a CT would be clinically useful. We perform a systematic review to investigate the use of clinical prediction tools (CPTs) for the identification of abdominal injuries in patients suffering blunt trauma. A literature search was performed using Medline, Embase, The Cochrane Library and NHS Evidence up to August 2014. English language, prospective and retrospective studies were included if they derived, validated or assessed a CPT, aimed at identifying intra-abdominal injuries or the need for intervention to treat an intra-abdominal after blunt trauma. Methodological quality was assessed using a 14 point scale. Performance was assessed predominantly by sensitivity. Seven relevant studies were identified. All studies were derivative studies and no CPT was validated in a separate study. There were large differences in the study design, composition of the CPTs, the outcomes analysed and the methodological quality of the included studies. Sensitivities ranged from 86 to 100%. The highest performing CPT had a lower limit of the 95% CI of 95.8% and was of high methodological quality (11 of 14). Had this rule been applied to the population then 25.1% of patients would have avoided a CT scan. Seven CPTs were identified of varying designs and methodological quality. All demonstrate relatively high sensitivity with some achieving very high sensitivity whilst still managing to reduce the number of CTs

  4. iPad and computer devices in preschool : A tool for literacy development among teachers and children in preschool

    Oladunjoye, Olayemi Kemi

    2013-01-01

    The title of this thesis is "iPad and Computer devices in Preschool: A tool for literacy development among teachers and children in preschool." The study was an exploration of how teachers and their pupils put iPad and other computer devices into use in early childhood education. This study was a qualitative research study, based on the observation of the pupils and the interviews of the teachers. In this study, observation of the children and interviewing of the teachers over a period of fiv...

  5. Advancing Research in Second Language Writing through Computational Tools and Machine Learning Techniques: A Research Agenda

    Crossley, Scott A.

    2013-01-01

    This paper provides an agenda for replication studies focusing on second language (L2) writing and the use of natural language processing (NLP) tools and machine learning algorithms. Specifically, it introduces a range of the available NLP tools and machine learning algorithms and demonstrates how these could be used to replicate seminal studies…

  6. A Usability Study of Users' Perceptions toward a Multimedia Computer-Assisted Learning Tool for Neuroanatomy

    Gould, Douglas J.; Terrell, Mark A.; Fleming, Jo

    2008-01-01

    This usability study evaluated users' perceptions of a multimedia prototype for a new e-learning tool: Anatomy of the Central Nervous System: A Multimedia Course. Usability testing is a collection of formative evaluation methods that inform the developmental design of e-learning tools to maximize user acceptance, satisfaction, and adoption.…

  7. TME (Task Mapping Editor): tool for executing distributed parallel computing. TME user's manual

    Takemiya, Hiroshi; Yamagishi, Nobuhiro; Imamura, Toshiyuki

    2000-03-01

    At the Center for Promotion of Computational Science and Engineering, a software environment PPExe has been developed to support scientific computing on a parallel computer cluster (distributed parallel scientific computing). TME (Task Mapping Editor) is one of components of the PPExe and provides a visual programming environment for distributed parallel scientific computing. Users can specify data dependence among tasks (programs) visually as a data flow diagram and map these tasks onto computers interactively through GUI of TME. The specified tasks are processed by other components of PPExe such as Meta-scheduler, RIM (Resource Information Monitor), and EMS (Execution Management System) according to the execution order of these tasks determined by TME. In this report, we describe the usage of TME. (author)

  8. Non-Intrusive Computational Method and Uncertainty Quantification Tool for isolator operability calculations, Phase I

    National Aeronautics and Space Administration — Computational fluid dynamics (CFD) simulations are extensively used by NASA for hypersonic aerothermodynamics calculations. The physical models used in CFD codes and...

  9. ATLAS OpenData and OpenKey: using low tech computational tools for students training in High Energy Physics

    Sanchez Pineda, Arturos; The ATLAS collaboration

    2018-01-01

    One of the big challenges in High Energy Physics development is the fact that many potential -and very valuable- students and young researchers live in countries where internet access and computational infrastructure are poor compared to institutions already participating. In order to accelerate the process, the ATLAS Open Data project releases useful and meaningful data and tools using standard and easy-to-deploy computational means, such as custom and light Linux Virtual Machines, open source technologies, web and desktop applications. The ATLAS Open Key, a simple USB pen, allows transporting all those resources around the globe. As simple as it sounds, this approach is helping to train students that are now PhD candidates and to integrate HEP educational programs at Master level in universities where did not exist before. The software tools and resources used will be presented, as well as results and stories, ideas and next steps of the ATLAS Open Data project.

  10. Using the Eclipse Parallel Tools Platform to Assist Earth Science Model Development and Optimization on High Performance Computers

    Alameda, J. C.

    2011-12-01

    Development and optimization of computational science models, particularly on high performance computers, and with the advent of ubiquitous multicore processor systems, practically on every system, has been accomplished with basic software tools, typically, command-line based compilers, debuggers, performance tools that have not changed substantially from the days of serial and early vector computers. However, model complexity, including the complexity added by modern message passing libraries such as MPI, and the need for hybrid code models (such as openMP and MPI) to be able to take full advantage of high performance computers with an increasing core count per shared memory node, has made development and optimization of such codes an increasingly arduous task. Additional architectural developments, such as many-core processors, only complicate the situation further. In this paper, we describe how our NSF-funded project, "SI2-SSI: A Productive and Accessible Development Workbench for HPC Applications Using the Eclipse Parallel Tools Platform" (WHPC) seeks to improve the Eclipse Parallel Tools Platform, an environment designed to support scientific code development targeted at a diverse set of high performance computing systems. Our WHPC project to improve Eclipse PTP takes an application-centric view to improve PTP. We are using a set of scientific applications, each with a variety of challenges, and using PTP to drive further improvements to both the scientific application, as well as to understand shortcomings in Eclipse PTP from an application developer perspective, to drive our list of improvements we seek to make. We are also partnering with performance tool providers, to drive higher quality performance tool integration. We have partnered with the Cactus group at Louisiana State University to improve Eclipse's ability to work with computational frameworks and extremely complex build systems, as well as to develop educational materials to incorporate into

  11. GPU-FS-kNN: a software tool for fast and scalable kNN computation using GPUs.

    Ahmed Shamsul Arefin

    Full Text Available BACKGROUND: The analysis of biological networks has become a major challenge due to the recent development of high-throughput techniques that are rapidly producing very large data sets. The exploding volumes of biological data are craving for extreme computational power and special computing facilities (i.e. super-computers. An inexpensive solution, such as General Purpose computation based on Graphics Processing Units (GPGPU, can be adapted to tackle this challenge, but the limitation of the device internal memory can pose a new problem of scalability. An efficient data and computational parallelism with partitioning is required to provide a fast and scalable solution to this problem. RESULTS: We propose an efficient parallel formulation of the k-Nearest Neighbour (kNN search problem, which is a popular method for classifying objects in several fields of research, such as pattern recognition, machine learning and bioinformatics. Being very simple and straightforward, the performance of the kNN search degrades dramatically for large data sets, since the task is computationally intensive. The proposed approach is not only fast but also scalable to large-scale instances. Based on our approach, we implemented a software tool GPU-FS-kNN (GPU-based Fast and Scalable k-Nearest Neighbour for CUDA enabled GPUs. The basic approach is simple and adaptable to other available GPU architectures. We observed speed-ups of 50-60 times compared with CPU implementation on a well-known breast microarray study and its associated data sets. CONCLUSION: Our GPU-based Fast and Scalable k-Nearest Neighbour search technique (GPU-FS-kNN provides a significant performance improvement for nearest neighbour computation in large-scale networks. Source code and the software tool is available under GNU Public License (GPL at https://sourceforge.net/p/gpufsknn/.

  12. Teaching Advanced Concepts in Computer Networks: VNUML-UM Virtualization Tool

    Ruiz-Martinez, A.; Pereniguez-Garcia, F.; Marin-Lopez, R.; Ruiz-Martinez, P. M.; Skarmeta-Gomez, A. F.

    2013-01-01

    In the teaching of computer networks the main problem that arises is the high price and limited number of network devices the students can work with in the laboratories. Nowadays, with virtualization we can overcome this limitation. In this paper, we present a methodology that allows students to learn advanced computer network concepts through…

  13. Computer-Mediated Communication as an Autonomy-Enhancement Tool for Advanced Learners of English

    Wach, Aleksandra

    2012-01-01

    This article examines the relevance of modern technology for the development of learner autonomy in the process of learning English as a foreign language. Computer-assisted language learning and computer-mediated communication (CMC) appear to be particularly conducive to fostering autonomous learning, as they naturally incorporate many elements of…

  14. The Hopes and Realities of the Computer as a School Administration and School Management Tool

    Butler, Rory; Visscher, Arend J.; Tatnall, Arthur; Davey, Bill

    2014-01-01

    Software for school administration and school management started as teachers with a science background started to develop computer programs in order that school office staff did not have to repeatedly type and re-type student lists. Later, computing companies entered the market and software packages

  15. THE METHODICAL ASPECTS OF MAXIMA USING AS A TOOL FOR FUNDAMENTAL TRAINING OF BACHELORS OF COMPUTER SCIENCE

    M. Shyshkina

    2014-07-01

    Full Text Available Within the formation of the information society, where the pace of scientific progress is rapidly growing, it is difficult to provide the training for immediate inclusion of the person into the production chain at a workplace or in an educational system. There is the way out and it is fundamentalization of informatics education. It is necessary to train the specialist so that he (she could be able to be adapted quickly to the changes occurring in the industry technological development; to give him knowledge, universal in nature, so as the expert may navigate quickly to resolve the professional tasks on this basis. The article describes the trends of systems of computer mathematics (SCM pedagogical use for teaching computer science disciplines. The general characteristics and conditions for effective use of the Maxima as a tool for fundamentalization of the bachelors learning process are outlined. The ways of informatics disciplines teaching methodology are revealed. The peculiarities of cloud based learning solutions are considered. The purpose of the article is the analysis of contemporary approaches to the use of systems of computer mathematics as a tool for fundamentalization of informatics disciplines training courses and identify methodological aspects of these systems application for the teaching of operations research as by the example of SCM Maxima. The object of investigation is the learning process of informatics bachelors with the use of SCM. The subject of investigation is the peculiarities of using the SCM Maxima as a learning tool for informatics courses support

  16. Use of computed tomography slices 3D-reconstruction as a powerful tool to improve manufacturing processes on aeroengine components

    Castellan, C.; Dastarac, D.

    2000-01-01

    TURBOMECA has been using computed tomography for several years as an inner-health analysis powerful tool for engine components. From 2D slices of the examined part, detailed information about lacks or inclusions could easily be extracted. But, measurements on internal features were quickly required because no other NDT methods were able to do it. CT has thus logically become a powerful 2D dimensional measuring tool. Recently, with new software and the latest computers able to deal with huge files, CT has become a powerful 3D digitization tool and now, TOMO ADOUR can offer a complete solution for reverse engineering of complex parts. Several months ago, TURBOMECA introduced CT into many development, validation and industrialization processes and has demonstrated how to take corrective actions to process deviation on their aeroengine components by: extracting the nonexisting CAD model of a part, generating CAD compatible data to check dimensional conformity and, eventually correct design misfits or manufacturing drifts, highlighting the metallurgical health of first article parts, making the decision of repairing the defining the appropriate method, generating a file (.STL) to build a rapid prototype or a file to pilot tool parts for machining, calculating physical properties such as behavior or flow analysis on a 'real' model. The image also allows a drawing to be made of a part that was originally produced by a supplier or competitor. This paper will be illustrated with a large number of examples

  17. Computational Tool for Kinetic Modeling of Non-Equilibrium Multiphase Flows in Ablation, Phase I

    National Aeronautics and Space Administration — Development of highly accurate tools to predict aerothermal environments and associated effects on vehicles is needed to enable advanced spacecraft for future NASA...

  18. Water System Adaptation To Hydrological Changes: Module 11, Methods and Tools: Computational Models

    This course will introduce students to the fundamental principles of water system adaptation to hydrological changes, with emphasis on data analysis and interpretation, technical planning, and computational modeling. Starting with real-world scenarios and adaptation needs, the co...

  19. A Dedicated Computational Platform for Cellular Monte Carlo T-CAD Software Tools

    2015-07-14

    computer that establishes an encrypted Virtual Private Network ( OpenVPN [44]) based on the Secure Socket Layer (SSL) paradigm. Each user is given a...security certificate for each device used to connect to the computing nodes. Stable OpenVPN clients are available for Linux, Microsoft Windows, Apple OSX...platform is granted by an encrypted connection base on the Secure Socket Layer (SSL) protocol, and implemented in the OpenVPN Virtual Personal Network

  20. The Multimorbidity Cluster Analysis Tool: Identifying Combinations and Permutations of Multiple Chronic Diseases Using a Record-Level Computational Analysis

    Kathryn Nicholson

    2017-12-01

    Full Text Available Introduction: Multimorbidity, or the co-occurrence of multiple chronic health conditions within an individual, is an increasingly dominant presence and burden in modern health care systems.  To fully capture its complexity, further research is needed to uncover the patterns and consequences of these co-occurring health states.  As such, the Multimorbidity Cluster Analysis Tool and the accompanying Multimorbidity Cluster Analysis Toolkit have been created to allow researchers to identify distinct clusters that exist within a sample of participants or patients living with multimorbidity.  Development: The Tool and Toolkit were developed at Western University in London, Ontario, Canada.  This open-access computational program (JAVA code and executable file was developed and tested to support an analysis of thousands of individual records and up to 100 disease diagnoses or categories.  Application: The computational program can be adapted to the methodological elements of a research project, including type of data, type of chronic disease reporting, measurement of multimorbidity, sample size and research setting.  The computational program will identify all existing, and mutually exclusive, combinations and permutations within the dataset.  An application of this computational program is provided as an example, in which more than 75,000 individual records and 20 chronic disease categories resulted in the detection of 10,411 unique combinations and 24,647 unique permutations among female and male patients.  Discussion: The Tool and Toolkit are now available for use by researchers interested in exploring the complexities of multimorbidity.  Its careful use, and the comparison between results, will be valuable additions to the nuanced understanding of multimorbidity.

  1. NetH2pan: A Computational Tool to Guide MHC peptide prediction on Murine Tumors

    DeVette, Christa I; Andreatta, Massimo; Bardet, Wilfried

    2018-01-01

    With the advancement of personalized cancer immunotherapies, new tools are needed to identify tumor antigens and evaluate T-cell responses in model systems, specifically those that exhibit clinically relevant tumor progression. Key transgenic mouse models of breast cancer are generated and mainta......With the advancement of personalized cancer immunotherapies, new tools are needed to identify tumor antigens and evaluate T-cell responses in model systems, specifically those that exhibit clinically relevant tumor progression. Key transgenic mouse models of breast cancer are generated...... for evaluating antigen specificity in the murine FVB strain. Our study provides the first detailed molecular and immunoproteomic characterization of the FVB H-2q MHC Class I alleles, including >8500 unique peptide ligands, a multi-allele murine MHC peptide prediction tool, and in vivo validation of these data...

  2. Development of a New Data Tool for Computing Launch and Landing Availability with Respect to Surface Weather

    Burns, K. Lee; Altino, Karen

    2008-01-01

    The Marshall Space Flight Center Natural Environments Branch has a long history of expertise in the modeling and computation of statistical launch availabilities with respect to weather conditions. Their existing data analysis product, the Atmospheric Parametric Risk Assessment (APRA) tool, computes launch availability given an input set of vehicle hardware and/or operational weather constraints by calculating the climatological probability of exceeding the specified constraint limits, APRA has been used extensively to provide the Space Shuttle program the ability to estimate impacts that various proposed design modifications would have to overall launch availability. The model accounts for both seasonal and diurnal variability at a single geographic location and provides output probabilities for a single arbitrary launch attempt. Recently, the Shuttle program has shown interest in having additional capabilities added to the APRA model, including analysis of humidity parameters, inclusion of landing site weather to produce landing availability, and concurrent analysis of multiple sites, to assist in operational landing site selection. In addition, the Constellation program has also expressed interest in the APRA tool, and has requested several additional capabilities to address some Constellation-specific issues, both in the specification and verification of design requirements and in the development of operations concepts. The combined scope of the requested capability enhancements suggests an evolution of the model beyond a simple revision process. Development has begun for a new data analysis tool that will satisfy the requests of both programs. This new tool, Probabilities of Atmospheric Conditions and Environmental Risk (PACER), will provide greater flexibility and significantly enhanced functionality compared to the currently existing tool.

  3. WNoDeS, a tool for integrated Grid and Cloud access and computing farm virtualization

    Salomoni, Davide; Italiano, Alessandro; Ronchieri, Elisabetta

    2011-01-01

    INFN CNAF is the National Computing Center, located in Bologna, Italy, of the Italian National Institute for Nuclear Physics (INFN). INFN CNAF, also called the INFN Tier-1, provides computing and storage facilities to the International High-Energy Physics community and to several multi-disciplinary experiments. Currently, the INFN Tier-1 supports more than twenty different collaborations; in this context, optimization of the usage of computing resources is essential. This is one of the main drivers behind the development of a software called WNoDeS (Worker Nodes on Demand Service). WNoDeS, developed at INFN CNAF and deployed on the INFN Tier-1 production infrastructure, is a solution to virtualize computing resources and to make them available through local, Grid or Cloud interfaces. It is designed to be fully integrated with a Local Resource Management System; it is therefore inherently scalable and permits full integration with existing scheduling, policing, monitoring, accounting and security workflows. WNoDeS dynamically instantiates Virtual Machines (VMs) on-demand, i.e. only when the need arises; these VMs can be tailored and used for purposes like batch job execution, interactive analysis or service instantiation. WNoDeS supports interaction with user requests through traditional batch or Grid jobs and also via the Open Cloud Computing Interface standard, making it possible to allocate compute, storage and network resources on a pay-as-you-go basis. User authentication is supported via several authentication methods, while authorization policies are handled via gLite Argus. WNoDeS is an ambitious solution aimed at virtualizing cluster resources in medium or large scale computing centers, with up to several thousands of Virtual Machines up and running at any given time. In this paper, we describe the WNoDeS architecture.

  4. WNoDeS, a tool for integrated Grid and Cloud access and computing farm virtualization

    Salomoni, Davide; Italiano, Alessandro; Ronchieri, Elisabetta

    2011-12-01

    INFN CNAF is the National Computing Center, located in Bologna, Italy, of the Italian National Institute for Nuclear Physics (INFN). INFN CNAF, also called the INFN Tier-1, provides computing and storage facilities to the International High-Energy Physics community and to several multi-disciplinary experiments. Currently, the INFN Tier-1 supports more than twenty different collaborations; in this context, optimization of the usage of computing resources is essential. This is one of the main drivers behind the development of a software called WNoDeS (Worker Nodes on Demand Service). WNoDeS, developed at INFN CNAF and deployed on the INFN Tier-1 production infrastructure, is a solution to virtualize computing resources and to make them available through local, Grid or Cloud interfaces. It is designed to be fully integrated with a Local Resource Management System; it is therefore inherently scalable and permits full integration with existing scheduling, policing, monitoring, accounting and security workflows. WNoDeS dynamically instantiates Virtual Machines (VMs) on-demand, i.e. only when the need arises; these VMs can be tailored and used for purposes like batch job execution, interactive analysis or service instantiation. WNoDeS supports interaction with user requests through traditional batch or Grid jobs and also via the Open Cloud Computing Interface standard, making it possible to allocate compute, storage and network resources on a pay-as-you-go basis. User authentication is supported via several authentication methods, while authorization policies are handled via gLite Argus. WNoDeS is an ambitious solution aimed at virtualizing cluster resources in medium or large scale computing centers, with up to several thousands of Virtual Machines up and running at any given time. In this paper, we descrive the WNoDeS architecture.

  5. Development and Evaluation of a Computer-Based, Self-Management Tool for People Recently Diagnosed with Type 2 Diabetes

    Alison O. Booth

    2016-01-01

    Full Text Available Aim. The purpose of this study was to develop and evaluate a computer-based, dietary, and physical activity self-management program for people recently diagnosed with type 2 diabetes. Methods. The computer-based program was developed in conjunction with the target group and evaluated in a 12-week randomised controlled trial (RCT. Participants were randomised to the intervention (computer-program or control group (usual care. Primary outcomes were diabetes knowledge and goal setting (ADKnowl questionnaire, Diabetes Obstacles Questionnaire (DOQ measured at baseline and week 12. User feedback on the program was obtained via a questionnaire and focus groups. Results. Seventy participants completed the 12-week RCT (32 intervention, 38 control, mean age 59 (SD years. After completion there was a significant between-group difference in the “knowledge and beliefs scale” of the DOQ. Two-thirds of the intervention group rated the program as either good or very good, 92% would recommend the program to others, and 96% agreed that the information within the program was clear and easy to understand. Conclusions. The computer-program resulted in a small but statistically significant improvement in diet-related knowledge and user satisfaction was high. With some further development, this computer-based educational tool may be a useful adjunct to diabetes self-management. This trial is registered with clinicaltrials.gov NCT number NCT00877851.

  6. Computer simulation in conjunction with medical thermography as an adjunct tool for early detection of breast cancer

    Sudharsan NM

    2004-04-01

    Full Text Available Abstract Background Mathematical modelling and analysis is now accepted in the engineering design on par with experimental approaches. Computer simulations enable one to perform several 'what-if' analyses cost effectively. High speed computers and low cost of memory has helped in simulating large-scale models in a relatively shorter time frame. The possibility of extending numerical modelling in the area of breast cancer detection in conjunction with medical thermography is considered in this work. Methods Thermography enables one to see the temperature pattern and look for abnormality. In a thermogram there is no radiation risk as it only captures the infrared radiation from the skin and is totally painless. But, a thermogram is only a test of physiology, whereas a mammogram is a test of anatomy. It is hoped that a thermogram along with numerical modelling will serve as an adjunct tool. Presently mammogram is the 'gold-standard' in breast cancer detection. But the interpretation of a mammogram is largely dependent on the radiologist. Therefore, a thermogram that looks into the physiological changes in combination with numerical simulation performing 'what-if' analysis could act as an adjunct tool to mammography. Results The proposed framework suggested that it could reduce the occurrence of false-negative/positive cases. Conclusion A numerical bioheat model of a female breast is developed and simulated. The results are compared with experimental results. The possibility of this method as an early detection tool is discussed.

  7. Computer simulation in conjunction with medical thermography as an adjunct tool for early detection of breast cancer

    Ng, Eddie Y-K; Sudharsan, NM

    2004-01-01

    Mathematical modelling and analysis is now accepted in the engineering design on par with experimental approaches. Computer simulations enable one to perform several 'what-if' analyses cost effectively. High speed computers and low cost of memory has helped in simulating large-scale models in a relatively shorter time frame. The possibility of extending numerical modelling in the area of breast cancer detection in conjunction with medical thermography is considered in this work. Thermography enables one to see the temperature pattern and look for abnormality. In a thermogram there is no radiation risk as it only captures the infrared radiation from the skin and is totally painless. But, a thermogram is only a test of physiology, whereas a mammogram is a test of anatomy. It is hoped that a thermogram along with numerical modelling will serve as an adjunct tool. Presently mammogram is the 'gold-standard' in breast cancer detection. But the interpretation of a mammogram is largely dependent on the radiologist. Therefore, a thermogram that looks into the physiological changes in combination with numerical simulation performing 'what-if' analysis could act as an adjunct tool to mammography. The proposed framework suggested that it could reduce the occurrence of false-negative/positive cases. A numerical bioheat model of a female breast is developed and simulated. The results are compared with experimental results. The possibility of this method as an early detection tool is discussed

  8. Simulation tools for scattering corrections in spectrally resolved x-ray computed tomography using McXtrace

    Busi, Matteo; Olsen, Ulrik L.; Knudsen, Erik B.; Frisvad, Jeppe R.; Kehres, Jan; Dreier, Erik S.; Khalil, Mohamad; Haldrup, Kristoffer

    2018-03-01

    Spectral computed tomography is an emerging imaging method that involves using recently developed energy discriminating photon-counting detectors (PCDs). This technique enables measurements at isolated high-energy ranges, in which the dominating undergoing interaction between the x-ray and the sample is the incoherent scattering. The scattered radiation causes a loss of contrast in the results, and its correction has proven to be a complex problem, due to its dependence on energy, material composition, and geometry. Monte Carlo simulations can utilize a physical model to estimate the scattering contribution to the signal, at the cost of high computational time. We present a fast Monte Carlo simulation tool, based on McXtrace, to predict the energy resolved radiation being scattered and absorbed by objects of complex shapes. We validate the tool through measurements using a CdTe single PCD (Multix ME-100) and use it for scattering correction in a simulation of a spectral CT. We found the correction to account for up to 7% relative amplification in the reconstructed linear attenuation. It is a useful tool for x-ray CT to obtain a more accurate material discrimination, especially in the high-energy range, where the incoherent scattering interactions become prevailing (>50 keV).

  9. Computational tools for the construction of calibration curves for use in dose calculations in radiotherapy treatment planning

    Oliveira, Alex C.H.; Vieira, Jose W.; Escola Politecnica de Pernambuco , Recife, PE

    2011-01-01

    The realization of tissue inhomogeneity corrections in image-based treatment planning improves the accuracy of radiation dose calculations for patients undergoing external-beam radiotherapy. Before the tissue inhomogeneity correction can be applied, the relationship between the computed tomography (CT) numbers and density must be established. This relationship is typically established by a calibration curve empirically obtained from CT images of a phantom that has several inserts of tissue-equivalent materials, covering a wide range of densities. This calibration curve is scanner-dependent and allows the conversion of CT numbers in densities for use in dose calculations. This paper describes the implementation of computational tools necessary to construct calibration curves. These tools are used for reading and displaying of CT images in DICOM format, determination of the mean CT numbers (and their standard deviations) of each tissue-equivalent material and construction of calibration curves by fits with bilinear equations. All these tools have been implemented in the Microsoft Visual Studio 2010 in C≠ programming language. (author)

  10. Computer system for identification of tool wear model in hot forging

    Wilkus Marek

    2016-01-01

    Full Text Available The aim of the research was to create a methodology that will enable effective and reliable prediction of the tool wear. The idea of the hybrid model, which accounts for various mechanisms of tool material deterioration, is proposed in the paper. The mechanisms, which were considered, include abrasive wear, adhesive wear, thermal fatigue, mechanical fatigue, oxidation and plastic deformation. Individual models of various complexity were used for separate phenomena and strategy of combination of these models in one hybrid system was developed to account for the synergy of various mechanisms. The complex hybrid model was built on the basis of these individual models for various wear mechanisms. The individual models expanded from phenomenological ones for abrasive wear to multi-scale methods for modelling micro cracks initiation and propagation utilizing virtual representations of granular microstructures. The latter have been intensively developed recently and they form potentially a powerful tool that allows modelling of thermal and mechanical fatigue, accounting explicitly for the tool material microstructure.

  11. Evaluation of Computer Tools for Idea Generation and Team Formation in Project-Based Learning

    Ardaiz-Villanueva, Oscar; Nicuesa-Chacon, Xabier; Brene-Artazcoz, Oscar; Sanz de Acedo Lizarraga, Maria Luisa; Sanz de Acedo Baquedano, Maria Teresa

    2011-01-01

    The main objective of this research was to validate the effectiveness of Wikideas and Creativity Connector tools to stimulate the generation of ideas and originality by university students organized into groups according to their indexes of creativity and affinity. Another goal of the study was to evaluate the classroom climate created by these…

  12. The hierarchical expert tuning of PID controllers using tools of soft computing.

    Karray, F; Gueaieb, W; Al-Sharhan, S

    2002-01-01

    We present soft computing-based results pertaining to the hierarchical tuning process of PID controllers located within the control loop of a class of nonlinear systems. The results are compared with PID controllers implemented either in a stand alone scheme or as a part of conventional gain scheduling structure. This work is motivated by the increasing need in the industry to design highly reliable and efficient controllers for dealing with regulation and tracking capabilities of complex processes characterized by nonlinearities and possibly time varying parameters. The soft computing-based controllers proposed are hybrid in nature in that they integrate within a well-defined hierarchical structure the benefits of hard algorithmic controllers with those having supervisory capabilities. The controllers proposed also have the distinct features of learning and auto-tuning without the need for tedious and computationally extensive online systems identification schemes.

  13. A new DoD initiative: the Computational Research and Engineering Acquisition Tools and Environments (CREATE) program

    Arevalo, S; Atwood, C; Bell, P; Blacker, T D; Dey, S; Fisher, D; Fisher, D A; Genalis, P; Gorski, J; Harris, A; Hill, K; Hurwitz, M; Kendall, R P; Meakin, R L; Morton, S; Moyer, E T; Post, D E; Strawn, R; Veldhuizen, D v; Votta, L G

    2008-01-01

    In FY2008, the U.S. Department of Defense (DoD) initiated the Computational Research and Engineering Acquisition Tools and Environments (CREATE) program, a $360M program with a two-year planning phase and a ten-year execution phase. CREATE will develop and deploy three computational engineering tool sets for DoD acquisition programs to use to design aircraft, ships and radio-frequency antennas. The planning and execution of CREATE are based on the 'lessons learned' from case studies of large-scale computational science and engineering projects. The case studies stress the importance of a stable, close-knit development team; a focus on customer needs and requirements; verification and validation; flexible and agile planning, management, and development processes; risk management; realistic schedules and resource levels; balanced short- and long-term goals and deliverables; and stable, long-term support by the program sponsor. Since it began in FY2008, the CREATE program has built a team and project structure, developed requirements and begun validating them, identified candidate products, established initial connections with the acquisition programs, begun detailed project planning and development, and generated the initial collaboration infrastructure necessary for success by its multi-institutional, multidisciplinary teams

  14. Simulation tools for two-dimensional experiments in x-ray computed tomography using the FORBILD head phantom.

    Yu, Zhicong; Noo, Frédéric; Dennerlein, Frank; Wunderlich, Adam; Lauritsch, Günter; Hornegger, Joachim

    2012-07-07

    Mathematical phantoms are essential for the development and early stage evaluation of image reconstruction algorithms in x-ray computed tomography (CT). This note offers tools for computer simulations using a two-dimensional (2D) phantom that models the central axial slice through the FORBILD head phantom. Introduced in 1999, in response to a need for a more robust test, the FORBILD head phantom is now seen by many as the gold standard. However, the simple Shepp-Logan phantom is still heavily used by researchers working on 2D image reconstruction. Universal acceptance of the FORBILD head phantom may have been prevented by its significantly higher complexity: software that allows computer simulations with the Shepp-Logan phantom is not readily applicable to the FORBILD head phantom. The tools offered here address this problem. They are designed for use with Matlab®, as well as open-source variants, such as FreeMat and Octave, which are all widely used in both academia and industry. To get started, the interested user can simply copy and paste the codes from this PDF document into Matlab® M-files.

  15. Simulation tools for two-dimensional experiments in x-ray computed tomography using the FORBILD head phantom

    Yu Zhicong; Noo, Frédéric; Wunderlich, Adam; Dennerlein, Frank; Lauritsch, Günter; Hornegger, Joachim

    2012-01-01

    Mathematical phantoms are essential for the development and early stage evaluation of image reconstruction algorithms in x-ray computed tomography (CT). This note offers tools for computer simulations using a two-dimensional (2D) phantom that models the central axial slice through the FORBILD head phantom. Introduced in 1999, in response to a need for a more robust test, the FORBILD head phantom is now seen by many as the gold standard. However, the simple Shepp–Logan phantom is still heavily used by researchers working on 2D image reconstruction. Universal acceptance of the FORBILD head phantom may have been prevented by its significantly higher complexity: software that allows computer simulations with the Shepp–Logan phantom is not readily applicable to the FORBILD head phantom. The tools offered here address this problem. They are designed for use with Matlab®, as well as open-source variants, such as FreeMat and Octave, which are all widely used in both academia and industry. To get started, the interested user can simply copy and paste the codes from this PDF document into Matlab® M-files. (note)

  16. Perceived problems with computer gaming and internet use among adolescents: measurement tool for non-clinical survey studies

    2014-01-01

    Background Existing instruments for measuring problematic computer and console gaming and internet use are often lengthy and often based on a pathological perspective. The objective was to develop and present a new and short non-clinical measurement tool for perceived problems related to computer use and gaming among adolescents and to study the association between screen time and perceived problems. Methods Cross-sectional school-survey of 11-, 13-, and 15-year old students in thirteen schools in the City of Aarhus, Denmark, participation rate 89%, n = 2100. The main exposure was time spend on weekdays on computer- and console-gaming and internet use for communication and surfing. The outcome measures were three indexes on perceived problems related to computer and console gaming and internet use. Results The three new indexes showed high face validity and acceptable internal consistency. Most schoolchildren with high screen time did not experience problems related to computer use. Still, there was a strong and graded association between time use and perceived problems related to computer gaming, console gaming (only boys) and internet use, odds ratios ranging from 6.90 to 10.23. Conclusion The three new measures of perceived problems related to computer and console gaming and internet use among adolescents are appropriate, reliable and valid for use in non-clinical surveys about young people’s everyday life and behaviour. These new measures do not assess Internet Gaming Disorder as it is listed in the DSM and therefore has no parity with DSM criteria. We found an increasing risk of perceived problems with increasing time spent with gaming and internet use. Nevertheless, most schoolchildren who spent much time with gaming and internet use did not experience problems. PMID:24731270

  17. Review. Supporting problem structuring with computer-based tools in participatory forest planning

    Hujala, T.; Khadka, C.; Wolfslehner, B.; Vacik, H.

    2013-09-01

    Aim of study: This review presents the state-of-art of using computerized techniques for problem structuring (PS) in participatory forest planning. Frequency and modes of using different computerized tool types and their contribution for planning processes as well as critical observations are described, followed by recommendations on how to better integrate PS with the use of forest decision support systems. Area of study: The reviewed research cases are from Asia, Europe, North-America, Africa and Australia. Material and methods: Via Scopus search and screening of abstracts, 32 research articles from years 2002-2011 were selected for review. Explicit and implicit evidence of using computerized tools for PS was recorded and assessed with content-driven qualitative analysis. Main results: GIS and forest-specific simulation tools were the most prevalent software types whereas cognitive modelling software and spreadsheet and calculation tools were less frequently used, followed by multi-criteria and interactive tools. The typical use type was to provide outputs of simulation–optimization or spatial analysis to negotiation situations or to compile summaries or illustrations afterwards; using software during group negotiation to foster interaction was observed only in a few cases. Research highlights: Expertise in both decision support systems and group learning is needed to better integrate PS and computerized decision analysis. From the knowledge management perspective, it is recommended to consider how the results of PS —e.g. conceptual models— could be stored into a problem perception database, and how PS and decision making could be streamlined by retrievals from such systems. (Author)

  18. Review. Supporting problem structuring with computer-based tools in participatory forest planning

    T. Hujala

    2013-07-01

    Full Text Available Aim of study: This review presents the state-of-art of using computerized techniques for problem structuring (PS in participatory forest planning. Frequency and modes of using different computerized tool types and their contribution for planning processes as well as critical observations are described, followed by recommendations on how to better integrate PS with the use of forest decision support systems.Area of study: The reviewed research cases are from Asia, Europe, North-America, Africa and Australia.Materials and methods: Via Scopus search and screening of abstracts, 32 research articles from years 2002–2011 were selected for review. Explicit and implicit evidence of using computerized tools for PS was recorded and assessed with content-driven qualitative analysis.Main results: GIS and forest-specific simulation tools were the most prevalent software types whereas cognitive modelling software and spreadsheet and calculation tools were less frequently used, followed by multi-criteria and interactive tools. The typical use type was to provide outputs of simulation–optimization or spatial analysis to negotiation situations or to compile summaries or illustrations afterwards; using software during group negotiation to foster interaction was observed only in a few cases.Research highlights: Expertise in both decision support systems and group learning is needed to better integrate PS and computerized decision analysis. From the knowledge management perspective, it is recommended to consider how the results of PS – e.g. conceptual models – could be stored into a problem perception database, and how PS and decision making could be streamlined by retrievals from such systems.Keywords: facilitated modeling; group negotiation; knowledge management; natural resource management; PSM; soft OR; stakeholders.

  19. COMPUTING

    I. Fisk

    2012-01-01

      Introduction Computing activity has been running at a sustained, high rate as we collect data at high luminosity, process simulation, and begin to process the parked data. The system is functional, though a number of improvements are planned during LS1. Many of the changes will impact users, we hope only in positive ways. We are trying to improve the distributed analysis tools as well as the ability to access more data samples more transparently.  Operations Office Figure 2: Number of events per month, for 2012 Since the June CMS Week, Computing Operations teams successfully completed data re-reconstruction passes and finished the CMSSW_53X MC campaign with over three billion events available in AOD format. Recorded data was successfully processed in parallel, exceeding 1.2 billion raw physics events per month for the first time in October 2012 due to the increase in data-parking rate. In parallel, large efforts were dedicated to WMAgent development and integrati...

  20. The evaluation of Computed Tomography hard- and software tools for micropaleontologic studies on foraminifera

    van Loo, D.; Speijer, R.; Masschaele, B.; Dierick, M.; Cnudde, V.; Boone, M.; de Witte, Y.; Dewanckele, J.; van Hoorebeke, L.; Jacobs, P.

    2009-04-01

    Foraminifera (Forams) are single-celled amoeba-like organisms in the sea, which build a tiny calcareous multi-chambered shell for protection. Their enormous abundance, great variation of shape through time and their presence in all marine deposits made these tiny microfossils the oil companies' best friend by facilitating the detection of new oil wells. Besides the success of forams in the oil and gas industry, they are also a most powerful tool for reconstructing climate change in the past. The shell of a foraminifer is a tiny gold mine of information both geometrical as chemical. However, until recently the best information on this architecture was only obtained through imaging the outside of a shell with Scanning Electron Microscopy (SEM), giving no clues towards internal structures other than single snapshots through breaking a specimen apart. With X-ray computed tomography (CT) it is possible to overcome this problem and uncover a huge amount of geometrical information without destructing the samples. Using the last generation of micro-CT's, called nano-CT, because of the sub-micron resolution, it is now possible to perform adequate imaging even on these tiny samples without needing huge facilities. In this research, a comparison is made between different X-ray sources and X-ray detectors and the resulting image resolution. Both sharpness, noise and contrast are very important parameters that will have important effects on the accuracy of the results and on the speed of data-processing. Combining this tomography technique with specific image processing software, called segmentation, it is possible to obtain a 3D virtual representation of the entire forams shell. This 3D virtual object can then be used for many purposes, from which automatic measurement of the chambers size is one of the most important ones. The segmentation process is a combination of several algorithms that are often used in CT evaluation, in this work an evaluation of those algorithms is

  1. BUILD-IT : a computer vision-based interaction technique for a planning tool

    Rauterberg, G.W.M.; Fjeld, M.; Krueger, H.; Bichsel, M.; Leonhardt, U.; Meier, M.; Thimbleby, H.; O'Conaill, B.; Thomas, P.J.

    1997-01-01

    Shows a method that goes beyond the established approaches of human-computer interaction. We first bring a serious critique of traditional interface types, showing their major drawbacks and limitations. Promising alternatives are offered by virtual (or immersive) reality (VR) and by augmented

  2. Imaging of peripheral arteries by 16-slice computed tomography angiography: a valuable tool

    Mishra, A.; Ehtuish, Ehtuish F.

    2007-01-01

    To evaluate the efficacy of multidetector (16-row) computed tomography (MDCT) in imaging the upper and lower limb arterial tree in trauma and peripheral vascular disease. Thirty three patients underwent multislice computed tomography angiography (MSCTA) of the upper or the lower limb on multislice (16-slice) CT scanner between November 2004 and July 2005 in the Department of Radiology, National Organ Transplant Center, Tripoli, Libya. The findings were retrospectively compared with the surgical outcome in cases of trauma with suspected arterial injuries; or color Doppler correlation was obtained, for patients of peripheral vascular disease. Multislice computed tomography angiography allows a comprehensive diagnostic work-up in all trauma cases with suspected arterial injuries. In 23 cases of peripheral vascular diseases, MSCTA adequately demonstrated the presence of any stenosis or occlusion, its degree and extent, the presence of collaterals and distal reformation if any; the presence of plaques. Our experience of computed tomography angiography with 16-row MDCT scanner has clearly demonstrated its efficacy as a promising, new, fast, accurate, safe and non-invasive imaging modality of choice in cases of trauma with suspected arterial injuries; and as a useful screening modality in cases of peripheral vascular disease for diagnosis and for grading. (author)

  3. Computer Vision Tool and Technician as First Reader of Lung Cancer Screening CT Scans

    Ritchie, A.J.; Sanghera, C.; Jacobs, C.; Zhang, W.; Mayo, J.; Schmidt, H.; Gingras, M.; Pasian, S.; Stewart, L.; Tsai, S.; Manos, D.; Seely, J.M.; Burrowes, P.; Bhatia, R.; Atkar-Khattra, S.; Ginneken, B. van; Tammemagi, M.; Tsao, M.S.; Lam, S.; et al.,

    2016-01-01

    To implement a cost-effective low-dose computed tomography (LDCT) lung cancer screening program at the population level, accurate and efficient interpretation of a large volume of LDCT scans is needed. The objective of this study was to evaluate a workflow strategy to identify abnormal LDCT scans in

  4. A new concept in glasshouse computer automation with SCADA and CASE Tools

    Meurs, van W.Th.M.; Gieling, Th.H.; Janssen, H.J.J.

    1996-01-01

    Climate control computers in greenhouses control heating and ventilation, supply water, dilute and dispense nutrients and integrate models into an optimally controlled system. This paper describes how information technology, as in use in other sectors of industry, applies to greenhouse control. In

  5. Promoting Creativity through Assessment: A Formative Computer-Assisted Assessment Tool for Teachers

    Cropley, David; Cropley, Arthur

    2016-01-01

    Computer-assisted assessment (CAA) is problematic when it comes to fostering creativity, because in educational thinking the essence of creativity is not finding the correct answer but generating novelty. The idea of "functional" creativity provides rubrics that can serve as the basis for forms of CAA leading to either formative or…

  6. Computed tomography as a tool for tolerance verification of industrial parts

    Müller, Pavel; Cantatore, Angela; Andreasen, J.L.

    2013-01-01

    Computed tomography (CT) is becoming an important technology for industrial applications, enabling fast and accurate control of manufactured parts. In only a few minutes, a complete 3D model of a part may be obtained, allowing measurements of external and internal features. This paper presents...

  7. Augmentation of Teaching Tools: Outsourcing the HSD Computing for SPSS Application

    Wang, Jianjun

    2010-01-01

    The widely-used Tukey's HSD index is not produced in the current version of SPSS (i.e., PASW Statistics, version 18), and a computer program named "HSD Calculator" has been chosen to amend this problem. In comparison to hand calculation, this program application does not require table checking, which eliminates potential concern on the size of a…

  8. Educational Impact of Digital Visualization Tools on Digital Character Production Computer Science Courses

    van Langeveld, Mark Christensen

    2009-01-01

    Digital character production courses have traditionally been taught in art departments. The digital character production course at the University of Utah is centered, drawing uniformly from art and engineering disciplines. Its design has evolved to include a synergy of computer science, functional art and human anatomy. It gives students an…

  9. The Strategy Blueprint : A Strategy Process Computer-Aided Design Tool

    Aldea, Adina Ioana; Febriani, Tania Rizki; Daneva, Maya; Iacob, Maria Eugenia

    2017-01-01

    Strategy has always been a main concern of organizations because it dictates their direction, and therefore determines their success. Thus, organizations need to have adequate support to guide them through their strategy formulation process. The goal of this research is to develop a computer-based

  10. Programming Languages or Generic Software Tools, for Beginners' Courses in Computer Literacy?

    Neuwirth, Erich

    1987-01-01

    Discussion of methods that can be used to teach beginner courses in computer literacy focuses on students aged 10-12. The value of using a programing language versus using a generic software package is highlighted; Logo and Prolog are reviewed; and the use of databases is discussed. (LRW)

  11. A Novel Computational Tool for Mining Real-Life Data: Application in the Metastatic Colorectal Cancer Care Setting.

    Siegelmann-Danieli, Nava; Farkash, Ariel; Katzir, Itzhak; Vesterman Landes, Janet; Rotem Rabinovich, Hadas; Lomnicky, Yossef; Carmeli, Boaz; Parush-Shear-Yashuv, Naama

    2016-01-01

    Randomized clinical trials constitute the gold-standard for evaluating new anti-cancer therapies; however, real-life data are key in complementing clinically useful information. We developed a computational tool for real-life data analysis and applied it to the metastatic colorectal cancer (mCRC) setting. This tool addressed the impact of oncology/non-oncology parameters on treatment patterns and clinical outcomes. The developed tool enables extraction of any computerized information including comorbidities and use of drugs (oncological/non-oncological) per individual HMO member. The study in which we evaluated this tool was a retrospective cohort study that included Maccabi Healthcare Services members with mCRC receiving bevacizumab with fluoropyrimidines (FP), FP plus oxaliplatin (FP-O), or FP plus irinotecan (FP-I) in the first-line between 9/2006 and 12/2013. The analysis included 753 patients of whom 15.4% underwent subsequent metastasectomy (the Surgery group). For the entire cohort, median overall survival (OS) was 20.5 months; in the Surgery group, median duration of bevacizumab-containing therapy (DOT) pre-surgery was 6.1 months; median OS was not reached. In the Non-surgery group, median OS and DOT were 18.7 and 11.4 months, respectively; no significant OS differences were noted between FP-O and FP-I, whereas FP use was associated with shorter OS (12.3 month; p controlling for age and gender) identified several non-oncology parameters associated with poorer clinical outcomes including concurrent use of diuretics and proton-pump inhibitors. Our tool provided insights that confirmed/complemented information gained from randomized-clinical trials. Prospective tool implementation is warranted.

  12. Automatic brain matter segmentation of computed tomography images using a statistical model: A tool to gain working time!

    Bertè, Francesco; Lamponi, Giuseppe; Bramanti, Placido; Calabrò, Rocco S

    2015-10-01

    Brain computed tomography (CT) is useful diagnostic tool for the evaluation of several neurological disorders due to its accuracy, reliability, safety and wide availability. In this field, a potentially interesting research topic is the automatic segmentation and recognition of medical regions of interest (ROIs). Herein, we propose a novel automated method, based on the use of the active appearance model (AAM) for the segmentation of brain matter in CT images to assist radiologists in the evaluation of the images. The method described, that was applied to 54 CT images coming from a sample of outpatients affected by cognitive impairment, enabled us to obtain the generation of a model overlapping with the original image with quite good precision. Since CT neuroimaging is in widespread use for detecting neurological disease, including neurodegenerative conditions, the development of automated tools enabling technicians and physicians to reduce working time and reach a more accurate diagnosis is needed. © The Author(s) 2015.

  13. Interstitial lung disease associated with collagen vascular disorders: disease quantification using a computer-aided diagnosis tool

    Marten, K.; Engelke, C.; Dicken, V.; Kneitz, C.; Hoehmann, M.; Kenn, W.; Hahn, D.

    2009-01-01

    The purpose of this study was to evaluate a computer-aided diagnosis (CAD) tool compared to human observers in quantification of interstitial lung disease (ILD) in patients with collagen-vascular disorders. A total of 52 patients with rheumatoid arthritis (n=24), scleroderma (n=14) and systemic lupus erythematosus (n=14) underwent thin-section CT. Two independent observers assessed the extent of ILD (EoILD), reticulation (EoRet) and ground-glass opacity (EoGGO). CAD assessed EoILD twice. Pulmonary function tests were obtained. Statistical evaluation used 95% limits of agreement and linear regression analysis. CAD correlated well with diffusing capacity (DL CO ) (R=-0.531, P CO (R=-0.705, P CO and moderately with FVC (DL CO : R=-0.663; FVC: R=-0.436; P≤0.005). The CAD system is a promising tool for ILD quantification, showing close correlation with human observers and physiologic impairment. (orig.)

  14. A review of the design and validation of web- and computer-based 24-h dietary recall tools.

    Timon, Claire M; van den Barg, Rinske; Blain, Richard J; Kehoe, Laura; Evans, Katie; Walton, Janette; Flynn, Albert; Gibney, Eileen R

    2016-12-01

    Technology-based dietary assessment offers solutions to many of the limitations of traditional dietary assessment methodologies including cost, participation rates and the accuracy of data collected. The 24-h dietary recall (24HDR) method is currently the most utilised method for the collection of dietary intake data at a national level. Recently there have been many developments using web-based platforms to collect food intake data using the principles of the 24HDR method. This review identifies web- and computer-based 24HDR tools that have been developed for both children and adult population groups, and examines common design features and the methods used to investigate the performance and validity of these tools. Overall, there is generally good to strong agreement between web-based 24HDR and respective reference measures for intakes of macro- and micronutrients.

  15. Reliability of a computer software angle tool for measuring spine and pelvic flexibility during the sit-and-reach test.

    Mier, Constance M; Shapiro, Belinda S

    2013-02-01

    The purpose of this study was to determine the reliability of a computer software angle tool that measures thoracic (T), lumbar (L), and pelvic (P) angles as a means of evaluating spine and pelvic flexibility during the sit-and-reach (SR) test. Thirty adults performed the SR twice on separate days. The SR test was captured on video and later analyzed for T, L, and P angles using the computer software angle tool. During the test, 3 markers were placed over T1, T12, and L5 vertebrae to identify T, L, and P angles. Intraclass correlation coefficient (ICC) indicated a very high internal consistency (between trials) for T, L, and P angles (0.95-0.99); thus, the average of trials was used for test-retest (between days) reliability. Mean (±SD) values did not differ between days for T (51.0 ± 14.3 vs. 52.3 ± 16.2°), L (23.9 ± 7.1 vs. 23.0 ± 6.9°), or P (98.4 ± 15.6 vs. 98.3 ± 14.7°) angles. Test-retest reliability (ICC) was high for T (0.96) and P (0.97) angles and moderate for L angle (0.84). Both intrarater and interrater reliabilities were high for T (0.95, 0.94) and P (0.97, 0.97) angles and moderate for L angle (0.87, 0.82). Thus, the computer software angle tool is a highly objective method for assessing spine and pelvic flexibility during a video-captured SR test.

  16. Technical Note: SPEKTR 3.0—A computational tool for x-ray spectrum modeling and analysis

    Punnoose, J.; Xu, J.; Sisniega, A.; Zbijewski, W.; Siewerdsen, J. H., E-mail: jeff.siewerdsen@jhu.edu [Department of Biomedical Engineering, Johns Hopkins University, Baltimore, Maryland 21205 (United States)

    2016-08-15

    Purpose: A computational toolkit (SPEKTR 3.0) has been developed to calculate x-ray spectra based on the tungsten anode spectral model using interpolating cubic splines (TASMICS) algorithm, updating previous work based on the tungsten anode spectral model using interpolating polynomials (TASMIP) spectral model. The toolkit includes a MATLAB (The Mathworks, Natick, MA) function library and improved user interface (UI) along with an optimization algorithm to match calculated beam quality with measurements. Methods: The SPEKTR code generates x-ray spectra (photons/mm{sup 2}/mAs at 100 cm from the source) using TASMICS as default (with TASMIP as an option) in 1 keV energy bins over beam energies 20–150 kV, extensible to 640 kV using the TASMICS spectra. An optimization tool was implemented to compute the added filtration (Al and W) that provides a best match between calculated and measured x-ray tube output (mGy/mAs or mR/mAs) for individual x-ray tubes that may differ from that assumed in TASMICS or TASMIP and to account for factors such as anode angle. Results: The median percent difference in photon counts for a TASMICS and TASMIP spectrum was 4.15% for tube potentials in the range 30–140 kV with the largest percentage difference arising in the low and high energy bins due to measurement errors in the empirically based TASMIP model and inaccurate polynomial fitting. The optimization tool reported a close agreement between measured and calculated spectra with a Pearson coefficient of 0.98. Conclusions: The computational toolkit, SPEKTR, has been updated to version 3.0, validated against measurements and existing models, and made available as open source code. Video tutorials for the SPEKTR function library, UI, and optimization tool are available.

  17. Technical Note: spektr 3.0—A computational tool for x-ray spectrum modeling and analysis

    Punnoose, J.; Xu, J.; Sisniega, A.; Zbijewski, W.; Siewerdsen, J. H.

    2016-01-01

    Purpose: A computational toolkit (spektr 3.0) has been developed to calculate x-ray spectra based on the tungsten anode spectral model using interpolating cubic splines (TASMICS) algorithm, updating previous work based on the tungsten anode spectral model using interpolating polynomials (TASMIP) spectral model. The toolkit includes a matlab (The Mathworks, Natick, MA) function library and improved user interface (UI) along with an optimization algorithm to match calculated beam quality with measurements. Methods: The spektr code generates x-ray spectra (photons/mm2/mAs at 100 cm from the source) using TASMICS as default (with TASMIP as an option) in 1 keV energy bins over beam energies 20–150 kV, extensible to 640 kV using the TASMICS spectra. An optimization tool was implemented to compute the added filtration (Al and W) that provides a best match between calculated and measured x-ray tube output (mGy/mAs or mR/mAs) for individual x-ray tubes that may differ from that assumed in TASMICS or TASMIP and to account for factors such as anode angle. Results: The median percent difference in photon counts for a TASMICS and TASMIP spectrum was 4.15% for tube potentials in the range 30–140 kV with the largest percentage difference arising in the low and high energy bins due to measurement errors in the empirically based TASMIP model and inaccurate polynomial fitting. The optimization tool reported a close agreement between measured and calculated spectra with a Pearson coefficient of 0.98. Conclusions: The computational toolkit, spektr, has been updated to version 3.0, validated against measurements and existing models, and made available as open source code. Video tutorials for the spektr function library, UI, and optimization tool are available. PMID:27487888

  18. Synthetic Biology Outside the Cell: Linking Computational Tools to Cell-Free Systems

    Lewis, Daniel D. [Integrative Genetics and Genomics, University of California Davis, Davis, CA (United States); Department of Biomedical Engineering, University of California Davis, Davis, CA (United States); Villarreal, Fernando D.; Wu, Fan; Tan, Cheemeng, E-mail: cmtan@ucdavis.edu [Department of Biomedical Engineering, University of California Davis, Davis, CA (United States)

    2014-12-09

    As mathematical models become more commonly integrated into the study of biology, a common language for describing biological processes is manifesting. Many tools have emerged for the simulation of in vivo synthetic biological systems, with only a few examples of prominent work done on predicting the dynamics of cell-free synthetic systems. At the same time, experimental biologists have begun to study dynamics of in vitro systems encapsulated by amphiphilic molecules, opening the door for the development of a new generation of biomimetic systems. In this review, we explore both in vivo and in vitro models of biochemical networks with a special focus on tools that could be applied to the construction of cell-free expression systems. We believe that quantitative studies of complex cellular mechanisms and pathways in synthetic systems can yield important insights into what makes cells different from conventional chemical systems.

  19. Synthetic Biology Outside the Cell: Linking Computational Tools to Cell-Free Systems

    Daniel eLewis

    2014-12-01

    Full Text Available As mathematical models become more commonly integrated into the study of biology, a common language for describing biological processes is manifesting. Many tools have emerged for the simulation of in vivo systems, with only a few examples of prominent work done on predicting the dynamics of cell-free systems. At the same time, experimental biologists have begun to study dynamics of in vitro systems encapsulated by amphiphilic molecules, opening the door for the development of a new generation of biomimetic systems. In this review, we explore both in vivo and in vitro models of biochemical networks with a special focus on tools that could be applied to the construction of cell-free expression systems. We believe that quantitative studies of complex cellular mechanisms and pathways in synthetic systems can yield important insights into what makes cells different from conventional chemical systems.

  20. [Dietopro.com: a new tool for dietotherapeutical management based on cloud computing technology].

    García, Candido Gabriel; Sebastià, Natividad; Blasco, Esther; Soriano, José Miguel

    2014-09-01

    dietotherapeutical softwares are now a basic tool in the dietary management of patients, either from a physiological point of view and / or pathological. New technologies and research in this regard, have favored the emergence of new applications for the dietary and nutritional management that facilitate the management of the dietotherapeutical company. To comparatively study the main dietotherapeutical applications on the market to give criteria to the professional users of diet and nutrition in the selection of one of the main tools for these. Dietopro.com is, from our point of view, one of the most comprehensive management of patients dietotherapeutical applications. Based on the need of the user, it has different dietary sofwares choice.We conclude that there is no better or worse than another application, but applications roughly adapted to the needs of professionals. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.

  1. Synthetic Biology Outside the Cell: Linking Computational Tools to Cell-Free Systems

    Lewis, Daniel D.; Villarreal, Fernando D.; Wu, Fan; Tan, Cheemeng

    2014-01-01

    As mathematical models become more commonly integrated into the study of biology, a common language for describing biological processes is manifesting. Many tools have emerged for the simulation of in vivo synthetic biological systems, with only a few examples of prominent work done on predicting the dynamics of cell-free synthetic systems. At the same time, experimental biologists have begun to study dynamics of in vitro systems encapsulated by amphiphilic molecules, opening the door for the development of a new generation of biomimetic systems. In this review, we explore both in vivo and in vitro models of biochemical networks with a special focus on tools that could be applied to the construction of cell-free expression systems. We believe that quantitative studies of complex cellular mechanisms and pathways in synthetic systems can yield important insights into what makes cells different from conventional chemical systems.

  2. A Computationally Efficient Tool for Assessing the Depth Resolution in Potential-Field Inversion

    Paoletti, V.; Hansen, Per Christian; Hansen, Mads Friis

    In potential-field inversion problems, it can be dicult to obtain reliable information about the source distribution with respect to depth. Moreover, spatial resolution of the reconstructions decreases with depth, and in fact the more ill-posed the problem - and the more noisy the data - the less...... reliable the depth information. Based on earlier work using the singular value decomposition, we introduce a tool ApproxDRP which uses approximations of the singular vectors obtained by the iterative Lanczos bidiagonalization algorithm, making it well suited for large-scale problems. This tool allows...... successfully show the limitations of depth resolution resulting from noise in the data. This allows a reliable analysis of the retrievable depth information and effectively guides the user in choosing the optimal number of iterations, for a given problem....

  3. Computational Analysis of Brain Images: Towards a Useful Tool in Clinical Practice

    Puonti, Oula

    scans, many of the developed methods are not readily extendible to clinical applications due to the variability of clinical MRI data and the presence of pathologies, such as tumors or lesions. Thus, clinicians are forced to manually analyze the MRI data, which is a time consuming task and introduces...... rater-dependent variability that reduces the accuracy and sensitivity of the results. The goal of this PhD-project was to enlarge the scope of the automatic tools into clinical applications. In order to tackle the variability of the data and presence of pathologies, we base our methods on Bayesian...... this framework can be extended with models of brain lesions. This results in a set of fast, robust and fully automatic tools for segmenting MRI brain scans of both healthy subjects and subjects suffering from brain disorders such as multiple sclerosis. Having access to quantitative measures of both lesions...

  4. Computer Tool for Automatically Generated 3D Illustration in Real Time from Archaeological Scanned Pieces

    Luis López; Germán Arroyo; Domingo Martín

    2012-01-01

    The graphical documentation process of archaeological pieces requires the active involvement of a professional artist to recreate beautiful illustrations using a wide variety of expressive techniques. Frequently, the artist’s work is limited by the inconvenience of working only with the photographs of the pieces he is going to illustrate. This paper presents a software tool that allows the easy generation of illustrations in real time from 3D scanned models. The developed interface allows the...

  5. The Classification and Evaluation of Computer-Aided Software Engineering Tools

    1990-09-01

    Registered Trademark of Index Technology Corporation iv FrameMaker is a Registered Trademark of Frame Technology Corp HP 9000/Laserjet/ are Registered...generators and templates to meet certain standards (i.e., DoD STD-2167A) with interfaces to technical publishing systems from Interleaf, Framemaker , etc...publishing systems (i.e., Interleaf, Framemaker , etc...). Fourth Generation Language (4GL): Tool contains a high level language providing database access

  6. Biopython: freely available Python tools for computational molecular biology and bioinformatics

    Cock, Peter J A; Antao, Tiago; Chang, Jeffrey T

    2009-01-01

    SUMMARY: The Biopython project is a mature open source international collaboration of volunteer developers, providing Python libraries for a wide range of bioinformatics problems. Biopython includes modules for reading and writing different sequence file formats and multiple sequence alignments......, dealing with 3D macro molecular structures, interacting with common tools such as BLAST, ClustalW and EMBOSS, accessing key online databases, as well as providing numerical methods for statistical learning. AVAILABILITY: Biopython is freely available, with documentation and source code at (www...

  7. Computer-aided tool for solvent selection in pharmaceutical processes: Solvent swap

    Papadakis, Emmanouil; K. Tula, Anjan; Gernaey, Krist V.

    -liquid equilibria). The application of the developed model-based framework is highlighted through several cases studies published in the literature. In the current state, the framework is suitable for problems where the original solvent is exchanged by distillation. A solvent selection guide for fast of suitable......-aided framework with the objective to assist the pharmaceutical industry in gaining better process understanding. A software interface to improve the usability of the tool has been created also....

  8. A Usability Study of Users’ Perceptions Toward a Multimedia Computer-Assisted Learning Tool for Neuroanatomy

    Gould, Douglas J.; Terrell, Mark A.; Fleming, Jo

    2015-01-01

    This usability study evaluated users’ perceptions of a multimedia prototype for a new e-learning tool: Anatomy of the Central Nervous System: A Multimedia Course. Usability testing is a collection of formative evaluation methods that inform the developmental design of e-learning tools to maximize user acceptance, satisfaction, and adoption. Sixty-two study participants piloted the prototype and completed a usability questionnaire designed to measure two usability properties: program need and program applicability. Statistical analyses were used to test the hypothesis that the multimedia prototype was well designed and highly usable, it was perceived as: 1) highly needed across a spectrum of educational contexts, 2) highly applicable in supporting the pedagogical processes of teaching and learning neuroanatomy, and 3) was highly usable by all types of users. Three independent variables represented user differences: level of expertise (faculty vs. student), age, and gender. Analysis of the results supports the research hypotheses that the prototype was designed well for different types of users in various educational contexts and for supporting the pedagogy of neuroanatomy. In addition, the results suggest that the multimedia program will be most useful as a neuroanatomy review tool for health-professions students preparing for licensing or board exams. This study demonstrates the importance of integrating quality properties of usability with principles of human learning during the instructional design process for multimedia products. PMID:19177405

  9. A development of a quantitative situation awareness measurement tool: Computational Representation of Situation Awareness with Graphical Expressions (CoRSAGE)

    Yim, Ho Bin; Lee, Seung Min; Seong, Poong Hyun

    2014-01-01

    Highlights: • We proposed quantitative situation awareness (SA) evaluation technique. • We developed a computer based SA evaluation tool for NPPs training environment. • We introduced three rules and components to express more human-like results. • We conducted three sets of training with real plant operators. • Results showed that the tool could reasonably represent operator’s SA. - Abstract: Operator performance measures are used for multiple purposes, such as control room design, human system interface (HSI) evaluation, training, and so on. Performance measures are often focused on results; however, especially for a training purpose – at least in a nuclear industry, more detailed descriptions about processes are required. Situation awareness (SA) measurements have directly/indirectly played as a complimentary measure and provided descriptive insights on how to improve performance of operators for the next training. Unfortunately, most of the well-developed SA measurement techniques, such as Situation Awareness Global Assessment Technique (SAGAT) need an expert opinion which sometimes troubles easy spread of measurement’s application or usage. A quantitative SA measurement tool named Computational Representation of Situation Awareness with Graphical Expressions (CoRSAGE) is introduced to resolve some of these concerns. CoRSAGE is based on production rules to represent a human operator’s cognitive process of problem solving, and Bayesian inference to quantify it. Petri Net concept is also used for graphical expressions of SA flow. Three components – inference transition, volatile/non-volatile memory tokens – were newly developed to achieve required functions. Training data of a Loss of Coolant Accident (LOCA) scenario for an emergency condition and an earthquake scenario for an abnormal condition by real plant operators were used to validate the tool. The validation result showed that CoRSAGE performed a reasonable match to other performance

  10. Computer Οptimization of Geometric Form of Tool and Preform for Closed-die Forging of Compressor Blade Simulator

    A. V. Botkin

    2014-07-01

    Full Text Available Using the software package DEFORM 3D when developing technology of isothermal forging workpiece blades it is possible to reduce the pre-production time, to improve the quality of forgings and increase lifetime of forging dies. Computer modeling allows to predict the formation of such defects during forging as notches and wrinkles, underfilling of die impression, to estimate tool loads. Preform shape and angular position of the blade simulator were optimized in order to minimize the lateral forces generated during the forging operation.

  11. Nsite, NsiteH and NsiteM Computer Tools for Studying Tran-scription Regulatory Elements

    Shahmuradov, Ilham

    2015-07-02

    Summary: Gene transcription is mostly conducted through interactions of various transcription factors and their binding sites on DNA (regulatory elements, REs). Today, we are still far from understanding the real regulatory content of promoter regions. Computer methods for identification of REs remain a widely used tool for studying and understanding transcriptional regulation mechanisms. The Nsite, NsiteH and NsiteM programs perform searches for statistically significant (non-random) motifs of known human, animal and plant one-box and composite REs in a single genomic sequence, in a pair of aligned homologous sequences and in a set of functionally related sequences, respectively.

  12. Connectivity among computer-aided engineering methods, procedures, and tools used in developing the SSC collider magnets

    Kallas, N.; Jalloh, A.R.

    1992-01-01

    The accomplishment of functional productivity for the computer aided engineering (CAE) environment at the magnet engineering department (ME) of the magnet systems division (MSD) at the Superconducting Super Collider Laboratory (SSCL) involves most of the basic aspects of information engineering. It is highly desirable to arrive at a software and hardware topology that offers total, two-way (back and forth), automatic and direct software and hardware connectivity among computer-aided design and drafting (CADD), analysis codes, and office automation tools applicable to the disciplines involved. This paper describes the components, data flow, and practices employed in the development of the CAE environment from a systems engineering aspect rather than from the analytical angle. When appropriate, references to case studies are made in order to demonstrate the connectivity of the techniques used

  13. Connectivity among computer-aided engineering methods, procedures, and tools used in developing the SSC collider magnets

    Kallas, N.; Jalloh, A.R.

    1992-03-01

    The accomplishment of functional productivity for the computer aided engineering (CAE) environment at the magnet engineering department (ME) of the magnet systems divisions (MSD) at the Superconducting Super Collider Laboratory (SSCL) involves most of the basic aspects of information engineering. It is highly desirable to arrive at a software and hardware topology that offers total, two-way (back and forth), automatic and direct software and hardware connectivity among computer-aided design and drafting (CADD), analysis codes, and office automation tools applicable to the disciplines involved. This paper describes the components, data flow, and practices employed in the development of the CAE environment from a systems engineering aspect rather than from the analytical angle. When appropriate, references to case studies are made in order to demonstrate the connectivity of the techniques used

  14. USING SECOND LIFE VIRTUAL COMPUTER WORLD AS A TRAINING TOOL FOR THE SUBJECTIVE GLOBAL ASSESSMENT (sga.

    G. Clark Connery

    2012-06-01

    Full Text Available The SGA is a clinical tool used to assess protein energy wasting. Although well validated, it is still not widely incorporated into clinical practice. A barrier to use may be the physical assessment section. Therefore, the purpose of this project was to develop a free and effective tool to train clinicians on performing the SGA. Second Life (SL is a free virtual reality program accessed through the internet using human-like “avatars.” A museum environment was created with panels presenting SGA background information through text, images, and videos of SGA being performed. Users are able to navigate the information by logging onto a provided avatar. After the initial panels, this avatar is able to interact with avatar bots and perform animations which mimic each body assessment within the SGA. Two trial periods were conducted to assess the efficacy of this training tool. The alpha trial consisted of 3 hospital dietitians and 3 nutrition students. These subjects came to the investigators’ facility to test the program. Subjective responses were collected and used to improve the training tool. Feedback was positive regarding the information, delivery, and direction of the project; however, they did complain of difficulty with controlling the avatar. The beta trial consists of users accessing the module remotely. These users include academic and clinical dietitians. Responses are being collected via 5 surveys covering each portion of the module. While 16 dietitians responded to the beta trial, only 4 have completed the training. Current survey responses state: the use of SL is easy and enjoyable; all SGA information was clear and in a desirable format; tactile comparison objects were beneficial for understanding; the in depth description of each assessment is beneficial; the animations that the avatars perform on the bots needs improvement; a patient avatar on which users could perform the full SGA is desirable; the use of SL in the learning

  15. Computational modeling of local hemodynamics phenomena: methods, tools and clinical applications

    Ponzini, R.; Rizzo, G.; Vergara, C.; Veneziani, A.; Morbiducci, U.; Montevecchi, F.M.; Redaelli, A.

    2009-01-01

    Local hemodynamics plays a key role in the onset of vessel wall pathophysiology, with peculiar blood flow structures (i.e. spatial velocity profiles, vortices, re-circulating zones, helical patterns and so on) characterizing the behavior of specific vascular districts. Thanks to the evolving technologies on computer sciences, mathematical modeling and hardware performances, the study of local hemodynamics can today afford also the use of a virtual environment to perform hypothesis testing, product development, protocol design and methods validation that just a couple of decades ago would have not been thinkable. Computational fluid dynamics (Cfd) appears to be more than a complementary partner to in vitro modeling and a possible substitute to animal models, furnishing a privileged environment for cheap fast and reproducible data generation.

  16. Computer-aided sperm analysis: a useful tool to evaluate patient's response to varicocelectomy

    Ariagno, Julia I; Mendeluk, Gabriela R; Furlan, Mar?a J; Sardi, M; Chenlo, P; Curi, Susana M; Pugliese, Mercedes N; Repetto, Herberto E; Cohen, Mariano

    2016-01-01

    Preoperative and postoperative sperm parameter values from infertile men with varicocele were analyzed by computer-aided sperm analysis (CASA) to assess if sperm characteristics improved after varicocelectomy. Semen samples of men with proven fertility (n = 38) and men with varicocele-related infertility (n = 61) were also analyzed. Conventional semen analysis was performed according to WHO (2010) criteria and a CASA system was employed to assess kinetic parameters and sperm concentration. Se...

  17. MRIVIEW: An interactive computational tool for investigation of brain structure and function

    Ranken, D.; George, J.

    1993-01-01

    MRIVIEW is a software system which uses image processing and visualization to provide neuroscience researchers with an integrated environment for combining functional and anatomical information. Key features of the software include semi-automated segmentation of volumetric head data and an interactive coordinate reconciliation method which utilizes surface visualization. The current system is a precursor to a computational brain atlas. We describe features this atlas will incorporate, including methods under development for visualizing brain functional data obtained from several different research modalities

  18. Development of a Tool for Measuring and Analyzing Computer User Satisfaction

    James E. Bailey; Sammy W. Pearson

    1983-01-01

    This paper reports on a technique for measuring and analyzing computer user satisfaction. Starting with the literature and using the critical incident interview technique, 39 factors affecting satisfaction were identified. Adapting the semantic differential scaling technique, a questionnaire for measuring satisfaction was then created. Finally, the instrument was pilot tested to prove its validity and reliability. The results of this effort and suggested uses of the questionnaire are reported...

  19. Design Tools for Accelerating Development and Usage of Multi-Core Computing Platforms

    2014-04-01

    Government formulated or supplied the drawings, specifications, or other data does not license the holder or any other person or corporation ; or convey...multicore PDSP platforms. The GPU- based capabilities of TDIF are currently oriented towards NVIDIA GPUs, based on the Compute Unified Device Architecture...CUDA) programming language [ NVIDIA 2007], which can be viewed as an extension of C. The multicore PDSP capabilities currently in TDIF are oriented

  20. Nispero: a cloud-computing based Scala tool specially suited for bioinformatics data processing

    Evdokim Kovach; Alexey Alekhin; Eduardo Pareja Tobes; Raquel Tobes; Eduardo Pareja; Marina Manrique

    2014-01-01

    Nowadays it is widely accepted that the bioinformatics data analysis is a real bottleneck in many research activities related to life sciences. High-throughput technologies like Next Generation Sequencing (NGS) have completely reshaped the biology and bioinformatics landscape. Undoubtedly NGS has allowed important progress in many life-sciences related fields but has also presented interesting challenges in terms of computation capabilities and algorithms. Many kinds of tasks related with NGS...

  1. 5th Conference on Advanced Mathematical and Computational Tools in Metrology

    Cox, M G; Filipe, E; Pavese, F; Richter, D

    2001-01-01

    Advances in metrology depend on improvements in scientific and technical knowledge and in instrumentation quality, as well as on better use of advanced mathematical tools and development of new ones. In this volume, scientists from both the mathematical and the metrological fields exchange their experiences. Industrial sectors, such as instrumentation and software, will benefit from this exchange, since metrology has a high impact on the overall quality of industrial products, and applied mathematics is becoming more and more important in industrial processes.This book is of interest to people

  2. Creation of an Open Framework for Point-of-Care Computer-Assisted Reporting and Decision Support Tools for Radiologists.

    Alkasab, Tarik K; Bizzo, Bernardo C; Berland, Lincoln L; Nair, Sujith; Pandharipande, Pari V; Harvey, H Benjamin

    2017-09-01

    Decreasing unnecessary variation in radiology reporting and producing guideline-concordant reports is fundamental to radiology's success in value-based payment models and good for patient care. In this article, we present an open authoring system for point-of-care clinical decision support tools integrated into the radiologist reporting environment referred to as the computer-assisted reporting and decision support (CAR/DS) framework. The CAR/DS authoring system, described herein, includes: (1) a definition format for representing radiology clinical guidelines as structured, machine-readable Extensible Markup Language documents and (2) a user-friendly reference implementation to test the fidelity of the created definition files with the clinical guideline. The proposed definition format and reference implementation will enable content creators to develop CAR/DS tools that voice recognition software (VRS) vendors can use to extend the commercial tools currently in use. In making the definition format and reference implementation software freely available, we hope to empower individual radiologists, expert groups such as the ACR, and VRS vendors to develop a robust ecosystem of CAR/DS tools that can further improve the quality and efficiency of the patient care that our field provides. We hope that this initial effort can serve as the basis for a community-owned open standard for guideline definition that the imaging informatics and VRS vendor communities will embrace and strengthen. To this end, the ACR Assist™ initiative is intended to make the College's clinical content, including the Incidental Findings Committee White Papers, available for decision support tool creation based upon the herein described CAR/DS framework. Copyright © 2017 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  3. TAOI B- Computational Microstructural Optimization Design Tool for High Temperature Structural Materials

    Mishra, Rajiv [Univ. Of North Texas, Denton, TX (United States); Charit, Indrajit [Univ. of Idaho, Moscow, ID (United States)

    2015-02-28

    The objectives of this research were two-fold: (a) develop a methodology for microstructural optimization of alloys - genetic algorithm approach for alloy microstructural optimization using theoretical models based on fundamental micro-mechanisms, and (b) develop a new computationally designed Ni-Cr alloy for coal-fired power plant applications. The broader outcome of these objectives is expected to be creation of an integrated approach for ‘structural materials by microstructural design’. Three alloy systems were considered for computational optimization and validation, (i) Ni-20Cr (wt.%) base alloy using only solid solution strengthening, (ii) nano-Y2O3 containing Ni-20Cr-1.2Y2O3 (wt.%) alloy for dispersion strengthening and (iii) a sub-micron Al2O3 for composite strengthening, Ni-20Cr-1.2Y2O3-5.0Al2O3 (wt.%). The specimens were synthesized by mechanical alloying and consolidated using spark plasma sintering. Detailed microstructural characterization was done along with initial mechanical properties to validate the computational prediction. A key target property is to have creep rate of 1x10-9 s-1 at 100 MPa and 800oC. The initial results were quite promising and require additional quantification of strengthening contributions from dislocation-particle attractive interaction and load transfer. The observed creep rate was in order of 10-9 s-1 for longer time creep test of Ni-20Cr -1.2Y2O3-5Al2O3, lending support to the overall approach pursued in this project.

  4. TAREAN: a computational tool for identification and characterization of satellite DNA from unassembled short reads

    Novák, Petr; Ávila Robledillo, Laura; Koblížková, Andrea; Vrbová, Iva; Neumann, Pavel; Macas, Jiří

    2017-01-01

    Roč. 45, č. 12 (2017), č. článku e111. ISSN 0305-1048 R&D Projects: GA ČR GBP501/12/G090; GA MŠk(CZ) LM2015047 Institutional support: RVO:60077344 Keywords : in-situ hybridization * repetitive sequences * tandem repeats * vicia-faba Subject RIV: EB - Genetics ; Molecular Biology OBOR OECD: Computer sciences, information science, bioinformathics (hardware development to be 2.2, social aspect to be 5.8) Impact factor: 10.162, year: 2016

  5. SnapAnatomy, a computer-based interactive tool for independent learning of human anatomy.

    Yip, George W; Rajendran, Kanagasuntheram

    2008-06-01

    Computer-aided instruction materials are becoming increasing popular in medical education and particularly in the teaching of human anatomy. This paper describes SnapAnatomy, a new interactive program that the authors designed for independent learning of anatomy. SnapAnatomy is primarily tailored for the beginner student to encourage the learning of anatomy by developing a three-dimensional visualization of human structure that is essential to applications in clinical practice and the understanding of function. The program allows the student to take apart and to accurately put together body components in an interactive, self-paced and variable manner to achieve the learning outcome.

  6. MRUniNovo: an efficient tool for de novo peptide sequencing utilizing the hadoop distributed computing framework.

    Li, Chuang; Chen, Tao; He, Qiang; Zhu, Yunping; Li, Kenli

    2017-03-15

    Tandem mass spectrometry-based de novo peptide sequencing is a complex and time-consuming process. The current algorithms for de novo peptide sequencing cannot rapidly and thoroughly process large mass spectrometry datasets. In this paper, we propose MRUniNovo, a novel tool for parallel de novo peptide sequencing. MRUniNovo parallelizes UniNovo based on the Hadoop compute platform. Our experimental results demonstrate that MRUniNovo significantly reduces the computation time of de novo peptide sequencing without sacrificing the correctness and accuracy of the results, and thus can process very large datasets that UniNovo cannot. MRUniNovo is an open source software tool implemented in java. The source code and the parameter settings are available at http://bioinfo.hupo.org.cn/MRUniNovo/index.php. s131020002@hnu.edu.cn ; taochen1019@163.com. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  7. MultiSIMNRA: A computational tool for self-consistent ion beam analysis using SIMNRA

    Silva, T.F.; Rodrigues, C.L.; Mayer, M.; Moro, M.V.; Trindade, G.F.; Aguirre, F.R.; Added, N.; Rizzutto, M.A.; Tabacniks, M.H.

    2016-01-01

    Highlights: • MultiSIMNRA enables the self-consistent analysis of multiple ion beam techniques. • Self-consistent analysis enables unequivocal and reliable modeling of the sample. • Four different computational algorithms available for model optimizations. • Definition of constraints enables to include prior knowledge into the analysis. - Abstract: SIMNRA is widely adopted by the scientific community of ion beam analysis for the simulation and interpretation of nuclear scattering techniques for material characterization. Taking advantage of its recognized reliability and quality of the simulations, we developed a computer program that uses multiple parallel sessions of SIMNRA to perform self-consistent analysis of data obtained by different ion beam techniques or in different experimental conditions of a given sample. In this paper, we present a result using MultiSIMNRA for a self-consistent multi-elemental analysis of a thin film produced by magnetron sputtering. The results demonstrate the potentialities of the self-consistent analysis and its feasibility using MultiSIMNRA.

  8. Applying Ancestry and Sex Computation as a Quality Control Tool in Targeted Next-Generation Sequencing.

    Mathias, Patrick C; Turner, Emily H; Scroggins, Sheena M; Salipante, Stephen J; Hoffman, Noah G; Pritchard, Colin C; Shirts, Brian H

    2016-03-01

    To apply techniques for ancestry and sex computation from next-generation sequencing (NGS) data as an approach to confirm sample identity and detect sample processing errors. We combined a principal component analysis method with k-nearest neighbors classification to compute the ancestry of patients undergoing NGS testing. By combining this calculation with X chromosome copy number data, we determined the sex and ancestry of patients for comparison with self-report. We also modeled the sensitivity of this technique in detecting sample processing errors. We applied this technique to 859 patient samples with reliable self-report data. Our k-nearest neighbors ancestry screen had an accuracy of 98.7% for patients reporting a single ancestry. Visual inspection of principal component plots was consistent with self-report in 99.6% of single-ancestry and mixed-ancestry patients. Our model demonstrates that approximately two-thirds of potential sample swaps could be detected in our patient population using this technique. Patient ancestry can be estimated from NGS data incidentally sequenced in targeted panels, enabling an inexpensive quality control method when coupled with patient self-report. © American Society for Clinical Pathology, 2016. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  9. A computational tool integrating host immunity with antibiotic dynamics to study tuberculosis treatment.

    Pienaar, Elsje; Cilfone, Nicholas A; Lin, Philana Ling; Dartois, Véronique; Mattila, Joshua T; Butler, J Russell; Flynn, JoAnne L; Kirschner, Denise E; Linderman, Jennifer J

    2015-02-21

    While active tuberculosis (TB) is a treatable disease, many complex factors prevent its global elimination. Part of the difficulty in developing optimal therapies is the large design space of antibiotic doses, regimens and combinations. Computational models that capture the spatial and temporal dynamics of antibiotics at the site of infection can aid in reducing the design space of costly and time-consuming animal pre-clinical and human clinical trials. The site of infection in TB is the granuloma, a collection of immune cells and bacteria that form in the lung, and new data suggest that penetration of drugs throughout granulomas is problematic. Here we integrate our computational model of granuloma formation and function with models for plasma pharmacokinetics, lung tissue pharmacokinetics and pharmacodynamics for two first line anti-TB antibiotics. The integrated model is calibrated to animal data. We make four predictions. First, antibiotics are frequently below effective concentrations inside granulomas, leading to bacterial growth between doses and contributing to the long treatment periods required for TB. Second, antibiotic concentration gradients form within granulomas, with lower concentrations toward their centers. Third, during antibiotic treatment, bacterial subpopulations are similar for INH and RIF treatment: mostly intracellular with extracellular bacteria located in areas non-permissive for replication (hypoxic areas), presenting a slowly increasing target population over time. Finally, we find that on an individual granuloma basis, pre-treatment infection severity (including bacterial burden, host cell activation and host cell death) is predictive of treatment outcome. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. Outcomes from a pilot study using computer-based rehabilitative tools in a military population.

    Sullivan, Katherine W; Quinn, Julia E; Pramuka, Michael; Sharkey, Laura A; French, Louis M

    2012-01-01

    Novel therapeutic approaches and outcome data are needed for cognitive rehabilitation for patients with a traumatic brain injury; computer-based programs may play a critical role in filling existing knowledge gaps. Brain-fitness computer programs can complement existing therapies, maximize neuroplasticity, provide treatment beyond the clinic, and deliver objective efficacy data. However, these approaches have not been extensively studied in the military and traumatic brain injury population. Walter Reed National Military Medical Center established its Brain Fitness Center (BFC) in 2008 as an adjunct to traditional cognitive therapies for wounded warriors. The BFC offers commercially available "brain-training" products for military Service Members to use in a supportive, structured environment. Over 250 Service Members have utilized this therapeutic intervention. Each patient receives subjective assessments pre and post BFC participation including the Mayo-Portland Adaptability Inventory-4 (MPAI-4), the Neurobehavioral Symptom Inventory (NBSI), and the Satisfaction with Life Scale (SWLS). A review of the first 29 BFC participants, who finished initial and repeat measures, was completed to determine the effectiveness of the BFC program. Two of the three questionnaires of self-reported symptom change completed before and after participation in the BFC revealed a statistically significant reduction in symptom severity based on MPAI and NBSI total scores (p < .05). There were no significant differences in the SWLS score. Despite the typical limitations of a retrospective chart review, such as variation in treatment procedures, preliminary results reveal a trend towards improved self-reported cognitive and functional symptoms.

  11. ARCHER, a new Monte Carlo software tool for emerging heterogeneous computing environments

    Xu, X. George; Liu, Tianyu; Su, Lin; Du, Xining; Riblett, Matthew; Ji, Wei; Gu, Deyang; Carothers, Christopher D.; Shephard, Mark S.; Brown, Forrest B.; Kalra, Mannudeep K.; Liu, Bob

    2015-01-01

    Highlights: • A fast Monte Carlo based radiation transport code ARCHER was developed. • ARCHER supports different hardware including CPU, GPU and Intel Xeon Phi coprocessor. • Code is benchmarked again MCNP for medical applications. • A typical CT scan dose simulation only takes 6.8 s on an NVIDIA M2090 GPU. • GPU and coprocessor-based codes are 5–8 times faster than the CPU-based codes. - Abstract: The Monte Carlo radiation transport community faces a number of challenges associated with peta- and exa-scale computing systems that rely increasingly on heterogeneous architectures involving hardware accelerators such as GPUs and Xeon Phi coprocessors. Existing Monte Carlo codes and methods must be strategically upgraded to meet emerging hardware and software needs. In this paper, we describe the development of a software, called ARCHER (Accelerated Radiation-transport Computations in Heterogeneous EnviRonments), which is designed as a versatile testbed for future Monte Carlo codes. Preliminary results from five projects in nuclear engineering and medical physics are presented

  12. Selection Finder (SelFi: A computational metabolic engineering tool to enable directed evolution of enzymes

    Neda Hassanpour

    2017-06-01

    Full Text Available Directed evolution of enzymes consists of an iterative process of creating mutant libraries and choosing desired phenotypes through screening or selection until the enzymatic activity reaches a desired goal. The biggest challenge in directed enzyme evolution is identifying high-throughput screens or selections to isolate the variant(s with the desired property. We present in this paper a computational metabolic engineering framework, Selection Finder (SelFi, to construct a selection pathway from a desired enzymatic product to a cellular host and to couple the pathway with cell survival. We applied SelFi to construct selection pathways for four enzymes and their desired enzymatic products xylitol, D-ribulose-1,5-bisphosphate, methanol, and aniline. Two of the selection pathways identified by SelFi were previously experimentally validated for engineering Xylose Reductase and RuBisCO. Importantly, SelFi advances directed evolution of enzymes as there is currently no known generalized strategies or computational techniques for identifying high-throughput selections for engineering enzymes.

  13. Land Cover Classification from Multispectral Data Using Computational Intelligence Tools: A Comparative Study

    André Mora

    2017-11-01

    Full Text Available This article discusses how computational intelligence techniques are applied to fuse spectral images into a higher level image of land cover distribution for remote sensing, specifically for satellite image classification. We compare a fuzzy-inference method with two other computational intelligence methods, decision trees and neural networks, using a case study of land cover classification from satellite images. Further, an unsupervised approach based on k-means clustering has been also taken into consideration for comparison. The fuzzy-inference method includes training the classifier with a fuzzy-fusion technique and then performing land cover classification using reinforcement aggregation operators. To assess the robustness of the four methods, a comparative study including three years of land cover maps for the district of Mandimba, Niassa province, Mozambique, was undertaken. Our results show that the fuzzy-fusion method performs similarly to decision trees, achieving reliable classifications; neural networks suffer from overfitting; while k-means clustering constitutes a promising technique to identify land cover types from unknown areas.

  14. User's Manual for FOMOCO Utilities-Force and Moment Computation Tools for Overset Grids

    Chan, William M.; Buning, Pieter G.

    1996-01-01

    In the numerical computations of flows around complex configurations, accurate calculations of force and moment coefficients for aerodynamic surfaces are required. When overset grid methods are used, the surfaces on which force and moment coefficients are sought typically consist of a collection of overlapping surface grids. Direct integration of flow quantities on the overlapping grids would result in the overlapped regions being counted more than once. The FOMOCO Utilities is a software package for computing flow coefficients (force, moment, and mass flow rate) on a collection of overset surfaces with accurate accounting of the overlapped zones. FOMOCO Utilities can be used in stand-alone mode or in conjunction with the Chimera overset grid compressible Navier-Stokes flow solver OVERFLOW. The software package consists of two modules corresponding to a two-step procedure: (1) hybrid surface grid generation (MIXSUR module), and (2) flow quantities integration (OVERINT module). Instructions on how to use this software package are described in this user's manual. Equations used in the flow coefficients calculation are given in Appendix A.

  15. TAREAN: a computational tool for identification and characterization of satellite DNA from unassembled short reads.

    Novák, Petr; Ávila Robledillo, Laura; Koblížková, Andrea; Vrbová, Iva; Neumann, Pavel; Macas, Jirí

    2017-07-07

    Satellite DNA is one of the major classes of repetitive DNA, characterized by tandemly arranged repeat copies that form contiguous arrays up to megabases in length. This type of genomic organization makes satellite DNA difficult to assemble, which hampers characterization of satellite sequences by computational analysis of genomic contigs. Here, we present tandem repeat analyzer (TAREAN), a novel computational pipeline that circumvents this problem by detecting satellite repeats directly from unassembled short reads. The pipeline first employs graph-based sequence clustering to identify groups of reads that represent repetitive elements. Putative satellite repeats are subsequently detected by the presence of circular structures in their cluster graphs. Consensus sequences of repeat monomers are then reconstructed from the most frequent k-mers obtained by decomposing read sequences from corresponding clusters. The pipeline performance was successfully validated by analyzing low-pass genome sequencing data from five plant species where satellite DNA was previously experimentally characterized. Moreover, novel satellite repeats were predicted for the genome of Vicia faba and three of these repeats were verified by detecting their sequences on metaphase chromosomes using fluorescence in situ hybridization. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  16. An Accurate Computational Tool for Performance Estimation of FSO Communication Links over Weak to Strong Atmospheric Turbulent Channels

    Theodore D. Katsilieris

    2017-03-01

    Full Text Available The terrestrial optical wireless communication links have attracted significant research and commercial worldwide interest over the last few years due to the fact that they offer very high and secure data rate transmission with relatively low installation and operational costs, and without need of licensing. However, since the propagation path of the information signal, i.e., the laser beam, is the atmosphere, their effectivity affects the atmospheric conditions strongly in the specific area. Thus, system performance depends significantly on the rain, the fog, the hail, the atmospheric turbulence, etc. Due to the influence of these effects, it is necessary to study, theoretically and numerically, very carefully before the installation of such a communication system. In this work, we present exactly and accurately approximate mathematical expressions for the estimation of the average capacity and the outage probability performance metrics, as functions of the link’s parameters, the transmitted power, the attenuation due to the fog, the ambient noise and the atmospheric turbulence phenomenon. The latter causes the scintillation effect, which results in random and fast fluctuations of the irradiance at the receiver’s end. These fluctuations can be studied accurately with statistical methods. Thus, in this work, we use either the lognormal or the gamma–gamma distribution for weak or moderate to strong turbulence conditions, respectively. Moreover, using the derived mathematical expressions, we design, accomplish and present a computational tool for the estimation of these systems’ performances, while also taking into account the parameter of the link and the atmospheric conditions. Furthermore, in order to increase the accuracy of the presented tool, for the cases where the obtained analytical mathematical expressions are complex, the performance results are verified with the numerical estimation of the appropriate integrals. Finally, using

  17. Software Construction and Composition Tools for Petascale Computing SCW0837 Progress Report

    Epperly, T W; Hochstein, L

    2011-09-12

    The majority of scientific software is distributed as source code. As the number of library dependencies and supported platforms increases, so does the complexity of describing the rules for configuring and building software. In this project, we have performed an empirical study of the magnitude of the build problem by examining the development history of two DOE-funded scientific software projects. We have developed MixDown, a meta-build tool, to simplify the task of building applications that depend on multiple third-party libraries. The results of this research indicate that the effort that scientific programmers spend takes a significant fraction of the total development effort and that the use of MixDown can significantly simplify the task of building software with multiple dependencies.

  18. Artificial intelligence-based computer modeling tools for controlling slag foaming in electric arc furnaces

    Wilson, Eric Lee

    Due to increased competition in a world economy, steel companies are currently interested in developing techniques that will allow for the improvement of the steelmaking process, either by increasing output efficiency or by improving the quality of their product, or both. Slag foaming is one practice that has been shown to contribute to both these goals. However, slag foaming is highly dynamic and difficult to model or control. This dissertation describes an effort to use artificial intelligence-based tools (genetic algorithms, fuzzy logic, and neural networks) to both model and control the slag foaming process. Specifically, a neural network is trained and tested on slag foaming data provided by a steel plant. This neural network model is then controlled by a fuzzy logic controller, which in turn is optimized by a genetic algorithm. This tuned controller is then installed at a steel plant and given control be a more efficient slag foaming controller than what was previously used by the steel plant.

  19. CasimirSim - A Tool to Compute Casimir Polder Forces for Nontrivial 3D Geometries

    Sedmik, Rene; Tajmar, Martin

    2007-01-01

    The so-called Casimir effect is one of the most interesting macro-quantum effects. Being negligible on the macro-scale it becomes a governing factor below structure sizes of 1 μm where it accounts for typically 100 kN m-2. The force does not depend on gravity, or electric charge but solely on the materials properties, and geometrical shape. This makes the effect a strong candidate for micro(nano)-mechanical devices M(N)EMS. Despite a long history of research the theory lacks a uniform description valid for arbitrary geometries which retards technical application. We present an advanced state-of-the-art numerical tool overcoming all the usual geometrical restrictions, capable of calculating arbitrary 3D geometries by utilizing the Casimir Polder approximation for the Casimir force

  20. Computer Tool for Automatically Generated 3D Illustration in Real Time from Archaeological Scanned Pieces

    Luis López

    2012-11-01

    Full Text Available The graphical documentation process of archaeological pieces requires the active involvement of a professional artist to recreate beautiful illustrations using a wide variety of expressive techniques. Frequently, the artist’s work is limited by the inconvenience of working only with the photographs of the pieces he is going to illustrate. This paper presents a software tool that allows the easy generation of illustrations in real time from 3D scanned models. The developed interface allows the user to simulate very elaborate artistic styles through the creation of diagrams by using the available virtual lights. The software processes the diagrams to render an illustration from any given angle or position. Among the available virtual lights, there are well known techniques as silhouettes enhancement, hatching or toon shading.

  1. Vortex filament method as a tool for computational visualization of quantum turbulence

    Hänninen, Risto; Baggaley, Andrew W.

    2014-01-01

    The vortex filament model has become a standard and powerful tool to visualize the motion of quantized vortices in helium superfluids. In this article, we present an overview of the method and highlight its impact in aiding our understanding of quantum turbulence, particularly superfluid helium. We present an analysis of the structure and arrangement of quantized vortices. Our results are in agreement with previous studies showing that under certain conditions, vortices form coherent bundles, which allows for classical vortex stretching, giving quantum turbulence a classical nature. We also offer an explanation for the differences between the observed properties of counterflow and pure superflow turbulence in a pipe. Finally, we suggest a mechanism for the generation of coherent structures in the presence of normal fluid shear. PMID:24704873

  2. A GIS-based Computational Tool for Multidimensional Flow Velocity by Acoustic Doppler Current Profilers

    Kim, D; Winkler, M; Muste, M

    2015-01-01

    Acoustic Doppler Current Profilers (ADCPs) provide efficient and reliable flow measurements compared to other tools for characteristics of the riverine environments. In addition to originally targeted discharge measurements, ADCPs are increasingly utilized to assess river flow characteristics. The newly developed VMS (Velocity Mapping Software) aims at providing an efficient process for quality assurance, mapping velocity vectors for visualization and facilitating comparison with physical and numerical model results. VMS was designed to provide efficient and smooth work flows for processing groups of transects. The software allows the user to select group of files and subsequently to conduct statistical and graphical quality assurance on the files as a group or individually as appropriate. VMS also enables spatial averaging in horizontal and vertical plane for ADCP data in a single or multiple transects over the same or consecutive cross sections. The analysis results are displayed in numerical and graphical formats. (paper)

  3. The Use Of Computational Human Performance Modeling As Task Analysis Tool

    Jacuqes Hugo; David Gertman

    2012-07-01

    During a review of the Advanced Test Reactor safety basis at the Idaho National Laboratory, human factors engineers identified ergonomic and human reliability risks involving the inadvertent exposure of a fuel element to the air during manual fuel movement and inspection in the canal. There were clear indications that these risks increased the probability of human error and possible severe physical outcomes to the operator. In response to this concern, a detailed study was conducted to determine the probability of the inadvertent exposure of a fuel element. Due to practical and safety constraints, the task network analysis technique was employed to study the work procedures at the canal. Discrete-event simulation software was used to model the entire procedure as well as the salient physical attributes of the task environment, such as distances walked, the effect of dropped tools, the effect of hazardous body postures, and physical exertion due to strenuous tool handling. The model also allowed analysis of the effect of cognitive processes such as visual perception demands, auditory information and verbal communication. The model made it possible to obtain reliable predictions of operator performance and workload estimates. It was also found that operator workload as well as the probability of human error in the fuel inspection and transfer task were influenced by the concurrent nature of certain phases of the task and the associated demand on cognitive and physical resources. More importantly, it was possible to determine with reasonable accuracy the stages as well as physical locations in the fuel handling task where operators would be most at risk of losing their balance and falling into the canal. The model also provided sufficient information for a human reliability analysis that indicated that the postulated fuel exposure accident was less than credible.

  4. Computational tool for postoperative evaluation of cochlear implant patients; Ferramenta computacional para avaliacao pos-operatoria de pacientes com implante coclear

    Giacomini, Guilherme; Pavan, Ana Luiza M.; Pina, Diana R. de [Universidade Estadual Paulista Julio de Mesquita Filho (IBB/UNESP), Botucatu, SP (Brazil). Instituto de Biociencias; Altemani, Joao M.C.; Castilho, Arthur M. [Universidade Estadual de Campinas (HC/UNICAMP), Campinas, SP (Brazil). Hospital de Clinicas

    2016-07-01

    The aim of this study was to develop a tool to calculate the insertion depth angle of cochlear implants, from computed tomography exams. The tool uses different image processing techniques, such as thresholding and active contour. Then, we compared the average insertion depth angle of three different implant manufacturers. The developed tool can be used, in the future, to compare the insertion depth angle of the cochlear implant with postoperative response of patient's hearing. (author)

  5. Development of a tool to aid the radiologic technologist using augmented reality and computer vision

    MacDougall, Robert D.; Scherrer, Benoit; Don, Steven

    2018-01-01

    This technical innovation describes the development of a novel device to aid technologists in reducing exposure variation and repeat imaging in computed and digital radiography. The device consists of a color video and depth camera in combination with proprietary software and user interface. A monitor in the x-ray control room displays the position of the patient in real time with respect to automatic exposure control chambers and image receptor area. The thickness of the body part of interest is automatically displayed along with a motion indicator for the examined body part. The aim is to provide an automatic measurement of patient thickness to set the x-ray technique and to assist the technologist in detecting errors in positioning and motion before the patient is exposed. The device has the potential to reduce the incidence of repeat imaging by addressing problems technologists encounter daily during the acquisition of radiographs. (orig.)

  6. Computer Animations as Astronomy Educational Tool: Immanuel Kant and the Island Universes Hypothesis

    Mijic, M.; Park, D.; Zumaeta, J.; Simonian, V.; Levitin, S.; Sullivan, A.; Kang, E. Y. E.; Longson, T.

    2008-11-01

    Development of astronomy is based on well defined, watershed moments when an individual or a group of individuals make a discovery or a measurement that expand, and sometimes dramatically improve our knowledge of the Universe. The purpose of the Scientific Visualization project at Cal State Los Angeles is to bring these moments to life with the use of computer animations, the medium of the 21st century that appeals to the generations which grew up in Internet age. Our first story describes Immanuel Kant's remarkable the Island Universes hypothesis. Using elementary principles of then new Newtonian mechanics, Kant made bold and ultimately correct interpretation of the Milky Way and the objects that we now call galaxies.

  7. [Elderlies in street situation or social vulnerability: facilities and difficulties in the use of computational tools].

    Frias, Marcos Antonio da Eira; Peres, Heloisa Helena Ciqueto; Pereira, Valclei Aparecida Gandolpho; Negreiros, Maria Célia de; Paranhos, Wana Yeda; Leite, Maria Madalena Januário

    2014-01-01

    This study aimed to identify the advantages and difficulties encountered by older people living on the streets or social vulnerability, to use the computer or internet. It is an exploratory qualitative research, in which five elderlies, attended on a non-governmental organization located in the city of São Paulo, have participated. The discourses were analyzed by content analysis technique and showed, as facilities, among others, to clarify doubts with the monitors, the stimulus for new discoveries coupled with proactivity and curiosity, and develop new skills. The mentioned difficulties were related to physical or cognitive issues, lack of instructor, and lack of knowledge to interact with the machine. The studies focusing on the elderly population living on the streets or in social vulnerability may contribute with evidence to guide the formulation of public policies to this population.

  8. A content validity approach to creating an end-user computer skill assessment tool

    Shirley Gibbs

    Full Text Available Practical assessment instruments are commonly used in the workplace and educational environments to assess a person\\'s level of digital literacy and end-user computer skill. However, it is often difficult to find statistical evidence of the actual validity of instruments being used. To ensure that the correct factors are being assessed for a particular purpose it is necessary to undertake some type of psychometric testing, and the first step is to study the content relevance of the measure. The purpose of this paper is to report on the rigorous judgment-quantification process using panels of experts in order to establish inter-rater reliability and agreement in the development of end-user instruments developed to measure workplace skills using spreadsheet and word-processing applications.

  9. Development of a tool to aid the radiologic technologist using augmented reality and computer vision

    MacDougall, Robert D.; Scherrer, Benoit [Boston Children' s Hospital, Department of Radiology, Boston, MA (United States); Don, Steven [Washington University School of Medicine, Mallinckrodt Institute of Radiology, St. Louis, MO (United States)

    2018-01-15

    This technical innovation describes the development of a novel device to aid technologists in reducing exposure variation and repeat imaging in computed and digital radiography. The device consists of a color video and depth camera in combination with proprietary software and user interface. A monitor in the x-ray control room displays the position of the patient in real time with respect to automatic exposure control chambers and image receptor area. The thickness of the body part of interest is automatically displayed along with a motion indicator for the examined body part. The aim is to provide an automatic measurement of patient thickness to set the x-ray technique and to assist the technologist in detecting errors in positioning and motion before the patient is exposed. The device has the potential to reduce the incidence of repeat imaging by addressing problems technologists encounter daily during the acquisition of radiographs. (orig.)

  10. Computer game as a tool for training the identification of phonemic length.

    Pennala, Riitta; Richardson, Ulla; Ylinen, Sari; Lyytinen, Heikki; Martin, Maisa

    2014-12-01

    Computer-assisted training of Finnish phonemic length was conducted with 7-year-old Russian-speaking second-language learners of Finnish. Phonemic length plays a different role in these two languages. The training included game activities with two- and three-syllable word and pseudo-word minimal pairs with prototypical vowel durations. The lowest accuracy scores were recorded for two-syllable words. Accuracy scores were higher for the minimal pairs with larger rather than smaller differences in duration. Accuracy scores were lower for long duration than for short duration. The ability to identify quantity degree was generalized to stimuli used in the identification test in two of the children. Ideas for improving the game are introduced.

  11. Integrating GRID tools to build a computing resource broker: activities of DataGrid WP1

    Anglano, C.; Barale, S.; Gaido, L.; Guarise, A.; Lusso, S.; Werbrouck, A.

    2001-01-01

    Resources on a computational Grid are geographically distributed, heterogeneous in nature, owned by different individuals or organizations with their own scheduling policies, have different access cost models with dynamically varying loads and availability conditions. This makes traditional approaches to workload management, load balancing and scheduling inappropriate. The first work package (WP1) of the EU-funded DataGrid project is addressing the issue of optimizing the distribution of jobs onto Grid resources based on a knowledge of the status and characteristics of these resources that is necessarily out-of-date (collected in a finite amount of time at a very loosely coupled site). The authors describe the DataGrid approach in integrating existing software components (from Condor, Globus, etc.) to build a Grid Resource Broker, and the early efforts to define a workable scheduling strategy

  12. Computer Vision Tool and Technician as First Reader of Lung Cancer Screening CT Scans.

    Ritchie, Alexander J; Sanghera, Calvin; Jacobs, Colin; Zhang, Wei; Mayo, John; Schmidt, Heidi; Gingras, Michel; Pasian, Sergio; Stewart, Lori; Tsai, Scott; Manos, Daria; Seely, Jean M; Burrowes, Paul; Bhatia, Rick; Atkar-Khattra, Sukhinder; van Ginneken, Bram; Tammemagi, Martin; Tsao, Ming Sound; Lam, Stephen

    2016-05-01

    To implement a cost-effective low-dose computed tomography (LDCT) lung cancer screening program at the population level, accurate and efficient interpretation of a large volume of LDCT scans is needed. The objective of this study was to evaluate a workflow strategy to identify abnormal LDCT scans in which a technician assisted by computer vision (CV) software acts as a first reader with the aim to improve speed, consistency, and quality of scan interpretation. Without knowledge of the diagnosis, a technician reviewed 828 randomly batched scans (136 with lung cancers, 556 with benign nodules, and 136 without nodules) from the baseline Pan-Canadian Early Detection of Lung Cancer Study that had been annotated by the CV software CIRRUS Lung Screening (Diagnostic Image Analysis Group, Nijmegen, The Netherlands). The scans were classified as either normal (no nodules ≥1 mm or benign nodules) or abnormal (nodules or other abnormality). The results were compared with the diagnostic interpretation by Pan-Canadian Early Detection of Lung Cancer Study radiologists. The overall sensitivity and specificity of the technician in identifying an abnormal scan were 97.8% (95% confidence interval: 96.4-98.8) and 98.0% (95% confidence interval: 89.5-99.7), respectively. Of the 112 prevalent nodules that were found to be malignant in follow-up, 92.9% were correctly identified by the technician plus CV compared with 84.8% by the study radiologists. The average time taken by the technician to review a scan after CV processing was 208 ± 120 seconds. Prescreening CV software and a technician as first reader is a promising strategy for improving the consistency and quality of screening interpretation of LDCT scans. Copyright © 2016 International Association for the Study of Lung Cancer. Published by Elsevier Inc. All rights reserved.

  13. Technologies and tools for high-performance distributed computing. Final report

    Karonis, Nicholas T.

    2000-05-01

    In this project we studied the practical use of the MPI message-passing interface in advanced distributed computing environments. We built on the existing software infrastructure provided by the Globus Toolkit{trademark}, the MPICH portable implementation of MPI, and the MPICH-G integration of MPICH with Globus. As a result of this project we have replaced MPICH-G with its successor MPICH-G2, which is also an integration of MPICH with Globus. MPICH-G2 delivers significant improvements in message passing performance when compared to its predecessor MPICH-G and was based on superior software design principles resulting in a software base that was much easier to make the functional extensions and improvements we did. Using Globus services we replaced the default implementation of MPI's collective operations in MPICH-G2 with more efficient multilevel topology-aware collective operations which, in turn, led to the development of a new timing methodology for broadcasts [8]. MPICH-G2 was extended to include client/server functionality from the MPI-2 standard [23] to facilitate remote visualization applications and, through the use of MPI idioms, MPICH-G2 provided application-level control of quality-of-service parameters as well as application-level discovery of underlying Grid-topology information. Finally, MPICH-G2 was successfully used in a number of applications including an award-winning record-setting computation in numerical relativity. In the sections that follow we describe in detail the accomplishments of this project, we present experimental results quantifying the performance improvements, and conclude with a discussion of our applications experiences. This project resulted in a significant increase in the utility of MPICH-G2.

  14. Predicting Structures of Ru-Centered Dyes: A Computational Screening Tool.

    Fredin, Lisa A; Allison, Thomas C

    2016-04-07

    Dye-sensitized solar cells (DSCs) represent a means for harvesting solar energy to produce electrical power. Though a number of light harvesting dyes are in use, the search continues for more efficient and effective compounds to make commercially viable DSCs a reality. Computational methods have been increasingly applied to understand the dyes currently in use and to aid in the search for improved light harvesting compounds. Semiempirical quantum chemistry methods have a well-deserved reputation for giving good quality results in a very short amount of computer time. The most recent semiempirical models such as PM6 and PM7 are parametrized for a wide variety of molecule types, including organometallic complexes similar to DSC chromophores. In this article, the performance of PM6 is tested against a set of 20 molecules whose geometries were optimized using a density functional theory (DFT) method. It is found that PM6 gives geometries that are in good agreement with the optimized DFT structures. In order to reduce the differences between geometries optimized using PM6 and geometries optimized using DFT, the PM6 basis set parameters have been optimized for a subset of the molecules. It is found that it is sufficient to optimize the basis set for Ru alone to improve the agreement between the PM6 results and the DFT results. When this optimized Ru basis set is used, the mean unsigned error in Ru-ligand bond lengths is reduced from 0.043 to 0.017 Å in the set of 20 test molecules. Though the magnitude of these differences is small, the effect on the calculated UV/vis spectra is significant. These results clearly demonstrate the value of using PM6 to screen DSC chromophores as well as the value of optimizing PM6 basis set parameters for a specific set of molecules.

  15. Computer Simulation as a Tool for Assessing Decision-Making in Pandemic Influenza Response Training

    James M Leaming

    2013-05-01

    Full Text Available Introduction: We sought to develop and test a computer-based, interactive simulation of a hypothetical pandemic influenza outbreak. Fidelity was enhanced with integrated video and branching decision trees, built upon the 2007 federal planning assumptions. We conducted a before-and-after study of the simulation effectiveness to assess the simulations’ ability to assess participants’ beliefs regarding their own hospitals’ mass casualty incident preparedness.Methods: Development: Using a Delphi process, we finalized a simulation that serves up a minimum of over 50 key decisions to 6 role-players on networked laptops in a conference area. The simulation played out an 8-week scenario, beginning with pre-incident decisions. Testing: Role-players and trainees (N=155 were facilitated to make decisions during the pandemic. Because decision responses vary, the simulation plays out differently, and a casualty counter quantifies hypothetical losses. The facilitator reviews and critiques key factors for casualty control, including effective communications, working with external organizations, development of internal policies and procedures, maintaining supplies and services, technical infrastructure support, public relations and training. Pre- and post-survey data were compared on trainees.Results: Post-simulation trainees indicated a greater likelihood of needing to improve their organization in terms of communications, mass casualty incident planning, public information and training. Participants also recognized which key factors required immediate attention at their own home facilities.Conclusion: The use of a computer-simulation was effective in providing a facilitated environment for determining the perception of preparedness, evaluating general preparedness concepts and introduced participants to critical decisions involved in handling a regional pandemic influenza surge. [West J Emerg Med. 2013;14(3:236–242.

  16. FY05 LDRD Final Report A Computational Design Tool for Microdevices and Components in Pathogen Detection Systems

    Trebotich, D

    2006-02-07

    We have developed new algorithms to model complex biological flows in integrated biodetection microdevice components. The proposed work is important because the design strategy for the next-generation Autonomous Pathogen Detection System at LLNL is the microfluidic-based Biobriefcase, being developed under the Chemical and Biological Countermeasures Program in the Homeland Security Organization. This miniaturization strategy introduces a new flow regime to systems where biological flow is already complex and not well understood. Also, design and fabrication of MEMS devices is time-consuming and costly due to the current trial-and-error approach. Furthermore, existing devices, in general, are not optimized. There are several MEMS CAD capabilities currently available, but their computational fluid dynamics modeling capabilities are rudimentary at best. Therefore, we proposed a collaboration to develop computational tools at LLNL which will (1) provide critical understanding of the fundamental flow physics involved in bioMEMS devices, (2) shorten the design and fabrication process, and thus reduce costs, (3) optimize current prototypes and (4) provide a prediction capability for the design of new, more advanced microfluidic systems. Computational expertise was provided by Comp-CASC and UC Davis-DAS. The simulation work was supported by key experiments for guidance and validation at UC Berkeley-BioE.

  17. G-LoSA: An efficient computational tool for local structure-centric biological studies and drug design.

    Lee, Hui Sun; Im, Wonpil

    2016-04-01

    Molecular recognition by protein mostly occurs in a local region on the protein surface. Thus, an efficient computational method for accurate characterization of protein local structural conservation is necessary to better understand biology and drug design. We present a novel local structure alignment tool, G-LoSA. G-LoSA aligns protein local structures in a sequence order independent way and provides a GA-score, a chemical feature-based and size-independent structure similarity score. Our benchmark validation shows the robust performance of G-LoSA to the local structures of diverse sizes and characteristics, demonstrating its universal applicability to local structure-centric comparative biology studies. In particular, G-LoSA is highly effective in detecting conserved local regions on the entire surface of a given protein. In addition, the applications of G-LoSA to identifying template ligands and predicting ligand and protein binding sites illustrate its strong potential for computer-aided drug design. We hope that G-LoSA can be a useful computational method for exploring interesting biological problems through large-scale comparison of protein local structures and facilitating drug discovery research and development. G-LoSA is freely available to academic users at http://im.compbio.ku.edu/GLoSA/. © 2016 The Protein Society.

  18. Computational fluid dynamics and particle image velocimetry assisted design tools for a new generation of trochoidal gear pumps

    M Garcia-Vilchez

    2015-06-01

    Full Text Available Trochoidal gear pumps produce significant flow pulsations that result in pressure pulsations, which interact with the system where they are connected, shortening the life of both the pump and circuit components. The complicated aspects of the operation of a gerotor pump make computational fluid dynamics the proper tool for modelling and simulating its flow characteristics. A three-dimensional model with deforming mesh computational fluid dynamics is presented, including the effects of the manufacturing tolerance and the leakage inside the pump. A new boundary condition is created for the simulation of the solid contact in the interteeth radial clearance. The experimental study of the pump is carried out by means of time-resolved particle image velocimetry, and results are qualitatively evaluated, thanks to the numerical simulation results. Time-resolved particle image velocimetry is developed in order to adapt it to the gerotor pump, and it is proved to be a feasible alternative to obtain the instantaneous flow of the pump in a direct mode, which would allow the determination of geometries that minimize the non-desired flow pulsations. Thus, a new methodology involving computational fluid dynamics and time-resolved particle image velocimetry is presented, which allows the obtaining of the instantaneous flow of the pump in a direct mode without altering its behaviour significantly.

  19. PeTTSy: a computational tool for perturbation analysis of complex systems biology models.

    Domijan, Mirela; Brown, Paul E; Shulgin, Boris V; Rand, David A

    2016-03-10

    Over the last decade sensitivity analysis techniques have been shown to be very useful to analyse complex and high dimensional Systems Biology models. However, many of the currently available toolboxes have either used parameter sampling, been focused on a restricted set of model observables of interest, studied optimisation of a objective function, or have not dealt with multiple simultaneous model parameter changes where the changes can be permanent or temporary. Here we introduce our new, freely downloadable toolbox, PeTTSy (Perturbation Theory Toolbox for Systems). PeTTSy is a package for MATLAB which implements a wide array of techniques for the perturbation theory and sensitivity analysis of large and complex ordinary differential equation (ODE) based models. PeTTSy is a comprehensive modelling framework that introduces a number of new approaches and that fully addresses analysis of oscillatory systems. It examines sensitivity analysis of the models to perturbations of parameters, where the perturbation timing, strength, length and overall shape can be controlled by the user. This can be done in a system-global setting, namely, the user can determine how many parameters to perturb, by how much and for how long. PeTTSy also offers the user the ability to explore the effect of the parameter perturbations on many different types of outputs: period, phase (timing of peak) and model solutions. PeTTSy can be employed on a wide range of mathematical models including free-running and forced oscillators and signalling systems. To enable experimental optimisation using the Fisher Information Matrix it efficiently allows one to combine multiple variants of a model (i.e. a model with multiple experimental conditions) in order to determine the value of new experiments. It is especially useful in the analysis of large and complex models involving many variables and parameters. PeTTSy is a comprehensive tool for analysing large and complex models of regulatory and

  20. How accurate are adolescents in portion-size estimation using the computer tool Young Adolescents' Nutrition Assessment on Computer (YANA-C)?

    Vereecken, Carine; Dohogne, Sophie; Covents, Marc; Maes, Lea

    2010-06-01

    Computer-administered questionnaires have received increased attention for large-scale population research on nutrition. In Belgium-Flanders, Young Adolescents' Nutrition Assessment on Computer (YANA-C) has been developed. In this tool, standardised photographs are available to assist in portion-size estimation. The purpose of the present study is to assess how accurate adolescents are in estimating portion sizes of food using YANA-C. A convenience sample, aged 11-17 years, estimated the amounts of ten commonly consumed foods (breakfast cereals, French fries, pasta, rice, apple sauce, carrots and peas, crisps, creamy velouté, red cabbage, and peas). Two procedures were followed: (1) short-term recall: adolescents (n 73) self-served their usual portions of the ten foods and estimated the amounts later the same day; (2) real-time perception: adolescents (n 128) estimated two sets (different portions) of pre-weighed portions displayed near the computer. Self-served portions were, on average, 8 % underestimated; significant underestimates were found for breakfast cereals, French fries, peas, and carrots and peas. Spearman's correlations between the self-served and estimated weights varied between 0.51 and 0.84, with an average of 0.72. The kappa statistics were moderate (>0.4) for all but one item. Pre-weighed portions were, on average, 15 % underestimated, with significant underestimates for fourteen of the twenty portions. Photographs of food items can serve as a good aid in ranking subjects; however, to assess the actual intake at a group level, underestimation must be considered.

  1. An Interactive Tool for Outdoor Computer Controlled Cultivation of Microalgae in a Tubular Photobioreactor System

    Raquel Dormido

    2014-03-01

    Full Text Available This paper describes an interactive virtual laboratory for experimenting with an outdoor tubular photobioreactor (henceforth PBR for short. This virtual laboratory it makes possible to: (a accurately reproduce the structure of a real plant (the PBR designed and built by the Department of Chemical Engineering of the University of Almería, Spain; (b simulate a generic tubular PBR by changing the PBR geometry; (c simulate the effects of changing different operating parameters such as the conditions of the culture (pH, biomass concentration, dissolved O2, inyected CO2, etc.; (d simulate the PBR in its environmental context; it is possible to change the geographic location of the system or the solar irradiation profile; (e apply different control strategies to adjust different variables such as the CO2 injection, culture circulation rate or culture temperature in order to maximize the biomass production; (f simulate the harvesting. In this way, users can learn in an intuitive way how productivity is affected by any change in the design. It facilitates the learning of how to manipulate essential variables for microalgae growth to design an optimal PBR. The simulator has been developed with Easy Java Simulations, a freeware open-source tool developed in Java, specifically designed for the creation of interactive dynamic simulations.

  2. The mechanical design of a transfemoral prosthesis using computational tools and design methodology

    John Sánchez Otero

    2012-09-01

    Full Text Available Artificial limb replacement with lower limb prostheses has been widely reported in current scientific literature. There are many lower limb prosthetic designs ranging from a single-axis knee mechanism to complex mechanisms involving microcontrollers, made from many materials ranging from lightweight, high specific strength ones (e.g., carbon fibre to traditional forms (e.g., stainless steel. However, the challenge is to design prostheses whose movement resembles the human body’s natural movement as closely as possible. Advances in prosthetics have enabled many amputees to return to their everyday activities; however, such prostheses are expensive, some costing as much as $60,000. Many of the affected population in Colombia have scarce economic resources; there is therefore a need to develop affordable functional prostheses.The Universidad del Norte’s Materials, Processes and Design Research Group and the Robotics and Intelligent Systems Group have been working on this line of research to develop modular prostheses which can be adjusted to each patient’s requirements. This research represents an initial methodological approach to developing a prosthesis in which software tools have been used (the finite element method with a criteria relationship matrix for selecting the best alternative while considering different aspects such as mod-ularity, cost, stiffness and weight.

  3. Risk monitor-a tool for computer aided risk assessment for NPPs

    Vinod, Gopika; Saraf, R.K.; Babar, A.K.; Kushwaha, H.S.; Hadap, Nikhil

    2001-01-01

    Considerable changes occur in components status and system design and subsequent operation due to changes in plant configuration and their operating procedures. These changes are organised because some components are randomly down and other can be planned for test, maintenance and repair. This results in a fluctuation of risk level over operating time, which is termed as risk profile. Probabilistic Safety Assessment (PSA) is an analytical technique for assessing the risk by integrating diverse aspects of design and operation of a Nuclear Power Plant. Risk can be defined as the product of the probability of an accident and the consequences from that accident. Reactor Safety Division of BARC has developed PC based tool, which can assess the risk profile. This package can be used to optimise the operation in Nuclear Power Plants with respect to a minimum risk level over the operating time, and is termed as Risk Monitor. Risk Monitor is user friendly and can re-evaluate core damage frequency for changes in component status, test interval, initiating event frequency etc. Plant restoration advice, when the plant is in high risk configuration, current status of all plant equipment, and equipment prioritization are also provided by the package. (author)

  4. PRIDE and "Database on Demand" as valuable tools for computational proteomics.

    Vizcaíno, Juan Antonio; Reisinger, Florian; Côté, Richard; Martens, Lennart

    2011-01-01

    The Proteomics Identifications Database (PRIDE, http://www.ebi.ac.uk/pride ) provides users with the ability to explore and compare mass spectrometry-based proteomics experiments that reveal details of the protein expression found in a broad range of taxonomic groups, tissues, and disease states. A PRIDE experiment typically includes identifications of proteins, peptides, and protein modifications. Additionally, many of the submitted experiments also include the mass spectra that provide the evidence for these identifications. Finally, one of the strongest advantages of PRIDE in comparison with other proteomics repositories is the amount of metadata it contains, a key point to put the above-mentioned data in biological and/or technical context. Several informatics tools have been developed in support of the PRIDE database. The most recent one is called "Database on Demand" (DoD), which allows custom sequence databases to be built in order to optimize the results from search engines. We describe the use of DoD in this chapter. Additionally, in order to show the potential of PRIDE as a source for data mining, we also explore complex queries using federated BioMart queries to integrate PRIDE data with other resources, such as Ensembl, Reactome, or UniProt.

  5. A Novel Computational Tool for Mining Real-Life Data: Application in the Metastatic Colorectal Cancer Care Setting.

    Nava Siegelmann-Danieli

    Full Text Available Randomized clinical trials constitute the gold-standard for evaluating new anti-cancer therapies; however, real-life data are key in complementing clinically useful information. We developed a computational tool for real-life data analysis and applied it to the metastatic colorectal cancer (mCRC setting. This tool addressed the impact of oncology/non-oncology parameters on treatment patterns and clinical outcomes.The developed tool enables extraction of any computerized information including comorbidities and use of drugs (oncological/non-oncological per individual HMO member. The study in which we evaluated this tool was a retrospective cohort study that included Maccabi Healthcare Services members with mCRC receiving bevacizumab with fluoropyrimidines (FP, FP plus oxaliplatin (FP-O, or FP plus irinotecan (FP-I in the first-line between 9/2006 and 12/2013.The analysis included 753 patients of whom 15.4% underwent subsequent metastasectomy (the Surgery group. For the entire cohort, median overall survival (OS was 20.5 months; in the Surgery group, median duration of bevacizumab-containing therapy (DOT pre-surgery was 6.1 months; median OS was not reached. In the Non-surgery group, median OS and DOT were 18.7 and 11.4 months, respectively; no significant OS differences were noted between FP-O and FP-I, whereas FP use was associated with shorter OS (12.3 month; p <0.002; notably, these patients were older. Patients who received both FP-O- and FP-I-based regimens achieved numerically longer OS vs. those who received only one of these regimens (22.1 [19.9-24.0] vs. 18.9 [15.5-21.9] months. Among patients assessed for wild-type KRAS and treated with subsequent anti-EGFR agent, OS was 25.4 months and 18.7 months for 124 treated vs. 37 non-treated patients (non-significant. Cox analysis (controlling for age and gender identified several non-oncology parameters associated with poorer clinical outcomes including concurrent use of diuretics and proton

  6. Computer simulation of the relationship between selected properties of laser remelted tool steel surface layer

    Bonek, Mirosław, E-mail: miroslaw.bonek@polsl.pl; Śliwa, Agata; Mikuła, Jarosław

    2016-12-01

    Highlights: • Prediction of the properties of laser remelted surface layer with the use of FEM analysis. • The simulation was applied to determine the shape of molten pool of remelted surface. • Applying of numerical model MES for simulation of surface laser treatment to meaningfully shorten time of selection of optimum parameters. • An FEM model was established for the purpose of building a computer simulation. - Abstract: Investigations >The language in this paper has been slightly changed. Please check for clarity of thought, and that the meaning is still correct, and amend if necessary.include Finite Element Method simulation model of remelting of PMHSS6-5-3 high-speed steel surface layer using the high power diode laser (HPDL). The Finite Element Method computations were performed using ANSYS software. The scope of FEM simulation was determination of temperature distribution during laser alloying process at various process configurations regarding the laser beam power and method of powder deposition, as pre-coated past or surface with machined grooves. The Finite Element Method simulation was performed on five different 3-dimensional models. The model assumed nonlinear change of thermal conductivity, specific heat and density that were depended on temperature. The heating process was realized as heat flux corresponding to laser beam power of 1.4, 1.7 and 2.1 kW. Latent heat effects are considered during solidification. The molten pool is composed of the same material as the substrate and there is no chemical reaction. The absorptivity of laser energy was dependent on the simulated materials properties and their surface condition. The Finite Element Method simulation allows specifying the heat affected zone and the temperature distribution in the sample as a function of time and thus allows the estimation of the structural changes taking place during laser remelting process. The simulation was applied to determine the shape of molten pool and the

  7. Computer-aided sperm analysis: a useful tool to evaluate patient's response to varicocelectomy.

    Ariagno, Julia I; Mendeluk, Gabriela R; Furlan, María J; Sardi, M; Chenlo, P; Curi, Susana M; Pugliese, Mercedes N; Repetto, Herberto E; Cohen, Mariano

    2017-01-01

    Preoperative and postoperative sperm parameter values from infertile men with varicocele were analyzed by computer-aided sperm analysis (CASA) to assess if sperm characteristics improved after varicocelectomy. Semen samples of men with proven fertility (n = 38) and men with varicocele-related infertility (n = 61) were also analyzed. Conventional semen analysis was performed according to WHO (2010) criteria and a CASA system was employed to assess kinetic parameters and sperm concentration. Seminal parameters values in the fertile group were very far above from those of the patients, either before or after surgery. No significant improvement in the percentage normal sperm morphology (P = 0.10), sperm concentration (P = 0.52), total sperm count (P = 0.76), subjective motility (%) (P = 0.97) nor kinematics (P = 0.30) was observed after varicocelectomy when all groups were compared. Neither was significant improvement found in percentage normal sperm morphology (P = 0.91), sperm concentration (P = 0.10), total sperm count (P = 0.89) or percentage motility (P = 0.77) after varicocelectomy in paired comparisons of preoperative and postoperative data. Analysis of paired samples revealed that the total sperm count (P = 0.01) and most sperm kinetic parameters: curvilinear velocity (P = 0.002), straight-line velocity (P = 0.0004), average path velocity (P = 0.0005), linearity (P = 0.02), and wobble (P = 0.006) improved after surgery. CASA offers the potential for accurate quantitative assessment of each patient's response to varicocelectomy.

  8. Calculation of brain atrophy using computed tomography and a new atrophy measurement tool

    Bin Zahid, Abdullah; Mikheev, Artem; Yang, Andrew Il; Samadani, Uzma; Rusinek, Henry

    2015-03-01

    Purpose: To determine if brain atrophy can be calculated by performing volumetric analysis on conventional computed tomography (CT) scans in spite of relatively low contrast for this modality. Materials & Method: CTs for 73 patients from the local Veteran Affairs database were selected. Exclusion criteria: AD, NPH, tumor, and alcohol abuse. Protocol: conventional clinical acquisition (Toshiba; helical, 120 kVp, X-ray tube current 300mA, slice thickness 3-5mm). Locally developed, automatic algorithm was used to segment intracranial cavity (ICC) using (a) white matter seed (b) constrained growth, limited by inner skull layer and (c) topological connectivity. ICC was further segmented into CSF and brain parenchyma using a threshold of 16 Hu. Results: Age distribution: 25-95yrs; (Mean 67+/-17.5yrs.). Significant correlation was found between age and CSF/ICC(r=0.695, pautomated software and conventional CT. Compared to MRI, CT is more widely available, cheaper, and less affected by head motion due to ~100 times shorter scan time. Work is in progress to improve the precision of the measurements, possibly leading to assessment of longitudinal changes within the patient.

  9. Imaging of peripheral arteries by 16-row multidetector computed tomography angiography: A feasible tool?

    Mishra, Anuj [Department of Radiology, National Organ Transplant Program, Tripoli (Libyan Arab Jamahiriya)]. E-mail: dranujmish@yahoo.com; Bhaktarahalli, Jahnavi Narayanaswamy [Department of Clinical Pathology, Tripoli Medical Centre, Tripoli (Libyan Arab Jamahiriya); Ehtuish, Ehtuish F. [Department of Surgery, National Organ Transplant Program, Tripoli (Libyan Arab Jamahiriya)

    2007-03-15

    Objective: To evaluate the efficacy of multidetector (16-row) computed tomography (MDCT) in imaging the upper and lower limb arterial tree in trauma and peripheral arterial occlusive disease (PAOD). Methods: Thirty-three patients underwent MDCT angiography (MDCTA) of the upper or the lower limb on 16-row MDCT scanner between November, 2004 and July, 2005. The findings were compared with the surgical outcome in cases with trauma and suspected arterial injuries or color Doppler correlation was obtained for patients of PAOD. Results: MDCTA allowed a comprehensive diagnostic work-up in all trauma cases with suspected arterial injuries. In the 23 cases of PAOD, MDCT adequately demonstrated the presence of stenosis or occlusion, its degree and extent, the presence of collaterals and plaques. Conclusion: Our experience of CT angiography (CTA) with 16-row MDCT scanner has clearly demonstrated its efficacy as a promising, new, fast, accurate, safe and non-invasive imaging modality of choice in cases of trauma with suspected arterial injuries and as a useful screening modality in cases of PAOD for diagnosis and for grading.

  10. Imaging of peripheral arteries by 16-row multidetector computed tomography angiography: A feasible tool?

    Mishra, Anuj; Bhaktarahalli, Jahnavi Narayanaswamy; Ehtuish, Ehtuish F.

    2007-01-01

    Objective: To evaluate the efficacy of multidetector (16-row) computed tomography (MDCT) in imaging the upper and lower limb arterial tree in trauma and peripheral arterial occlusive disease (PAOD). Methods: Thirty-three patients underwent MDCT angiography (MDCTA) of the upper or the lower limb on 16-row MDCT scanner between November, 2004 and July, 2005. The findings were compared with the surgical outcome in cases with trauma and suspected arterial injuries or color Doppler correlation was obtained for patients of PAOD. Results: MDCTA allowed a comprehensive diagnostic work-up in all trauma cases with suspected arterial injuries. In the 23 cases of PAOD, MDCT adequately demonstrated the presence of stenosis or occlusion, its degree and extent, the presence of collaterals and plaques. Conclusion: Our experience of CT angiography (CTA) with 16-row MDCT scanner has clearly demonstrated its efficacy as a promising, new, fast, accurate, safe and non-invasive imaging modality of choice in cases of trauma with suspected arterial injuries and as a useful screening modality in cases of PAOD for diagnosis and for grading

  11. Tools for Brain-Computer Interaction: a general concept for a hybrid BCI (hBCI

    Gernot R. Mueller-Putz

    2011-11-01

    Full Text Available The aim of this work is to present the development of a hybrid Brain-Computer Interface (hBCI which combines existing input devices with a BCI. Thereby, the BCI should be available if the user wishes to extend the types of inputs available to an assistive technology system, but the user can also choose not to use the BCI at all; the BCI is active in the background. The hBCI might decide on the one hand which input channel(s offer the most reliable signal(s and switch between input channels to improve information transfer rate, usability, or other factors, or on the other hand fuse various input channels. One major goal therefore is to bring the BCI technology to a level where it can be used in a maximum number of scenarios in a simple way. To achieve this, it is of great importance that the hBCI is able to operate reliably for long periods, recognizing and adapting to changes as it does so. This goal is only possible if many different subsystems in the hBCI can work together. Since one research institute alone cannot provide such different functionality, collaboration between institutes is necessary. To allow for such a collaboration, a common software framework was investigated.

  12. Computational tools for Breakthrough Propulsion Physics: State of the art and future prospects

    Maccone, Claudio

    2000-01-01

    To address problems in Breakthrough Propulsion Physics (BPP) one needs sheer computing capabilities. This is because General Relativity and Quantum Field Theory are so mathematically sophisticated that the amount of analytical calculations is prohibitive and one can hardly do all of them by hand. In this paper we make a comparative review of the main tensor calculus capabilities of the three most advanced and commercially available 'symbolic manipulator' codes: Macsyma, Maple V and Mathematica. We also point out that currently one faces such a variety of different conventions in tensor calculus that it is difficult or impossible to compare results obtained by different scholars in General Relativity and Quantum Field Theory. Mathematical physicists, experimental physicists and engineers have each their own way of customizing tensors, especially by using the different metric signatures, different metric determinant signs, different definitions of the basic Riemann and Ricci tensors, and by adopting different systems of physical units. This chaos greatly hampers progress toward the chief NASA BPP goal: the design of the NASA Warp Drive. It is thus concluded that NASA should put order by establishing international standards in symbolic tensor calculus and enforcing anyone working in BPP to adopt these NASA BPP Standards

  13. Development of Computational Tools for Metabolic Model Curation, Flux Elucidation and Strain Design

    Maranas, Costas D

    2012-05-21

    An overarching goal of the Department of Energy mission is the efficient deployment and engineering of microbial and plant systems to enable biomass conversion in pursuit of high energy density liquid biofuels. This has spurred the pace at which new organisms are sequenced and annotated. This torrent of genomic information has opened the door to understanding metabolism in not just skeletal pathways and a handful of microorganisms but for truly genome-scale reconstructions derived for hundreds of microbes and plants. Understanding and redirecting metabolism is crucial because metabolic fluxes are unique descriptors of cellular physiology that directly assess the current cellular state and quantify the effect of genetic engineering interventions. At the same time, however, trying to keep pace with the rate of genomic data generation has ushered in a number of modeling and computational challenges related to (i) the automated assembly, testing and correction of genome-scale metabolic models, (ii) metabolic flux elucidation using labeled isotopes, and (iii) comprehensive identification of engineering interventions leading to the desired metabolism redirection.

  14. Computational tools for experimental determination and theoretical prediction of protein structure

    O`Donoghue, S.; Rost, B.

    1995-12-31

    This tutorial was one of eight tutorials selected to be presented at the Third International Conference on Intelligent Systems for Molecular Biology which was held in the United Kingdom from July 16 to 19, 1995. The authors intend to review the state of the art in the experimental determination of protein 3D structure (focus on nuclear magnetic resonance), and in the theoretical prediction of protein function and of protein structure in 1D, 2D and 3D from sequence. All the atomic resolution structures determined so far have been derived from either X-ray crystallography (the majority so far) or Nuclear Magnetic Resonance (NMR) Spectroscopy (becoming increasingly more important). The authors briefly describe the physical methods behind both of these techniques; the major computational methods involved will be covered in some detail. They highlight parallels and differences between the methods, and also the current limitations. Special emphasis will be given to techniques which have application to ab initio structure prediction. Large scale sequencing techniques increase the gap between the number of known proteins sequences and that of known protein structures. They describe the scope and principles of methods that contribute successfully to closing that gap. Emphasis will be given on the specification of adequate testing procedures to validate such methods.

  15. Computer-assisted diagnostic tool to quantify the pulmonary veins in sickle cell associated pulmonary hypertension

    Jajamovich, Guido H.; Pamulapati, Vivek; Alam, Shoaib; Mehari, Alem; Kato, Gregory J.; Wood, Bradford J.; Linguraru, Marius George

    2012-03-01

    Pulmonary hypertension is a common cause of death among patients with sickle cell disease. This study investigates the use of pulmonary vein analysis to assist the diagnosis of pulmonary hypertension non-invasively with CT-Angiography images. The characterization of the pulmonary veins from CT presents two main challenges. Firstly, the number of pulmonary veins is unknown a priori and secondly, the contrast material is degraded when reaching the pulmonary veins, making the edges of these vessels to appear faint. Each image is first denoised and a fast marching approach is used to segment the left atrium and pulmonary veins. Afterward, a geodesic active contour is employed to isolate the left atrium. A thinning technique is then used to extract the skeleton of the atrium and the veins. The locations of the pulmonary veins ostia are determined by the intersection of the skeleton and the contour of the atrium. The diameters of the pulmonary veins are measured in each vein at fixed distances from the corresponding ostium, and for each distance, the sum of the diameters of all the veins is computed. These indicators are shown to be significantly larger in sickle-cell patients with pulmonary hypertension as compared to controls (p-values < 0.01).

  16. Patient-specific surgical planning and hemodynamic computational fluid dynamics optimization through free-form haptic anatomy editing tool (SURGEM).

    Pekkan, Kerem; Whited, Brian; Kanter, Kirk; Sharma, Shiva; de Zelicourt, Diane; Sundareswaran, Kartik; Frakes, David; Rossignac, Jarek; Yoganathan, Ajit P

    2008-11-01

    The first version of an anatomy editing/surgical planning tool (SURGEM) targeting anatomical complexity and patient-specific computational fluid dynamics (CFD) analysis is presented. Novel three-dimensional (3D) shape editing concepts and human-shape interaction technologies have been integrated to facilitate interactive surgical morphology alterations, grid generation and CFD analysis. In order to implement "manual hemodynamic optimization" at the surgery planning phase for patients with congenital heart defects, these tools are applied to design and evaluate possible modifications of patient-specific anatomies. In this context, anatomies involve complex geometric topologies and tortuous 3D blood flow pathways with multiple inlets and outlets. These tools make it possible to freely deform the lumen surface and to bend and position baffles through real-time, direct manipulation of the 3D models with both hands, thus eliminating the tedious and time-consuming phase of entering the desired geometry using traditional computer-aided design (CAD) systems. The 3D models of the modified anatomies are seamlessly exported and meshed for patient-specific CFD analysis. Free-formed anatomical modifications are quantified using an in-house skeletization based cross-sectional geometry analysis tool. Hemodynamic performance of the systematically modified anatomies is compared with the original anatomy using CFD. CFD results showed the relative importance of the various surgically created features such as pouch size, vena cave to pulmonary artery (PA) flare and PA stenosis. An interactive surgical-patch size estimator is also introduced. The combined design/analysis cycle time is used for comparing and optimizing surgical plans and improvements are tabulated. The reduced cost of patient-specific shape design and analysis process, made it possible to envision large clinical studies to assess the validity of predictive patient-specific CFD simulations. In this paper, model

  17. Availability of Neutronics Benchmarks in the ICSBEP and IRPhEP Handbooks for Computational Tools Testing

    Bess, John D.; Briggs, J. Blair; Ivanova, Tatiana; Hill, Ian; Gulliford, Jim

    2017-02-01

    In the past several decades, numerous experiments have been performed worldwide to support reactor operations, measurements, design, and nuclear safety. Those experiments represent an extensive international investment in infrastructure, expertise, and cost, representing significantly valuable resources of data supporting past, current, and future research activities. Those valuable assets represent the basis for recording, development, and validation of our nuclear methods and integral nuclear data [1]. The loss of these experimental data, which has occurred all too much in the recent years, is tragic. The high cost to repeat many of these measurements can be prohibitive, if not impossible, to surmount. Two international projects were developed, and are under the direction of the Organisation for Co-operation and Development Nuclear Energy Agency (OECD NEA) to address the challenges of not just data preservation, but evaluation of the data to determine its merit for modern and future use. The International Criticality Safety Benchmark Evaluation Project (ICSBEP) was established to identify and verify comprehensive critical benchmark data sets; evaluate the data, including quantification of biases and uncertainties; compile the data and calculations in a standardized format; and formally document the effort into a single source of verified benchmark data [2]. Similarly, the International Reactor Physics Experiment Evaluation Project (IRPhEP) was established to preserve integral reactor physics experimental data, including separate or special effects data for nuclear energy and technology applications [3]. Annually, contributors from around the world continue to collaborate in the evaluation and review of select benchmark experiments for preservation and dissemination. The extensively peer-reviewed integral benchmark data can then be utilized to support nuclear design and safety analysts to validate the analytical tools, methods, and data needed for next

  18. Computational tool for immunotoxic assessment of pyrethroids toward adaptive immune cell receptors.

    Kumar, Anoop; Behera, Padma Charan; Rangra, Naresh Kumar; Dey, Suddhasattya; Kant, Kamal

    2018-01-01

    Pyrethroids have prominently known for their insecticidal actions worldwide, but recent reports as anticancer and antiviral applications gained a lot of interest to further understand their safety and immunotoxicity. This encouraged us to carry out our present study to evaluate the interactions of pyrethroids toward adaptive immune cell receptors. Type 1 and Type 2 pyrethroids were tested on T (CD4 and CD8) and B (CD28 and CD45) immune cell receptors using Maestro 9.3 (Schrödinger, LLC, Cambridge, USA). In addition, top-ranked tested ligands were too explored for toxicity prediction in rodents using ProTOX tool. Pyrethroids (specifically type 2) such as fenvalerate (-5.534 kcal/mol: CD8), fluvalinate (-4.644 and - 4.431 kcal/mol: CD4 and CD45), and cypermethrin (-3.535 kcal/mol: CD28) have outcome in less energy or more affinity for B-cell and T-cell immune receptors which may later result in the immunosuppressive and hypersensitivity reactions. The current findings have uncovered that there is a further need to assess the Type 2 pyrethroids with wet laboratory experiments to understand the chemical nature of pyrethroid-induced immunotoxicity. Fenvalerate showed apex glide score toward CD8 immune receptor, while fluvalinate confirmed top-ranked binding with CD4 and CD45 immune proteinsIn addition, cypermethrin outcame in top glide score against CD28 immune receptorTop dock hits (Type 2) pyrethroids have shown probable toxicity targets toward AOFA: Amine oxidase (flavin-containing) A and PGH1: Prostaglandin G/H synthase 1, respectively. Abbreviations used: PDB: Protein Data Bank; AOFA: Amine oxidase (flavin-containing) A; PGH 1: Prostaglandin G/H synthase 1.

  19. Introducing Enabling Computational Tools to the Climate Sciences: Multi-Resolution Climate Modeling with Adaptive Cubed-Sphere Grids

    Jablonowski, Christiane [Univ. of Michigan, Ann Arbor, MI (United States)

    2015-07-14

    The research investigates and advances strategies how to bridge the scale discrepancies between local, regional and global phenomena in climate models without the prohibitive computational costs of global cloud-resolving simulations. In particular, the research explores new frontiers in computational geoscience by introducing high-order Adaptive Mesh Refinement (AMR) techniques into climate research. AMR and statically-adapted variable-resolution approaches represent an emerging trend for atmospheric models and are likely to become the new norm in future-generation weather and climate models. The research advances the understanding of multi-scale interactions in the climate system and showcases a pathway how to model these interactions effectively with advanced computational tools, like the Chombo AMR library developed at the Lawrence Berkeley National Laboratory. The research is interdisciplinary and combines applied mathematics, scientific computing and the atmospheric sciences. In this research project, a hierarchy of high-order atmospheric models on cubed-sphere computational grids have been developed that serve as an algorithmic prototype for the finite-volume solution-adaptive Chombo-AMR approach. The foci of the investigations have lied on the characteristics of both static mesh adaptations and dynamically-adaptive grids that can capture flow fields of interest like tropical cyclones. Six research themes have been chosen. These are (1) the introduction of adaptive mesh refinement techniques into the climate sciences, (2) advanced algorithms for nonhydrostatic atmospheric dynamical cores, (3) an assessment of the interplay between resolved-scale dynamical motions and subgrid-scale physical parameterizations, (4) evaluation techniques for atmospheric model hierarchies, (5) the comparison of AMR refinement strategies and (6) tropical cyclone studies with a focus on multi-scale interactions and variable-resolution modeling. The results of this research project

  20. Coleridge: a computer tool for assisting musical reflection and self-explanation

    John Cook

    1998-12-01

    Full Text Available Since the mid-1980s, there has been a movement away from knowledge supplied by the teacher and towards talking, reflecting and explaining as ways to learn. An example of this change in focus is provided by the self-explanation work of Chi et al (1994 who describe an approach to talking science rather than hearing science. According to Chi and coworkers, generating explanations to oneself (self-explanations facilitates the integration of new information into existing knowledge. Reflecting about one's own learning is the same as thinking about learning or metacognition. Metacognition can be defined as the understanding of knowledge, an understanding that can be reflected in either effective use or overt description of the knowledge in question (Brown, 1987. This definition of metacognition requires of a learner both internalized thinking about learning (that is, reflection, and externalized communication, through language or action, that indicates an understanding of knowledge (that is, a self-explanation. In the work described in this paper the overall pedagogical goal is to encourage creative reflection in learners. Creative reflection is defined as the ability of a learner to imagine musical opportunities in novel situations, and then to make accurate predictions (verbally about these opportunities. To succeed at creative reflection there should be a correspondence between what a learner predicts will happen and what actually happens. An example would be a learner first writing a musical phrase using musical notation, then predicting verbally how that phrase will sound, playing the phrase back on a piano, and finally evaluating if the prediction was accurate or not. Very little work has been done on how computers can be used to support talking, reflecting and explaining in the creative subject-area of musical composition. The rest of this paper addresses this issue.

  1. Computational modeling as a tool for water resources management: an alternative approach to problems of multiple uses

    Haydda Manolla Chaves da Hora

    2012-04-01

    Full Text Available Today in Brazil there are many cases of incompatibility regarding use of water and its availability. Due to the increase in required variety and volume, the concept of multiple uses was created, as stated by Pinheiro et al. (2007. The use of the same resource to satisfy different needs with several restrictions (qualitative and quantitative creates conflicts. Aiming to minimize these conflicts, this work was applied to the particular cases of Hydrographic Regions VI and VIII of Rio de Janeiro State, using computational modeling techniques (based on MOHID software – Water Modeling System as a tool for water resources management.

  2. Application of Soft Computing Tools for Wave Prediction at Specific Locations in the Arabian Sea Using Moored Buoy Observations

    J. Vimala

    2012-12-01

    Full Text Available The knowledge of design and operational values of significant wave heights is perhaps the single most important input needed in ocean engineering studies. Conventionally such information is obtained using classical statistical analysis and stochastic methods. As the causative variables are innumerable and underlying physics is too complicated, the results obtained from the numerical models may not always be very satisfactory. Soft computing tools like Artificial Neural Network (ANN and Adaptive Network based Fuzzy Inference System (ANFIS may therefore be useful to predict significant wave heights in some situations. The study is aimed at forecasting of significant wave height values in real time over a period of 24hrs at certain locations in Indian seas using the models of ANN and ANFIS. The data for the work were collected by National Institute of Ocean Technology, Chennai. It was found that the predictions of wave heights can be done by both methods with equal efficiency and satisfaction.

  3. Implementation of a computer-aided detection tool for quantification of intracranial radiologic markers on brain CT images

    Aghaei, Faranak; Ross, Stephen R.; Wang, Yunzhi; Wu, Dee H.; Cornwell, Benjamin O.; Ray, Bappaditya; Zheng, Bin

    2017-03-01

    Aneurysmal subarachnoid hemorrhage (aSAH) is a form of hemorrhagic stroke that affects middle-aged individuals and associated with significant morbidity and/or mortality especially those presenting with higher clinical and radiologic grades at the time of admission. Previous studies suggested that blood extravasated after aneurysmal rupture was a potentially clinical prognosis factor. But all such studies used qualitative scales to predict prognosis. The purpose of this study is to develop and test a new interactive computer-aided detection (CAD) tool to detect, segment and quantify brain hemorrhage and ventricular cerebrospinal fluid on non-contrasted brain CT images. First, CAD segments brain skull using a multilayer region growing algorithm with adaptively adjusted thresholds. Second, CAD assigns pixels inside the segmented brain region into one of three classes namely, normal brain tissue, blood and fluid. Third, to avoid "black-box" approach and increase accuracy in quantification of these two image markers using CT images with large noise variation in different cases, a graphic User Interface (GUI) was implemented and allows users to visually examine segmentation results. If a user likes to correct any errors (i.e., deleting clinically irrelevant blood or fluid regions, or fill in the holes inside the relevant blood or fluid regions), he/she can manually define the region and select a corresponding correction function. CAD will automatically perform correction and update the computed data. The new CAD tool is now being used in clinical and research settings to estimate various quantitatively radiological parameters/markers to determine radiological severity of aSAH at presentation and correlate the estimations with various homeostatic/metabolic derangements and predict clinical outcome.

  4. Quantitative computed tomography (QCT) as a radiology reporting tool by using optical character recognition (OCR) and macro program.

    Lee, Young Han; Song, Ho-Taek; Suh, Jin-Suck

    2012-12-01

    The objectives are (1) to introduce a new concept of making a quantitative computed tomography (QCT) reporting system by using optical character recognition (OCR) and macro program and (2) to illustrate the practical usages of the QCT reporting system in radiology reading environment. This reporting system was created as a development tool by using an open-source OCR software and an open-source macro program. The main module was designed for OCR to report QCT images in radiology reading process. The principal processes are as follows: (1) to save a QCT report as a graphic file, (2) to recognize the characters from an image as a text, (3) to extract the T scores from the text, (4) to perform error correction, (5) to reformat the values into QCT radiology reporting template, and (6) to paste the reports into the electronic medical record (EMR) or picture archiving and communicating system (PACS). The accuracy test of OCR was performed on randomly selected QCTs. QCT as a radiology reporting tool successfully acted as OCR of QCT. The diagnosis of normal, osteopenia, or osteoporosis is also determined. Error correction of OCR is done with AutoHotkey-coded module. The results of T scores of femoral neck and lumbar vertebrae had an accuracy of 100 and 95.4 %, respectively. A convenient QCT reporting system could be established by utilizing open-source OCR software and open-source macro program. This method can be easily adapted for other QCT applications and PACS/EMR.

  5. Tool vibration detection with eddy current sensors in machining process and computation of stability lobes using fuzzy classifiers

    Devillez, Arnaud; Dudzinski, Daniel

    2007-01-01

    Today the knowledge of a process is very important for engineers to find optimal combination of control parameters warranting productivity, quality and functioning without defects and failures. In our laboratory, we carry out research in the field of high speed machining with modelling, simulation and experimental approaches. The aim of our investigation is to develop a software allowing the cutting conditions optimisation to limit the number of predictive tests, and the process monitoring to prevent any trouble during machining operations. This software is based on models and experimental data sets which constitute the knowledge of the process. In this paper, we deal with the problem of vibrations occurring during a machining operation. These vibrations may cause some failures and defects to the process, like workpiece surface alteration and rapid tool wear. To measure on line the tool micro-movements, we equipped a lathe with a specific instrumentation using eddy current sensors. Obtained signals were correlated with surface finish and a signal processing algorithm was used to determine if a test is stable or unstable. Then, a fuzzy classification method was proposed to classify the tests in a space defined by the width of cut and the cutting speed. Finally, it was shown that the fuzzy classification takes into account of the measurements incertitude to compute the stability limit or stability lobes of the process.

  6. COMPUTER AIDED RESTORATION TOOLS TO ASSIST THE CONSERVATION OF AN ANCIENT SCULPTURE. THE COLOSSAL STATUE OF ZEUS ENTHRONED

    F. Di Paola

    2017-08-01

    Full Text Available The research focuses on the contribution of the integrated application of Computer Aided Restoration digital procedures as a means to guide the integration measure of an artifact, innovating and implementing the traditional investigation methods. The aim of the study was to provide effective geometrical-formal investigation tools in the frame of the conservation work of Zeus enthroned from Soluntum, conserved in the Archaeological Museum “A. Salinas” of Palermo. The paper describes the workflow of the 3D acquisition and graphical modeling with non-invasive digitalization and high information density techniques to assist the conservation of the legs of the throne, especially the integration of the missing part. Thanks to the technique of the digital fabrication it has been reconstruct the two missing parts following the theoretical criteria as: recognisability, compatibility and retractability. This innovative application of 3D digital technologies have showed as the integrated use of the new technology can be a useful tools for improving the conservation of a work of art.

  7. IBiSA_Tools: A Computational Toolkit for Ion-Binding State Analysis in Molecular Dynamics Trajectories of Ion Channels.

    Kota Kasahara

    Full Text Available Ion conduction mechanisms of ion channels are a long-standing conundrum. Although the molecular dynamics (MD method has been extensively used to simulate ion conduction dynamics at the atomic level, analysis and interpretation of MD results are not straightforward due to complexity of the dynamics. In our previous reports, we proposed an analytical method called ion-binding state analysis to scrutinize and summarize ion conduction mechanisms by taking advantage of a variety of analytical protocols, e.g., the complex network analysis, sequence alignment, and hierarchical clustering. This approach effectively revealed the ion conduction mechanisms and their dependence on the conditions, i.e., ion concentration and membrane voltage. Here, we present an easy-to-use computational toolkit for ion-binding state analysis, called IBiSA_tools. This toolkit consists of a C++ program and a series of Python and R scripts. From the trajectory file of MD simulations and a structure file, users can generate several images and statistics of ion conduction processes. A complex network named ion-binding state graph is generated in a standard graph format (graph modeling language; GML, which can be visualized by standard network analyzers such as Cytoscape. As a tutorial, a trajectory of a 50 ns MD simulation of the Kv1.2 channel is also distributed with the toolkit. Users can trace the entire process of ion-binding state analysis step by step. The novel method for analysis of ion conduction mechanisms of ion channels can be easily used by means of IBiSA_tools. This software is distributed under an open source license at the following URL: http://www.ritsumei.ac.jp/~ktkshr/ibisa_tools/.

  8. G‐LoSA: An efficient computational tool for local structure‐centric biological studies and drug design

    2016-01-01

    Abstract Molecular recognition by protein mostly occurs in a local region on the protein surface. Thus, an efficient computational method for accurate characterization of protein local structural conservation is necessary to better understand biology and drug design. We present a novel local structure alignment tool, G‐LoSA. G‐LoSA aligns protein local structures in a sequence order independent way and provides a GA‐score, a chemical feature‐based and size‐independent structure similarity score. Our benchmark validation shows the robust performance of G‐LoSA to the local structures of diverse sizes and characteristics, demonstrating its universal applicability to local structure‐centric comparative biology studies. In particular, G‐LoSA is highly effective in detecting conserved local regions on the entire surface of a given protein. In addition, the applications of G‐LoSA to identifying template ligands and predicting ligand and protein binding sites illustrate its strong potential for computer‐aided drug design. We hope that G‐LoSA can be a useful computational method for exploring interesting biological problems through large‐scale comparison of protein local structures and facilitating drug discovery research and development. G‐LoSA is freely available to academic users at http://im.compbio.ku.edu/GLoSA/. PMID:26813336

  9. A malaria diagnostic tool based on computer vision screening and visualization of Plasmodium falciparum candidate areas in digitized blood smears.

    Nina Linder

    Full Text Available INTRODUCTION: Microscopy is the gold standard for diagnosis of malaria, however, manual evaluation of blood films is highly dependent on skilled personnel in a time-consuming, error-prone and repetitive process. In this study we propose a method using computer vision detection and visualization of only the diagnostically most relevant sample regions in digitized blood smears. METHODS: Giemsa-stained thin blood films with P. falciparum ring-stage trophozoites (n = 27 and uninfected controls (n = 20 were digitally scanned with an oil immersion objective (0.1 µm/pixel to capture approximately 50,000 erythrocytes per sample. Parasite candidate regions were identified based on color and object size, followed by extraction of image features (local binary patterns, local contrast and Scale-invariant feature transform descriptors used as input to a support vector machine classifier. The classifier was trained on digital slides from ten patients and validated on six samples. RESULTS: The diagnostic accuracy was tested on 31 samples (19 infected and 12 controls. From each digitized area of a blood smear, a panel with the 128 most probable parasite candidate regions was generated. Two expert microscopists were asked to visually inspect the panel on a tablet computer and to judge whether the patient was infected with P. falciparum. The method achieved a diagnostic sensitivity and specificity of 95% and 100% as well as 90% and 100% for the two readers respectively using the diagnostic tool. Parasitemia was separately calculated by the automated system and the correlation coefficient between manual and automated parasitemia counts was 0.97. CONCLUSION: We developed a decision support system for detecting malaria parasites using a computer vision algorithm combined with visualization of sample areas with the highest probability of malaria infection. The system provides a novel method for blood smear screening with a significantly reduced need for

  10. Reliability and reproducibility analysis of the Cobb angle and assessing sagittal plane by computer-assisted and manual measurement tools.

    Wu, Weifei; Liang, Jie; Du, Yuanli; Tan, Xiaoyi; Xiang, Xuanping; Wang, Wanhong; Ru, Neng; Le, Jinbo

    2014-02-06

    Although many studies on reliability and reproducibility of measurement have been performed on coronal Cobb angle, few results about reliability and reproducibility are reported on sagittal alignment measurement including the pelvis. We usually use SurgimapSpine software to measure the Cobb angle in our studies; however, there are no reports till date on its reliability and reproducible measurements. Sixty-eight standard standing posteroanterior whole-spine radiographs were reviewed. Three examiners carried out the measurements independently under the settings of manual measurement on X-ray radiographies and SurgimapSpine software on the computer. Parameters measured included pelvic incidence, sacral slope, pelvic tilt, Lumbar lordosis (LL), thoracic kyphosis, and coronal Cobb angle. SPSS 16.0 software was used for statistical analyses. The means, standard deviations, intraclass and interclass correlation coefficient (ICC), and 95% confidence intervals (CI) were calculated. There was no notable difference between the two tools (P = 0.21) for the coronal Cobb angle. In the sagittal plane parameters, the ICC of intraobserver reliability for the manual measures varied from 0.65 (T2-T5 angle) to 0.95 (LL angle). Further, for SurgimapSpine tool, the ICC ranged from 0.75 to 0.98. No significant difference in intraobserver reliability was found between the two measurements (P > 0.05). As for the interobserver reliability, measurements with SurgimapSpine tool had better ICC (0.71 to 0.98 vs 0.59 to 0.96) and Pearson's coefficient (0.76 to 0.99 vs 0.60 to 0.97). The reliability of SurgimapSpine measures was significantly higher in all parameters except for the coronal Cobb angle where the difference was not significant (P > 0.05). Although the differences between the two methods are very small, the results of this study indicate that the SurgimapSpine measurement is an equivalent measuring tool to the traditional manual in coronal Cobb angle, but is advantageous in spino

  11. What is needed to implement a computer-assisted health risk assessment tool? An exploratory concept mapping study

    Ahmad Farah

    2012-12-01

    Full Text Available Abstract Background Emerging eHealth tools could facilitate the delivery of comprehensive care in time-constrained clinical settings. One such tool is interactive computer-assisted health-risk assessments (HRA, which may improve provider-patient communication at the point of care, particularly for psychosocial health concerns, which remain under-detected in clinical encounters. The research team explored the perspectives of healthcare providers representing a variety of disciplines (physicians, nurses, social workers, allied staff regarding the factors required for implementation of an interactive HRA on psychosocial health. Methods The research team employed a semi-qualitative participatory method known as Concept Mapping, which involved three distinct phases. First, in face-to-face and online brainstorming sessions, participants responded to an open-ended central question: “What factors should be in place within your clinical setting to support an effective computer-assisted screening tool for psychosocial risks?” The brainstormed items were consolidated by the research team. Then, in face-to-face and online sorting sessions, participants grouped the items thematically as ‘it made sense to them’. Participants also rated each item on a 5-point scale for its ‘importance’ and ‘action feasibility’ over the ensuing six month period. The sorted and rated data was analyzed using multidimensional scaling and hierarchical cluster analyses which produced visual maps. In the third and final phase, the face-to-face Interpretation sessions, the concept maps were discussed and illuminated by participants collectively. Results Overall, 54 providers participated (emergency care 48%; primary care 52%. Participants brainstormed 196 items thought to be necessary for the implementation of an interactive HRA emphasizing psychosocial health. These were consolidated by the research team into 85 items. After sorting and rating, cluster analysis

  12. Development and Usability Testing of a Computer-Tailored Decision Support Tool for Lung Cancer Screening: Study Protocol.

    Carter-Harris, Lisa; Comer, Robert Skipworth; Goyal, Anurag; Vode, Emilee Christine; Hanna, Nasser; Ceppa, DuyKhanh; Rawl, Susan M

    2017-11-16

    Awareness of lung cancer screening remains low in the screening-eligible population, and when patients visit their clinician never having heard of lung cancer screening, engaging in shared decision making to arrive at an informed decision can be a challenge. Therefore, methods to effectively support both patients and clinicians to engage in these important discussions are essential. To facilitate shared decision making about lung cancer screening, effective methods to prepare patients to have these important discussions with their clinician are needed. Our objective is to develop a computer-tailored decision support tool that meets the certification criteria of the International Patient Decision Aid Standards instrument version 4.0 that will support shared decision making in lung cancer screening decisions. Using a 3-phase process, we will develop and test a prototype of a computer-tailored decision support tool in a sample of lung cancer screening-eligible individuals. In phase I, we assembled a community advisory board comprising 10 screening-eligible individuals to develop the prototype. In phase II, we recruited a sample of 13 screening-eligible individuals to test the prototype for usability, acceptability, and satisfaction. In phase III, we are conducting a pilot randomized controlled trial (RCT) with 60 screening-eligible participants who have never been screened for lung cancer. Outcomes tested include lung cancer and screening knowledge, lung cancer screening health beliefs (perceived risk, perceived benefits, perceived barriers, and self-efficacy), perception of being prepared to engage in a patient-clinician discussion about lung cancer screening, occurrence of a patient-clinician discussion about lung cancer screening, and stage of adoption for lung cancer screening. Phases I and II are complete. Phase III is underway. As of July 15, 2017, 60 participants have been enrolled into the study, and have completed the baseline survey, intervention, and first

  13. Creating Electronic Books-Chapters for Computers and Tablets Using Easy Java/JavaScript Simulations, EjsS Modeling Tool

    Wee, Loo Kang

    2015-01-01

    This paper shares my journey (tools used, design principles derived and modeling pedagogy implemented) when creating electronic books-chapters (epub3 format) for computers and tablets using Easy Java/JavaScript Simulations, (old name EJS, new EjsS) Modeling Tool. The theory underpinning this work grounded on learning by doing through dynamic and interactive simulation-models that can be more easily made sense of instead of the static nature of printed materials. I started combining related co...

  14. Using Computer-Aided Software Engineering (CASE)--tools to document the current logical model of a system for DoD requirements specifications.

    Ganzer, Donna A.

    1987-01-01

    Approved for public release; distribution is unlimited The Naval Postgraduate School's final exam scheduling system serves as a test case with which to compare two commercially available Computer-Aided Software Engineering (CASE) tools. The tools, Nastec Corporation's DesignAid (Release 3.55) and Index Technology's Excelerator (Release 1.7) are used to create Section 4.1 of two Abbreviated Systems Decision Papers to determine if their output can satisfy and should replace some of the Life...

  15. State-of-the-art and dissemination of computational tools for drug-design purposes: a survey among Italian academics and industrial institutions.

    Artese, Anna; Alcaro, Stefano; Moraca, Federica; Reina, Rocco; Ventura, Marzia; Costantino, Gabriele; Beccari, Andrea R; Ortuso, Francesco

    2013-05-01

    During the first edition of the Computationally Driven Drug Discovery meeting, held in November 2011 at Dompé Pharma (L'Aquila, Italy), a questionnaire regarding the diffusion and the use of computational tools for drug-design purposes in both academia and industry was distributed among all participants. This is a follow-up of a previously reported investigation carried out among a few companies in 2007. The new questionnaire implemented five sections dedicated to: research group identification and classification; 18 different computational techniques; software information; hardware data; and economical business considerations. In this article, together with a detailed history of the different computational methods, a statistical analysis of the survey results that enabled the identification of the prevalent computational techniques adopted in drug-design projects is reported and a profile of the computational medicinal chemist currently working in academia and pharmaceutical companies in Italy is highlighted.

  16. Interstitial lung disease associated with collagen vascular disorders: disease quantification using a computer-aided diagnosis tool

    Marten, K.; Engelke, C. [University Hospital of Goettingen, Department of Radiology, Goettingen (Germany); Dicken, V. [MeVis Research GmbH, Bremen (Germany); Kneitz, C. [University Hospital of Wuerzburg, Dept. of Rheumatology and Clinical Immunology, Medizinische Klinik and Poliklinik, Wuerzburg (Germany); Hoehmann, M.; Kenn, W.; Hahn, D. [University Hospital of Wuerzburg, Department of Radiology, Wuerzburg (Germany)

    2009-02-15

    The purpose of this study was to evaluate a computer-aided diagnosis (CAD) tool compared to human observers in quantification of interstitial lung disease (ILD) in patients with collagen-vascular disorders. A total of 52 patients with rheumatoid arthritis (n=24), scleroderma (n=14) and systemic lupus erythematosus (n=14) underwent thin-section CT. Two independent observers assessed the extent of ILD (EoILD), reticulation (EoRet) and ground-glass opacity (EoGGO). CAD assessed EoILD twice. Pulmonary function tests were obtained. Statistical evaluation used 95% limits of agreement and linear regression analysis. CAD correlated well with diffusing capacity (DL{sub CO}) (R=-0.531, P<0.0001) and moderately with forced vital capacity (FVC) (R=-0.483, P=0.0008). There was close correlation between CAD and the readers (EoILD vs. CAD: R=0.716, P<0.0001; EoRet vs. CAD: R=0.69, P<0.0001). Subgroup analysis including patients with minimal EoGGO (<15%) strengthened the correlations between CAD and the readers, readers and PFT, and CAD and PFT. EoILD by readers correlated strongly with DL{sub CO} (R=-0.705, P<0.0001) and moderately with FVC (R=-0.559, P=0.0002). EoRet correlated closely with DL{sub CO} and moderately with FVC (DL{sub CO}: R=-0.663; FVC: R=-0.436; P{<=}0.005). The CAD system is a promising tool for ILD quantification, showing close correlation with human observers and physiologic impairment. (orig.)

  17. MO-E-18C-04: Advanced Computer Simulation and Visualization Tools for Enhanced Understanding of Core Medical Physics Concepts

    Naqvi, S

    2014-01-01

    Purpose: Most medical physics programs emphasize proficiency in routine clinical calculations and QA. The formulaic aspect of these calculations and prescriptive nature of measurement protocols obviate the need to frequently apply basic physical principles, which, therefore, gradually decay away from memory. E.g. few students appreciate the role of electron transport in photon dose, making it difficult to understand key concepts such as dose buildup, electronic disequilibrium effects and Bragg-Gray theory. These conceptual deficiencies manifest when the physicist encounters a new system, requiring knowledge beyond routine activities. Methods: Two interactive computer simulation tools are developed to facilitate deeper learning of physical principles. One is a Monte Carlo code written with a strong educational aspect. The code can “label” regions and interactions to highlight specific aspects of the physics, e.g., certain regions can be designated as “starters” or “crossers,” and any interaction type can be turned on and off. Full 3D tracks with specific portions highlighted further enhance the visualization of radiation transport problems. The second code calculates and displays trajectories of a collection electrons under arbitrary space/time dependent Lorentz force using relativistic kinematics. Results: Using the Monte Carlo code, the student can interactively study photon and electron transport through visualization of dose components, particle tracks, and interaction types. The code can, for instance, be used to study kerma-dose relationship, explore electronic disequilibrium near interfaces, or visualize kernels by using interaction forcing. The electromagnetic simulator enables the student to explore accelerating mechanisms and particle optics in devices such as cyclotrons and linacs. Conclusion: The proposed tools are designed to enhance understanding of abstract concepts by highlighting various aspects of the physics. The simulations serve as

  18. Using open source computational tools for predicting human metabolic stability and additional absorption, distribution, metabolism, excretion, and toxicity properties.

    Gupta, Rishi R; Gifford, Eric M; Liston, Ted; Waller, Chris L; Hohman, Moses; Bunin, Barry A; Ekins, Sean

    2010-11-01

    Ligand-based computational models could be more readily shared between researchers and organizations if they were generated with open source molecular descriptors [e.g., chemistry development kit (CDK)] and modeling algorithms, because this would negate the requirement for proprietary commercial software. We initially evaluated open source descriptors and model building algorithms using a training set of approximately 50,000 molecules and a test set of approximately 25,000 molecules with human liver microsomal metabolic stability data. A C5.0 decision tree model demonstrated that CDK descriptors together with a set of Smiles Arbitrary Target Specification (SMARTS) keys had good statistics [κ = 0.43, sensitivity = 0.57, specificity = 0.91, and positive predicted value (PPV) = 0.64], equivalent to those of models built with commercial Molecular Operating Environment 2D (MOE2D) and the same set of SMARTS keys (κ = 0.43, sensitivity = 0.58, specificity = 0.91, and PPV = 0.63). Extending the dataset to ∼193,000 molecules and generating a continuous model using Cubist with a combination of CDK and SMARTS keys or MOE2D and SMARTS keys confirmed this observation. When the continuous predictions and actual values were binned to get a categorical score we observed a similar κ statistic (0.42). The same combination of descriptor set and modeling method was applied to passive permeability and P-glycoprotein efflux data with similar model testing statistics. In summary, open source tools demonstrated predictive results comparable to those of commercial software with attendant cost savings. We discuss the advantages and disadvantages of open source descriptors and the opportunity for their use as a tool for organizations to share data precompetitively, avoiding repetition and assisting drug discovery.

  19. MO-E-18C-04: Advanced Computer Simulation and Visualization Tools for Enhanced Understanding of Core Medical Physics Concepts

    Naqvi, S [Saint Agnes Cancer Institute, Department of Radiation Oncology, Baltimore, MD (United States)

    2014-06-15

    Purpose: Most medical physics programs emphasize proficiency in routine clinical calculations and QA. The formulaic aspect of these calculations and prescriptive nature of measurement protocols obviate the need to frequently apply basic physical principles, which, therefore, gradually decay away from memory. E.g. few students appreciate the role of electron transport in photon dose, making it difficult to understand key concepts such as dose buildup, electronic disequilibrium effects and Bragg-Gray theory. These conceptual deficiencies manifest when the physicist encounters a new system, requiring knowledge beyond routine activities. Methods: Two interactive computer simulation tools are developed to facilitate deeper learning of physical principles. One is a Monte Carlo code written with a strong educational aspect. The code can “label” regions and interactions to highlight specific aspects of the physics, e.g., certain regions can be designated as “starters” or “crossers,” and any interaction type can be turned on and off. Full 3D tracks with specific portions highlighted further enhance the visualization of radiation transport problems. The second code calculates and displays trajectories of a collection electrons under arbitrary space/time dependent Lorentz force using relativistic kinematics. Results: Using the Monte Carlo code, the student can interactively study photon and electron transport through visualization of dose components, particle tracks, and interaction types. The code can, for instance, be used to study kerma-dose relationship, explore electronic disequilibrium near interfaces, or visualize kernels by using interaction forcing. The electromagnetic simulator enables the student to explore accelerating mechanisms and particle optics in devices such as cyclotrons and linacs. Conclusion: The proposed tools are designed to enhance understanding of abstract concepts by highlighting various aspects of the physics. The simulations serve as

  20. Computational challenges and human factors influencing the design and use of clinical research participant eligibility pre-screening tools

    Pressler Taylor R

    2012-05-01

    Full Text Available Abstract Background Clinical trials are the primary mechanism for advancing clinical care and evidenced-based practice, yet challenges with the recruitment of participants for such trials are widely recognized as a major barrier to these types of studies. Data warehouses (DW store large amounts of heterogenous clinical data that can be used to enhance recruitment practices, but multiple challenges exist when using a data warehouse for such activities, due to the manner of collection, management, integration, analysis, and dissemination of the data. A critical step in leveraging the DW for recruitment purposes is being able to match trial eligibility criteria to discrete and semi-structured data types in the data warehouse, though trial eligibility criteria tend to be written without concern for their computability. We present the multi-modal evaluation of a web-based tool that can be used for pre-screening patients for clinical trial eligibility and assess the ability of this tool to be practically used for clinical research pre-screening and recruitment. Methods The study used a validation study, usability testing, and a heuristic evaluation to evaluate and characterize the operational characteristics of the software as well as human factors affecting its use. Results Clinical trials from the Division of Cardiology and the Department of Family Medicine were used for this multi-modal evaluation, which included a validation study, usability study, and a heuristic evaluation. From the results of the validation study, the software demonstrated a positive predictive value (PPV of 54.12% and 0.7%, respectively, and a negative predictive value (NPV of 73.3% and 87.5%, respectively, for two types of clinical trials. Heuristic principles concerning error prevention and documentation were characterized as the major usability issues during the heuristic evaluation. Conclusions This software is intended to provide an initial list of eligible patients to a

  1. TH-AB-202-03: A Novel Tool for Computing Deliverable Doses in Dynamic MLC Tracking Treatments

    Fast, M; Kamerling, C; Menten, M; Nill, S; Oelfke, U [The Institute of Cancer Research and The Royal Marsden NHS Foundation Trust, London (United Kingdom); Crijns, S; Raaymakers, B [University Medical Center Utrecht, Utrecht (Netherlands)

    2016-06-15

    Purpose: In tracked dynamic multi-leaf collimator (MLC) treatments, segments are continuously adapted to the target centroid motion in beams-eye-view. On-the-fly segment adaptation, however, potentially induces dosimetric errors due to the finite MLC leaf width and non-rigid target motion. In this study, we outline a novel tool for computing the 4d dose of lung SBRT plans delivered with MLC tracking. Methods: The following automated workflow was developed: A) centroid tracking, where the initial segments are morphed to each 4dCT phase based on the beams-eye-view GTV shift (followed by a dose calculation on each phase); B) re-optimized tracking, in which all morphed initial plans from (A) are further optimised (“warm-started”) in each 4dCT phase using the initial optimisation parameters but phase-specific volume definitions. Finally, both dose sets are accumulated to the reference phase using deformable image registration. Initial plans were generated according to the RTOG-1021 guideline (54Gy, 3-Fx, equidistant 9-beam IMRT) on the peak-exhale (reference) phase of a phase-binned 4dCT. Treatment planning and delivery simulations were performed in RayStation (research v4.6) using our in-house segment-morphing algorithm, which directly links to RayStation through a native C++ interface. Results: Computing the tracking plans and 4d dose distributions via the in-house interface takes 5 and 8 minutes respectively for centroid and re-optimized tracking. For a sample lung SBRT patient with 14mm peak-to-peak motion in sup-inf direction, mainly perpendicular leaf motion (0-collimator) resulted in small dose changes for PTV-D95 (−13cGy) and GTV-D98 (+18cGy) for the centroid tracking case compared to the initial plan. Modest reductions of OAR doses (e.g. spinal cord D2: −11cGy) were achieved in the idealized tracking case. Conclusion: This study presents an automated “1-click” workflow for computing deliverable MLC tracking doses in RayStation. Adding a non

  2. TH-AB-202-03: A Novel Tool for Computing Deliverable Doses in Dynamic MLC Tracking Treatments

    Fast, M; Kamerling, C; Menten, M; Nill, S; Oelfke, U; Crijns, S; Raaymakers, B

    2016-01-01

    Purpose: In tracked dynamic multi-leaf collimator (MLC) treatments, segments are continuously adapted to the target centroid motion in beams-eye-view. On-the-fly segment adaptation, however, potentially induces dosimetric errors due to the finite MLC leaf width and non-rigid target motion. In this study, we outline a novel tool for computing the 4d dose of lung SBRT plans delivered with MLC tracking. Methods: The following automated workflow was developed: A) centroid tracking, where the initial segments are morphed to each 4dCT phase based on the beams-eye-view GTV shift (followed by a dose calculation on each phase); B) re-optimized tracking, in which all morphed initial plans from (A) are further optimised (“warm-started”) in each 4dCT phase using the initial optimisation parameters but phase-specific volume definitions. Finally, both dose sets are accumulated to the reference phase using deformable image registration. Initial plans were generated according to the RTOG-1021 guideline (54Gy, 3-Fx, equidistant 9-beam IMRT) on the peak-exhale (reference) phase of a phase-binned 4dCT. Treatment planning and delivery simulations were performed in RayStation (research v4.6) using our in-house segment-morphing algorithm, which directly links to RayStation through a native C++ interface. Results: Computing the tracking plans and 4d dose distributions via the in-house interface takes 5 and 8 minutes respectively for centroid and re-optimized tracking. For a sample lung SBRT patient with 14mm peak-to-peak motion in sup-inf direction, mainly perpendicular leaf motion (0-collimator) resulted in small dose changes for PTV-D95 (−13cGy) and GTV-D98 (+18cGy) for the centroid tracking case compared to the initial plan. Modest reductions of OAR doses (e.g. spinal cord D2: −11cGy) were achieved in the idealized tracking case. Conclusion: This study presents an automated “1-click” workflow for computing deliverable MLC tracking doses in RayStation. Adding a non

  3. The utility of computed tomography as a screening tool for the evaluation of pediatric blunt chest trauma.

    Markel, Troy A; Kumar, Rajiv; Koontz, Nicholas A; Scherer, L R; Applegate, Kimberly E

    2009-07-01

    There is a growing concern that computed tomography (CT) is being unnecessarily overused for the evaluation of pediatric patients. The purpose of this study was to analyze the trends and utility of chest CT use compared with chest X-ray (CXR) for the evaluation of children with blunt chest trauma. A 4-year retrospective review was performed for pediatric patients who underwent chest CT within 24 hours of sustaining blunt trauma at a Level-I trauma center. Trends in the use of CT and CXR were documented, and results of radiology reports were analyzed and compared with clinical outcomes. Three hundred thirty-three children, mean age 11 years, had chest CTs, increasing from 5.5% in 2001-2002 to 10.5% in 2004-2005 (p tool to analyze which patients may require CT evaluation. A multidisciplinary approach is warranted to develop guidelines that standardize the use of CT and thereby decreases unnecessary radiation exposure to pediatric patients.

  4. On the development of a computer-based handwriting assessment tool to objectively quantify handwriting proficiency in children.

    Falk, Tiago H; Tam, Cynthia; Schellnus, Heidi; Chau, Tom

    2011-12-01

    Standardized writing assessments such as the Minnesota Handwriting Assessment (MHA) can inform interventions for handwriting difficulties, which are prevalent among school-aged children. However, these tests usually involve the laborious task of subjectively rating the legibility of the written product, precluding their practical use in some clinical and educational settings. This study describes a portable computer-based handwriting assessment tool to objectively measure MHA quality scores and to detect handwriting difficulties in children. Several measures are proposed based on spatial, temporal, and grip force measurements obtained from a custom-built handwriting instrument. Thirty-five first and second grade students participated in the study, nine of whom exhibited handwriting difficulties. Students performed the MHA test and were subjectively scored based on speed and handwriting quality using five primitives: legibility, form, alignment, size, and space. Several spatial parameters are shown to correlate significantly (phandwriting legibility and speed, respectively. Using only size and space parameters, promising discrimination between proficient and non-proficient handwriting can be achieved. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.

  5. Image Guided Virtual Autopsy: An Adjunct with Radiographic and Computed Tomography Modalities - An Important Tool in Forensic Identification

    Shalu Rai

    2017-01-01

    Full Text Available The forensic examination of dead bodies is very helpful in order to identify the person, cause of death, gender, and solving the mysterious cases. It includes a number of techniques, out of which autopsy is the primary investigation that is performed in every medicolegal case. Because of mutilation technologies, traditional autopsy technique is most disturbing in terms of emotions and rituals of relatives. The use of radiology in forensic science comprises performance, interpretation, and reporting of radiographs that is helpful in detecting those changes that are not clinically visible. Forensic radiology plays an important role for identification of humans in mass disasters, criminal investigations, and evaluation of cause of death. The introduction of radiological modalities in autopsy techniques is a complementary tool for forensic identification and is known as virtual autopsy. The advance imaging techniques such as computed tomography (CT and magnetic resonance imaging (MRI is used in virtual autopsy in order to visualize and reconstruct the internal organs to know the site, type, and depth of injury. This review elaborates the role of maxillofacial imaging in image-guided virtual autopsy.

  6. Improving the Efficiency of the Nodal Integral Method With the Portable, Extensible Tool-kit for Scientific Computation

    Toreja, Allen J.; Uddin, Rizwan

    2002-01-01

    An existing implementation of the nodal integral method for the time-dependent convection-diffusion equation is modified to incorporate various PETSc (Portable, Extensible Tool-kit for Scientific Computation) solver and pre-conditioner routines. In the modified implementation, the default iterative Gauss-Seidel solver is replaced with one of the following PETSc iterative linear solver routines: Generalized Minimal Residuals, Stabilized Bi-conjugate Gradients, or Transpose-Free Quasi-Minimal Residuals. For each solver, a Jacobi or a Successive Over-Relaxation pre-conditioner is used. Two sample problems, one with a low Peclet number and one with a high Peclet number, are solved using the new implementation. In all the cases tested, the new implementation with the PETSc solver routines outperforms the original Gauss-Seidel implementation. Moreover, the PETSc Stabilized Bi-conjugate Gradients routine performs the best on the two sample problems leading to CPU times that are less than half the CPU times of the original implementation. (authors)

  7. Qualification of integrated tool environments (QUITE) for the development of computer-based safety systems in NPP

    Miedl, Horst

    2004-01-01

    In NPP I et C systems are back fitted meanwhile increasingly by computer-based systems (I et C platforms). The corresponding safety functions are implemented by software, and this software is developed, configured and administrated with the help of integrated tool environments (ITE). An ITE offers a set of services which are used to construct an I et C system and consist typically of software packages for project control and documentation, specification and design, automatic code generation and so on. Commercial ITE are not necessarily conceived and qualified (type-tested) for nuclear specific applications but are used - and will increasingly be used - for the implementation of nuclear safety related I et C systems. Therefor, it is necessary to qualify commercial ITE with respect to their influence on the quality of the target system for each I et C platform (dependent on the safety category of the target system). Examples for commercial ITEs are I et C platforms like SPINLINE 3, TELEPERM XP, Common Q, TRICON, etc. (Author)

  8. COMPUTING

    M. Kasemann

    Overview In autumn the main focus was to process and handle CRAFT data and to perform the Summer08 MC production. The operational aspects were well covered by regular Computing Shifts, experts on duty and Computing Run Coordination. At the Computing Resource Board (CRB) in October a model to account for service work at Tier 2s was approved. The computing resources for 2009 were reviewed for presentation at the C-RRB. The quarterly resource monitoring is continuing. Facilities/Infrastructure operations Operations during CRAFT data taking ran fine. This proved to be a very valuable experience for T0 workflows and operations. The transfers of custodial data to most T1s went smoothly. A first round of reprocessing started at the Tier-1 centers end of November; it will take about two weeks. The Computing Shifts procedure was tested full scale during this period and proved to be very efficient: 30 Computing Shifts Persons (CSP) and 10 Computing Resources Coordinators (CRC). The shift program for the shut down w...

  9. Enhancing pediatric safety: assessing and improving resident competency in life-threatening events with a computer-based interactive resuscitation tool

    Lerner, Catherine; Gaca, Ana M.; Frush, Donald P.; Ancarana, Anjanette; Hohenhaus, Sue; Seelinger, Terry A.; Frush, Karen

    2009-01-01

    Though rare, allergic reactions occur as a result of administration of low osmolality nonionic iodinated contrast material to pediatric patients. Currently available resuscitation aids are inadequate in guiding radiologists' initial management of such reactions. To compare radiology resident competency with and without a computer-based interactive resuscitation tool in the management of life-threatening events in pediatric patients. The study was approved by the IRB. Radiology residents (n=19; 14 male, 5 female; 19 certified in basic life support/advanced cardiac life support; 1 certified in pediatric advanced life support) were videotaped during two simulated 5-min anaphylaxis scenarios involving 18-month-old and 8-year-old mannequins (order randomized). No advance warning was given. In half of the scenarios, a computer-based interactive resuscitation tool with a response-driven decision tree was available to residents (order randomized). Competency measures included: calling a code, administering oxygen and epinephrine, and correctly dosing epinephrine. Residents performed significantly more essential interventions with the computer-based resuscitation tool than without (72/76 vs. 49/76, P<0.001). Significantly more residents appropriately dosed epinephrine with the tool than without (17/19 vs. 1/19; P<0.001). More residents called a code with the tool than without (17/19 vs. 14/19; P = 0.08). A learning effect was present: average times to call a code, request oxygen, and administer epinephrine were shorter in the second scenario (129 vs. 93 s, P=0.24; 52 vs. 30 s, P<0.001; 152 vs. 82 s, P=0.025, respectively). All the trainees found the resuscitation tool helpful and potentially useful in a true pediatric emergency. A computer-based interactive resuscitation tool significantly improved resident performance in managing pediatric emergencies in the radiology department. (orig.)

  10. COMPUTING

    I. Fisk

    2011-01-01

    Introduction CMS distributed computing system performed well during the 2011 start-up. The events in 2011 have more pile-up and are more complex than last year; this results in longer reconstruction times and harder events to simulate. Significant increases in computing capacity were delivered in April for all computing tiers, and the utilisation and load is close to the planning predictions. All computing centre tiers performed their expected functionalities. Heavy-Ion Programme The CMS Heavy-Ion Programme had a very strong showing at the Quark Matter conference. A large number of analyses were shown. The dedicated heavy-ion reconstruction facility at the Vanderbilt Tier-2 is still involved in some commissioning activities, but is available for processing and analysis. Facilities and Infrastructure Operations Facility and Infrastructure operations have been active with operations and several important deployment tasks. Facilities participated in the testing and deployment of WMAgent and WorkQueue+Request...

  11. COMPUTING

    P. McBride

    The Computing Project is preparing for a busy year where the primary emphasis of the project moves towards steady operations. Following the very successful completion of Computing Software and Analysis challenge, CSA06, last fall, we have reorganized and established four groups in computing area: Commissioning, User Support, Facility/Infrastructure Operations and Data Operations. These groups work closely together with groups from the Offline Project in planning for data processing and operations. Monte Carlo production has continued since CSA06, with about 30M events produced each month to be used for HLT studies and physics validation. Monte Carlo production will continue throughout the year in the preparation of large samples for physics and detector studies ramping to 50 M events/month for CSA07. Commissioning of the full CMS computing system is a major goal for 2007. Site monitoring is an important commissioning component and work is ongoing to devise CMS specific tests to be included in Service Availa...

  12. [Use of the computer as a tool for the implementation of the nursing process--the experience of the Sâo Paulo/UNIFESP].

    de Barros, Alba Lúcia; Fakih, Flávio Trevisani; Michel, Jeanne Liliane

    2002-01-01

    This article reports the pathway used to build a prototype of a computer nurse's clinical decision making support system, using NANDA, NIC and NOC classifications, as an auxiliary tool in the insertion of nursing data in the computerized patient record of Hospital São Paulo/UNIFESP.

  13. A novel approach for computer-assisted template-guided autotransplantation of teeth with custom 3d designed/printed surgical tooling. An ex vivo proof of concept

    Anssari Moin, D.; Derksen, W.; Verweij, J.P.; van Merkesteyn, R.; Wismeijer, D.

    2016-01-01

    Purpose: The aim of this study was to introduce a novel method for accurate autotransplantation with computer-assisted guided templates and assembled custom-designed surgical tooling and to test the feasibility and accuracy of this method ex vivo. Materials and Methods: A partially edentulous human

  14. Accuracy of computer-assisted template-guided autotransplantation of teeth with custom three-dimensional designed/printed surgical tooling : A cadaveric study

    Anssari Moin, D.; Verweij, J.P.; Waars, H.; van Merkesteyn, R.; Wismeijer, D.

    2017-01-01

    Purpose: The aim of the present cadaveric study was to assess the accuracy of computer-assisted template-guided autotransplantation of teeth with custom 3-dimensional (3D) designed/printed surgical tooling. Materials and Methods: Ten partially edentulous human mandibular cadavers were scanned using

  15. COMPUTING

    I. Fisk

    2013-01-01

    Computing activity had ramped down after the completion of the reprocessing of the 2012 data and parked data, but is increasing with new simulation samples for analysis and upgrade studies. Much of the Computing effort is currently involved in activities to improve the computing system in preparation for 2015. Operations Office Since the beginning of 2013, the Computing Operations team successfully re-processed the 2012 data in record time, not only by using opportunistic resources like the San Diego Supercomputer Center which was accessible, to re-process the primary datasets HTMHT and MultiJet in Run2012D much earlier than planned. The Heavy-Ion data-taking period was successfully concluded in February collecting almost 500 T. Figure 3: Number of events per month (data) In LS1, our emphasis is to increase efficiency and flexibility of the infrastructure and operation. Computing Operations is working on separating disk and tape at the Tier-1 sites and the full implementation of the xrootd federation ...

  16. Developing a computational tool for predicting physical parameters of a typical VVER-1000 core based on artificial neural network

    Mirvakili, S.M.; Faghihi, F.; Khalafi, H.

    2012-01-01

    Highlights: ► Thermal–hydraulics parameters of a VVER-1000 core based on neural network (ANN), are carried out. ► Required data for ANN training are found based on modified COBRA-EN code and then linked each other using MATLAB software. ► Based on ANN method, average and maximum temperature of fuel and clad as well as MDNBR of each FA are predicted. -- Abstract: The main goal of the present article is to design a computational tool to predict physical parameters of the VVER-1000 nuclear reactor core based on artificial neural network (ANN), taking into account a detailed physical model of the fuel rods and coolant channels in a fuel assembly. Predictions of thermal characteristics of fuel, clad and coolant are performed using cascade feed forward ANN based on linear fission power distribution and power peaking factors of FAs and hot channels factors (which are found based on our previous neutronic calculations). A software package has been developed to prepare the required data for ANN training which applies a modified COBRA-EN code for sub-channel analysis and links the codes using the MATLAB software. Based on the current estimation system, five main core TH parameters are predicted, which include the average and maximum temperatures of fuel and clad as well as the minimum departure from nucleate boiling ratio (MDNBR) for each FA. To get the best conditions for the considered ANNs training, a comprehensive sensitivity study has been performed to examine the effects of variation of hidden neurons, hidden layers, transfer functions, and the learning algorithms on the training and simulation results. Performance evaluation results show that the developed ANN can be trained to estimate the core TH parameters of a typical VVER-1000 reactor quickly without loss of accuracy.

  17. Using FlowLab, an educational computational fluid dynamics tool, to perform a comparative study of turbulence models

    Parihar, A.; Kulkarni, A.; Stern, F.; Xing, T.; Moeykens, S.

    2005-01-01

    Flow over an Ahmed body is a key benchmark case for validating the complex turbulent flow field around vehicles. In spite of the simple geometry, the flow field around an Ahmed body retains critical features of real, external vehicular flow. The present study is an attempt to implement such a real life example into the course curriculum for undergraduate engineers. FlowLab, which is a Computational Fluid Dynamics (CFD) tool developed by Fluent Inc. for use in engineering education, allows students to conduct interactive application studies. This paper presents a synopsis of FlowLab, a description of one FlowLab exercise, and an overview of the educational experience gained by students through using FlowLab, which is understood through student surveys and examinations. FlowLab-based CFD exercises were implemented into 57:020 Mechanics of Fluids and Transport Processes and 58:160 Intermediate Mechanics of Fluids courses at the University of Iowa in the fall of 2004, although this report focuses only on experiences with the Ahmed body exercise, which was used only in the intermediate-level fluids class, 58:160. This exercise was developed under National Science Foundation funding by the authors of this paper. The focus of this study does not include validating the various turbulence models used for the Ahmed body simulation, because a two-dimensional simplification was applied. With the two-dimensional simplification, students may setup, run, and post process this model in a 50 minute class period using a single-CPU PC, as required for the 58:160 class at the University of Iowa. It is educational for students to understand the implication of a two- dimensional approximation for essentially a three-dimensional flow field, along with the consequent variation in both qualitative and quantitative results. Additionally, through this exercise, students may realize that the choice of the respective turbulence model will affect simulation prediction. (author)

  18. COMPUTING

    I. Fisk

    2010-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion An activity that is still in progress is computing for the heavy-ion program. The heavy-ion events are collected without zero suppression, so the event size is much large at roughly 11 MB per event of RAW. The central collisions are more complex and...

  19. COMPUTING

    M. Kasemann P. McBride Edited by M-C. Sawley with contributions from: P. Kreuzer D. Bonacorsi S. Belforte F. Wuerthwein L. Bauerdick K. Lassila-Perini M-C. Sawley

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the comput...

  20. COMPUTING

    P. McBride

    It has been a very active year for the computing project with strong contributions from members of the global community. The project has focused on site preparation and Monte Carlo production. The operations group has begun processing data from P5 as part of the global data commissioning. Improvements in transfer rates and site availability have been seen as computing sites across the globe prepare for large scale production and analysis as part of CSA07. Preparations for the upcoming Computing Software and Analysis Challenge CSA07 are progressing. Ian Fisk and Neil Geddes have been appointed as coordinators for the challenge. CSA07 will include production tests of the Tier-0 production system, reprocessing at the Tier-1 sites and Monte Carlo production at the Tier-2 sites. At the same time there will be a large analysis exercise at the Tier-2 centres. Pre-production simulation of the Monte Carlo events for the challenge is beginning. Scale tests of the Tier-0 will begin in mid-July and the challenge it...

  1. COMPUTING

    M. Kasemann

    Introduction During the past six months, Computing participated in the STEP09 exercise, had a major involvement in the October exercise and has been working with CMS sites on improving open issues relevant for data taking. At the same time operations for MC production, real data reconstruction and re-reconstructions and data transfers at large scales were performed. STEP09 was successfully conducted in June as a joint exercise with ATLAS and the other experiments. It gave good indication about the readiness of the WLCG infrastructure with the two major LHC experiments stressing the reading, writing and processing of physics data. The October Exercise, in contrast, was conducted as an all-CMS exercise, where Physics, Computing and Offline worked on a common plan to exercise all steps to efficiently access and analyze data. As one of the major results, the CMS Tier-2s demonstrated to be fully capable for performing data analysis. In recent weeks, efforts were devoted to CMS Computing readiness. All th...

  2. COMPUTING

    I. Fisk

    2011-01-01

    Introduction It has been a very active quarter in Computing with interesting progress in all areas. The activity level at the computing facilities, driven by both organised processing from data operations and user analysis, has been steadily increasing. The large-scale production of simulated events that has been progressing throughout the fall is wrapping-up and reprocessing with pile-up will continue. A large reprocessing of all the proton-proton data has just been released and another will follow shortly. The number of analysis jobs by users each day, that was already hitting the computing model expectations at the time of ICHEP, is now 33% higher. We are expecting a busy holiday break to ensure samples are ready in time for the winter conferences. Heavy Ion The Tier 0 infrastructure was able to repack and promptly reconstruct heavy-ion collision data. Two copies were made of the data at CERN using a large CASTOR disk pool, and the core physics sample was replicated ...

  3. COMPUTING

    I. Fisk

    2012-01-01

    Introduction Computing continued with a high level of activity over the winter in preparation for conferences and the start of the 2012 run. 2012 brings new challenges with a new energy, more complex events, and the need to make the best use of the available time before the Long Shutdown. We expect to be resource constrained on all tiers of the computing system in 2012 and are working to ensure the high-priority goals of CMS are not impacted. Heavy ions After a successful 2011 heavy-ion run, the programme is moving to analysis. During the run, the CAF resources were well used for prompt analysis. Since then in 2012 on average 200 job slots have been used continuously at Vanderbilt for analysis workflows. Operations Office As of 2012, the Computing Project emphasis has moved from commissioning to operation of the various systems. This is reflected in the new organisation structure where the Facilities and Data Operations tasks have been merged into a common Operations Office, which now covers everything ...

  4. COMPUTING

    M. Kasemann

    CCRC’08 challenges and CSA08 During the February campaign of the Common Computing readiness challenges (CCRC’08), the CMS computing team had achieved very good results. The link between the detector site and the Tier0 was tested by gradually increasing the number of parallel transfer streams well beyond the target. Tests covered the global robustness at the Tier0, processing a massive number of very large files and with a high writing speed to tapes.  Other tests covered the links between the different Tiers of the distributed infrastructure and the pre-staging and reprocessing capacity of the Tier1’s: response time, data transfer rate and success rate for Tape to Buffer staging of files kept exclusively on Tape were measured. In all cases, coordination with the sites was efficient and no serious problem was found. These successful preparations prepared the ground for the second phase of the CCRC’08 campaign, in May. The Computing Software and Analysis challen...

  5. COMPUTING

    I. Fisk

    2010-01-01

    Introduction The first data taking period of November produced a first scientific paper, and this is a very satisfactory step for Computing. It also gave the invaluable opportunity to learn and debrief from this first, intense period, and make the necessary adaptations. The alarm procedures between different groups (DAQ, Physics, T0 processing, Alignment/calibration, T1 and T2 communications) have been reinforced. A major effort has also been invested into remodeling and optimizing operator tasks in all activities in Computing, in parallel with the recruitment of new Cat A operators. The teams are being completed and by mid year the new tasks will have been assigned. CRB (Computing Resource Board) The Board met twice since last CMS week. In December it reviewed the experience of the November data-taking period and could measure the positive improvements made for the site readiness. It also reviewed the policy under which Tier-2 are associated with Physics Groups. Such associations are decided twice per ye...

  6. COMPUTING

    M. Kasemann

    Introduction More than seventy CMS collaborators attended the Computing and Offline Workshop in San Diego, California, April 20-24th to discuss the state of readiness of software and computing for collisions. Focus and priority were given to preparations for data taking and providing room for ample dialog between groups involved in Commissioning, Data Operations, Analysis and MC Production. Throughout the workshop, aspects of software, operating procedures and issues addressing all parts of the computing model were discussed. Plans for the CMS participation in STEP’09, the combined scale testing for all four experiments due in June 2009, were refined. The article in CMS Times by Frank Wuerthwein gave a good recap of the highly collaborative atmosphere of the workshop. Many thanks to UCSD and to the organizers for taking care of this workshop, which resulted in a long list of action items and was definitely a success. A considerable amount of effort and care is invested in the estimate of the co...

  7. ATAQS: A computational software tool for high throughput transition optimization and validation for selected reaction monitoring mass spectrometry

    Ramos Hector

    2011-03-01

    Full Text Available Abstract Background Since its inception, proteomics has essentially operated in a discovery mode with the goal of identifying and quantifying the maximal number of proteins in a sample. Increasingly, proteomic measurements are also supporting hypothesis-driven studies, in which a predetermined set of proteins is consistently detected and quantified in multiple samples. Selected reaction monitoring (SRM is a targeted mass spectrometric technique that supports the detection and quantification of specific proteins in complex samples at high sensitivity and reproducibility. Here, we describe ATAQS, an integrated software platform that supports all stages of targeted, SRM-based proteomics experiments including target selection, transition optimization and post acquisition data analysis. This software will significantly facilitate the use of targeted proteomic techniques and contribute to the generation of highly sensitive, reproducible and complete datasets that are particularly critical for the discovery and validation of targets in hypothesis-driven studies in systems biology. Result We introduce a new open source software pipeline, ATAQS (Automated and Targeted Analysis with Quantitative SRM, which consists of a number of modules that collectively support the SRM assay development workflow for targeted proteomic experiments (project management and generation of protein, peptide and transitions and the validation of peptide detection by SRM. ATAQS provides a flexible pipeline for end-users by allowing the workflow to start or end at any point of the pipeline, and for computational biologists, by enabling the easy extension of java algorithm classes for their own algorithm plug-in or connection via an external web site. This integrated system supports all steps in a SRM-based experiment and provides a user-friendly GUI that can be run by any operating system that allows the installation of the Mozilla Firefox web browser. Conclusions Targeted

  8. Dinosaurs and fossils living without dangerous tools: Social representations of computers and the Internet by elderly Finnish and American non-users

    Päivi Rasi (previously Hakkarainen

    2014-11-01

    Full Text Available This study compares the computer- and Internet-related conceptions of Finnish and American elderly people who deliberately refuse to use the Internet. It seeks to answer the following questions based on various social representations: Are there similarities and differences in the way the Finnish and American respondents classify the computer and the Internet? Are there similarities and differences in the images the Finnish and American respondents use to depict the computer and the Internet? How do the social representations of the computer and the Internet express the respondents’ distinct identities, history and culture? An analysis of written accounts provided by elderly Finnish and American people showed that both groups expressed an understanding of the computer and the Internet as a ‘Tool and Thing’ and ‘Danger’. However, differences existed between their understanding of the computer as a ‘Depriver of Freedom’ and ‘Marker of Differences’. The study concludes that their distinct identities, interests, history and culture may be some of the factors that limit their motivation and capacity to welcome and use the computer. To promote digital inclusion, the elderly should be provided with Internet-related information, training and support. At the same time, however, digital inclusion policies should also encompass a choice for Internet non-use.

  9. Report of the 2. research co-ordination meeting of the co-ordinated research programme on the development of computer-based troubleshooting tools and instruments

    1998-11-01

    The Research coordination meeting reviewed current results on the Development of Computer-Based Troubleshooting Tools and Instruments. Presentations at the meeting were made by the participants, and the project summary reports include: PC based software for troubleshooting microprocessor-based instruments; technical data base software; design and construction of a random pulser for maintenance and quality control of a nuclear counting system; microprocessor-based power conditioner; in-circuit emulator for microprocessor-based nuclear instruments; PC-based analog signal generator for simulated detector signals and arbitrary test waveforms for testing of nuclear instruments; expert system for nuclear instrument troubleshooting; development and application of versatile computer-based measurement and diagnostic tools; and development of a programmable signal generator for troubleshooting of nuclear instrumentation

  10. Nanopore sequencing technology and tools for genome assembly: computational analysis of the current state, bottlenecks and future directions.

    Senol Cali, Damla; Kim, Jeremie S; Ghose, Saugata; Alkan, Can; Mutlu, Onur

    2018-04-02

    Nanopore sequencing technology has the potential to render other sequencing technologies obsolete with its ability to generate long reads and provide portability. However, high error rates of the technology pose a challenge while generating accurate genome assemblies. The tools used for nanopore sequence analysis are of critical importance, as they should overcome the high error rates of the technology. Our goal in this work is to comprehensively analyze current publicly available tools for nanopore sequence analysis to understand their advantages, disadvantages and performance bottlenecks. It is important to understand where the current tools do not perform well to develop better tools. To this end, we (1) analyze the multiple steps and the associated tools in the genome assembly pipeline using nanopore sequence data, and (2) provide guidelines for determining the appropriate tools for each step. Based on our analyses, we make four key observations: (1) the choice of the tool for basecalling plays a critical role in overcoming the high error rates of nanopore sequencing technology. (2) Read-to-read overlap finding tools, GraphMap and Minimap, perform similarly in terms of accuracy. However, Minimap has a lower memory usage, and it is faster than GraphMap. (3) There is a trade-off between accuracy and performance when deciding on the appropriate tool for the assembly step. The fast but less accurate assembler Miniasm can be used for quick initial assembly, and further polishing can be applied on top of it to increase the accuracy, which leads to faster overall assembly. (4) The state-of-the-art polishing tool, Racon, generates high-quality consensus sequences while providing a significant speedup over another polishing tool, Nanopolish. We analyze various combinations of different tools and expose the trade-offs between accuracy, performance, memory usage and scalability. We conclude that our observations can guide researchers and practitioners in making conscious

  11. COMPUTING

    2010-01-01

    Introduction Just two months after the “LHC First Physics” event of 30th March, the analysis of the O(200) million 7 TeV collision events in CMS accumulated during the first 60 days is well under way. The consistency of the CMS computing model has been confirmed during these first weeks of data taking. This model is based on a hierarchy of use-cases deployed between the different tiers and, in particular, the distribution of RECO data to T1s, who then serve data on request to T2s, along a topology known as “fat tree”. Indeed, during this period this model was further extended by almost full “mesh” commissioning, meaning that RECO data were shipped to T2s whenever possible, enabling additional physics analyses compared with the “fat tree” model. Computing activities at the CMS Analysis Facility (CAF) have been marked by a good time response for a load almost evenly shared between ALCA (Alignment and Calibration tasks - highest p...

  12. COMPUTING

    Contributions from I. Fisk

    2012-01-01

    Introduction The start of the 2012 run has been busy for Computing. We have reconstructed, archived, and served a larger sample of new data than in 2011, and we are in the process of producing an even larger new sample of simulations at 8 TeV. The running conditions and system performance are largely what was anticipated in the plan, thanks to the hard work and preparation of many people. Heavy ions Heavy Ions has been actively analysing data and preparing for conferences.  Operations Office Figure 6: Transfers from all sites in the last 90 days For ICHEP and the Upgrade efforts, we needed to produce and process record amounts of MC samples while supporting the very successful data-taking. This was a large burden, especially on the team members. Nevertheless the last three months were very successful and the total output was phenomenal, thanks to our dedicated site admins who keep the sites operational and the computing project members who spend countless hours nursing the...

  13. COMPUTING

    M. Kasemann

    Introduction A large fraction of the effort was focused during the last period into the preparation and monitoring of the February tests of Common VO Computing Readiness Challenge 08. CCRC08 is being run by the WLCG collaboration in two phases, between the centres and all experiments. The February test is dedicated to functionality tests, while the May challenge will consist of running at all centres and with full workflows. For this first period, a number of functionality checks of the computing power, data repositories and archives as well as network links are planned. This will help assess the reliability of the systems under a variety of loads, and identifying possible bottlenecks. Many tests are scheduled together with other VOs, allowing the full scale stress test. The data rates (writing, accessing and transfer¬ring) are being checked under a variety of loads and operating conditions, as well as the reliability and transfer rates of the links between Tier-0 and Tier-1s. In addition, the capa...

  14. COMPUTING

    Matthias Kasemann

    Overview The main focus during the summer was to handle data coming from the detector and to perform Monte Carlo production. The lessons learned during the CCRC and CSA08 challenges in May were addressed by dedicated PADA campaigns lead by the Integration team. Big improvements were achieved in the stability and reliability of the CMS Tier1 and Tier2 centres by regular and systematic follow-up of faults and errors with the help of the Savannah bug tracking system. In preparation for data taking the roles of a Computing Run Coordinator and regular computing shifts monitoring the services and infrastructure as well as interfacing to the data operations tasks are being defined. The shift plan until the end of 2008 is being put together. User support worked on documentation and organized several training sessions. The ECoM task force delivered the report on “Use Cases for Start-up of pp Data-Taking” with recommendations and a set of tests to be performed for trigger rates much higher than the ...

  15. COMPUTING

    P. MacBride

    The Computing Software and Analysis Challenge CSA07 has been the main focus of the Computing Project for the past few months. Activities began over the summer with the preparation of the Monte Carlo data sets for the challenge and tests of the new production system at the Tier-0 at CERN. The pre-challenge Monte Carlo production was done in several steps: physics generation, detector simulation, digitization, conversion to RAW format and the samples were run through the High Level Trigger (HLT). The data was then merged into three "Soups": Chowder (ALPGEN), Stew (Filtered Pythia) and Gumbo (Pythia). The challenge officially started when the first Chowder events were reconstructed on the Tier-0 on October 3rd. The data operations teams were very busy during the the challenge period. The MC production teams continued with signal production and processing while the Tier-0 and Tier-1 teams worked on splitting the Soups into Primary Data Sets (PDS), reconstruction and skimming. The storage sys...

  16. COMPUTING

    I. Fisk

    2013-01-01

    Computing operation has been lower as the Run 1 samples are completing and smaller samples for upgrades and preparations are ramping up. Much of the computing activity is focusing on preparations for Run 2 and improvements in data access and flexibility of using resources. Operations Office Data processing was slow in the second half of 2013 with only the legacy re-reconstruction pass of 2011 data being processed at the sites.   Figure 1: MC production and processing was more in demand with a peak of over 750 Million GEN-SIM events in a single month.   Figure 2: The transfer system worked reliably and efficiently and transferred on average close to 520 TB per week with peaks at close to 1.2 PB.   Figure 3: The volume of data moved between CMS sites in the last six months   The tape utilisation was a focus for the operation teams with frequent deletion campaigns from deprecated 7 TeV MC GEN-SIM samples to INVALID datasets, which could be cleaned up...

  17. Starpc: a library for communication among tools on a parallel computer cluster. User's and developer's guide to Starpc

    Takemiya, Hiroshi; Yamagishi, Nobuhiro

    2000-02-01

    We report on a RPC(Remote Procedure Call)-based communication library, Starpc, for a parallel computer cluster. Starpc supports communication between Java Applets and C programs as well as between C programs. Starpc has the following three features. (1) It enables communication between Java Applets and C programs on an arbitrary computer without security violation, although Java Applets are supposed to communicate only with programs on the specific computer (Web server) in subject to a restriction on security. (2) Diverse network communication protocols are available on Starpc, because of using Nexus communication library developed at Argonne National Laboratory. (3) It works on many kinds of computers including eight parallel computers and four WS servers. In this report, the usage of Starpc and the development of applications using Starpc are described. (author)

  18. A new strategic neurosurgical planning tool for brainstem cavernous malformations using interactive computer graphics with multimodal fusion images.

    Kin, Taichi; Nakatomi, Hirofumi; Shojima, Masaaki; Tanaka, Minoru; Ino, Kenji; Mori, Harushi; Kunimatsu, Akira; Oyama, Hiroshi; Saito, Nobuhito

    2012-07-01

    In this study, the authors used preoperative simulation employing 3D computer graphics (interactive computer graphics) to fuse all imaging data for brainstem cavernous malformations. The authors evaluated whether interactive computer graphics or 2D imaging correlated better with the actual operative field, particularly in identifying a developmental venous anomaly (DVA). The study population consisted of 10 patients scheduled for surgical treatment of brainstem cavernous malformations. Data from preoperative imaging (MRI, CT, and 3D rotational angiography) were automatically fused using a normalized mutual information method, and then reconstructed by a hybrid method combining surface rendering and volume rendering methods. With surface rendering, multimodality and multithreshold techniques for 1 tissue were applied. The completed interactive computer graphics were used for simulation of surgical approaches and assumed surgical fields. Preoperative diagnostic rates for a DVA associated with brainstem cavernous malformation were compared between conventional 2D imaging and interactive computer graphics employing receiver operating characteristic (ROC) analysis. The time required for reconstruction of 3D images was 3-6 hours for interactive computer graphics. Observation in interactive mode required approximately 15 minutes. Detailed anatomical information for operative procedures, from the craniotomy to microsurgical operations, could be visualized and simulated three-dimensionally as 1 computer graphic using interactive computer graphics. Virtual surgical views were consistent with actual operative views. This technique was very useful for examining various surgical approaches. Mean (±SEM) area under the ROC curve for rate of DVA diagnosis was significantly better for interactive computer graphics (1.000±0.000) than for 2D imaging (0.766±0.091; pcomputer graphics than with 2D images. Interactive computer graphics was also useful in helping to plan the surgical

  19. COMPUTING

    I. Fisk

    2011-01-01

    Introduction The Computing Team successfully completed the storage, initial processing, and distribution for analysis of proton-proton data in 2011. There are still a variety of activities ongoing to support winter conference activities and preparations for 2012. Heavy ions The heavy-ion run for 2011 started in early November and has already demonstrated good machine performance and success of some of the more advanced workflows planned for 2011. Data collection will continue until early December. Facilities and Infrastructure Operations Operational and deployment support for WMAgent and WorkQueue+Request Manager components, routinely used in production by Data Operations, are provided. The GlideInWMS and components installation are now deployed at CERN, which is added to the GlideInWMS factory placed in the US. There has been new operational collaboration between the CERN team and the UCSD GlideIn factory operators, covering each others time zones by monitoring/debugging pilot jobs sent from the facto...

  20. COMPUTING

    M. Kasemann

    CMS relies on a well functioning, distributed computing infrastructure. The Site Availability Monitoring (SAM) and the Job Robot submission have been very instrumental for site commissioning in order to increase availability of more sites such that they are available to participate in CSA07 and are ready to be used for analysis. The commissioning process has been further developed, including "lessons learned" documentation via the CMS twiki. Recently the visualization, presentation and summarizing of SAM tests for sites has been redesigned, it is now developed by the central ARDA project of WLCG. Work to test the new gLite Workload Management System was performed; a 4 times increase in throughput with respect to LCG Resource Broker is observed. CMS has designed and launched a new-generation traffic load generator called "LoadTest" to commission and to keep exercised all data transfer routes in the CMS PhE-DEx topology. Since mid-February, a transfer volume of about 12 P...