WorldWideScience

Sample records for rendering utility toolkit

  1. Development and evaluation of an RN/RPN utilization toolkit.

    Science.gov (United States)

    Blastorah, Margaret; Alvarado, Kim; Duhn, Lenora; Flint, Frances; McGrath, Petrina; Vandevelde-Coke, Susan

    2010-05-01

    To develop and evaluate a toolkit for Registered Nurse/Registered Practical Nurse (RN/RPN) staff mix decision-making based on the College of Nurses of Ontario's practice standard for utilization of RNs and RPNs. Descriptive exploratory. The toolkit was tested in a sample of 2,069 inpatients on 36 medical/surgical units in five academic and two community acute care hospitals in southern Ontario. Survey and focus group data were used to evaluate the toolkit's psychometric properties, feasibility of use and utility. Results support the validity and reliability of the Patient Care Needs Assessment (PCNA) tool and the consensus-based process for conducting patient care reviews. Review participants valued the consensus approach. There was limited evidence for the validity and utility of the Unit Environmental Profile (UEP) tool. Nursing unit leaders reported confidence in planning unit staff mix ratios based on information generated through application of the toolkit, specifically the PCNA, although they were less clear about how to incorporate environmental data into staff mix decisions. Results confirm that the toolkit consistently measured the constructs that it was intended to measure and was useful in informing RN/RPN staff mix decision-making. Further refinement and testing of the UEP is required. Future research is needed to evaluate the quality of decisions resulting from the application of the toolkit, illuminate processes for integrating data into decisions and adapt the toolkit for application in other sectors.

  2. Flight-appropriate 3D Terrain-rendering Toolkit for Synthetic Vision Project

    Data.gov (United States)

    National Aeronautics and Space Administration — TerraMetrics proposes an SBIR Phase I R/R&D effort to develop a key 3D terrain-rendering technology that provides the basis for successful commercial deployment...

  3. Flight-appropriate 3D Terrain-rendering Toolkit for Synthetic Vision Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The TerraBlocksTM 3D terrain data format and terrain-block-rendering methodology provides an enabling basis for successful commercial deployment of...

  4. PHISICS TOOLKIT: MULTI-REACTOR TRANSMUTATION ANALYSIS UTILITY - MRTAU

    Energy Technology Data Exchange (ETDEWEB)

    Andrea Alfonsi; Cristian Rabiti; Aaron S. Epiney; Yaqi Wang; Joshua Cogliati

    2012-04-01

    The principal idea of this paper is to present the new capabilities available in the PHISICS toolkit, connected with the implementation of the depletion code MRTAU, a generic depletion/ decay/burn-up code developed at the Idaho National Laboratory. It is programmed in a modular structure and modern FORTRAN 95/2003. The code tracks the time evolution of the isotopic concentration of a given material accounting for nuclear reaction happening in presence of neutron flux and also due to natural decay. MRTAU has two different methods to perform the depletion calculation, in order to let the user choose the best one respect his needs. Both the methodologies and some significant results are reported in this paper.

  5. Information from Searching Content with an Ontology-Utilizing Toolkit (iSCOUT).

    Science.gov (United States)

    Lacson, Ronilda; Andriole, Katherine P; Prevedello, Luciano M; Khorasani, Ramin

    2012-08-01

    Radiology reports are permanent legal documents that serve as official interpretation of imaging tests. Manual analysis of textual information contained in these reports requires significant time and effort. This study describes the development and initial evaluation of a toolkit that enables automated identification of relevant information from within these largely unstructured text reports. We developed and made publicly available a natural language processing toolkit, Information from Searching Content with an Ontology-Utilizing Toolkit (iSCOUT). Core functions are included in the following modules: the Data Loader, Header Extractor, Terminology Interface, Reviewer, and Analyzer. The toolkit enables search for specific terms and retrieval of (radiology) reports containing exact term matches as well as similar or synonymous term matches within the text of the report. The Terminology Interface is the main component of the toolkit. It allows query expansion based on synonyms from a controlled terminology (e.g., RadLex or National Cancer Institute Thesaurus [NCIT]). We evaluated iSCOUT document retrieval of radiology reports that contained liver cysts, and compared precision and recall with and without using NCIT synonyms for query expansion. iSCOUT retrieved radiology reports with documented liver cysts with a precision of 0.92 and recall of 0.96, utilizing NCIT. This recall (i.e., utilizing the Terminology Interface) is significantly better than using each of two search terms alone (0.72, p=0.03 for liver cyst and 0.52, p=0.0002 for hepatic cyst). iSCOUT reliably assembled relevant radiology reports for a cohort of patients with liver cysts with significant improvement in document retrieval when utilizing controlled lexicons.

  6. JAVA Stereo Display Toolkit

    Science.gov (United States)

    Edmonds, Karina

    2008-01-01

    This toolkit provides a common interface for displaying graphical user interface (GUI) components in stereo using either specialized stereo display hardware (e.g., liquid crystal shutter or polarized glasses) or anaglyph display (red/blue glasses) on standard workstation displays. An application using this toolkit will work without modification in either environment, allowing stereo software to reach a wider audience without sacrificing high-quality display on dedicated hardware. The toolkit is written in Java for use with the Swing GUI Toolkit and has cross-platform compatibility. It hooks into the graphics system, allowing any standard Swing component to be displayed in stereo. It uses the OpenGL graphics library to control the stereo hardware and to perform the rendering. It also supports anaglyph and special stereo hardware using the same API (application-program interface), and has the ability to simulate color stereo in anaglyph mode by combining the red band of the left image with the green/blue bands of the right image. This is a low-level toolkit that accomplishes simply the display of components (including the JadeDisplay image display component). It does not include higher-level functions such as disparity adjustment, 3D cursor, or overlays all of which can be built using this toolkit.

  7. TOOLKIT, Version 2. 0

    Energy Technology Data Exchange (ETDEWEB)

    Schroeder, E.; Bagot, B.; McNeill, R.L.

    1990-05-09

    The purpose of this User's Guide is to show by example many of the features of Toolkit II. Some examples will be copies of screens as they appear while running the Toolkit. Other examples will show what the user should enter in various situations; in these instances, what the computer asserts will be in boldface and what the user responds will be in regular type. The User's Guide is divided into four sections. The first section, FOCUS Databases'', will give a broad overview of the Focus administrative databases that are available on the VAX; easy-to-use reports are available for most of them in the Toolkit. The second section, Getting Started'', will cover the steps necessary to log onto the Computer Center VAX cluster and how to start Focus and the Toolkit. The third section, Using the Toolkit'', will discuss some of the features in the Toolkit -- the available reports and how to access them, as well as some utilities. The fourth section, Helpful Hints'', will cover some useful facts about the VAX and Focus as well as some of the more common problems that can occur. The Toolkit is not set in concrete but is continually being revised and improved. If you have any opinions as to changes that you would like to see made to the Toolkit or new features that you would like included, please let us know. Since we do try to respond to the needs of the user and make periodic improvement to the Toolkit, this User's Guide may not correspond exactly to what is available in the computer. In general, changes are made to provide new options or features; rarely is an existing feature deleted.

  8. Quantum rendering

    Science.gov (United States)

    Lanzagorta, Marco O.; Gomez, Richard B.; Uhlmann, Jeffrey K.

    2003-08-01

    In recent years, computer graphics has emerged as a critical component of the scientific and engineering process, and it is recognized as an important computer science research area. Computer graphics are extensively used for a variety of aerospace and defense training systems and by Hollywood's special effects companies. All these applications require the computer graphics systems to produce high quality renderings of extremely large data sets in short periods of time. Much research has been done in "classical computing" toward the development of efficient methods and techniques to reduce the rendering time required for large datasets. Quantum Computing's unique algorithmic features offer the possibility of speeding up some of the known rendering algorithms currently used in computer graphics. In this paper we discuss possible implementations of quantum rendering algorithms. In particular, we concentrate on the implementation of Grover's quantum search algorithm for Z-buffering, ray-tracing, radiosity, and scene management techniques. We also compare the theoretical performance between the classical and quantum versions of the algorithms.

  9. Tribal Green Building Toolkit

    Science.gov (United States)

    This Tribal Green Building Toolkit (Toolkit) is designed to help tribal officials, community members, planners, developers, and architects develop and adopt building codes to support green building practices. Anyone can use this toolkit!

  10. Antenna toolkit

    CERN Document Server

    Carr, Joseph

    2006-01-01

    Joe Carr has provided radio amateurs and short-wave listeners with the definitive design guide for sending and receiving radio signals with Antenna Toolkit 2nd edition.Together with the powerful suite of CD software, the reader will have a complete solution for constructing or using an antenna - bar the actual hardware! The software provides a simple Windows-based aid to carrying out the design calculations at the heart of successful antenna design. All the user needs to do is select the antenna type and set the frequency - a much more fun and less error prone method than using a con

  11. The MIS Pipeline Toolkit

    Science.gov (United States)

    Teuben, Peter J.; Pound, M. W.; Storm, S.; Mundy, L. G.; Salter, D. M.; Lee, K.; Kwon, W.; Fernandez Lopez, M.; Plunkett, A.

    2013-01-01

    A pipeline toolkit was developed to help organizing, reducing and analyzing a large number of near-identical datasets. This is a very general problem, for which many different solutions have been implemented. In this poster we present one such solution that lends itself to users of the Unix command line, using the Unix "make" utility, and adapts itself easily to observational as well as theoretical projects. Two examples are given, one from the CARMA CLASSy survey, and another from a simulated kinematic survey of early galaxy forming disks. The CLASSy survey (discussed in more detail in three accompanying posters) consists of 5 different star forming regions, observed with CARMA, each containing roughly 10-20 datasets in continuum and 3 different molecular lines, that need to be combined in final data cubes and maps. The strength of such a pipeline toolkit shows itself as new data are accumulated, the data reduction steps are improved and easily re-applied to previously taken data. For this we employed a master script that was run nightly, and collaborators submitted improved script and/or pipeline parameters that control these scripts. MIS is freely available for download.

  12. NAIF Toolkit - Extended

    Science.gov (United States)

    Acton, Charles H., Jr.; Bachman, Nathaniel J.; Semenov, Boris V.; Wright, Edward D.

    2010-01-01

    The Navigation Ancillary Infor ma tion Facility (NAIF) at JPL, acting under the direction of NASA s Office of Space Science, has built a data system named SPICE (Spacecraft Planet Instrument Cmatrix Events) to assist scientists in planning and interpreting scientific observations (see figure). SPICE provides geometric and some other ancillary information needed to recover the full value of science instrument data, including correlation of individual instrument data sets with data from other instruments on the same or other spacecraft. This data system is used to produce space mission observation geometry data sets known as SPICE kernels. It is also used to read SPICE kernels and to compute derived quantities such as positions, orientations, lighting angles, etc. The SPICE toolkit consists of a subroutine/ function library, executable programs (both large applications and simple utilities that focus on kernel management), and simple examples of using SPICE toolkit subroutines. This software is very accurate, thoroughly tested, and portable to all computers. It is extremely stable and reusable on all missions. Since the previous version, three significant capabilities have been added: Interactive Data Language (IDL) interface, MATLAB interface, and a geometric event finder subsystem.

  13. Practical Parallel Rendering

    CERN Document Server

    Chalmers, Alan

    2002-01-01

    Meeting the growing demands for speed and quality in rendering computer graphics images requires new techniques. Practical parallel rendering provides one of the most practical solutions. This book addresses the basic issues of rendering within a parallel or distributed computing environment, and considers the strengths and weaknesses of multiprocessor machines and networked render farms for graphics rendering. Case studies of working applications demonstrate, in detail, practical ways of dealing with complex issues involved in parallel processing.

  14. Geant4 - A Simulation Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Wright, Dennis H

    2002-08-09

    GEANT4 is a toolkit for simulating the passage of particles through matter. it includes a complete range of functionality including tracking, geometry, physics models and hits. The physics processes offered cover a comprehensive range, including electromagnetic, hadronic and optical processes, a large set of long-lived particles, materials and elements, over a wide energy range starting, in some cases, from 250 eV and extending in others to the TeV energy range. it has been designed and constructed to expose the physics models utilized, to handle complex geometries, and to enable its easy adaptation for optimal use in different sets of applications. The toolkit is the result of a worldwide collaboration of physicists and software engineers. It has been created exploiting software engineering and object-oriented technology and implemented in the C++ programming language. It has been used in applications in particle physics, nuclear physics, accelerator design, space engineering and medical physics.

  15. CANEGROWERS Action Research Toolkit

    NARCIS (Netherlands)

    Mostert, R.H.; Brouwer, J.H.

    2015-01-01

    This toolkit contains a selection of tools to conduct action research, organized around four phases: Identify problems and possibilities; Analyze problems and possibilities; Search for solutions; and Reflection tools. The toolkit is customized for staff of Canegrowers in South Africa, who used the t

  16. CANEGROWERS Action Research Toolkit

    NARCIS (Netherlands)

    Mostert, R.H.; Brouwer, J.H.

    2015-01-01

    This toolkit contains a selection of tools to conduct action research, organized around four phases: Identify problems and possibilities; Analyze problems and possibilities; Search for solutions; and Reflection tools. The toolkit is customized for staff of Canegrowers in South Africa, who used the

  17. Student Success Center Toolkit

    Science.gov (United States)

    Jobs For the Future, 2014

    2014-01-01

    "Student Success Center Toolkit" is a compilation of materials organized to assist Student Success Center directors as they staff, launch, operate, and sustain Centers. The toolkit features materials created and used by existing Centers, such as staffing and budgeting templates, launch materials, sample meeting agendas, and fundraising…

  18. RenderToolbox3: MATLAB tools that facilitate physically based stimulus rendering for vision research.

    Science.gov (United States)

    Heasly, Benjamin S; Cottaris, Nicolas P; Lichtman, Daniel P; Xiao, Bei; Brainard, David H

    2014-02-07

    RenderToolbox3 provides MATLAB utilities and prescribes a workflow that should be useful to researchers who want to employ graphics in the study of vision and perhaps in other endeavors as well. In particular, RenderToolbox3 facilitates rendering scene families in which various scene attributes and renderer behaviors are manipulated parametrically, enables spectral specification of object reflectance and illuminant spectra, enables the use of physically based material specifications, helps validate renderer output, and converts renderer output to physical units of radiance. This paper describes the design and functionality of the toolbox and discusses several examples that demonstrate its use. We have designed RenderToolbox3 to be portable across computer hardware and operating systems and to be free and open source (except for MATLAB itself). RenderToolbox3 is available at https://github.com/DavidBrainard/RenderToolbox3.

  19. Equalizer: a scalable parallel rendering framework.

    Science.gov (United States)

    Eilemann, Stefan; Makhinya, Maxim; Pajarola, Renato

    2009-01-01

    Continuing improvements in CPU and GPU performances as well as increasing multi-core processor and cluster-based parallelism demand for flexible and scalable parallel rendering solutions that can exploit multipipe hardware accelerated graphics. In fact, to achieve interactive visualization, scalable rendering systems are essential to cope with the rapid growth of data sets. However, parallel rendering systems are non-trivial to develop and often only application specific implementations have been proposed. The task of developing a scalable parallel rendering framework is even more difficult if it should be generic to support various types of data and visualization applications, and at the same time work efficiently on a cluster with distributed graphics cards. In this paper we introduce a novel system called Equalizer, a toolkit for scalable parallel rendering based on OpenGL which provides an application programming interface (API) to develop scalable graphics applications for a wide range of systems ranging from large distributed visualization clusters and multi-processor multipipe graphics systems to single-processor single-pipe desktop machines. We describe the system architecture, the basic API, discuss its advantages over previous approaches, present example configurations and usage scenarios as well as scalability results.

  20. Hydropower RAPID Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    2016-12-01

    This fact sheet provides a brief overview of the U.S. Department of Energy (DOE) Hydropower Regulatory and Permitting Information Desktop (RAPID) Toolkit including its capabilities, features, and benefits.

  1. HALE Toolkit (HTK) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Development of a toolkit for optimization and virtual flight test of HALE vehicles is proposed based on extensions of the IHAT system for integrated...

  2. Energy Conservation Behaviour Toolkit

    NARCIS (Netherlands)

    Kalz, Marco; Börner, Dirk; Ternier, Stefaan; Specht, Marcus

    2013-01-01

    Kalz, M., Börner, D., Ternier, S., & Specht, M. (2013, 31 January). Energy Conservation Behaviour Toolkit. Presentation given at the symposium "Groene ICT en Duurzame ontwikkeling: Meters maken in het Hoger Onderwijs", Driebergen, The Netherlands.

  3. Emergency care toolkits.

    Science.gov (United States)

    Black, Steven

    2004-06-01

    Emergency care services are the focus of a series of toolkits developed by the NHS National electronic Library for Health to provide resources for emergency care leads and others involved in modernising emergency care, writes Steven Black.

  4. Wildlife crossings toolkit

    OpenAIRE

    2005-01-01

    Many highways wind their way through excellent wildlife habitat. Florida’s highways slice through rare black bear habitat. Alaska struggles with moose-vehicle collisions. Grizzly bears in the northern Rockies are killed on highways or avoid crossing them, limiting them to smaller areas. Solutions are available, but the information is widely scattered. The Wildlife Crossings Toolkit gathers information in one location on proven solutions and lessons learned. Who can use the toolkit? Profession...

  5. Image Based Rendering under Varying Illumination

    Institute of Scientific and Technical Information of China (English)

    Wang Chengfeng (王城峰); Hu Zhanyi

    2003-01-01

    A new approach for photorealistic rendering of a class of objects at arbitrary illumination is presented. The approach of the authors relies entirely on image based rendering techniques. A scheme is utilized for re-illumination of objects based on linear combination of low dimensional image representations. The minimum rendering condition of technique of the authors is three sample images under varying illumination of a reference object and a single input image of an interested object. Important properties of this approach are its simplicity, robustness and speediness. Experimental results validate the proposed rendering approach.

  6. Video-based rendering

    CERN Document Server

    Magnor, Marcus A

    2005-01-01

    Driven by consumer-market applications that enjoy steadily increasing economic importance, graphics hardware and rendering algorithms are a central focus of computer graphics research. Video-based rendering is an approach that aims to overcome the current bottleneck in the time-consuming modeling process and has applications in areas such as computer games, special effects, and interactive TV. This book offers an in-depth introduction to video-based rendering, a rapidly developing new interdisciplinary topic employing techniques from computer graphics, computer vision, and telecommunication en

  7. First responder tracking and visualization for command and control toolkit

    Science.gov (United States)

    Woodley, Robert; Petrov, Plamen; Meisinger, Roger

    2010-04-01

    In order for First Responder Command and Control personnel to visualize incidents at urban building locations, DHS sponsored a small business research program to develop a tool to visualize 3D building interiors and movement of First Responders on site. 21st Century Systems, Inc. (21CSI), has developed a toolkit called Hierarchical Grid Referenced Normalized Display (HiGRND). HiGRND utilizes three components to provide a full spectrum of visualization tools to the First Responder. First, HiGRND visualizes the structure in 3D. Utilities in the 3D environment allow the user to switch between views (2D floor plans, 3D spatial, evacuation routes, etc.) and manually edit fast changing environments. HiGRND accepts CAD drawings and 3D digital objects and renders these in the 3D space. Second, HiGRND has a First Responder tracker that uses the transponder signals from First Responders to locate them in the virtual space. We use the movements of the First Responder to map the interior of structures. Finally, HiGRND can turn 2D blueprints into 3D objects. The 3D extruder extracts walls, symbols, and text from scanned blueprints to create the 3D mesh of the building. HiGRND increases the situational awareness of First Responders and allows them to make better, faster decisions in critical urban situations.

  8. Volume-rendered hemorrhage-responsible arteriogram created by 64 multidetector-row CT during aortography: utility for catheterization in transcatheter arterial embolization for acute arterial bleeding.

    Science.gov (United States)

    Minamiguchi, Hiroki; Kawai, Nobuyuki; Sato, Morio; Ikoma, Akira; Sanda, Hiroki; Nakata, Kouhei; Tanaka, Fumihiro; Nakai, Motoki; Sonomura, Tetsuo; Murotani, Kazuhiro; Hosokawa, Seiki; Nishioku, Tadayoshi

    2014-01-01

    Aortography for detecting hemorrhage is limited when determining the catheter treatment strategy because the artery responsible for hemorrhage commonly overlaps organs and non-responsible arteries. Selective catheterization of untargeted arteries would result in repeated arteriography, large volumes of contrast medium, and extended time. A volume-rendered hemorrhage-responsible arteriogram created with 64 multidetector-row CT (64MDCT) during aortography (MDCTAo) can be used both for hemorrhage mapping and catheter navigation. The MDCTAo depicted hemorrhage in 61 of 71 cases of suspected acute arterial bleeding treated at our institute in the last 3 years. Complete hemostasis by embolization was achieved in all cases. The hemorrhage-responsible arteriogram was used for navigation during catheterization, thus assisting successful embolization. Hemorrhage was not visualized in the remaining 10 patients, of whom 6 had a pseudoaneurysm in a visceral artery; 1 with urinary bladder bleeding and 1 with chest wall hemorrhage had gaze tamponade; and 1 with urinary bladder hemorrhage and 1 with uterine hemorrhage had spastic arteries. Six patients with pseudoaneurysm underwent preventive embolization and the other 4 patients were managed by watchful observation. MDCTAo has the advantage of depicting the arteries responsible for hemoptysis, whether from the bronchial arteries or other systemic arteries, in a single scan. MDCTAo is particularly useful for identifying the source of acute arterial bleeding in the pancreatic arcade area, which is supplied by both the celiac and superior mesenteric arteries. In a case of pelvic hemorrhage, MDCTAo identified the responsible artery from among numerous overlapping visceral arteries that branched from the internal iliac arteries. In conclusion, a hemorrhage-responsible arteriogram created by 64MDCT immediately before catheterization is useful for deciding the catheter treatment strategy for acute arterial bleeding.

  9. Terrain-Toolkit

    DEFF Research Database (Denmark)

    Wang, Qi; Kaul, Manohar; Long, Cheng

    2014-01-01

    , as will be shown, is used heavily for query processing in spatial databases; and (3) they do not provide the surface distance operator which is fundamental for many applications based on terrain data. Motivated by this, we developed a tool called Terrain-Toolkit for terrain data which accepts a comprehensive set...

  10. Automated Generation of Web Services for Visualization Toolkits

    Science.gov (United States)

    Jensen, P. A.; Yuen, D. A.; Erlebacher, G.; Bollig, E. F.; Kigelman, D. G.; Shukh, E. A.

    2005-12-01

    The recent explosion in the size and complexity of geophysical data and an increasing trend for collaboration across large geographical areas demand the use of remote, full featured visualization toolkits. As the scientific community shifts toward grid computing to handle these increased demands, new web services are needed to assemble powerful distributed applications. Recent research has established the possibility of converting toolkits such as VTK [1] and Matlab [2] into remote visualization services. We are investigating an automated system to allow these toolkits to export their functions as web services under the standardized protocols SOAP and WSDL using pre-existing software (gSOAP [3]) and a custom compiler for Tcl-based scripts. The compiler uses a flexible parser and type inferring mechanism to convert the Tcl into a C++ program that allows the desired Tcl procedures to be exported as SOAP-accessible functions and the VTK rendering window to be captured offscreen and encapsulated for forwarding through a web service. Classes for a client-side Java applet to access the rendering window remotely are also generated. We will use this system to demonstrate the streamlined generation of a standards-compliant web service (suitable for grid deployment) from a Tcl script for VTK. References: [1] The Visualization Toolkit, http://www.vtk.org [2] Matlab, http://www.mathworks.com [3] gSOAP, http://www.cs.fsu.edu/~engelen/soap.html

  11. SIERRA Toolkit v. 2.0

    Energy Technology Data Exchange (ETDEWEB)

    2016-09-14

    The SIERRA Toolkit is a collection of libraries to facilitate the development of parallel engineering analysis applications. These libraries supply basic core services that an engineering application may need such as a parallel distributed and dynamic mesh database (for unstructured meshes), mechanics algorithm support (parallel infrastructure only), interfaces to parallel solvers, parallel mesh and data I/O, and various utilities (timers, diagnostic tools, etc.)

  12. The Bio* toolkits--a brief overview.

    Science.gov (United States)

    Mangalam, Harry

    2002-09-01

    Bioinformatics research is often difficult to do with commercial software. The Open Source BioPerl, BioPython and Biojava projects provide toolkits with multiple functionality that make it easier to create customised pipelines or analysis. This review briefly compares the quirks of the underlying languages and the functionality, documentation, utility and relative advantages of the Bio counterparts, particularly from the point of view of the beginning biologist programmer.

  13. Alma Data Mining Toolkit

    Science.gov (United States)

    Friedel, Douglas; Looney, Leslie; Teuben, Peter J.; Pound, Marc W.; Rauch, Kevin P.; Mundy, Lee; Harris, Robert J.; Xu, Lisa

    2016-06-01

    ADMIT (ALMA Data Mining Toolkit) is a Python based pipeline toolkit for the creation and analysis of new science products from ALMA data. ADMIT quickly provides users with a detailed overview of their science products, for example: line identifications, line 'cutout' cubes, moment maps, and emission type analysis (e.g., feature detection). Users can download the small ADMIT pipeline product (< 20MB), analyze the results, then fine-tune and re-run the ADMIT pipeline (or any part thereof) on their own machines and interactively inspect the results. ADMIT has both a web browser and command line interface available for this purpose. By analyzing multiple data cubes simultaneously, data mining between many astronomical sources and line transitions are possible. Users are also able to enhance the capabilities of ADMIT by creating customized ADMIT tasks satisfying any special processing needs. We will present some of the salient features of ADMIT and example use cases.

  14. Newnes electronics toolkit

    CERN Document Server

    Phillips, Geoff

    2013-01-01

    Newnes Electronics Toolkit brings together fundamental facts, concepts, and applications of electronic components and circuits, and presents them in a clear, concise, and unambiguous format, to provide a reference book for engineers. The book contains 10 chapters that discuss the following concepts: resistors, capacitors, inductors, semiconductors, circuit concepts, electromagnetic compatibility, sound, light, heat, and connections. The engineer's job does not end when the circuit diagram is completed; the design for the manufacturing process is just as important if volume production is to be

  15. A toolkit for detecting technical surprise.

    Energy Technology Data Exchange (ETDEWEB)

    Trahan, Michael Wayne; Foehse, Mark C.

    2010-10-01

    The detection of a scientific or technological surprise within a secretive country or institute is very difficult. The ability to detect such surprises would allow analysts to identify the capabilities that could be a military or economic threat to national security. Sandia's current approach utilizing ThreatView has been successful in revealing potential technological surprises. However, as data sets become larger, it becomes critical to use algorithms as filters along with the visualization environments. Our two-year LDRD had two primary goals. First, we developed a tool, a Self-Organizing Map (SOM), to extend ThreatView and improve our understanding of the issues involved in working with textual data sets. Second, we developed a toolkit for detecting indicators of technical surprise in textual data sets. Our toolkit has been successfully used to perform technology assessments for the Science & Technology Intelligence (S&TI) program.

  16. Rendering the Topological Spines

    Energy Technology Data Exchange (ETDEWEB)

    Nieves-Rivera, D. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-05-05

    Many tools to analyze and represent high dimensional data already exits yet most of them are not flexible, informative and intuitive enough to help the scientists make the corresponding analysis and predictions, understand the structure and complexity of scientific data, get a complete picture of it and explore a greater number of hypotheses. With this in mind, N-Dimensional Data Analysis and Visualization (ND²AV) is being developed to serve as an interactive visual analysis platform with the purpose of coupling together a number of these existing tools that range from statistics, machine learning, and data mining, with new techniques, in particular with new visualization approaches. My task is to create the rendering and implementation of a new concept called topological spines in order to extend ND²AV's scope. Other existing visualization tools create a representation preserving either the topological properties or the structural (geometric) ones because it is challenging to preserve them both simultaneously. Overcoming such challenge by creating a balance in between them, the topological spines are introduced as a new approach that aims to preserve them both. Its render using OpenGL and C++ and is currently being tested to further on be implemented on ND²AV. In this paper I will present what are the Topological Spines and how they are rendered.

  17. High Fidelity Haptic Rendering

    CERN Document Server

    Otaduy, Miguel A

    2006-01-01

    The human haptic system, among all senses, provides unique and bidirectional communication between humans and their physical environment. Yet, to date, most human-computer interactive systems have focused primarily on the graphical rendering of visual information and, to a lesser extent, on the display of auditory information. Extending the frontier of visual computing, haptic interfaces, or force feedback devices, have the potential to increase the quality of human-computer interaction by accommodating the sense of touch. They provide an attractive augmentation to visual display and enhance t

  18. Einstein Toolkit for Relativistic Astrophysics

    Science.gov (United States)

    Collaborative Effort

    2011-02-01

    The Einstein Toolkit is a collection of software components and tools for simulating and analyzing general relativistic astrophysical systems. Such systems include gravitational wave space-times, collisions of compact objects such as black holes or neutron stars, accretion onto compact objects, core collapse supernovae and Gamma-Ray Bursts. The Einstein Toolkit builds on numerous software efforts in the numerical relativity community including CactusEinstein, Whisky, and Carpet. The Einstein Toolkit currently uses the Cactus Framework as the underlying computational infrastructure that provides large-scale parallelization, general computational components, and a model for collaborative, portable code development.

  19. Adolescent Relationship Abuse (ARA) Toolkit

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Adolescent Relationship Abuse (ARA) Toolkit provides information and strategies on how to: incorporate abuse prevention into programming; conduct staff training;...

  20. Skills Labs: workshop EMERGO toolkit

    NARCIS (Netherlands)

    Kurvers, Hub; Slootmaker, Aad

    2009-01-01

    Kurvers, H. J., & Slootmaker, A. (2009). Skills Labs: workshop EMERGO toolkit. Presentation given at project members of Skills Labs. March, 31, 2009 and April, 24, 2009, Heerlen, The Netherlands: Open University of the Netherlands.

  1. EMERGO Toolkit 2.0

    NARCIS (Netherlands)

    Slootmaker, Aad; Kurvers, Hub

    2010-01-01

    Slootmaker, A., & Kurvers, H.J. (2009). EMERGO, a Java based toolkit for web based development and deployment of scenario based educational games. Heerlen, The Netherlands: Open University of the Netherlands. License: GNU General Public License

  2. The Lean and Environment Toolkit

    Science.gov (United States)

    This Lean and Environment Toolkit assembles practical experience collected by the U.S. Environmental Protection Agency (EPA) and partner companies and organizations that have experience with coordinating Lean implementation and environmental management.

  3. Business/Employers Influenza Toolkit

    Centers for Disease Control (CDC) Podcasts

    2011-09-06

    This podcast promotes the "Make It Your Business To Fight The Flu" toolkit for Businesses and Employers. The toolkit provides information and recommended strategies to help businesses and employers promote the seasonal flu vaccine. Additionally, employers will find flyers, posters, and other materials to post and distribute in the workplace.  Created: 9/6/2011 by Office of Infectious Diseases, Office of the Director (OD).   Date Released: 9/7/2011.

  4. ARE: Ada Rendering Engine

    Directory of Open Access Journals (Sweden)

    Stefano Penge

    2009-10-01

    Full Text Available E' ormai pratica diffusa, nello sviluppo di applicazioni web, l'utilizzo di template e di potenti template engine per automatizzare la generazione dei contenuti da presentare all'utente. Tuttavia a volte la potenza di tali engine è€ ottenuta mescolando logica e interfaccia, introducendo linguaggi diversi da quelli di descrizione della pagina, o addirittura inventando nuovi linguaggi dedicati.ARE (ADA Rendering Engine è€ pensato per gestire l'intero flusso di creazione del contenuto HTML/XHTML dinamico, la selezione del corretto template, CSS, JavaScript e la produzione dell'output separando completamente logica e interfaccia. I templates utilizzati sono puro HTML senza parti in altri linguaggi, e possono quindi essere gestiti e visualizzati autonomamente. Il codice HTML generato è€ uniforme e parametrizzato.E' composto da due moduli, CORE (Common Output Rendering Engine e ALE (ADA Layout Engine.Il primo (CORE viene utilizzato per la generazione OO degli elementi del DOM ed è pensato per aiutare lo sviluppatore nella produzione di codice valido rispetto al DTD utilizzato. CORE genera automaticamente gli elementi del DOM in base al DTD impostato nella configurazioneIl secondo (ALE viene utilizzato come template engine per selezionare automaticamente in base ad alcuni parametri (modulo, profilo utente, tipologia del nodo, del corso, preferenze di installazione il template HTML, i CSS e i file JavaScript appropriati. ALE permette di usare templates di default e microtemplates ricorsivi per semplificare il lavoro del grafico.I due moduli possono in ogni caso essere utilizzati indipendentemente l'uno dall'altro. E' possibile generare e renderizzare una pagina HTML utilizzando solo CORE oppure inviare gli oggetti CORE al template engine ALE che provvede a renderizzare la pagina HTML. Viceversa è possibile generare HTML senza utilizzare CORE ed inviarlo al template engine ALECORE è alla prima release ed è€ già utilizzato all

  5. Parallel hierarchical radiosity rendering

    Energy Technology Data Exchange (ETDEWEB)

    Carter, M.

    1993-07-01

    In this dissertation, the step-by-step development of a scalable parallel hierarchical radiosity renderer is documented. First, a new look is taken at the traditional radiosity equation, and a new form is presented in which the matrix of linear system coefficients is transformed into a symmetric matrix, thereby simplifying the problem and enabling a new solution technique to be applied. Next, the state-of-the-art hierarchical radiosity methods are examined for their suitability to parallel implementation, and scalability. Significant enhancements are also discovered which both improve their theoretical foundations and improve the images they generate. The resultant hierarchical radiosity algorithm is then examined for sources of parallelism, and for an architectural mapping. Several architectural mappings are discussed. A few key algorithmic changes are suggested during the process of making the algorithm parallel. Next, the performance, efficiency, and scalability of the algorithm are analyzed. The dissertation closes with a discussion of several ideas which have the potential to further enhance the hierarchical radiosity method, or provide an entirely new forum for the application of hierarchical methods.

  6. Sea modeling and rendering

    Science.gov (United States)

    Cathala, Thierry; Latger, Jean

    2010-10-01

    More and more defence and civil applications require simulation of marine synthetic environment. Currently, the "Future Anti-Surface-Guided-Weapon" (FASGW) or "anti-navire léger" (ANL) missile needs this kind of modelling. This paper presents a set of technical enhancement of the SE-Workbench that aim at better representing the sea profile and the interaction with targets. The operational scenario variability is a key criterion: the generic geographical area (e.g. Persian Gulf, coast of Somalia,...), the type of situation (e.g. peace keeping, peace enforcement, anti-piracy, drug interdiction,...)., the objectives (political, strategic, or military objectives), the description of the mission(s) (e.g. antipiracy) and operation(s) (e.g. surveillance and reconnaissance, escort, convoying) to achieve the objectives, the type of environment (Weather, Time of day, Geography [coastlines, islands, hills/mountains]). The paper insists on several points such as the dual rendering using either ray tracing [and the GP GPU optimization] or rasterization [and GPU shaders optimization], the modelling of sea-surface based on hypertextures and shaders, the wakes modelling, the buoyancy models for targets, the interaction of coast and littoral, the dielectric infrared modelling of water material.

  7. The medical exploration toolkit: an efficient support for visual computing in surgical planning and training.

    Science.gov (United States)

    Mühler, Konrad; Tietjen, Christian; Ritter, Felix; Preim, Bernhard

    2010-01-01

    Application development is often guided by the usage of software libraries and toolkits. For medical applications, the toolkits currently available focus on image analysis and volume rendering. Advance interactive visualizations and user interface issues are not adequately supported. Hence, we present a toolkit for application development in the field of medical intervention planning, training, and presentation--the MEDICALEXPLORATIONTOOLKIT (METK). The METK is based on the rapid prototyping platform MeVisLab and offers a large variety of facilities for an easy and efficient application development process. We present dedicated techniques for advanced medical visualizations, exploration, standardized documentation, adn interface widgets for common tasks. These include, e.g., advanced animation facilities, viewpoint selection, several illustrative rendering techniques, and new techniques for object selection in 3D surface models. No extended programming skills are needed for application building, since a graphical programming approach can be used. the toolkit is freely available and well documented to facilitate the use and extension of the toolkit.

  8. ParCAT: Parallel Climate Analysis Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Brian E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Steed, Chad A. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Shipman, Galen M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Ricciuto, Daniel M. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Thornton, Peter E. [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Wehner, Michael [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Williams, Dean N. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-01-01

    Climate science is employing increasingly complex models and simulations to analyze the past and predict the future of Earth s climate. This growth in complexity is creating a widening gap between the data being produced and the ability to analyze the datasets. Parallel computing tools are necessary to analyze, compare, and interpret the simulation data. The Parallel Climate Analysis Toolkit (ParCAT) provides basic tools to efficiently use parallel computing techniques to make analysis of these datasets manageable. The toolkit provides the ability to compute spatio-temporal means, differences between runs or differences between averages of runs, and histograms of the values in a data set. ParCAT is implemented as a command-line utility written in C. This allows for easy integration in other tools and allows for use in scripts. This also makes it possible to run ParCAT on many platforms from laptops to supercomputers. ParCAT outputs NetCDF files so it is compatible with existing utilities such as Panoply and UV-CDAT. This paper describes ParCAT and presents results from some example runs on the Titan system at ORNL.

  9. toolkit computational mesh conceptual model.

    Energy Technology Data Exchange (ETDEWEB)

    Baur, David G.; Edwards, Harold Carter; Cochran, William K.; Williams, Alan B.; Sjaardema, Gregory D.

    2010-03-01

    The Sierra Toolkit computational mesh is a software library intended to support massively parallel multi-physics computations on dynamically changing unstructured meshes. This domain of intended use is inherently complex due to distributed memory parallelism, parallel scalability, heterogeneity of physics, heterogeneous discretization of an unstructured mesh, and runtime adaptation of the mesh. Management of this inherent complexity begins with a conceptual analysis and modeling of this domain of intended use; i.e., development of a domain model. The Sierra Toolkit computational mesh software library is designed and implemented based upon this domain model. Software developers using, maintaining, or extending the Sierra Toolkit computational mesh library must be familiar with the concepts/domain model presented in this report.

  10. Haptic rendering for dental training system

    Institute of Scientific and Technical Information of China (English)

    WANG DangXiao; ZHANG YuRu; WANG Yong; L(U) PeiJun; ZHOU RenGe; ZHOU WanLin

    2009-01-01

    Immersion and Interaction are two key features of virtual reality systems,which are especially important for medical applications.Based on the requirement of motor skill training in dental surgery,haptic rendering method based on triangle model is investigated in this paper.Multi-rate haptic rendering architecture is proposed to solve the contradiction between fidelity and efficiency requirements.Realtime collision detection algorithm based on spatial partition and time coherence is utilized to enable fast contact determination.Proxy-based collision response algorithm is proposed to compute surface contact point.Cutting force model based on piecewise contact transition model is proposed for dental drilling simulation during tooth preparation.Velocity-driven levels of detail hapUc rendering algorithm is proposed to maintain high update rate for complex scenes with a large number of triangles.Hapticvisual collocated dental training prototype is established using half-mirror solution.Typical dental operations have been realized Including dental caries exploration,detection of boundary within dental crose-section plane,and dental drilling during tooth preparation.The haptic rendering method is a fundamental technology to improve Immersion and interaction of virtual reality training systems,which is useful not only in dental training,but also in other surgical training systems.

  11. Entropy, color, and color rendering.

    Science.gov (United States)

    Price, Luke L A

    2012-12-01

    The Shannon entropy [Bell Syst. Tech J.27, 379 (1948)] of spectral distributions is applied to the problem of color rendering. With this novel approach, calculations for visual white entropy, spectral entropy, and color rendering are proposed, indices that are unreliant on the subjectivity inherent in reference spectra and color samples. The indices are tested against real lamp spectra, showing a simple and robust system for color rendering assessment. The discussion considers potential roles for white entropy in several areas of color theory and psychophysics and nonextensive entropy generalizations of the entropy indices in mathematical color spaces.

  12. Build an Assistive Technology Toolkit

    Science.gov (United States)

    Ahrens, Kelly

    2011-01-01

    Assistive technology (AT) by its very nature consists of a variety of personal and customized tools for multiple learning styles and physical challenges. The author not only encourages students, parents, and educators to advocate for AT services, she also wants them to go a step further and build their own AT toolkits that can instill independence…

  13. A Toolkit for Teacher Engagement

    Science.gov (United States)

    Grantmakers for Education, 2014

    2014-01-01

    Teachers are critical to the success of education grantmaking strategies, yet in talking with them we discovered that the world of philanthropy is often a mystery. GFE's Toolkit for Teacher Engagement aims to assist funders in authentically and effectively involving teachers in the education reform and innovation process. Built directly from the…

  14. Kekule.js: An Open Source JavaScript Chemoinformatics Toolkit.

    Science.gov (United States)

    Jiang, Chen; Jin, Xi; Dong, Ying; Chen, Ming

    2016-06-27

    Kekule.js is an open-source, object-oriented JavaScript toolkit for chemoinformatics. It provides methods for many common tasks in molecular informatics, including chemical data input/output (I/O), two- and three-dimensional (2D/3D) rendering of chemical structure, stereo identification, ring perception, structure comparison, and substructure search. Encapsulated widgets to display and edit chemical structures directly in web context are also supplied. Developed with web standards, the toolkit is ideal for building chemoinformatics applications over the Internet. Moreover, it is highly platform-independent and can also be used in desktop or mobile environments. Some initial applications, such as plugins for inputting chemical structures on the web and uses in chemistry education, have been developed based on the toolkit.

  15. Energy Savings Measurement and Verification Toolkit Guide: Version 2.96

    Science.gov (United States)

    2004-07-01

    American Society of Heating, Refrigerating , and Air-Conditioning Engineers ( ASHRAE ) Guideline 14-2002, Measurement of Energy and Demand Savings... ASHRAE , Atlanta GA, 2002). 2 ERDC/CERL SR-04-13 brated simulation model. The toolkit is based on a collection of Microsoft Excel workbooks, with...enough to apply to most file compression utilities. 1. Double-click the current *.zip archive containing the toolkit (Figure 1). The ar- chive will

  16. Exposure render: an interactive photo-realistic volume rendering framework.

    Directory of Open Access Journals (Sweden)

    Thomas Kroes

    Full Text Available The field of volume visualization has undergone rapid development during the past years, both due to advances in suitable computing hardware and due to the increasing availability of large volume datasets. Recent work has focused on increasing the visual realism in Direct Volume Rendering (DVR by integrating a number of visually plausible but often effect-specific rendering techniques, for instance modeling of light occlusion and depth of field. Besides yielding more attractive renderings, especially the more realistic lighting has a positive effect on perceptual tasks. Although these new rendering techniques yield impressive results, they exhibit limitations in terms of their exibility and their performance. Monte Carlo ray tracing (MCRT, coupled with physically based light transport, is the de-facto standard for synthesizing highly realistic images in the graphics domain, although usually not from volumetric data. Due to the stochastic sampling of MCRT algorithms, numerous effects can be achieved in a relatively straight-forward fashion. For this reason, we have developed a practical framework that applies MCRT techniques also to direct volume rendering (DVR. With this work, we demonstrate that a host of realistic effects, including physically based lighting, can be simulated in a generic and flexible fashion, leading to interactive DVR with improved realism. In the hope that this improved approach to DVR will see more use in practice, we have made available our framework under a permissive open source license.

  17. RenderMan design principles

    Science.gov (United States)

    Apodaca, Tony; Porter, Tom

    1989-01-01

    The two worlds of interactive graphics and realistic graphics have remained separate. Fast graphics hardware runs simple algorithms and generates simple looking images. Photorealistic image synthesis software runs slowly on large expensive computers. The time has come for these two branches of computer graphics to merge. The speed and expense of graphics hardware is no longer the barrier to the wide acceptance of photorealism. There is every reason to believe that high quality image synthesis will become a standard capability of every graphics machine, from superworkstation to personal computer. The significant barrier has been the lack of a common language, an agreed-upon set of terms and conditions, for 3-D modeling systems to talk to 3-D rendering systems for computing an accurate rendition of that scene. Pixar has introduced RenderMan to serve as that common language. RenderMan, specifically the extensibility it offers in shading calculations, is discussed.

  18. GEANT4 A Simulation toolkit

    CERN Document Server

    Agostinelli, S; Amako, K; Apostolakis, John; Araújo, H M; Arce, P; Asai, M; Axen, D A; Banerjee, S; Barrand, G; Behner, F; Bellagamba, L; Boudreau, J; Broglia, L; Brunengo, A; Chauvie, S; Chuma, J; Chytracek, R; Cooperman, G; Cosmo, G; Degtyarenko, P V; Dell'Acqua, A; De Paola, G O; Dietrich, D D; Enami, R; Feliciello, A; Ferguson, C; Fesefeldt, H S; Folger, G; Foppiano, F; Forti, A C; Garelli, S; Giani, S; Giannitrapani, R; Gibin, D; Gómez-Cadenas, J J; González, I; Gracía-Abríl, G; Greeniaus, L G; Greiner, W; Grichine, V M; Grossheim, A; Gumplinger, P; Hamatsu, R; Hashimoto, K; Hasui, H; Heikkinen, A M; Howard, A; Hutton, A M; Ivanchenko, V N; Johnson, A; Jones, F W; Kallenbach, Jeff; Kanaya, N; Kawabata, M; Kawabata, Y; Kawaguti, M; Kelner, S; Kent, P; Kodama, T; Kokoulin, R P; Kossov, M; Kurashige, H; Lamanna, E; Lampen, T; Lara, V; Lefébure, V; Lei, F; Liendl, M; Lockman, W; Longo, F; Magni, S; Maire, M; Mecking, B A; Medernach, E; Minamimoto, K; Mora de Freitas, P; Morita, Y; Murakami, K; Nagamatu, M; Nartallo, R; Nieminen, P; Nishimura, T; Ohtsubo, K; Okamura, M; O'Neale, S W; O'Ohata, Y; Perl, J; Pfeiffer, A; Pia, M G; Ranjard, F; Rybin, A; Sadilov, S; Di Salvo, E; Santin, G; Sasaki, T; Savvas, N; Sawada, Y; Scherer, S; Sei, S; Sirotenko, V I; Smith, D; Starkov, N; Stöcker, H; Sulkimo, J; Takahata, M; Tanaka, S; Chernyaev, E; Safai-Tehrani, F; Tropeano, M; Truscott, P R; Uno, H; Urbàn, L; Urban, P; Verderi, M; Walkden, A; Wander, W; Weber, H; Wellisch, J P; Wenaus, T; Williams, D C; Wright, D; Yamada, T; Yoshida, H; Zschiesche, D

    2003-01-01

    Geant4 is a toolkit for simulating the passage of particles through matter. It includes a complete range of functionality including tracking, geometry, physics models and hits. The physics processes offered cover a comprehensive range, including electromagnetic, hadronic and optical processes, a large set of long-lived particles, materials and elements, over a wide energy range starting, in some cases, from 250 eV and extending in others to the TeV energy range. It has been designed and constructed to expose the physics models utilised, to handle complex geometries, and to enable its easy adaptation for optimal use in different sets of applications. The toolkit is the result of a worldwide collaboration of physicists and software engineers. It has been created exploiting software engineering and object-oriented technology and implemented in the C++ programming language. It has been used in applications in particle physics, nuclear physics, accelerator design, space engineering and medical physics.

  19. GEANT4--a simulation toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Agostinelli, S.; Allison, J. E-mail: john.allison@man.ac.uk; Amako, K.; Apostolakis, J.; Araujo, H.; Arce, P.; Asai, M.; Axen, D.; Banerjee, S.; Barrand, G.; Behner, F.; Bellagamba, L.; Boudreau, J.; Broglia, L.; Brunengo, A.; Burkhardt, H.; Chauvie, S.; Chuma, J.; Chytracek, R.; Cooperman, G.; Cosmo, G.; Degtyarenko, P.; Dell' Acqua, A.; Depaola, G.; Dietrich, D.; Enami, R.; Feliciello, A.; Ferguson, C.; Fesefeldt, H.; Folger, G.; Foppiano, F.; Forti, A.; Garelli, S.; Giani, S.; Giannitrapani, R.; Gibin, D.; Gomez Cadenas, J.J.; Gonzalez, I.; Gracia Abril, G.; Greeniaus, G.; Greiner, W.; Grichine, V.; Grossheim, A.; Guatelli, S.; Gumplinger, P.; Hamatsu, R.; Hashimoto, K.; Hasui, H.; Heikkinen, A.; Howard, A.; Ivanchenko, V.; Johnson, A.; Jones, F.W.; Kallenbach, J.; Kanaya, N.; Kawabata, M.; Kawabata, Y.; Kawaguti, M.; Kelner, S.; Kent, P.; Kimura, A.; Kodama, T.; Kokoulin, R.; Kossov, M.; Kurashige, H.; Lamanna, E.; Lampen, T.; Lara, V.; Lefebure, V.; Lei, F.; Liendl, M.; Lockman, W.; Longo, F.; Magni, S.; Maire, M.; Medernach, E.; Minamimoto, K.; Mora de Freitas, P.; Morita, Y.; Murakami, K.; Nagamatu, M.; Nartallo, R.; Nieminen, P.; Nishimura, T.; Ohtsubo, K.; Okamura, M.; O' Neale, S.; Oohata, Y.; Paech, K.; Perl, J.; Pfeiffer, A.; Pia, M.G.; Ranjard, F.; Rybin, A.; Sadilov, S.; Di Salvo, E.; Santin, G.; Sasaki, T.; Savvas, N.; Sawada, Y.; Scherer, S.; Sei, S.; Sirotenko, V.; Smith, D.; Starkov, N.; Stoecker, H.; Sulkimo, J.; Takahata, M.; Tanaka, S.; Tcherniaev, E.; Safai Tehrani, E.; Tropeano, M.; Truscott, P.; Uno, H.; Urban, L.; Urban, P.; Verderi, M.; Walkden, A.; Wander, W.; Weber, H.; Wellisch, J.P.; Wenaus, T.; Williams, D.C.; Wright, D.; Yamada, T.; Yoshida, H.; Zschiesche, D

    2003-07-01

    GEANT4 is a toolkit for simulating the passage of particles through matter. It includes a complete range of functionality including tracking, geometry, physics models and hits. The physics processes offered cover a comprehensive range, including electromagnetic, hadronic and optical processes, a large set of long-lived particles, materials and elements, over a wide energy range starting, in some cases, from 250 eV and extending in others to the TeV energy range. It has been designed and constructed to expose the physics models utilised, to handle complex geometries, and to enable its easy adaptation for optimal use in different sets of applications. The toolkit is the result of a worldwide collaboration of physicists and software engineers. It has been created exploiting software engineering and object-oriented technology and implemented in the C++ programming language. It has been used in applications in particle physics, nuclear physics, accelerator design, space engineering and medical physics.

  20. Exposure Render: An Interactive Photo-Realistic Volume Rendering Framework

    NARCIS (Netherlands)

    Kroes, T.; Post, F.H.; Botha, C.P.

    2012-01-01

    The field of volume visualization has undergone rapid development during the past years, both due to advances in suitable computing hardware and due to the increasing availability of large volume datasets. Recent work has focused on increasing the visual realism in Direct Volume Rendering (DVR) by i

  1. Light-Field Imaging Toolkit

    Science.gov (United States)

    Bolan, Jeffrey; Hall, Elise; Clifford, Chris; Thurow, Brian

    The Light-Field Imaging Toolkit (LFIT) is a collection of MATLAB functions designed to facilitate the rapid processing of raw light field images captured by a plenoptic camera. An included graphical user interface streamlines the necessary post-processing steps associated with plenoptic images. The generation of perspective shifted views and computationally refocused images is supported, in both single image and animated formats. LFIT performs necessary calibration, interpolation, and structuring steps to enable future applications of this technology.

  2. Immersive volume rendering of blood vessels

    Science.gov (United States)

    Long, Gregory; Kim, Han Suk; Marsden, Alison; Bazilevs, Yuri; Schulze, Jürgen P.

    2012-03-01

    In this paper, we present a novel method of visualizing flow in blood vessels. Our approach reads unstructured tetrahedral data, resamples it, and uses slice based 3D texture volume rendering. Due to the sparse structure of blood vessels, we utilize an octree to efficiently store the resampled data by discarding empty regions of the volume. We use animation to convey time series data, wireframe surface to give structure, and utilize the StarCAVE, a 3D virtual reality environment, to add a fully immersive element to the visualization. Our tool has great value in interdisciplinary work, helping scientists collaborate with clinicians, by improving the understanding of blood flow simulations. Full immersion in the flow field allows for a more intuitive understanding of the flow phenomena, and can be a great help to medical experts for treatment planning.

  3. Wetland Resources Action Planning (WRAP) toolkit

    DEFF Research Database (Denmark)

    Bunting, Stuart W.; Smith, Kevin G.; Lund, Søren

    2013-01-01

    The Wetland Resources Action Planning (WRAP) toolkit is a toolkit of research methods and better management practices used in HighARCS (Highland Aquatic Resources Conservation and Sustainable Development), an EU-funded project with field experiences in China, Vietnam and India. It aims...

  4. The REACH Youth Program Learning Toolkit

    Science.gov (United States)

    Sierra Health Foundation, 2011

    2011-01-01

    Believing in the value of using video documentaries and data as learning tools, members of the REACH technical assistance team collaborated to develop this toolkit. The learning toolkit was designed using and/or incorporating components of the "Engaging Youth in Community Change: Outcomes and Lessons Learned from Sierra Health Foundation's…

  5. The Data Warehouse Lifecycle Toolkit

    CERN Document Server

    Kimball, Ralph; Thornthwaite, Warren; Mundy, Joy; Becker, Bob

    2011-01-01

    A thorough update to the industry standard for designing, developing, and deploying data warehouse and business intelligence systemsThe world of data warehousing has changed remarkably since the first edition of The Data Warehouse Lifecycle Toolkit was published in 1998. In that time, the data warehouse industry has reached full maturity and acceptance, hardware and software have made staggering advances, and the techniques promoted in the premiere edition of this book have been adopted by nearly all data warehouse vendors and practitioners. In addition, the term "business intelligence" emerge

  6. Penetration Tester's Open Source Toolkit

    CERN Document Server

    Faircloth, Jeremy

    2011-01-01

    Great commercial penetration testing tools can be very expensive and sometimes hard to use or of questionable accuracy. This book helps solve both of these problems. The open source, no-cost penetration testing tools presented do a great job and can be modified by the user for each situation. Many tools, even ones that cost thousands of dollars, do not come with any type of instruction on how and in which situations the penetration tester can best use them. Penetration Tester's Open Source Toolkit, Third Edition, expands upon existing instructions so that a professional can get the most accura

  7. An Introduction to the Einstein Toolkit

    CERN Document Server

    Zilhão, Miguel

    2013-01-01

    We give an introduction to the Einstein Toolkit, a mature, open-source computational infrastructure for numerical relativity based on the Cactus Framework, for the target group of new users. This toolkit is composed of several different modules, is developed by researchers from different institutions throughout the world and is in active continuous development. Documentation for the toolkit and its several modules is often scattered across different locations, however, a difficulty new users may at times have to struggle with. Scientific papers exist describing the toolkit and its methods in detail, but they might be overwhelming at first. With these lecture notes we hope to provide an initial overview for new users. We cover how to obtain, compile and run the toolkit, and give an overview of some of the tools and modules provided with it.

  8. View compensated compression of volume rendered images for remote visualization.

    Science.gov (United States)

    Lalgudi, Hariharan G; Marcellin, Michael W; Bilgin, Ali; Oh, Han; Nadar, Mariappan S

    2009-07-01

    Remote visualization of volumetric images has gained importance over the past few years in medical and industrial applications. Volume visualization is a computationally intensive process, often requiring hardware acceleration to achieve a real time viewing experience. One remote visualization model that can accomplish this would transmit rendered images from a server, based on viewpoint requests from a client. For constrained server-client bandwidth, an efficient compression scheme is vital for transmitting high quality rendered images. In this paper, we present a new view compensation scheme that utilizes the geometric relationship between viewpoints to exploit the correlation between successive rendered images. The proposed method obviates motion estimation between rendered images, enabling significant reduction to the complexity of a compressor. Additionally, the view compensation scheme, in conjunction with JPEG2000 performs better than AVC, the state of the art video compression standard.

  9. ParCAT: A Parallel Climate Analysis Toolkit

    Science.gov (United States)

    Haugen, B.; Smith, B.; Steed, C.; Ricciuto, D. M.; Thornton, P. E.; Shipman, G.

    2012-12-01

    Climate science has employed increasingly complex models and simulations to analyze the past and predict the future of our climate. The size and dimensionality of climate simulation data has been growing with the complexity of the models. This growth in data is creating a widening gap between the data being produced and the tools necessary to analyze large, high dimensional data sets. With single run data sets increasing into 10's, 100's and even 1000's of gigabytes, parallel computing tools are becoming a necessity in order to analyze and compare climate simulation data. The Parallel Climate Analysis Toolkit (ParCAT) provides basic tools that efficiently use parallel computing techniques to narrow the gap between data set size and analysis tools. ParCAT was created as a collaborative effort between climate scientists and computer scientists in order to provide efficient parallel implementations of the computing tools that are of use to climate scientists. Some of the basic functionalities included in the toolkit are the ability to compute spatio-temporal means and variances, differences between two runs and histograms of the values in a data set. ParCAT is designed to facilitate the "heavy lifting" that is required for large, multidimensional data sets. The toolkit does not focus on performing the final visualizations and presentation of results but rather, reducing large data sets to smaller, more manageable summaries. The output from ParCAT is provided in commonly used file formats (NetCDF, CSV, ASCII) to allow for simple integration with other tools. The toolkit is currently implemented as a command line utility, but will likely also provide a C library for developers interested in tighter software integration. Elements of the toolkit are already being incorporated into projects such as UV-CDAT and CMDX. There is also an effort underway to implement portions of the CCSM Land Model Diagnostics package using ParCAT in conjunction with Python and gnuplot. Par

  10. Admit: Alma Data Mining Toolkit

    Science.gov (United States)

    Friedel, Douglas; Looney, Leslie; Xu, Lisa; Pound, Marc W.; Teuben, Peter J.; Rauch, Kevin P.; Mundy, Lee; Kern, Jeffrey S.

    2015-06-01

    ADMIT (ALMA Data Mining Toolkit) is a toolkit for the creation and analysis of new science products from ALMA data. ADMIT is an ALMA Development Project written purely in Python. While specifically targeted for ALMA science and production use after the ALMA pipeline, it is designed to be generally applicable to radio-astronomical data. ADMIT quickly provides users with a detailed overview of their science products: line identifications, line 'cutout' cubes, moment maps, emission type analysis (e.g., feature detection), etc. Users can download the small ADMIT pipeline product (<20MB), analyze the results, then fine-tune and re-run the ADMIT pipeline (or any part thereof) on their own machines and interactively inspect the results. ADMIT will have both a GUI and command line interface available for this purpose. By analyzing multiple data cubes simultaneously, data mining between many astronomical sources and line transitions will be possible. Users will also be able to enhance the capabilities of ADMIT by creating customized ADMIT tasks satisfying any special processing needs. Future implementations of ADMIT may include EVLA and other instruments.

  11. The SCRAM tool-kit

    Science.gov (United States)

    Tamir, David; Flanigan, Lee A.; Weeks, Jack L.; Siewert, Thomas A.; Kimbrough, Andrew G.; McClure, Sidney R.

    1994-01-01

    This paper proposes a new series of on-orbit capabilities to support the near-term Hubble Space Telescope, Extended Duration Orbiter, Long Duration Orbiter, Space Station Freedom, other orbital platforms, and even the future manned Lunar/Mars missions. These proposed capabilities form a toolkit termed Space Construction, Repair, and Maintenance (SCRAM). SCRAM addresses both intra-Vehicular Activity (IVA) and Extra-Vehicular Activity (EVA) needs. SCRAM provides a variety of tools which enable welding, brazing, cutting, coating, heating, and cleaning, as well as corresponding nondestructive examination. Near-term IVA-SCRAM applications include repair and modification to fluid lines, structure, and laboratory equipment inside a shirt-sleeve environment (i.e. inside Spacelab or Space Station). Near-term EVA-SCRAM applications include construction of fluid lines and structural members, repair of punctures by orbital debris, refurbishment of surfaces eroded by contaminants. The SCRAM tool-kit also promises future EVA applications involving mass production tasks automated by robotics and artificial intelligence, for construction of large truss, aerobrake, and nuclear reactor shadow shields structures. The leading candidate tool processes for SCRAM, currently undergoing research and development, include Electron Beam, Gas Tungsten Arc, Plasma Arc, and Laser Beam. A series of strategic space flight experiments would make SCRAM available to help conquer the space frontier.

  12. Guide to Using the WIND Toolkit Validation Code

    Energy Technology Data Exchange (ETDEWEB)

    Lieberman-Cribbin, W.; Draxl, C.; Clifton, A.

    2014-12-01

    In response to the U.S. Department of Energy's goal of using 20% wind energy by 2030, the Wind Integration National Dataset (WIND) Toolkit was created to provide information on wind speed, wind direction, temperature, surface air pressure, and air density on more than 126,000 locations across the United States from 2007 to 2013. The numerical weather prediction model output, gridded at 2-km and at a 5-minute resolution, was further converted to detail the wind power production time series of existing and potential wind facility sites. For users of the dataset it is important that the information presented in the WIND Toolkit is accurate and that errors are known, as then corrective steps can be taken. Therefore, we provide validation code written in R that will be made public to provide users with tools to validate data of their own locations. Validation is based on statistical analyses of wind speed, using error metrics such as bias, root-mean-square error, centered root-mean-square error, mean absolute error, and percent error. Plots of diurnal cycles, annual cycles, wind roses, histograms of wind speed, and quantile-quantile plots are created to visualize how well observational data compares to model data. Ideally, validation will confirm beneficial locations to utilize wind energy and encourage regional wind integration studies using the WIND Toolkit.

  13. [A biomedical signal processing toolkit programmed by Java].

    Science.gov (United States)

    Xie, Haiyuan

    2012-09-01

    According to the biomedical signal characteristics, a new biomedical signal processing toolkit is developed. The toolkit is programmed by Java. It is used in basic digital signal processing, random signal processing and etc. All the methods in toolkit has been tested, the program is robust. The feature of the toolkit is detailed explained, easy use and good practicability.

  14. GPU Pro advanced rendering techniques

    CERN Document Server

    Engel, Wolfgang

    2010-01-01

    This book covers essential tools and techniques for programming the graphics processing unit. Brought to you by Wolfgang Engel and the same team of editors who made the ShaderX series a success, this volume covers advanced rendering techniques, engine design, GPGPU techniques, related mathematical techniques, and game postmortems. A special emphasis is placed on handheld programming to account for the increased importance of graphics on mobile devices, especially the iPhone and iPod touch.Example programs and source code can be downloaded from the book's CRC Press web page. 

  15. The DLESE Evaluation Toolkit Project

    Science.gov (United States)

    Buhr, S. M.; Barker, L. J.; Marlino, M.

    2002-12-01

    The Evaluation Toolkit and Community project is a new Digital Library for Earth System Education (DLESE) collection designed to raise awareness of project evaluation within the geoscience education community, and to enable principal investigators, teachers, and evaluators to implement project evaluation more readily. This new resource is grounded in the needs of geoscience educators, and will provide a virtual home for a geoscience education evaluation community. The goals of the project are to 1) provide a robust collection of evaluation resources useful for Earth systems educators, 2) establish a forum and community for evaluation dialogue within DLESE, and 3) disseminate the resources through the DLESE infrastructure and through professional society workshops and proceedings. Collaboration and expertise in education, geoscience and evaluation are necessary if we are to conduct the best possible geoscience education. The Toolkit allows users to engage in evaluation at whichever level best suits their needs, get more evaluation professional development if desired, and access the expertise of other segments of the community. To date, a test web site has been built and populated, initial community feedback from the DLESE and broader community is being garnered, and we have begun to heighten awareness of geoscience education evaluation within our community. The web site contains features that allow users to access professional development about evaluation, search and find evaluation resources, submit resources, find or offer evaluation services, sign up for upcoming workshops, take the user survey, and submit calendar items. The evaluation resource matrix currently contains resources that have met our initial review. The resources are currently organized by type; they will become searchable on multiple dimensions of project type, audience, objectives and evaluation resource type as efforts to develop a collection-specific search engine mature. The peer review

  16. Educational RIS/PACS simulator integrated with the HIPAA compliant auditing (HCA) toolkit

    Science.gov (United States)

    Zhou, Zheng; Liu, Brent J.; Huang, H. K.; Zhang, J.

    2005-04-01

    Health Insurance Portability and Accountability Act (HIPAA), a guideline for healthcare privacy and security, has been officially instituted recently. HIPAA mandates healthcare providers to follow its privacy and security rules, one of which is to have the ability to generate audit trails on the data access for any specific patient on demand. Although most current medical imaging systems such as PACS utilize logs to record their activities, there is a lack of formal methodology to interpret these large volumes of log data and generate HIPAA compliant auditing trails. In this paper, we present a HIPAA compliant auditing (HCA) toolkit for auditing the image data flow of PACS. The toolkit can extract pertinent auditing information from the logs of various PACS components and store the information in a centralized auditing database. The HIPAA compliant audit trails can be generated based on the database, which can also be utilized for data analysis to facilitate the dynamic monitoring of the data flow of PACS. In order to demonstrate the HCA toolkit in a PACS environment, it was integrated with the PACS Simulator, that was presented as an educational tool in 2003 and 2004 SPIE. With the integration of the HCA toolkit with the PACS simulator, users can learn HIPAA audit concepts and how to generate audit trails of image data access in PACS, as well as trace the image data flow of PACS Simulator through the toolkit.

  17. Remote volume rendering pipeline for mHealth applications

    Science.gov (United States)

    Gutenko, Ievgeniia; Petkov, Kaloian; Papadopoulos, Charilaos; Zhao, Xin; Park, Ji Hwan; Kaufman, Arie; Cha, Ronald

    2014-03-01

    We introduce a novel remote volume rendering pipeline for medical visualization targeted for mHealth (mobile health) applications. The necessity of such a pipeline stems from the large size of the medical imaging data produced by current CT and MRI scanners with respect to the complexity of the volumetric rendering algorithms. For example, the resolution of typical CT Angiography (CTA) data easily reaches 512^3 voxels and can exceed 6 gigabytes in size by spanning over the time domain while capturing a beating heart. This explosion in data size makes data transfers to mobile devices challenging, and even when the transfer problem is resolved the rendering performance of the device still remains a bottleneck. To deal with this issue, we propose a thin-client architecture, where the entirety of the data resides on a remote server where the image is rendered and then streamed to the client mobile device. We utilize the display and interaction capabilities of the mobile device, while performing interactive volume rendering on a server capable of handling large datasets. Specifically, upon user interaction the volume is rendered on the server and encoded into an H.264 video stream. H.264 is ubiquitously hardware accelerated, resulting in faster compression and lower power requirements. The choice of low-latency CPU- and GPU-based encoders is particularly important in enabling the interactive nature of our system. We demonstrate a prototype of our framework using various medical datasets on commodity tablet devices.

  18. Audit: Automated Disk Investigation Toolkit

    Directory of Open Access Journals (Sweden)

    Umit Karabiyik

    2014-09-01

    Full Text Available Software tools designed for disk analysis play a critical role today in forensics investigations. However, these digital forensics tools are often difficult to use, usually task specific, and generally require professionally trained users with IT backgrounds. The relevant tools are also often open source requiring additional technical knowledge and proper configuration. This makes it difficult for investigators without some computer science background to easily conduct the needed disk analysis. In this paper, we present AUDIT, a novel automated disk investigation toolkit that supports investigations conducted by non-expert (in IT and disk technology and expert investigators. Our proof of concept design and implementation of AUDIT intelligently integrates open source tools and guides non-IT professionals while requiring minimal technical knowledge about the disk structures and file systems of the target disk image.

  19. Flightspeed Integral Image Analysis Toolkit

    Science.gov (United States)

    Thompson, David R.

    2009-01-01

    The Flightspeed Integral Image Analysis Toolkit (FIIAT) is a C library that provides image analysis functions in a single, portable package. It provides basic low-level filtering, texture analysis, and subwindow descriptor for applications dealing with image interpretation and object recognition. Designed with spaceflight in mind, it addresses: Ease of integration (minimal external dependencies) Fast, real-time operation using integer arithmetic where possible (useful for platforms lacking a dedicated floatingpoint processor) Written entirely in C (easily modified) Mostly static memory allocation 8-bit image data The basic goal of the FIIAT library is to compute meaningful numerical descriptors for images or rectangular image regions. These n-vectors can then be used directly for novelty detection or pattern recognition, or as a feature space for higher-level pattern recognition tasks. The library provides routines for leveraging training data to derive descriptors that are most useful for a specific data set. Its runtime algorithms exploit a structure known as the "integral image." This is a caching method that permits fast summation of values within rectangular regions of an image. This integral frame facilitates a wide range of fast image-processing functions. This toolkit has applicability to a wide range of autonomous image analysis tasks in the space-flight domain, including novelty detection, object and scene classification, target detection for autonomous instrument placement, and science analysis of geomorphology. It makes real-time texture and pattern recognition possible for platforms with severe computational restraints. The software provides an order of magnitude speed increase over alternative software libraries currently in use by the research community. FIIAT can commercially support intelligent video cameras used in intelligent surveillance. It is also useful for object recognition by robots or other autonomous vehicles

  20. Adaptive image contrast enhancement algorithm for point-based rendering

    Science.gov (United States)

    Xu, Shaoping; Liu, Xiaoping P.

    2015-03-01

    Surgical simulation is a major application in computer graphics and virtual reality, and most of the existing work indicates that interactive real-time cutting simulation of soft tissue is a fundamental but challenging research problem in virtual surgery simulation systems. More specifically, it is difficult to achieve a fast enough graphic update rate (at least 30 Hz) on commodity PC hardware by utilizing traditional triangle-based rendering algorithms. In recent years, point-based rendering (PBR) has been shown to offer the potential to outperform the traditional triangle-based rendering in speed when it is applied to highly complex soft tissue cutting models. Nevertheless, the PBR algorithms are still limited in visual quality due to inherent contrast distortion. We propose an adaptive image contrast enhancement algorithm as a postprocessing module for PBR, providing high visual rendering quality as well as acceptable rendering efficiency. Our approach is based on a perceptible image quality technique with automatic parameter selection, resulting in a visual quality comparable to existing conventional PBR algorithms. Experimental results show that our adaptive image contrast enhancement algorithm produces encouraging results both visually and numerically compared to representative algorithms, and experiments conducted on the latest hardware demonstrate that the proposed PBR framework with the postprocessing module is superior to the conventional PBR algorithm and that the proposed contrast enhancement algorithm can be utilized in (or compatible with) various variants of the conventional PBR algorithm.

  1. ARC Code TI: Crisis Mapping Toolkit

    Data.gov (United States)

    National Aeronautics and Space Administration — The Crisis Mapping Toolkit (CMT) is a collection of tools for processing geospatial data (images, satellite data, etc.) into cartographic products that improve...

  2. NOAA Weather and Climate Toolkit (WCT)

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The NOAA Weather and Climate Toolkit is an application that provides simple visualization and data export of weather and climatological data archived at NCDC. The...

  3. Usability testing of a fall prevention toolkit.

    Science.gov (United States)

    Keuter, Kayla R; Berg, Gina M; Hervey, Ashley M; Rogers, Nicole

    2015-05-01

    This study sought to evaluate a fall prevention toolkit, determine its ease of use and user satisfaction, and determine the preferred venue of distribution. Three forms of assessment were used: focus groups, usability testing, and surveys. Focus group participants were recruited from four locations: two rural health clinics and two urban centers. Usability testing participants were recruited from two rural health clinics. Survey questions included self-reported prior falls, current fall prevention habits, reaction to the toolkit, and demographics. Participants reported the toolkit was attractive, well-organized, and easy to use, but may contain too much information. Most participants admitted they would not actively use the toolkit on their own, but prefer having it introduced by a healthcare provider or in a social setting. Healthcare focuses on customer satisfaction; therefore, providers benefit from knowing patient preferred methods of learning fall prevention strategies.

  4. Water Quality Trading Toolkit for Permit Writers

    Science.gov (United States)

    The Water Quality Trading Toolkit for Permit Writers is EPA’s first “how-to” manual on designing and implementing water quality trading programs. It helps NPDES permitting authorities incorporate trading provisions into permits.

  5. Development of an Integrated Human Factors Toolkit

    Science.gov (United States)

    Resnick, Marc L.

    2003-01-01

    An effective integration of human abilities and limitations is crucial to the success of all NASA missions. The Integrated Human Factors Toolkit facilitates this integration by assisting system designers and analysts to select the human factors tools that are most appropriate for the needs of each project. The HF Toolkit contains information about a broad variety of human factors tools addressing human requirements in the physical, information processing and human reliability domains. Analysis of each tool includes consideration of the most appropriate design stage, the amount of expertise in human factors that is required, the amount of experience with the tool and the target job tasks that are needed, and other factors that are critical for successful use of the tool. The benefits of the Toolkit include improved safety, reliability and effectiveness of NASA systems throughout the agency. This report outlines the initial stages of development for the Integrated Human Factors Toolkit.

  6. A Geospatial Decision Support System Toolkit Project

    Data.gov (United States)

    National Aeronautics and Space Administration — We propose to design a working prototype Geospatial Decision Support Toolkit (GeoKit) that will enable scientists, agencies, and stakeholders to configure and deploy...

  7. A Modular Toolkit for Distributed Interactions

    Directory of Open Access Journals (Sweden)

    Julien Lange

    2011-10-01

    Full Text Available We discuss the design, architecture, and implementation of a toolkit which supports some theories for distributed interactions. The main design principles of our architecture are flexibility and modularity. Our main goal is to provide an easily extensible workbench to encompass current algorithms and incorporate future developments of the theories. With the help of some examples, we illustrate the main features of our toolkit.

  8. Web applications using the Google Web Toolkit

    OpenAIRE

    von Wenckstern, Michael

    2013-01-01

    This diploma thesis describes how to create or convert traditional Java programs to desktop-like rich internet applications with the Google Web Toolkit. The Google Web Toolkit is an open source development environment, which translates Java code to browser and device independent HTML and JavaScript. Most of the GWT framework parts, including the Java to JavaScript compiler as well as important security issues of websites will be introduced. The famous Agricola board game will be ...

  9. Web applications using the Google Web Toolkit

    OpenAIRE

    von Wenckstern, Michael

    2013-01-01

    This diploma thesis describes how to create or convert traditional Java programs to desktop-like rich internet applications with the Google Web Toolkit. The Google Web Toolkit is an open source development environment, which translates Java code to browser and device independent HTML and JavaScript. Most of the GWT framework parts, including the Java to JavaScript compiler as well as important security issues of websites will be introduced. The famous Agricola board game will be ...

  10. Wetland Resources Action Planning (WRAP) toolkit

    DEFF Research Database (Denmark)

    Bunting, Stuart W.; Smith, Kevin G.; Lund, Søren;

    2013-01-01

    The Wetland Resources Action Planning (WRAP) toolkit is a toolkit of research methods and better management practices used in HighARCS (Highland Aquatic Resources Conservation and Sustainable Development), an EU-funded project with field experiences in China, Vietnam and India. It aims...... to communicate best practices in conserving biodiversity and sustaining ecosystem services to potential users and to promote the wise-use of aquatic resources, improve livelihoods and enhance policy information....

  11. Quality improvement projects targeting health care-associated infections: comparing Virtual Collaborative and Toolkit approaches.

    Science.gov (United States)

    Speroff, Theodore; Ely, E Wes; Greevy, Robert; Weinger, Matthew B; Talbot, Thomas R; Wall, Richard J; Deshpande, Jayant K; France, Daniel J; Nwosu, Sam; Burgess, Hayley; Englebright, Jane; Williams, Mark V; Dittus, Robert S

    2011-05-01

    Collaborative and toolkit approaches have gained traction for improving quality in health care. To determine if a quality improvement virtual collaborative intervention would perform better than a toolkit-only approach at preventing central line-associated bloodstream infections (CLABSIs) and ventilator-associated pneumonias (VAPs). Cluster randomized trial with the Intensive Care Units (ICUs) of 60 hospitals assigned to the Toolkit (n=29) or Virtual Collaborative (n=31) group from January 2006 through September 2007. CLABSI and VAP rates. Follow-up survey on improvement interventions, toolkit utilization, and strategies for implementing improvement. A total of 83% of the Collaborative ICUs implemented all CLABSI interventions compared to 64% of those in the Toolkit group (P = 0.13), implemented daily catheter reviews more often (P = 0.04), and began this intervention sooner (P < 0.01). Eighty-six percent of the Collaborative group implemented the VAP bundle compared to 64% of the Toolkit group (P = 0.06). The CLABSI rate was 2.42 infections per 1000 catheter days at baseline and 2.73 at 18 months (P = 0.59). The VAP rate was 3.97 per 1000 ventilator days at baseline and 4.61 at 18 months (P = 0.50). Neither group improved outcomes over time; there was no differential performance between the 2 groups for either CLABSI rates (P = 0.71) or VAP rates (P = 0.80). The intensive collaborative approach outpaced the simpler toolkit approach in changing processes of care, but neither approach improved outcomes. Incorporating quality improvement methods, such as ICU checklists, into routine care processes is complex, highly context-dependent, and may take longer than 18 months to achieve. Copyright © 2011 Society of Hospital Medicine.

  12. Real-time graphics rendering engine

    CERN Document Server

    Bao, Hujun

    2011-01-01

    ""Real-Time Graphics Rendering Engine"" reveals the software architecture of the modern real-time 3D graphics rendering engine and the relevant technologies based on the authors' experience developing this high-performance, real-time system. The relevant knowledge about real-time graphics rendering such as the rendering pipeline, the visual appearance and shading and lighting models are also introduced. This book is intended to offer well-founded guidance for researchers and developers who are interested in building their own rendering engines. Hujun Bao is a professor at the State Key Lab of

  13. Hardware Accelerated Point Rendering of Isosurfaces

    DEFF Research Database (Denmark)

    Bærentzen, Jakob Andreas; Christensen, Niels Jørgen

    2003-01-01

    an approximate technique for point scaling using distance attenuation which makes it possible to render points stored in display lists or vertex arrays. This enables us to render points quickly using OpenGL. Our comparisons show that point generation is significantly faster than triangle generation...... and that the advantage of rendering points as opposed to triangles increases with the size and complexity of the volumes. To gauge the visual quality of future hardware accelerated point rendering schemes, we have implemented a software based point rendering method and compare the quality to both MC and our OpenGL based...

  14. Cluster parallel rendering based on encoded mesh

    Institute of Scientific and Technical Information of China (English)

    QIN Ai-hong; XIONG Hua; PENG Hao-yu; LIU Zhen; SHI Jiao-ying

    2006-01-01

    Use of compressed mesh in parallel rendering architecture is still an unexplored area, the main challenge of which is to partition and sort the encoded mesh in compression-domain. This paper presents a mesh compression scheme PRMC (Parallel Rendering based Mesh Compression) supplying encoded meshes that can be partitioned and sorted in parallel rendering system even in encoded-domain. First, we segment the mesh into submeshes and clip the submeshes' boundary into Runs, and then piecewise compress the submeshes and Runs respectively. With the help of several auxiliary index tables, compressed submeshes and Runs can serve as rendering primitives in parallel rendering system. Based on PRMC, we design and implement a parallel rendering architecture. Compared with uncompressed representation, experimental results showed that PRMC meshes applied in cluster parallel rendering system can dramatically reduce the communication requirement.

  15. Binaural Rendering in MPEG Surround

    Directory of Open Access Journals (Sweden)

    Kristofer Kjörling

    2008-04-01

    Full Text Available This paper describes novel methods for evoking a multichannel audio experience over stereo headphones. In contrast to the conventional convolution-based approach where, for example, five input channels are filtered using ten head-related transfer functions, the current approach is based on a parametric representation of the multichannel signal, along with either a parametric representation of the head-related transfer functions or a reduced set of head-related transfer functions. An audio scene with multiple virtual sound sources is represented by a mono or a stereo downmix signal of all sound source signals, accompanied by certain statistical (spatial properties. These statistical properties of the sound sources are either combined with statistical properties of head-related transfer functions to estimate “binaural parameters” that represent the perceptually relevant aspects of the auditory scene or used to create a limited set of combined head-related transfer functions that can be applied directly on the downmix signal. Subsequently, a binaural rendering stage reinstates the statistical properties of the sound sources by applying the estimated binaural parameters or the reduced set of combined head-related transfer functions directly on the downmix. If combined with parametric multichannel audio coders such as MPEG Surround, the proposed methods are advantageous over conventional methods in terms of perceived quality and computational complexity.

  16. Binaural Rendering in MPEG Surround

    Science.gov (United States)

    Breebaart, Jeroen; Villemoes, Lars; Kjörling, Kristofer

    2008-12-01

    This paper describes novel methods for evoking a multichannel audio experience over stereo headphones. In contrast to the conventional convolution-based approach where, for example, five input channels are filtered using ten head-related transfer functions, the current approach is based on a parametric representation of the multichannel signal, along with either a parametric representation of the head-related transfer functions or a reduced set of head-related transfer functions. An audio scene with multiple virtual sound sources is represented by a mono or a stereo downmix signal of all sound source signals, accompanied by certain statistical (spatial) properties. These statistical properties of the sound sources are either combined with statistical properties of head-related transfer functions to estimate "binaural parameters" that represent the perceptually relevant aspects of the auditory scene or used to create a limited set of combined head-related transfer functions that can be applied directly on the downmix signal. Subsequently, a binaural rendering stage reinstates the statistical properties of the sound sources by applying the estimated binaural parameters or the reduced set of combined head-related transfer functions directly on the downmix. If combined with parametric multichannel audio coders such as MPEG Surround, the proposed methods are advantageous over conventional methods in terms of perceived quality and computational complexity.

  17. The Weather and Climate Toolkit

    Science.gov (United States)

    Ansari, S.; Del Greco, S.; Hankins, B.

    2010-12-01

    The Weather and Climate Toolkit (WCT) is free, platform independent software distributed from NOAA’s National Climatic Data Center (NCDC). The WCT allows the visualization and data export of weather and climate data, including Radar, Satellite and Model data. By leveraging the NetCDF for Java library and Common Data Model, the WCT is extremely scalable and capable of supporting many new datasets in the future. Gridded NetCDF files (regular and irregularly spaced, using Climate-Forecast (CF) conventions) are supported, along with many other formats including GRIB. The WCT provides tools for custom data overlays, Web Map Service (WMS) background maps, animations and basic filtering. The export of images and movies is provided in multiple formats. The WCT Data Export Wizard allows for data export in both vector polygon/point (Shapefile, Well-Known Text) and raster (GeoTIFF, ESRI Grid, VTK, Gridded NetCDF) formats. These data export features promote the interoperability of weather and climate information with various scientific communities and common software packages such as ArcGIS, Google Earth, MatLAB, GrADS and R. The WCT also supports an embedded, integrated Google Earth instance. The Google Earth Browser Plugin allows seamless visualization of data on a native 3-D Google Earth instance linked to the standard 2-D map. Level-II NEXRAD data for Hurricane Katrina GPCP (Global Precipitation Product), visualized in 2-D and internal Google Earth view.

  18. Rendering Caustics on Non-Lambertian Surfaces

    DEFF Research Database (Denmark)

    Jensen, Henrik Wann

    1997-01-01

    This paper presents a new technique for rendering caustics on non-Lambertian surfaces. The method is based on an extension of the photon map which removes previous restrictions limiting the usage to Lambertian surfaces. We add information about the incoming direction to the photons and this allow...... reduces the rendering time. We have used the method to render caustics on surfaces with reflectance functions varying from Lambertian to glossy specular....

  19. Building Interstellar's black hole: the gravitational renderer

    OpenAIRE

    James, Oliver; Dieckmann, Sylvan; Pabst, Simon; Roberts, Paul-George H.; Thorne, Kip S.

    2015-01-01

    Interstellar is the first feature film to attempt depicting a black hole as it would actually be seen by somebody nearby. A close collaboration between the production's Scientific Advisor and the Visual Effects team led to the development of a new renderer, DNGR (Double Negative Gravitational Renderer) which uses novel techniques for rendering in curved space-time. Following the completion of the movie, the code was adapted for scientific research, leading to new insights into gravitational l...

  20. The Topology ToolKit.

    Science.gov (United States)

    Tierny, Julien; Favelier, Guillaume; Levine, Joshua A; Gueunet, Charles; Michaux, Michael

    2017-08-29

    This system paper presents the Topology ToolKit (TTK), a software platform designed for the topological analysis of scalar data in scientific visualization. While topological data analysis has gained in popularity over the last two decades, it has not yet been widely adopted as a standard data analysis tool for end users or developers. TTK aims at addressing this problem by providing a unified, generic, efficient, and robust implementation of key algorithms for the topological analysis of scalar data, including: critical points, integral lines, persistence diagrams, persistence curves, merge trees, contour trees, Morse-Smale complexes, fiber surfaces, continuous scatterplots, Jacobi sets, Reeb spaces, and more. TTK is easily accessible to end users due to a tight integration with ParaView. It is also easily accessible to developers through a variety of bindings (Python, VTK/C++) for fast prototyping or through direct, dependency-free, C++, to ease integration into pre-existing complex systems. While developing TTK, we faced several algorithmic and software engineering challenges, which we document in this paper. In particular, we present an algorithm for the construction of a discrete gradient that complies to the critical points extracted in the piecewise-linear setting. This algorithm guarantees a combinatorial consistency across the topological abstractions supported by TTK, and importantly, a unified implementation of topological data simplification for multi-scale exploration and analysis. We also present a cached triangulation data structure, that supports time efficient and generic traversals, which self-adjusts its memory usage on demand for input simplicial meshes and which implicitly emulates a triangulation for regular grids with no memory overhead. Finally, we describe an original software architecture, which guarantees memory efficient and direct accesses to TTK features, while still allowing for researchers powerful and easy bindings and extensions

  1. A toolkit for determining historical eco-hydrological interactions

    Science.gov (United States)

    Singer, M. B.; Sargeant, C. I.; Evans, C. M.; Vallet-Coulomb, C.

    2016-12-01

    Contemporary climate change is predicted to result in perturbations to hydroclimatic regimes across the globe, with some regions forecast to become warmer and drier. Given that water is a primary determinant of vegetative health and productivity, we can expect shifts in the availability of this critical resource to have significant impacts on forested ecosystems. The subject is particularly complex in environments where multiple sources of water are potentially available to vegetation and which may also exhibit spatial and temporal variability. To anticipate how subsurface hydrological partitioning may evolve in the future and impact overlying vegetation, we require well constrained, historical data and a modelling framework for assessing the dynamics of subsurface hydrology. We outline a toolkit to retrospectively investigate dynamic water use by trees. We describe a synergistic approach, which combines isotope dendrochronology of tree ring cellulose with a biomechanical model, detailed climatic and isotopic data in endmember waters to assess the mean isotopic composition of source water used in annual tree rings. We identify the data requirements and suggest three versions of the toolkit based on data availability. We present sensitivity analyses in order to identify the key variables required to constrain model predictions and then develop empirical relationships for constraining these parameters based on climate records. We demonstrate our methodology within a Mediterranean riparian forest site and show how it can be used along with subsurface hydrological modelling to validate source water determinations, which are fundamental to understanding climatic fluctuations and trends in subsurface hydrology. We suggest that the utility of our toolkit is applicable in riparian zones and in a range of forest environments where distinct isotopic endmembers are present.

  2. The advanced computational testing and simulation toolkit (ACTS)

    Energy Technology Data Exchange (ETDEWEB)

    Drummond, L.A.; Marques, O.

    2002-05-21

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Distinctively, a number of these are important scientific problems ranging in scale from the atomic to the cosmic. For example, ionization is a phenomenon as ubiquitous in modern society as the glow of fluorescent lights and the etching on silicon computer chips; but it was not until 1999 that researchers finally achieved a complete numerical solution to the simplest example of ionization, the collision of a hydrogen atom with an electron. On the opposite scale, cosmologists have long wondered whether the expansion of the Universe, which began with the Big Bang, would ever reverse itself, ending the Universe in a Big Crunch. In 2000, analysis of new measurements of the cosmic microwave background radiation showed that the geometry of the Universe is flat, and thus the Universe will continue expanding forever. Both of these discoveries depended on high performance computer simulations that utilized computational tools included in the Advanced Computational Testing and Simulation (ACTS) Toolkit. The ACTS Toolkit is an umbrella project that brought together a number of general purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools, which have been developed independently, mainly at DOE laboratories, make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS Toolkit Project enables the use of these tools by a much wider community of computational scientists, and promotes code portability, reusability, reduction of duplicate efforts

  3. Desensitized Optimal Filtering and Sensor Fusion Toolkit

    Science.gov (United States)

    Karlgaard, Christopher D.

    2015-01-01

    Analytical Mechanics Associates, Inc., has developed a software toolkit that filters and processes navigational data from multiple sensor sources. A key component of the toolkit is a trajectory optimization technique that reduces the sensitivity of Kalman filters with respect to model parameter uncertainties. The sensor fusion toolkit also integrates recent advances in adaptive Kalman and sigma-point filters for non-Gaussian problems with error statistics. This Phase II effort provides new filtering and sensor fusion techniques in a convenient package that can be used as a stand-alone application for ground support and/or onboard use. Its modular architecture enables ready integration with existing tools. A suite of sensor models and noise distribution as well as Monte Carlo analysis capability are included to enable statistical performance evaluations.

  4. Integrated Systems Health Management (ISHM) Toolkit

    Science.gov (United States)

    Venkatesh, Meera; Kapadia, Ravi; Walker, Mark; Wilkins, Kim

    2013-01-01

    A framework of software components has been implemented to facilitate the development of ISHM systems according to a methodology based on Reliability Centered Maintenance (RCM). This framework is collectively referred to as the Toolkit and was developed using General Atomics' Health MAP (TM) technology. The toolkit is intended to provide assistance to software developers of mission-critical system health monitoring applications in the specification, implementation, configuration, and deployment of such applications. In addition to software tools designed to facilitate these objectives, the toolkit also provides direction to software developers in accordance with an ISHM specification and development methodology. The development tools are based on an RCM approach for the development of ISHM systems. This approach focuses on defining, detecting, and predicting the likelihood of system functional failures and their undesirable consequences.

  5. An Overview of the Geant4 Toolkit

    CERN Document Server

    Apostolakis, John

    2007-01-01

    Geant4 is a toolkit for the simulation of the transport of radiation trough matter. With a flexible kernel and choices between different physics modeling choices, it has been tailored to the requirements of a wide range of applications. With the toolkit a user can describe a setup's or detector's geometry and materials, navigate inside it, simulate the physical interactions using a choice of physics engines, underlying physics cross-sections and models, visualise and store results. Physics models describing electromagnetic and hadronic interactions are provided, as are decays and processes for optical photons. Several models, with different precision and performance are available for many processes. The toolkit includes coherent physics model configurations, which are called physics lists. Users can choose an existing physics list or create their own, depending on their requirements and the application area. A clear structure and readable code, enable the user to investigate the origin of physics results. App...

  6. 78 FR 58520 - U.S. Environmental Solutions Toolkit

    Science.gov (United States)

    2013-09-24

    ... International Trade Administration U.S. Environmental Solutions Toolkit AGENCY: International Trade... Solutions Toolkit to be used by foreign environmental officials and foreign end-users of environmental.... Environmental Solutions Toolkit should self- identify by November 1, 2013, at 5:00 p.m. Eastern Standard...

  7. 77 FR 73022 - U.S. Environmental Solutions Toolkit

    Science.gov (United States)

    2012-12-07

    ... International Trade Administration U.S. Environmental Solutions Toolkit AGENCY: International Trade... web- based U.S. Environmental Solutions Toolkit to be used by foreign environmental officials and... participating in the U.S. Environmental Solutions Toolkit should self- identify by December 31, 2012, at 5:00...

  8. "Handy Manny" and the Emergent Literacy Technology Toolkit

    Science.gov (United States)

    Hourcade, Jack J.; Parette, Howard P., Jr.; Boeckmann, Nichole; Blum, Craig

    2010-01-01

    This paper outlines the use of a technology toolkit to support emergent literacy curriculum and instruction in early childhood education settings. Components of the toolkit include hardware and software that can facilitate key emergent literacy skills. Implementation of the comprehensive technology toolkit enhances the development of these…

  9. Image Based Rendering and Virtual Reality

    DEFF Research Database (Denmark)

    Livatino, Salvatore

    The Presentation concerns with an overview of Image Based Rendering approaches and their use on Virtual Reality, including Virtual Photography and Cinematography, and Mobile Robot Navigation.......The Presentation concerns with an overview of Image Based Rendering approaches and their use on Virtual Reality, including Virtual Photography and Cinematography, and Mobile Robot Navigation....

  10. Physically based rendering: from theory to implementation

    National Research Council Canada - National Science Library

    Pharr, Matt; Humphreys, Greg, Ph. D

    2010-01-01

    ... rendering algorithm variations. This book is not only a textbook for students, but also a useful reference book for practitioners in the field. The second edition has been extended with sections on Metropolis light transport, subsurface scattering, precomputed light transport, and more. Per Christensen Senior Software Developer, RenderMan Products,...

  11. Image Based Rendering and Virtual Reality

    DEFF Research Database (Denmark)

    Livatino, Salvatore

    The Presentation concerns with an overview of Image Based Rendering approaches and their use on Virtual Reality, including Virtual Photography and Cinematography, and Mobile Robot Navigation.......The Presentation concerns with an overview of Image Based Rendering approaches and their use on Virtual Reality, including Virtual Photography and Cinematography, and Mobile Robot Navigation....

  12. TRSkit: A Simple Digital Library Toolkit

    Science.gov (United States)

    Nelson, Michael L.; Esler, Sandra L.

    1997-01-01

    This paper introduces TRSkit, a simple and effective toolkit for building digital libraries on the World Wide Web. The toolkit was developed for the creation of the Langley Technical Report Server and the NASA Technical Report Server, but is applicable to most simple distribution paradigms. TRSkit contains a handful of freely available software components designed to be run under the UNIX operating system and served via the World Wide Web. The intended customer is the person that must continuously and synchronously distribute anywhere from 100 - 100,000's of information units and does not have extensive resources to devote to the problem.

  13. Autism Speaks Toolkits: Resources for Busy Physicians.

    Science.gov (United States)

    Bellando, Jayne; Fussell, Jill J; Lopez, Maya

    2016-02-01

    Given the increased prevalence of autism spectrum disorders (ASD), it is likely that busy primary care providers (PCP) are providing care to individuals with ASD in their practice. Autism Speaks provides a wealth of educational, medical, and treatment/intervention information resources for PCPs and families, including at least 32 toolkits. This article serves to familiarize PCPs and families on the different toolkits that are available on the Autism Speaks website. This article is intended to increase physicians' knowledge on the issues that families with children with ASD frequently encounter, to increase their ability to share evidence-based information to guide treatment and care for affected families in their practice.

  14. Moisture movements in render on brick wall

    DEFF Research Database (Denmark)

    Hansen, Kurt Kielsgaard; Munch, Thomas Astrup; Thorsen, Peter Schjørmann

    2003-01-01

    A three-layer render on brick wall used for building facades is studied in the laboratory. The vertical render surface is held in contact with water for 24 hours simulating driving rain while it is measured with non-destructive X-ray equipment every hour in order to follow the moisture front...... through the render and into the brick. The test specimen is placed between the source and the detector. The test specimens are all scanned before they are exposed to water. In that way the loss of counts from the dry scan to the wet scan qualitatively shows the presence of water. The results show nearly...... no penetration of water through the render and into the brick, and the results are independent of the start condition of the test specimens. Also drying experiments are performed. The results show a small difference in the rate of drying, in favour of the bricks without render....

  15. Physically based rendering from theory to implementation

    CERN Document Server

    Pharr, Matt

    2010-01-01

    "Physically Based Rendering, 2nd Edition" describes both the mathematical theory behind a modern photorealistic rendering system as well as its practical implementation. A method - known as 'literate programming'- combines human-readable documentation and source code into a single reference that is specifically designed to aid comprehension. The result is a stunning achievement in graphics education. Through the ideas and software in this book, you will learn to design and employ a full-featured rendering system for creating stunning imagery. This book features new sections on subsurface scattering, Metropolis light transport, precomputed light transport, multispectral rendering, and much more. It includes a companion site complete with source code for the rendering system described in the book, with support for Windows, OS X, and Linux. Code and text are tightly woven together through a unique indexing feature that lists each function, variable, and method on the page that they are first described.

  16. Optimization-Based Wearable Tactile Rendering.

    Science.gov (United States)

    Perez, Alvaro G; Lobo, Daniel; Chinello, Francesco; Cirio, Gabriel; Malvezzi, Monica; San Martin, Jose; Prattichizzo, Domenico; Otaduy, Miguel A

    2016-10-20

    Novel wearable tactile interfaces offer the possibility to simulate tactile interactions with virtual environments directly on our skin. But, unlike kinesthetic interfaces, for which haptic rendering is a well explored problem, they pose new questions about the formulation of the rendering problem. In this work, we propose a formulation of tactile rendering as an optimization problem, which is general for a large family of tactile interfaces. Based on an accurate simulation of contact between a finger model and the virtual environment, we pose tactile rendering as the optimization of the device configuration, such that the contact surface between the device and the actual finger matches as close as possible the contact surface in the virtual environment. We describe the optimization formulation in general terms, and we also demonstrate its implementation on a thimble-like wearable device. We validate the tactile rendering formulation by analyzing its force error, and we show that it outperforms other approaches.

  17. Marine Debris and Plastic Source Reduction Toolkit

    Science.gov (United States)

    Many plastic food service ware items originate on college and university campuses—in cafeterias, snack rooms, cafés, and eateries with take-out dining options. This Campus Toolkit is a detailed “how to” guide for reducing plastic waste on college campuses.

  18. A Toolkit for the Effective Teaching Assistant

    Science.gov (United States)

    Tyrer, Richard; Gunn, Stuart; Lee, Chris; Parker, Maureen; Pittman, Mary; Townsend, Mark

    2004-01-01

    This book offers the notion of a "toolkit" to allow Teaching Assistants (TAs) and colleagues to review and revise their thinking and practice about real issues and challenges in managing individuals, groups, colleagues and themselves in school. In a rapidly changing educational environment the book focuses on combining the underpinning knowledge…

  19. A Toolkit for Stimulating Productive Thinking

    Science.gov (United States)

    Janssen, Fred; de Hullu, Els

    2008-01-01

    Students need tools, thinking skills, to help them think actively and in depth about biological phenomena. They need to know what kind of questions to ask and how to find answers to those questions. In this article we present a toolkit with 12 "thinking tools" for asking and answering questions about biological phenomena from different…

  20. Ready, Set, Respect! GLSEN's Elementary School Toolkit

    Science.gov (United States)

    Gay, Lesbian and Straight Education Network (GLSEN), 2012

    2012-01-01

    "Ready, Set, Respect!" provides a set of tools to help elementary school educators ensure that all students feel safe and respected and develop respectful attitudes and behaviors. It is not a program to be followed but instead is designed to help educators prepare themselves for teaching about and modeling respect. The toolkit responds to…

  1. Plus 50: Business Community Outreach Toolkit

    Science.gov (United States)

    American Association of Community Colleges (NJ1), 2009

    2009-01-01

    This toolkit is designed to support you in building partnerships with the business community. It includes a series of fact sheets you can distribute to employers that discuss the value in hiring plus 50 workers. Individual sections contain footnotes. (Contains 5 web resources.)

  2. Toolkit of Available EPA Green Infrastructure Modeling ...

    Science.gov (United States)

    This webinar will present a toolkit consisting of five EPA green infrastructure models and tools, along with communication material. This toolkit can be used as a teaching and quick reference resource for use by planners and developers when making green infrastructure implementation decisions. It can also be used for low impact development design competitions. Models and tools included: Green Infrastructure Wizard (GIWiz), Watershed Management Optimization Support Tool (WMOST), Visualizing Ecosystem Land Management Assessments (VELMA) Model, Storm Water Management Model (SWMM), and the National Stormwater Calculator (SWC). This webinar will present a toolkit consisting of five EPA green infrastructure models and tools, along with communication material. This toolkit can be used as a teaching and quick reference resource for use by planners and developers when making green infrastructure implementation decisions. It can also be used for low impact development design competitions. Models and tools included: Green Infrastructure Wizard (GIWiz), Watershed Management Optimization Support Tool (WMOST), Visualizing Ecosystem Land Management Assessments (VELMA) Model, Storm Water Management Model (SWMM), and the National Stormwater Calculator (SWC).

  3. GEANT4, the physicists simulation toolkit

    CERN Multimedia

    Perricone, Mike

    2005-01-01

    One of the biggest projects ever mounted in science: the development of the proposed International Linear Collider; serving as combination instruction manual/toolkit/support network is GEANT4, a freely-available software ackage that simulates the passage of particles through scientific instruments (2 pages)

  4. 3D Rendering - Techniques and Challenges

    Directory of Open Access Journals (Sweden)

    Ekta Walia

    2010-04-01

    Full Text Available Computer generated images and animations are getting more and more common. They are used in many different contexts such as movies,mobiles, medical visualization, architectural visualization and CAD. Advanced ways of describing surface and light source properties are important to ensure that artists are able to create realistic and stylish looking images. Even when using advanced rendering algorithms such as ray tracing, time required for shading may contribute towards a large part of the image creation time. Therefore both performance and flexibility is important in a rendering system. This paper gives a comparative study of various 3D Rendering techniques and their challenges in a complete and systematic manner.

  5. The doctor-patient relationship as a toolkit for uncertain clinical decisions.

    Science.gov (United States)

    Diamond-Brown, Lauren

    2016-06-01

    Medical uncertainty is a well-recognized problem in healthcare, yet how doctors make decisions in the face of uncertainty remains to be understood. This article draws on interdisciplinary literature on uncertainty and physician decision-making to examine a specific physician response to uncertainty: using the doctor-patient relationship as a toolkit. Additionally, I ask what happens to this process when the doctor-patient relationship becomes fragmented. I answer these questions by examining obstetrician-gynecologists' narratives regarding how they make decisions when faced with uncertainty in childbirth. Between 2013 and 2014, I performed 21 semi-structured interviews with obstetricians in the United States. Obstetricians were selected to maximize variation in relevant physician, hospital, and practice characteristics. I began with grounded theory and moved to analytical coding of themes in relation to relevant literature. My analysis renders it evident that some physicians use the doctor-patient relationship as a toolkit for dealing with uncertainty. I analyze how this process varies for physicians in different models of care by comparing doctors' experiences in models with continuous versus fragmented doctor-patient relationships. My key findings are that obstetricians in both models appealed to the ideal of patient-centered decision-making to cope with uncertain decisions, but in practice physicians in fragmented care faced a number of challenges to using the doctor-patient relationship as a toolkit for decision-making. These challenges led to additional uncertainties and in some cases to poor outcomes for doctors and/or patients; they also raised concerns about the reproduction of inequality. Thus organization of care delivery mitigates the efficacy of doctors' use of the doctor-patient relationship toolkit for uncertain decisions. These findings have implications for theorizing about decision-making under conditions of medical uncertainty, for understanding

  6. The PROMIS smoking assessment toolkit--background and introduction to supplement.

    Science.gov (United States)

    Edelen, Maria Orlando

    2014-09-01

    The PROMIS Smoking Initiative has developed an assessment toolkit for measuring 6 domains of interest to cigarette smoking research: nicotine dependence, coping expectancies, emotional and sensory expectancies, health expectancies, psychosocial expectancies, and social motivations for smoking. The papers in this supplement describe the methods used to develop these item banks, their psychometric properties, and the preliminary evidence for their validity. This commentary is meant to provide background information for the material in this supplement. After discussing the use of item response theory in behavioral measurement, I will briefly review the initial developmental steps for the smoking assessment toolkit. Finally, I will describe the contents of this supplement and provide some closing remarks. Psychometric evidence strongly supports the utility of the toolkit of item banks, short forms (SFs), and computer adaptive tests (CATs). The item banks for daily smokers produce scores with reliability estimates above 0.90 for a wide range of each cigarette smoking domain continuum, and SF and CAT administrations also achieve high reliability (generally greater than 0.85) using very few items (4-7 items for most banks). Performance of the banks for nondaily smokers is similar. Preliminary evidence supports the concurrent and the discriminant validity of the bank domains. The new smoking assessment toolkit has attractive measurement features that are likely to benefit smoking research as researchers begin to utilize this resource. Information about the toolkit and access to the assessments is available at the project Web site (http://www.rand.org/health/projects/promis-smoking-initiative.html) and can also be accessed via the PROMIS Assessment Center (www.assessmentcenter.net). © The Author 2014. Published by Oxford University Press on behalf of the Society for Research on Nicotine and Tobacco. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  7. FAST CROWD RENDERING IN COMPUTER GAMES

    Directory of Open Access Journals (Sweden)

    Kaya OĞUZ

    2010-06-01

    Full Text Available Computer games, with the speed advancements of graphical processors, are coming closer to the quality of cinema industry. Contrary to offline rendering of the scenes in a motion picture, computer games should be able to render at 30 frames per second. Therefore, CPU and memory performance are sought by using various techniques. This paper is about using instancing feature of contemporary graphical processors along with level of detail techniques which has been in use for a very long time. Using instancing, 15,000 instances were successfully rendered at 30 frames per second using a very low %10 CPU usage. The application can render 40,000 instances at 13 frames per second.

  8. Visibility-Aware Direct Volume Rendering

    Institute of Scientific and Technical Information of China (English)

    Wai-Ho Mak; Yingcai Wu; Ming-Yuen Chan; Huamin Qu

    2011-01-01

    Direct volume rendering (DVR) is a powerful visualization technique which allows users to effectively explore and study volumetric datasets. Different transparency settings can be flexibly assigned to different structures such that some valuable information can be revealed in direct volume rendered images (DVRIs). However, end-users often feel that some risks are always associated with DVR because they do not know whether any important information is missing from the transparent regions of DVRIs. In this paper, we investigate how to semi-automatically generate a set of DVRIs and also an animation which can reveal information missed in the original DVRIs and meanwhile satisfy some image quality criteria such as coherence. A complete framework is developed to tackle various problems related to the generation and quality evaluation of visibility-aware DVRIs and animations. Our technique can reduce the risk of using direct volume rendering and thus boost the confidence of users in volume rendering systems.

  9. ARC Code TI: SLAB Spatial Audio Renderer

    Data.gov (United States)

    National Aeronautics and Space Administration — SLAB is a software-based, real-time virtual acoustic environment rendering system being developed as a tool for the study of spatial hearing. SLAB is designed to...

  10. Layered Textures for Image-Based Rendering

    Institute of Scientific and Technical Information of China (English)

    en-Cheng Wang; ui-Yu Li; in Zheng; n-Hua Wu

    2004-01-01

    An extension to texture mapping is given in this paper for improving the efficiency of image-based rendering. For a depth image with an orthogonal displacement at each pixel, it is decomposed by the displacement into a series of layered textures (LTs) with each one having the same displacement for all its texels. Meanwhile,some texels of the layered textures are interpolated for obtaining a continuous 3D approximation of the model represented in the depth image. Thus, the plane-to-plane texture mapping can be used to map these layered textures to produce novel views and the advantages can be obtained as follows: accelerating the rendering speed,supporting the 3D surface details and view motion parallax, and avoiding the expensive task of hole-filling in the rendering stage. Experimental results show the new method can produce high-quality images and run faster than many famous image-based rendering techniques.

  11. Composed Scattering Model for Direct Volume Rendering

    Institute of Scientific and Technical Information of China (English)

    蔡文立; 石教英

    1996-01-01

    Based on the equation of transfer in transport theory of optical physics,a new volume rendering model,called composed scattering model(CSM),is presented.In calculating the scattering term of the equation,it is decomposed into volume scattering intensity and surface scattering intensity,and they are composed with the boundary detection operator as the weight function.This proposed model differs from the most current volume rendering models in the aspect that in CSM segmentation and illumination intensity calculation are taken as two coherent parts while in existing models they are regarded as two separate ones.This model has been applied to the direct volume rendering of 3D data sets obtained by CT and MRI.The resultant images show not only rich details but also clear boundary surfaces.CSM is demonstrated to be an accurate volume rendering model suitable for CT and MRI data sets.

  12. Influence of rendering methods on yield and quality of chicken fat recovered from broiler skin

    Directory of Open Access Journals (Sweden)

    Liang-Kun Lin

    2017-06-01

    Full Text Available Objective In order to utilize fat from broiler byproducts efficiently, it is necessary to develop an appropriate rendering procedure and establish quality information for the rendered fat. A study was therefore undertaken to evaluate the influence of rendering methods on the amounts and general properties of the fat recovered from broiler skin. Methods The yield and quality of the broiler skin fat rendered through high and lower energy microwave rendering (3.6 W/g for 10 min and 2.4 W/g for 10 min for high power microwave rendering (HPMR and high power microwave rendering (LPMR, respectively, oven baking (OB, at 180°C for 40 min, and water cooking (WC, boiling for 40 min were compared. Results Microwave-rendered skin exhibited the highest yields and fat recovery rates, followed by OB, and WC fats (p<0.05. HPMR fat had the highest L*, a*, and b* values, whereas WC fat had the highest moisture content, acid values, and thiobarbituric acid (TBA values (p<0.05. There was no significant difference in the acid value, peroxide value, and TBA values between HPMR and LPMR fats. Conclusion Microwave rendering at a power level of 3.6 W/g for 10 min is suggested base on the yield and quality of chicken fat.

  13. Testing Video and Social Media for Engaging Users of the U.S. Climate Resilience Toolkit

    Science.gov (United States)

    Green, C. J.; Gardiner, N.; Niepold, F., III; Esposito, C.

    2015-12-01

    We developed a custom video production stye and a method for analyzing social media behavior so that we may deliberately build and track audience growth for decision-support tools and case studies within the U.S. Climate Resilience Toolkit. The new style of video focuses quickly on decision processes; its 30s format is well-suited for deployment through social media. We measured both traffic and engagement with video using Google Analytics. Each video included an embedded tag, allowing us to measure viewers' behavior: whether or not they entered the toolkit website; the duration of their session on the website; and the number pages they visited in that session. Results showed that video promotion was more effective on Facebook than Twitter. Facebook links generated twice the number of visits to the toolkit. Videos also increased Facebook interaction overall. Because most Facebook users are return visitors, this campaign did not substantially draw new site visitors. We continue to research and apply these methods in a targeted engagement and outreach campaign that utilizes the theory of social diffusion and social influence strategies to grow our audience of "influential" decision-makers and people within their social networks. Our goal is to increase access and use of the U.S. Climate Resilience Toolkit.

  14. The Liquid Argon Software Toolkit (LArSoft): Goals, Status and Plan

    Energy Technology Data Exchange (ETDEWEB)

    Pordes, Rush [Fermilab; Snider, Erica [Fermilab

    2016-08-17

    LArSoft is a toolkit that provides a software infrastructure and algorithms for the simulation, reconstruction and analysis of events in Liquid Argon Time Projection Chambers (LArTPCs). It is used by the ArgoNeuT, LArIAT, MicroBooNE, DUNE (including 35ton prototype and ProtoDUNE) and SBND experiments. The LArSoft collaboration provides an environment for the development, use, and sharing of code across experiments. The ultimate goal is to develop fully automatic processes for reconstruction and analysis of LArTPC events. The toolkit is based on the art framework and has a well-defined architecture to interface to other packages, including to GEANT4 and GENIE simulation software and the Pandora software development kit for pattern recognition. It is designed to facilitate and support the evolution of algorithms including their transition to new computing platforms. The development of the toolkit is driven by the scientific stakeholders involved. The core infrastructure includes standard definitions of types and constants, means to input experiment geometries as well as meta and event- data in several formats, and relevant general utilities. Examples of algorithms experiments have contributed to date are: photon-propagation; particle identification; hit finding, track finding and fitting; electromagnetic shower identification and reconstruction. We report on the status of the toolkit and plans for future work.

  15. The Identification of Potential Resilient Estuary-based Enterprises to Encourage Economic Empowerment in South Africa: a Toolkit Approach

    Directory of Open Access Journals (Sweden)

    Myles Mander

    2012-09-01

    Full Text Available It has been argued that ecosystem services can be used as the foundation to provide economic opportunities to empower the disadvantaged. The Ecosystem Services Framework (ESF approach for poverty alleviation, which balances resource conservation and human resource use, has received much attention in the literature. However, few projects have successfully achieved both conservation and economic objectives. This is partly due to there being a hiatus between theory and practice, due to the absence of tools that help make the transition between conceptual frameworks and theory, to practical integration of ecosystem services into decision making. To address this hiatus, an existing conceptual framework for analyzing the robustness of social-ecological systems was translated into a practical toolkit to help understand the complexity of social-ecological systems (SES. The toolkit can be used by a diversity of stakeholders as a decision making aid for assessing ecosystem services supply and demand and associated enterprise opportunities. The toolkit is participatory and combines both a generic "top-down" scientific approach with a case-specific "bottom-up" approach. It promotes a shared understanding of the utilization of ecosystem services, which is the foundation of identifying resilient enterprises. The toolkit comprises four steps: (i ecosystem services supply and demand assessment; (ii roles identification; (iii enterprise opportunity identification; and (vi enterprise risk assessment, and was tested at two estuary study sites. Implementation of the toolkit requires the populating of preprogrammed Excel worksheets through the holding of workshops that are attended by stakeholders associated with the ecosystems. It was concluded that for an enterprise to be resilient, it must be resilient at an external SES level,which the toolkit addresses, and at an internal business functioning level, e.g., social dynamics among personnel, skills, and literacy

  16. A Toolkit for Scalable Spreadsheet Visualization

    CERN Document Server

    Clermont, Markus

    2008-01-01

    This paper presents a toolkit for spreadsheet visualization based on logical areas, semantic classes and data modules. Logical areas, semantic classes and data modules are abstract representations of spreadsheet programs that are meant to reduce the auditing and comprehension effort, especially for large and regular spreadsheets. The toolkit is integrated as a plug-in in the Gnumeric spreadsheet system for Linux. It can process large, industry scale spreadsheet programs in reasonable time and is tightly integrated with its host spreadsheet system. Users can generate hierarchical and graph-based representations of their spreadsheets. This allows them to spot conceptual similarities in different regions of the spreadsheet, that would otherwise not fit on a screen. As it is assumed that the learning effort for effective use of such a tool should be kept low, we aim for intuitive handling of most of the tool's functions.

  17. ECCE Toolkit: Prototyping Sensor-Based Interaction

    Science.gov (United States)

    Bellucci, Andrea; Aedo, Ignacio; Díaz, Paloma

    2017-01-01

    Building and exploring physical user interfaces requires high technical skills and hours of specialized work. The behavior of multiple devices with heterogeneous input/output channels and connectivity has to be programmed in a context where not only the software interface matters, but also the hardware components are critical (e.g., sensors and actuators). Prototyping physical interaction is hindered by the challenges of: (1) programming interactions among physical sensors/actuators and digital interfaces; (2) implementing functionality for different platforms in different programming languages; and (3) building custom electronic-incorporated objects. We present ECCE (Entities, Components, Couplings and Ecosystems), a toolkit for non-programmers that copes with these issues by abstracting from low-level implementations, thus lowering the complexity of prototyping small-scale, sensor-based physical interfaces to support the design process. A user evaluation provides insights and use cases of the kind of applications that can be developed with the toolkit. PMID:28241502

  18. INTELMOD - An Intelligent Satellite Modelling Toolkit

    Science.gov (United States)

    Aynsley, M.; Hiden, H.

    This paper describes the development of an intelligent, generic spacecraft modelling toolkit, INTELMOD (INTELligent MODeller). The system has been designed to provide an environment which can efficiently capture spacecraft engineering and operational expertise, coupled with mission or phase-related knowledge. This knowledge can then be applied to support human flight controllers at ESA (European Space Agency) in performing a number of generic monitoring, analytical and diagnostic tasks. INTELMOD has been developed using a RAD (Rapid Application Development) approach, based on the Dynamic Systems Development Methodology (DSDM) and has made extensive use of Commercial Off-The-Shelf (COTS) software products. INTELMOD also incorporates UNiT (Universal Intelligent Toolkit), to provide automatic execution of recovery procedures following fault detection and isolation. Users of INTELMOD require no formal programming experience, as models can be constructed with user-friendly editors that employ a “drag and drop” approach using pre- defined palettes of key components.

  19. The Interactive Learning Toolkit: supporting interactive classrooms

    Science.gov (United States)

    Dutta, S.; McCauley, V.; Mazur, E.

    2004-05-01

    Research-based interactive learning techniques have dramatically improved student understanding. We have created the 'Interactive Learning Toolkit' (ILT), a web-based learning management system, to help implement two such pedagogies: Just in Time Teaching and Peer Instruction. Our main goal in developing this toolkit is to save the instructor time and effort and to use technology to facilitate the interaction between the students and the instructor (and between students themselves). After a brief review of both pedagogies, we will demonstrate the many exciting new features of the ILT. We will show how technology can not only implement, but also supplement and improve these pedagogies. We would like acknowdge grants from NSF and DEAS, Harvard University

  20. Observation option toolkit for acute otitis media.

    Science.gov (United States)

    Rosenfeld, R M

    2001-04-01

    The observation option for acute otitis media (AOM) refers to deferring antibiotic treatment of selected children for up to 3 days, during which time management is limited to analgesics and symptomatic relief. With appropriate follow-up complications are not increased, and clinical outcomes compare favorably with routine initial antibiotic therapy. Although used commonly in the Netherlands and certain Scandinavian countries, this approach has not gained wide acceptance in Europe and the United States. This article describes an evidence-based toolkit developed by the New York Region Otitis Project for judicious use of the observation option. The toolkit is not intended to endorse the observation option as a preferred method of management, nor is it intended as a rigid practice guideline to supplant clinician judgement. Rather, it presents busy clinicians with the tools needed to implement the observation option in everyday patient care should they so desire.

  1. ECCE Toolkit: Prototyping Sensor-Based Interaction.

    Science.gov (United States)

    Bellucci, Andrea; Aedo, Ignacio; Díaz, Paloma

    2017-02-23

    Building and exploring physical user interfaces requires high technical skills and hours of specialized work. The behavior of multiple devices with heterogeneous input/output channels and connectivity has to be programmed in a context where not only the software interface matters, but also the hardware components are critical (e.g., sensors and actuators). Prototyping physical interaction is hindered by the challenges of: (1) programming interactions among physical sensors/actuators and digital interfaces; (2) implementing functionality for different platforms in different programming languages; and (3) building custom electronic-incorporated objects. We present ECCE (Entities, Components, Couplings and Ecosystems), a toolkit for non-programmers that copes with these issues by abstracting from low-level implementations, thus lowering the complexity of prototyping small-scale, sensor-based physical interfaces to support the design process. A user evaluation provides insights and use cases of the kind of applications that can be developed with the toolkit.

  2. ECCE Toolkit: Prototyping Sensor-Based Interaction

    Directory of Open Access Journals (Sweden)

    Andrea Bellucci

    2017-02-01

    Full Text Available Building and exploring physical user interfaces requires high technical skills and hours of specialized work. The behavior of multiple devices with heterogeneous input/output channels and connectivity has to be programmed in a context where not only the software interface matters, but also the hardware components are critical (e.g., sensors and actuators. Prototyping physical interaction is hindered by the challenges of: (1 programming interactions among physical sensors/actuators and digital interfaces; (2 implementing functionality for different platforms in different programming languages; and (3 building custom electronic-incorporated objects. We present ECCE (Entities, Components, Couplings and Ecosystems, a toolkit for non-programmers that copes with these issues by abstracting from low-level implementations, thus lowering the complexity of prototyping small-scale, sensor-based physical interfaces to support the design process. A user evaluation provides insights and use cases of the kind of applications that can be developed with the toolkit.

  3. Rendering and Compositing Infrastructure Improvements to VisIt for Insitu Rendering

    Energy Technology Data Exchange (ETDEWEB)

    Loring, Burlen [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ruebel, Oliver [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2016-01-28

    Compared to posthoc rendering, insitu rendering often generates larger numbers of images, as a result rendering performance and scalability are critical in the insitu setting. In this work we present improvements to VisIt's rendering and compositing infrastructure that deliver increased performance and scalability in both posthoc and insitu settings. We added the capability for alpha blend compositing and use it with ordered compositing when datasets have disjoint block domain decomposition to optimize the rendering of transparent geometry. We also made improvements that increase overall efficiency by reducing communication and data movement and have addressed a number of performance issues. We structured our code to take advantage of SIMD parallelization and use threads to overlap communication and compositing. We tested our improvements on a 20 core workstation using 8 cores to render geometry generated from a $256^3$ cosmology dataset and on a Cray XC31 using 512 cores to render geometry generated from a $2000^2 \\times 800$ plasma dataset. Our results show that ordered compositing provides a speed up of up to $4 \\times$ over the current sort first strategy. The other improvements resulted in modest speed up with one notable exception where we achieve up to $40 \\times$ speed up of rendering and compositing of opaque geometry when both opaque and transparent geometry are rendered together. We also investigated the use of depth peeling, but found that the implementation provided by VTK is substantially slower,both with and without GPU acceleration, than a local camera order sort.

  4. Brain Image Representation and Rendering: A Survey

    Directory of Open Access Journals (Sweden)

    Mudassar Raza

    2012-09-01

    Full Text Available Brain image representation and rendering processes are basically used for evaluation, development and investigation consent experimental examination and formation of brain images of a variety of modalities that includes the major brain types like MEG, EEG, PET, MRI, CT or microscopy. So, there is a need to conduct a study to review the existing work in this area. This paper provides a review of different existing techniques and methods regarding the brain image representation and rendering. Image Rendering is the method of generating an image by means of a model, through computer programs. The basic purpose of brain image representation and rendering processes is to analyze the brain images precisely in order to effectively diagnose and examine the diseases and problems. The basic objective of this study is to evaluate and discuss different techniques and approaches proposed in order to handle different brain imaging types. The paper provides a short overview of different methods, in the form of advantages and limitations, presented in the prospect of brain image representation and rendering along with their sub categories proposed by different authors.

  5. Standardized rendering from IR surveillance motion imagery

    Science.gov (United States)

    Prokoski, F. J.

    2014-06-01

    Government agencies, including defense and law enforcement, increasingly make use of video from surveillance systems and camera phones owned by non-government entities.Making advanced and standardized motion imaging technology available to private and commercial users at cost-effective prices would benefit all parties. In particular, incorporating thermal infrared into commercial surveillance systems offers substantial benefits beyond night vision capability. Face rendering is a process to facilitate exploitation of thermal infrared surveillance imagery from the general area of a crime scene, to assist investigations with and without cooperating eyewitnesses. Face rendering automatically generates greyscale representations similar to police artist sketches for faces in surveillance imagery collected from proximate locations and times to a crime under investigation. Near-realtime generation of face renderings can provide law enforcement with an investigation tool to assess witness memory and credibility, and integrate reports from multiple eyewitnesses, Renderings can be quickly disseminated through social media to warn of a person who may pose an immediate threat, and to solicit the public's help in identifying possible suspects and witnesses. Renderings are pose-standardized so as to not divulge the presence and location of eyewitnesses and surveillance cameras. Incorporation of thermal infrared imaging into commercial surveillance systems will significantly improve system performance, and reduce manual review times, at an incremental cost that will continue to decrease. Benefits to criminal justice would include improved reliability of eyewitness testimony and improved accuracy of distinguishing among minority groups in eyewitness and surveillance identifications.

  6. Application experiences with the Globus toolkit.

    Energy Technology Data Exchange (ETDEWEB)

    Brunett, S.

    1998-06-09

    The Globus grid toolkit is a collection of software components designed to support the development of applications for high-performance distributed computing environments, or ''computational grids'' [14]. The Globus toolkit is an implementation of a ''bag of services'' architecture, which provides application and tool developers not with a monolithic system but rather with a set of stand-alone services. Each Globus component provides a basic service, such as authentication, resource allocation, information, communication, fault detection, and remote data access. Different applications and tools can combine these services in different ways to construct ''grid-enabled'' systems. The Globus toolkit has been used to construct the Globus Ubiquitous Supercomputing Testbed, or GUSTO: a large-scale testbed spanning 20 sites and included over 4000 compute nodes for a total compute power of over 2 TFLOPS. Over the past six months, we and others have used this testbed to conduct a variety of application experiments, including multi-user collaborative environments (tele-immersion), computational steering, distributed supercomputing, and high throughput computing. The goal of this paper is to review what has been learned from these experiments regarding the effectiveness of the toolkit approach. To this end, we describe two of the application experiments in detail, noting what worked well and what worked less well. The two applications are a distributed supercomputing application, SF-Express, in which multiple supercomputers are harnessed to perform large distributed interactive simulations; and a tele-immersion application, CAVERNsoft, in which the focus is on connecting multiple people to a distributed simulated world.

  7. An Overview of the Geant4 Toolkit

    Science.gov (United States)

    Apostolakis, John; Wright, Dennis H.

    2007-03-01

    Geant4 is a toolkit for the simulation of the transport of radiation through matter. With a flexible kernel and choices between different physics modeling choices, it has been tailored to the requirements of a wide range of applications. With the toolkit a user can describe a setup's or detector's geometry and materials, navigate inside it, simulate the physical interactions using a choice of physics engines, underlying physics cross-sections and models, visualise and store results. Physics models describing electromagnetic and hadronic interactions are provided, as are decays and processes for optical photons. Several models, with different precision and performance are available for many processes. The toolkit includes coherent physics model configurations, which are called physics lists. Users can choose an existing physics list or create their own, depending on their requirements and the application area. A clear structure and readable code, enable the user to investigate the origin of physics results. Application areas include detector simulation and background simulation in High Energy Physics experiments, simulation of accelerator setups, studies in medical imaging and treatment, and the study of the effects of solar radiation on spacecraft instruments.

  8. A Racial Equity Toolkit for Midwifery Organizations.

    Science.gov (United States)

    Gordon, Wendy M

    2016-11-01

    Midwifery associations are increasing awareness and commitment to racial equity in the profession and in the communities we serve. Moving these commitments from words into action may be facilitated by a racial equity toolkit to help guide midwifery organizations to consider all policies, initiatives, and actions with a racial equity lens. Racial equity impact analyses have been used in recent years by various governmental agencies in the United States and abroad with positive results, and emerging literature indicates that nonprofit organizations are having similarly positive results. This article proposes a framework for midwifery organizations to incorporate a racial equity toolkit, starting with explicit intentions of the organization with regard to racial equity in the profession. Indicators of success are elucidated as the next step, followed by the use of a racial equity impact analysis worksheet. This worksheet is applied by teams or committees when considering new policies or initiatives to examine those actions through a racial equity lens. An organizational change team and equity advisory groups are essential in assisting organizational leadership to forecast potential negative and positive impacts. Examples of the components of a midwifery-specific racial equity toolkit are included. © 2016 by the American College of Nurse-Midwives.

  9. Adaptive Rendering Based on Visual Acuity Equations

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    A new method of adaptable rendering for interaction in Virtual Environment(VE) through different visual acuity equations is proposed. An acuity factor equation of luminance vision is first given. Secondly, five equations which calculate the visual acuity through visual acuity factors are presented, and adaptive rendering strategy based on different visual acuity equations is given. The VE system may select one of them on the basis of the host's load, hereby select LOD for each model which would be rendered. A coarser LOD is selected where the visual acuity is lower, and a better LOD is used where it is higher. This method is tested through experiments and the experimental results show that it is effective.

  10. Rendering Falling Leaves on Graphics Hardware

    Directory of Open Access Journals (Sweden)

    Marcos Balsa

    2008-04-01

    Full Text Available There is a growing interest in simulating natural phenomena in computer graphics applications. Animating natural scenes in real time is one of the most challenging problems due to the inherent complexity of their structure, formed by millions of geometric entities, and the interactions that happen within. An example of natural scenario that is needed for games or simulation programs are forests. Forests are difficult to render because the huge amount of geometric entities and the large amount of detail to be represented. Moreover, the interactions between the objects (grass, leaves and external forces such as wind are complex to model. In this paper we concentrate in the rendering of falling leaves at low cost. We present a technique that exploits graphics hardware in order to render thousands of leaves with different falling paths in real time and low memory requirements.

  11. Blender cycles lighting and rendering cookbook

    CERN Document Server

    Iraci, Bernardo

    2013-01-01

    An in-depth guide full of step-by-step recipes to explore the concepts behind the usage of Cycles. Packed with illustrations, and lots of tips and tricks; the easy-to-understand nature of the book will help the reader understand even the most complex concepts with ease.If you are a digital artist who already knows your way around Blender, and you want to learn about the new Cycles' rendering engine, this is the book for you. Even experts will be able to pick up new tips and tricks to make the most of the rendering capabilities of Cycles.

  12. Volume Rendering for Curvilinear and Unstructured Grids

    Energy Technology Data Exchange (ETDEWEB)

    Max, N; Williams, P; Silva, C; Cook, R

    2003-03-05

    We discuss two volume rendering methods developed at Lawrence Livermore National Laboratory. The first, cell projection, renders the polygons in the projection of each cell. It requires a global visibility sort in order to composite the cells in back to front order, and we discuss several different algorithms for this sort. The second method uses regularly spaced slice planes perpendicular to the X, Y, or Z axes, which slice the cells into polygons. Both methods are supplemented with anti-aliasing techniques to deal with small cells that might fall between pixel samples or slice planes, and both have been parallelized.

  13. GPU Pro 5 advanced rendering techniques

    CERN Document Server

    Engel, Wolfgang

    2014-01-01

    In GPU Pro5: Advanced Rendering Techniques, section editors Wolfgang Engel, Christopher Oat, Carsten Dachsbacher, Michal Valient, Wessam Bahnassi, and Marius Bjorge have once again assembled a high-quality collection of cutting-edge techniques for advanced graphics processing unit (GPU) programming. Divided into six sections, the book covers rendering, lighting, effects in image space, mobile devices, 3D engine design, and compute. It explores rasterization of liquids, ray tracing of art assets that would otherwise be used in a rasterized engine, physically based area lights, volumetric light

  14. Digital color acquisition, perception, coding and rendering

    CERN Document Server

    Fernandez-Maloigne, Christine; Macaire, Ludovic

    2013-01-01

    In this book the authors identify the basic concepts and recent advances in the acquisition, perception, coding and rendering of color. The fundamental aspects related to the science of colorimetry in relation to physiology (the human visual system) are addressed, as are constancy and color appearance. It also addresses the more technical aspects related to sensors and the color management screen. Particular attention is paid to the notion of color rendering in computer graphics. Beyond color, the authors also look at coding, compression, protection and quality of color images and videos.

  15. Haptic rendering for simulation of fine manipulation

    CERN Document Server

    Wang, Dangxiao; Zhang, Yuru

    2014-01-01

    This book introduces the latest progress in six degrees of freedom (6-DoF) haptic rendering with the focus on a new approach for simulating force/torque feedback in performing tasks that require dexterous manipulation skills. One of the major challenges in 6-DoF haptic rendering is to resolve the conflict between high speed and high fidelity requirements, especially in simulating a tool interacting with both rigid and deformable objects in a narrow space and with fine features. The book presents a configuration-based optimization approach to tackle this challenge. Addressing a key issue in man

  16. TractRender: a new generalized 3D medical image visualization and output platform

    Science.gov (United States)

    Hwang, Darryl H.; Tsao, Sinchai; Gajawelli, Niharika; Law, Meng; Lepore, Natasha

    2015-01-01

    Diffusion MRI allows us not only voxelized diffusion characteristics but also the potential to delineate neuronal fiber path through tractography. There is a dearth of flexible open source tractography software programs for visualizing these complicated 3D structures. Moreover, rendering these structures using various shading, lighting, and representations will result in vastly different graphical feel. In addition, the ability to output these objects in various formats increases the utility of this platform. We have created TractRender that leverages openGL features through Matlab, allowing for maximum ease of use but still maintain the flexibility of custom scene rendering.

  17. Integrating the visualization concept of the medical imaging interaction toolkit (MITK) into the XIP-Builder visual programming environment

    Science.gov (United States)

    Wolf, Ivo; Nolden, Marco; Schwarz, Tobias; Meinzer, Hans-Peter

    2010-02-01

    The Medical Imaging Interaction Toolkit (MITK) and the eXtensible Imaging Platform (XIP) both aim at facilitating the development of medical imaging applications, but provide support on different levels. MITK offers support from the toolkit level, whereas XIP comes with a visual programming environment. XIP is strongly based on Open Inventor. Open Inventor with its scene graph-based rendering paradigm was not specifically designed for medical imaging, but focuses on creating dedicated visualizations. MITK has a visualization concept with a model-view-controller like design that assists in implementing multiple, consistent views on the same data, which is typically required in medical imaging. In addition, MITK defines a unified means of describing position, orientation, bounds, and (if required) local deformation of data and views, supporting e.g. images acquired with gantry tilt and curved reformations. The actual rendering is largely delegated to the Visualization Toolkit (VTK). This paper presents an approach of how to integrate the visualization concept of MITK with XIP, especially into the XIP-Builder. This is a first step of combining the advantages of both platforms. It enables experimenting with algorithms in the XIP visual programming environment without requiring a detailed understanding of Open Inventor. Using MITK-based add-ons to XIP, any number of data objects (images, surfaces, etc.) produced by algorithms can simply be added to an MITK DataStorage object and rendered into any number of slice-based (2D) or 3D views. Both MITK and XIP are open-source C++ platforms. The extensions presented in this paper will be available from www.mitk.org.

  18. NASA Space Radiation Program Integrative Risk Model Toolkit

    Science.gov (United States)

    Kim, Myung-Hee Y.; Hu, Shaowen; Plante, Ianik; Ponomarev, Artem L.; Sandridge, Chris

    2015-01-01

    NASA Space Radiation Program Element scientists have been actively involved in development of an integrative risk models toolkit that includes models for acute radiation risk and organ dose projection (ARRBOD), NASA space radiation cancer risk projection (NSCR), hemocyte dose estimation (HemoDose), GCR event-based risk model code (GERMcode), and relativistic ion tracks (RITRACKS), NASA radiation track image (NASARTI), and the On-Line Tool for the Assessment of Radiation in Space (OLTARIS). This session will introduce the components of the risk toolkit with opportunity for hands on demonstrations. The brief descriptions of each tools are: ARRBOD for Organ dose projection and acute radiation risk calculation from exposure to solar particle event; NSCR for Projection of cancer risk from exposure to space radiation; HemoDose for retrospective dose estimation by using multi-type blood cell counts; GERMcode for basic physical and biophysical properties for an ion beam, and biophysical and radiobiological properties for a beam transport to the target in the NASA Space Radiation Laboratory beam line; RITRACKS for simulation of heavy ion and delta-ray track structure, radiation chemistry, DNA structure and DNA damage at the molecular scale; NASARTI for modeling of the effects of space radiation on human cells and tissue by incorporating a physical model of tracks, cell nucleus, and DNA damage foci with image segmentation for the automated count; and OLTARIS, an integrated tool set utilizing HZETRN (High Charge and Energy Transport) intended to help scientists and engineers study the effects of space radiation on shielding materials, electronics, and biological systems.

  19. Rendering Visible: Painting and Sexuate Subjectivity

    Science.gov (United States)

    Daley, Linda

    2015-01-01

    In this essay, I examine Luce Irigaray's aesthetic of sexual difference, which she develops by extrapolating from Paul Klee's idea that the role of painting is to render the non-visible rather than represent the visible. This idea is the premise of her analyses of phenomenology and psychoanalysis and their respective contributions to understanding…

  20. ProteinShader: illustrative rendering of macromolecules

    Directory of Open Access Journals (Sweden)

    Weber Joseph R

    2009-03-01

    Full Text Available Abstract Background Cartoon-style illustrative renderings of proteins can help clarify structural features that are obscured by space filling or balls and sticks style models, and recent advances in programmable graphics cards offer many new opportunities for improving illustrative renderings. Results The ProteinShader program, a new tool for macromolecular visualization, uses information from Protein Data Bank files to produce illustrative renderings of proteins that approximate what an artist might create by hand using pen and ink. A combination of Hermite and spherical linear interpolation is used to draw smooth, gradually rotating three-dimensional tubes and ribbons with a repeating pattern of texture coordinates, which allows the application of texture mapping, real-time halftoning, and smooth edge lines. This free platform-independent open-source program is written primarily in Java, but also makes extensive use of the OpenGL Shading Language to modify the graphics pipeline. Conclusion By programming to the graphics processor unit, ProteinShader is able to produce high quality images and illustrative rendering effects in real-time. The main feature that distinguishes ProteinShader from other free molecular visualization tools is its use of texture mapping techniques that allow two-dimensional images to be mapped onto the curved three-dimensional surfaces of ribbons and tubes with minimum distortion of the images.

  1. Rendering Visible: Painting and Sexuate Subjectivity

    Science.gov (United States)

    Daley, Linda

    2015-01-01

    In this essay, I examine Luce Irigaray's aesthetic of sexual difference, which she develops by extrapolating from Paul Klee's idea that the role of painting is to render the non-visible rather than represent the visible. This idea is the premise of her analyses of phenomenology and psychoanalysis and their respective contributions to understanding…

  2. Microsoft BizTalk ESB Toolkit 2.1

    CERN Document Server

    Benito, Andrés Del Río

    2013-01-01

    A practical guide into the architecture and features that make up the services and components of the ESB Toolkit.This book is for experienced BizTalk developers, administrators, and architects, as well as IT managers and BizTalk business analysts. Knowledge and experience with the Toolkit is not a requirement.

  3. Designing and Delivering Intensive Interventions: A Teacher's Toolkit

    Science.gov (United States)

    Murray, Christy S.; Coleman, Meghan A.; Vaughn, Sharon; Wanzek, Jeanne; Roberts, Greg

    2012-01-01

    This toolkit provides activities and resources to assist practitioners in designing and delivering intensive interventions in reading and mathematics for K-12 students with significant learning difficulties and disabilities. Grounded in research, this toolkit is based on the Center on Instruction's "Intensive Interventions for Students Struggling…

  4. RAY TRACING RENDER MENGGUNAKAN FRAGMENT ANTI ALIASING

    Directory of Open Access Journals (Sweden)

    Febriliyan Samopa

    2008-07-01

    Full Text Available Normal 0 false false false IN X-NONE X-NONE MicrosoftInternetExplorer4 /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-qformat:yes; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin-top:0cm; mso-para-margin-right:0cm; mso-para-margin-bottom:10.0pt; mso-para-margin-left:0cm; line-height:115%; mso-pagination:widow-orphan; font-size:11.0pt; font-family:"Calibri","sans-serif"; mso-ascii-font-family:Calibri; mso-ascii-theme-font:minor-latin; mso-fareast-font-family:"Times New Roman"; mso-fareast-theme-font:minor-fareast; mso-hansi-font-family:Calibri; mso-hansi-theme-font:minor-latin; mso-bidi-font-family:"Times New Roman"; mso-bidi-theme-font:minor-bidi;} Rendering is generating surface and three-dimensional effects on an object displayed on a monitor screen. Ray tracing as a rendering method that traces ray for each image pixel has a drawback, that is, aliasing (jaggies effect. There are some methods for executing anti aliasing. One of those methods is OGSS (Ordered Grid Super Sampling. OGSS is able to perform aliasing well. However, this method requires more computation time since sampling of all pixels in the image will be increased. Fragment Anti Aliasing (FAA is a new alternative method that can cope with the drawback. FAA will check the image when performing rendering to a scene. Jaggies effect is only happened at curve and gradient object. Therefore, only this part of object that will experience sampling magnification. After this sampling magnification and the pixel values are computed, then downsample is performed to retrieve the original pixel values. Experimental results show that the software can implement ray tracing well in order to form images, and it can implement FAA and OGSS technique to perform anti aliasing. In general, rendering using FAA is faster than using OGSS

  5. Livermore Big Artificial Neural Network Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    2016-07-01

    LBANN is a toolkit that is designed to train artificial neural networks efficiently on high performance computing architectures. It is optimized to take advantages of key High Performance Computing features to accelerate neural network training. Specifically it is optimized for low-latency, high bandwidth interconnects, node-local NVRAM, node-local GPU accelerators, and high bandwidth parallel file systems. It is built on top of the open source Elemental distributed-memory dense and spars-direct linear algebra and optimization library that is released under the BSD license. The algorithms contained within LBANN are drawn from the academic literature and implemented to work within a distributed-memory framework.

  6. GENFIT - a Generic Track-Fitting Toolkit

    CERN Document Server

    Rauch, Johannes

    2014-01-01

    GENFIT is an experiment-independent track-fitting toolkit that combines fitting algorithms, track representations, and measurement geometries into a modular framework. We report on a significantly improved version of GENFIT, based on experience gained in the Belle II, PANDA, and FOPI experiments. Improvements concern the implementation of additional track-fitting algorithms, enhanced implementations of Kalman fitters, enhanced visualization capabilities, and additional implementations of measurement types suited for various kinds of tracking detectors. The data model has been revised, allowing for efficient track merging, smoothing, residual calculation, alignment, and storage.

  7. GENFIT - a generic track-fitting toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Rauch, Johannes [Technische Universitaet Muenchen (Germany); Schlueter, Tobias [Ludwig-Maximilians-Universitaet Muenchen (Germany)

    2014-07-01

    GENFIT is an experiment-independent track-fitting toolkit, which combines fitting algorithms, track representations, and measurement geometries into a modular framework. We report on a significantly improved version of GENFIT, based on experience gained in the Belle II, PANDA, and FOPI experiments. Improvements concern the implementation of additional track-fitting algorithms, enhanced implementations of Kalman fitters, enhanced visualization capabilities, and additional implementations of measurement types suited for various kinds of tracking detectors. The data model has been revised, allowing for efficient track merging, smoothing, residual calculation and alignment.

  8. Design Optimization Toolkit: Users' Manual

    Energy Technology Data Exchange (ETDEWEB)

    Aguilo Valentin, Miguel Alejandro [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States). Computational Solid Mechanics and Structural Dynamics

    2014-07-01

    The Design Optimization Toolkit (DOTk) is a stand-alone C++ software package intended to solve complex design optimization problems. DOTk software package provides a range of solution methods that are suited for gradient/nongradient-based optimization, large scale constrained optimization, and topology optimization. DOTk was design to have a flexible user interface to allow easy access to DOTk solution methods from external engineering software packages. This inherent flexibility makes DOTk barely intrusive to other engineering software packages. As part of this inherent flexibility, DOTk software package provides an easy-to-use MATLAB interface that enables users to call DOTk solution methods directly from the MATLAB command window.

  9. National eHealth strategy toolkit

    CERN Document Server

    2012-01-01

    Worldwide the application of information and communication technologies to support national health-care services is rapidly expanding and increasingly important. This is especially so at a time when all health systems face stringent economic challenges and greater demands to provide more and better care especially to those most in need. The National eHealth Strategy Toolkit is an expert practical guide that provides governments their ministries and stakeholders with a solid foundation and method for the development and implementation of a national eHealth vision action plan and monitoring fram

  10. Graph algorithms in the titan toolkit.

    Energy Technology Data Exchange (ETDEWEB)

    McLendon, William Clarence, III; Wylie, Brian Neil

    2009-10-01

    Graph algorithms are a key component in a wide variety of intelligence analysis activities. The Graph-Based Informatics for Non-Proliferation and Counter-Terrorism project addresses the critical need of making these graph algorithms accessible to Sandia analysts in a manner that is both intuitive and effective. Specifically we describe the design and implementation of an open source toolkit for doing graph analysis, informatics, and visualization that provides Sandia with novel analysis capability for non-proliferation and counter-terrorism.

  11. Montage: An Astronomical Image Mosaicking Toolkit

    Science.gov (United States)

    Jacob, Joseph C.; Katz, Daniel S.; Berriman, G. Bruce; Good, John; Laity, Anastasia C.; Deelman, Ewa; Kesselman, Carl; Singh, Gurmeet; Su, Mei-Hui; Prince, Thomas A.; Williams, Roy

    2010-10-01

    Montage is an open source code toolkit for assembling Flexible Image Transport System (FITS) images into custom mosaics. It runs on all common Linux/Unix platforms, on desktops, clusters and computational grids, and supports all World Coordinate System (WCS) projections and common coordinate systems. Montage preserves spatial and calibration fidelity of input images, processes 40 million pixels in up to 32 minutes on 128 nodes on a Linux cluster, and provides independent engines for analyzing the geometry of images on the sky, re-projecting images, rectifying background emission to a common level, and co-adding images. It offers convenient tools for managing and manipulating large image files.

  12. Now We Know: Assessing Sexual Assault Criminal Justice Case Processing in an Urban Community Using the Sexual Assault Nurse Practitioner Evaluation Toolkit.

    Science.gov (United States)

    Valentine, Julie L; Shaw, Jessica; Lark, Alyssa; Campbell, Rebecca

    2016-01-01

    Campbell and colleagues developed an evaluation Toolkit for use by sexual assault nurse examiners (SANEs) to assess criminal case outcomes in adult sexual assault cases seen by SANE programs (Campbell, Townsend, Shaw, Karim, & Markowitz, 2014; Campbell, Bybee, et al., 2014). The Toolkit provides step-by-step directions and an easy-to-use statistical program. This study describes implementation of the Toolkit in Salt Lake County, the first site outside the pilot sites to utilize the program. The Toolkit revealed that, in Salt Lake County from 2003 to 2011, only 6% of adult sexual assault cases were successfully prosecuted. These findings prompted multiple community discussions, media attention, and a call to action to improve the investigation and prosecution of adult sexual assault cases. The primary purpose of this case report is to encourage other SANE teams and communities to use the Toolkit by sharing the successful experience of Salt Lake County in implementing the Toolkit.Video Abstract available for additional insights from Dr. Valentine (see Supplemental Digital Content 1, http://links.lww.com/JFN/A19).

  13. The Einstein Toolkit: A Community Computational Infrastructure for Relativistic Astrophysics

    CERN Document Server

    Löffler, Frank; Bentivegna, Eloisa; Bode, Tanja; Diener, Peter; Haas, Roland; Hinder, Ian; Mundim, Bruno C; Ott, Christian D; Schnetter, Erik; Allen, Gabrielle; Campanelli, Manuela; Laguna, Pablo

    2011-01-01

    We describe the Einstein Toolkit, a community-driven, freely accessible computational infrastructure intended for use in numerical relativity, relativistic astrophysics, and other applications. The Toolkit, developed by a collaboration involving researchers from multiple institutions around the world, combines a core set of components needed to simulate astrophysical objects such as black holes, compact objects, and collapsing stars, as well as a full suite of analysis tools. The Einstein Toolkit is currently based on the Cactus Framework for high-performance computing and the Carpet adaptive mesh refinement driver. It implements spacetime evolution via the BSSN evolution system and general-relativistic hydrodynamics in a finite-volume discretization. The toolkit is under continuous development and contains many new code components that have been publicly released for the first time and are described in this article. We discuss the motivation behind the release of the toolkit, the philosophy underlying its de...

  14. Automatic Image-Based Pencil Sketch Rendering

    Institute of Scientific and Technical Information of China (English)

    王进; 鲍虎军; 周伟华; 彭群生; 徐迎庆

    2002-01-01

    This paper presents an automatic image-based approach for converting greyscale images to pencil sketches, in which strokes follow the image features. The algorithm first extracts a dense direction field automatically using Logical/Linear operators which embody the drawing mechanism. Next, a reconstruction approach based on a sampling-and-interpolation scheme is introduced to generate stroke paths from the direction field. Finally, pencil strokes are rendered along the specified paths with consideration of image tone and artificial illumination.As an important application, the technique is applied to render portraits from images with little user interaction. The experimental results demonstrate that the approach can automatically achieve compelling pencil sketches from reference images.

  15. Anti-Aliased Rendering of Water Surface

    Institute of Scientific and Technical Information of China (English)

    Xue-Ying Qin; Eihachiro Nakamae; Wei Hua; Yasuo Nagai; Qun-Sheng Peng

    2004-01-01

    Water surface is one of the most important components of landscape scenes. When rendering spacious far from the viewpoint. This is because water surface consists of stochastic water waves which are usually modeled by periodic bump mapping. The incident rays on the water surface are actually scattered by the bumped waves,pattern, we estimate this solid angle of reflected rays and trace these rays. An image-based accelerating method is adopted so that the contribution of each reflected ray can be quickly obtained without elaborate intersection calculation. We also demonstrate anti-aliased shadows of sunlight and skylight on the water surface. Both the rendered images and animations show excellent effects on the water surface of a reservoir.

  16. Optimization techniques for computationally expensive rendering algorithms

    OpenAIRE

    Navarro Gil, Fernando; Gutiérrez Pérez, Diego; Serón Arbeloa, Francisco José

    2012-01-01

    Realistic rendering in computer graphics simulates the interactions of light and surfaces. While many accurate models for surface reflection and lighting, including solid surfaces and participating media have been described; most of them rely on intensive computation. Common practices such as adding constraints and assumptions can increase performance. However, they may compromise the quality of the resulting images or the variety of phenomena that can be accurately represented. In this thesi...

  17. Visualization of Medpor implants using surface rendering

    Institute of Scientific and Technical Information of China (English)

    WANG Meng; GUI Lai; LIU Xiao-jing

    2011-01-01

    Background The Medpor surgical implant is one of the easiest implants in clinical practice, especially in craniomaxillofacial surgery. It is often used as a bone substitute material for the repair of skull defects and facial deformities. The Medpor implant has several advantages but its use is limited because it is radiolucent in both direct radiography and conventional computed tomography, causing serious problems with visualization.Methods In this study, a new technique for visualizing Medpor implants was evaluated in 10 patients who had undergone facial reconstruction using the material. Continuous volume scans were made using a 16-channel tomographic scanner and 3D reconstruction software was used to create surface renderings. The threshold values for surface renderings of the implant ranged from -70 HU to -20 HU, with bone as the default.Results The shape of the implants and the spatial relationship between bone and implant could both be displayed.Conclusion Surface rendering can allow successful visualization of Medpor implants in the body.

  18. [Research on Three-dimensional Medical Image Reconstruction and Interaction Based on HTML5 and Visualization Toolkit].

    Science.gov (United States)

    Gao, Peng; Liu, Peng; Su, Hongsen; Qiao, Liang

    2015-04-01

    Integrating visualization toolkit and the capability of interaction, bidirectional communication and graphics rendering which provided by HTML5, we explored and experimented on the feasibility of remote medical image reconstruction and interaction in pure Web. We prompted server-centric method which did not need to download the big medical data to local connections and avoided considering network transmission pressure and the three-dimensional (3D) rendering capability of client hardware. The method integrated remote medical image reconstruction and interaction into Web seamlessly, which was applicable to lower-end computers and mobile devices. Finally, we tested this method in the Internet and achieved real-time effects. This Web-based 3D reconstruction and interaction method, which crosses over internet terminals and performance limited devices, may be useful for remote medical assistant.

  19. ADMIT: The ALMA Data Mining Toolkit

    Science.gov (United States)

    Teuben, P.; Pound, M.; Mundy, L.; Rauch, K.; Friedel, D.; Looney, L.; Xu, L.; Kern, J.

    2015-09-01

    ADMIT (ALMA Data Mining ToolkiT), a toolkit for the creation of new science products from ALMA data, is being developed as an ALMA Development Project. It is written in Python and, while specifically targeted for a uniform analysis of the ALMA science products that come out of the ALMA pipeline, it is designed to be generally applicable to (radio) astronomical data. It first provides users with a detailed view of their science products created by ADMIT inside the ALMA pipeline: line identifications, line ‘cutout' cubes, moment maps, emission type analysis (e.g., feature detection). Using descriptor vectors the ALMA data archive is enriched with useful information to make archive data mining possible. Users can also opt to download the (small) ADMIT pipeline product, then fine-tune and re-run the pipeline and inspect their hopefully improved data. By running many projects in a parallel fashion, data mining between many astronomical sources and line transitions will also be possible. Future implementations of ADMIT may include EVLA and other instruments.

  20. MIS: A MIRIAD Interferometry Singledish Toolkit

    Science.gov (United States)

    Pound, M. W.; Teuben, P.

    2012-09-01

    Building on the “drPACS” contribution at ADASS XX of a simple Unix pipeline infrastructure, we implemented a pipeline toolkit using the package MIRIAD to combine Interferometric and Single Dish data (MIS). This was prompted by our observations made with the Combined Array For Research in Millimeter-wave Astronomy (CARMA) interferometer of the star-forming region NGC 1333, a large survey highlighting the new 23-element and singledish observing modes. The project consists of 20 CARMA datasets each containing interferometric as well as simultaneously obtained single dish data, for 3 molecular spectral lines and continuum, in 527 different pointings, covering an area of about 8 by 11 arcminutes. A small group of collaborators then shared this toolkit and their parameters via CVS, and scripts were developed to ensure uniform data reduction across the group. The pipeline was run end-to-end each night as new observations were obtained, producing maps that contained all the data to date. We will show examples of the scripts and data products. This approach could serve as a model for repeated calibration and mapping of large mixed-mode correlation datasets from ALMA.

  1. The Virtual Physiological Human ToolKit.

    Science.gov (United States)

    Cooper, Jonathan; Cervenansky, Frederic; De Fabritiis, Gianni; Fenner, John; Friboulet, Denis; Giorgino, Toni; Manos, Steven; Martelli, Yves; Villà-Freixa, Jordi; Zasada, Stefan; Lloyd, Sharon; McCormack, Keith; Coveney, Peter V

    2010-08-28

    The Virtual Physiological Human (VPH) is a major European e-Science initiative intended to support the development of patient-specific computer models and their application in personalized and predictive healthcare. The VPH Network of Excellence (VPH-NoE) project is tasked with facilitating interaction between the various VPH projects and addressing issues of common concern. A key deliverable is the 'VPH ToolKit'--a collection of tools, methodologies and services to support and enable VPH research, integrating and extending existing work across Europe towards greater interoperability and sustainability. Owing to the diverse nature of the field, a single monolithic 'toolkit' is incapable of addressing the needs of the VPH. Rather, the VPH ToolKit should be considered more as a 'toolbox' of relevant technologies, interacting around a common set of standards. The latter apply to the information used by tools, including any data and the VPH models themselves, and also to the naming and categorizing of entities and concepts involved. Furthermore, the technologies and methodologies available need to be widely disseminated, and relevant tools and services easily found by researchers. The VPH-NoE has thus created an online resource for the VPH community to meet this need. It consists of a database of tools, methods and services for VPH research, with a Web front-end. This has facilities for searching the database, for adding or updating entries, and for providing user feedback on entries. Anyone is welcome to contribute.

  2. Applications Toolkit for Accelerator Control and Analysis

    Science.gov (United States)

    Borland, M.

    1997-05-01

    The Advanced Photon Source (APS) has taken a unique approach to creating high-level software applications for accelerator control and analysis. The approach is based on self-describing data, modular program toolkits, and scripts. Self-describing data provides a communication standard that aids the creation of modular program toolkits by allowing compliant programs to be used in essentially arbitrary combinations. These modular programs can be used as part of an arbitrary number of high-level applications. At APS, a group of about 60 data analysis, manipulation, and display tools is used in concert with about 20 control-system-specific tools to implement applications for commissioning and operations. High-level applications are created using scripts, which are relatively simple interpreted programs. The Tcl/Tk script language is used, allowing creation of graphical user interfaces (GUIs) and a library of algorithms that are separate from the interface. This allows greater automation of control by making it easy to take the human out of the loop. Applications of this methodology to accelerator commissioning and operation such as orbit correction, and data archiving and review will be discussed.

  3. WAVOS: a MATLAB toolkit for wavelet analysis and visualization of oscillatory systems

    Directory of Open Access Journals (Sweden)

    Harang Richard

    2012-03-01

    Full Text Available Abstract Background Wavelets have proven to be a powerful technique for the analysis of periodic data, such as those that arise in the analysis of circadian oscillators. While many implementations of both continuous and discrete wavelet transforms are available, we are aware of no software that has been designed with the nontechnical end-user in mind. By developing a toolkit that makes these analyses accessible to end users without significant programming experience, we hope to promote the more widespread use of wavelet analysis. Findings We have developed the WAVOS toolkit for wavelet analysis and visualization of oscillatory systems. WAVOS features both the continuous (Morlet and discrete (Daubechies wavelet transforms, with a simple, user-friendly graphical user interface within MATLAB. The interface allows for data to be imported from a number of standard file formats, visualized, processed and analyzed, and exported without use of the command line. Our work has been motivated by the challenges of circadian data, thus default settings appropriate to the analysis of such data have been pre-selected in order to minimize the need for fine-tuning. The toolkit is flexible enough to deal with a wide range of oscillatory signals, however, and may be used in more general contexts. Conclusions We have presented WAVOS: a comprehensive wavelet-based MATLAB toolkit that allows for easy visualization, exploration, and analysis of oscillatory data. WAVOS includes both the Morlet continuous wavelet transform and the Daubechies discrete wavelet transform. We have illustrated the use of WAVOS, and demonstrated its utility for the analysis of circadian data on both bioluminesence and wheel-running data. WAVOS is freely available at http://sourceforge.net/projects/wavos/files/

  4. The Image-Guided Surgery ToolKit IGSTK: an open source C++ software toolkit

    Science.gov (United States)

    Cheng, Peng; Ibanez, Luis; Gobbi, David; Gary, Kevin; Aylward, Stephen; Jomier, Julien; Enquobahrie, Andinet; Zhang, Hui; Kim, Hee-su; Blake, M. Brian; Cleary, Kevin

    2007-03-01

    The Image-Guided Surgery Toolkit (IGSTK) is an open source C++ software library that provides the basic components needed to develop image-guided surgery applications. The focus of the toolkit is on robustness using a state machine architecture. This paper presents an overview of the project based on a recent book which can be downloaded from igstk.org. The paper includes an introduction to open source projects, a discussion of our software development process and the best practices that were developed, and an overview of requirements. The paper also presents the architecture framework and main components. This presentation is followed by a discussion of the state machine model that was incorporated and the associated rationale. The paper concludes with an example application.

  5. GPU Pro 4 advanced rendering techniques

    CERN Document Server

    Engel, Wolfgang

    2013-01-01

    GPU Pro4: Advanced Rendering Techniques presents ready-to-use ideas and procedures that can help solve many of your day-to-day graphics programming challenges. Focusing on interactive media and games, the book covers up-to-date methods producing real-time graphics. Section editors Wolfgang Engel, Christopher Oat, Carsten Dachsbacher, Michal Valient, Wessam Bahnassi, and Sebastien St-Laurent have once again assembled a high-quality collection of cutting-edge techniques for advanced graphics processing unit (GPU) programming. Divided into six sections, the book begins with discussions on the abi

  6. Haptic rendering foundations, algorithms, and applications

    CERN Document Server

    Lin, Ming C

    2008-01-01

    For a long time, human beings have dreamed of a virtual world where it is possible to interact with synthetic entities as if they were real. It has been shown that the ability to touch virtual objects increases the sense of presence in virtual environments. This book provides an authoritative overview of state-of-theart haptic rendering algorithms and their applications. The authors examine various approaches and techniques for designing touch-enabled interfaces for a number of applications, including medical training, model design, and maintainability analysis for virtual prototyping, scienti

  7. GPU PRO 3 Advanced rendering techniques

    CERN Document Server

    Engel, Wolfgang

    2012-01-01

    GPU Pro3, the third volume in the GPU Pro book series, offers practical tips and techniques for creating real-time graphics that are useful to beginners and seasoned game and graphics programmers alike. Section editors Wolfgang Engel, Christopher Oat, Carsten Dachsbacher, Wessam Bahnassi, and Sebastien St-Laurent have once again brought together a high-quality collection of cutting-edge techniques for advanced GPU programming. With contributions by more than 50 experts, GPU Pro3: Advanced Rendering Techniques covers battle-tested tips and tricks for creating interesting geometry, realistic sha

  8. Defects of organization in rendering medical aid

    Directory of Open Access Journals (Sweden)

    Shavkat Islamov

    2010-09-01

    Full Text Available The defects of organization at the medical institution mean disturbance of rules, norms and order of rendering of medical aid. The number of organization defects in Uzbekistan increased from 20.42%, in 1999 to 25.46% in 2001 with gradual decrease to 19.9% in 2003 and 16.66%, in 2006 and gradual increase to 21.95% and 28.28% (P<0.05 in 2005 and 2008. Among the groups of essential defects of organization there were following: disturbance of transportation rules, lack of dispensary care, shortcomings in keeping medical documentation.

  9. Methods for Evaluating Text Extraction Toolkits: An Exploratory Investigation

    Science.gov (United States)

    2015-01-22

    M T R 1 4 0 4 4 3 R 2 M I T R E T E C H N I C A L R E P O R T Methods for Evaluating Text Extraction Toolkits : An...JAN 2015 2. REPORT TYPE 3. DATES COVERED 00-00-2015 to 00-00-2015 4. TITLE AND SUBTITLE Methods for Evaluating Text Extraction Toolkits : An...contributes to closing this gap. We discuss an exploratory investigation into a method and a set of tools for evaluating a text extraction toolkit

  10. VIDE: The Void IDentification and Examination toolkit

    CERN Document Server

    Sutter, P M; Hamaus, Nico; Pisani, Alice; Wandelt, Benjamin D; Warren, Michael S; Villaescusa-Navarro, Francisco; Zivick, Paul; Mao, Qingqing; Thompson, Benjamin B

    2014-01-01

    We present VIDE, the Void IDentification and Examination toolkit, an open-source Python/C++ code for finding cosmic voids in galaxy redshift surveys and N-body simulations, characterizing their properties, and providing a platform for more detailed analysis. At its core, VIDE uses a greatly enhanced version of ZOBOV (Neyinck 2008) to calculate a Voronoi tessellation for estimating the density field and a watershed transform to construct voids. The watershed levels are used to place voids in a hierarchical tree. VIDE provides significant additional functionality for both pre- and post-processing: for example, VIDE can work with volume- or magnitude-limited galaxy samples with arbitrary survey geometries, or dark matter particles or halo catalogs in a variety of common formats. It can also randomly subsample inputs and includes a Halo Occupation Distribution model for constructing mock galaxy populations. VIDE outputs a summary of void properties in plain ASCII, and provides a Python API to perform many analysi...

  11. NBII-SAIN Data Management Toolkit

    Science.gov (United States)

    Burley, Thomas E.; Peine, John D.

    2009-01-01

    percent of the cost of a spatial information system is associated with spatial data collection and management (U.S. General Accounting Office, 2003). These figures indicate that the resources (time, personnel, money) of many agencies and organizations could be used more efficiently and effectively. Dedicated and conscientious data management coordination and documentation is critical for reducing such redundancy. Substantial cost savings and increased efficiency are direct results of a pro-active data management approach. In addition, details of projects as well as data and information are frequently lost as a result of real-world occurrences such as the passing of time, job turnover, and equipment changes and failure. A standardized, well documented database allows resource managers to identify issues, analyze options, and ultimately make better decisions in the context of adaptive management (National Land and Water Resources Audit and the Australia New Zealand Land Information Council on behalf of the Australian National Government, 2003). Many environmentally focused, scientific, or natural resource management organizations collect and create both spatial and non-spatial data in some form. Data management appropriate for those data will be contingent upon the project goal(s) and objectives and thus will vary on a case-by-case basis. This project and the resulting Data Management Toolkit, hereafter referred to as the Toolkit, is therefore not intended to be comprehensive in terms of addressing all of the data management needs of all projects that contain biological, geospatial, and other types of data. The Toolkit emphasizes the idea of connecting a project's data and the related management needs to the defined project goals and objectives from the outset. In that context, the Toolkit presents and describes the fundamental components of sound data and information management that are common to projects involving biological, geospatial, and other related data

  12. A Cosmology Forecast Toolkit -- CosmoLib

    CERN Document Server

    Huang, Zhiqi

    2012-01-01

    The package CosmoLib is a combination of a cosmological Boltzmann code and a simulation toolkit to forecast the constraints on cosmological parameters from future observations. In this paper we describe the released linear-order part of the package. We discuss the stability and performance of the Boltzmann code. This is written in Newtonian gauge and including dark energy perturbations. In CosmoLib the integrator that computes the CMB angular power spectrum is optimized for a $\\ell$-by-$\\ell$ brute-force integration, which is useful for studying inflationary models predicting sharp features in the primordial power spectrum of metric fluctuations. The numerical code and its documentation are available at http://www.cita.utoronto.ca/~zqhuang/CosmoLib.

  13. Introduction to the Geant4 Simulation toolkit

    Science.gov (United States)

    Guatelli, S.; Cutajar, D.; Oborn, B.; Rosenfeld, A. B.

    2011-05-01

    Geant4 is a Monte Carlo simulation Toolkit, describing the interactions of particles with matter. Geant4 is widely used in radiation physics research, from High Energy Physics, to medical physics and space science, thanks to its sophisticated physics component, coupled with advanced functionality in geometry description. Geant4 is widely used at the Centre for Medical Radiation Physics (CMRP), at the University of Wollongong, to characterise and optimise novel detector concepts, radiotherapy treatments, and imaging solutions. This lecture consists of an introduction to Monte Carlo method, and to Geant4. Particular attention will be devoted to the Geant4 physics component, and to the physics models describing electromagnetic and hadronic physics interactions. The second part of the lecture will be focused on the methodology to adopt to develop a Geant4 simulation application.

  14. Grids Portals: Frameworks, Middleware or Toolkit

    Directory of Open Access Journals (Sweden)

    Xavier Medianero-Pasco

    2010-05-01

    Full Text Available Grid portals are interfaces for interconnection between Grid resources and users. Grid portals are supported by components called portlets, servlets and middleware. Frameworks, middleware and/or development toolkits are used in the design and development of grid portals; each of them presents distinctive characteristics like any existing tool for grid portals such as: GridSphere, GridPort, PortalLab and others. The concept and definition of the components of a Grid Portal are not clear at the time of their application, because different models and works are appointed arbitrarily without taking into account the approach and features of the concept. Their clarification can give an insight to each grid portal and existing components.

  15. Introducing the Ginga FITS Viewer and Toolkit

    Science.gov (United States)

    Jeschke, E.; Inagaki, T.; Kackley, R.

    2013-10-01

    We introduce Ginga, a new open-source FITS viewer and toolkit based on Python astronomical packages such as pyfits, numpy, scipy, matplotlib, and pywcs. For developers, we present a set of Python classes for viewing FITS files under the modern Gtk and Qt widget sets and a more full-featured viewer that has a plugin architecture. We further describe how plugins can be written to extend the viewer with many different capabilities. The software may be of interest to software developers who are looking for a solution for integrating FITS visualization into their Python programs and end users interested in a new and different FITS viewer that is not based on Tcl/Tk widget technology. The software has been released under a BSD license.

  16. MX: A beamline control system toolkit

    Science.gov (United States)

    Lavender, William M.

    2000-06-01

    The development of experimental and beamline control systems for two Collaborative Access Teams at the Advanced Photon Source has resulted in the creation of a portable data acquisition and control toolkit called MX. MX consists of a set of servers, application programs and libraries that enable the creation of command line and graphical user interface applications that may be easily retargeted to new and different kinds of motor and device controllers. The source code for MX is written in ANSI C and Tcl/Tk with interprocess communication via TCP/IP. MX is available for several versions of Unix, Windows 95/98/NT and DOS. It may be downloaded from the web site http://www.imca.aps.anl.gov/mx/.

  17. A Multiresolution Image Cache for Volume Rendering

    Energy Technology Data Exchange (ETDEWEB)

    LaMar, E; Pascucci, V

    2003-02-27

    The authors discuss the techniques and implementation details of the shared-memory image caching system for volume visualization and iso-surface rendering. One of the goals of the system is to decouple image generation from image display. This is done by maintaining a set of impostors for interactive display while the production of the impostor imagery is performed by a set of parallel, background processes. The system introduces a caching basis that is free of the gap/overlap artifacts of earlier caching techniques. instead of placing impostors at fixed, pre-defined positions in world space, the technique is to adaptively place impostors relative to the camera viewpoint. The positions translate with the camera but stay aligned to the data; i.e., the positions translate, but do not rotate, with the camera. The viewing transformation is factored into a translation transformation and a rotation transformation. The impostor imagery is generated using just the translation transformation and visible impostors are displayed using just the rotation transformation. Displayed image quality is improved by increasing the number of impostors and the frequency that impostors are re-rendering is improved by decreasing the number of impostors.

  18. Rendering of 3D Dynamic Virtual Environments

    CERN Document Server

    Catanese, Salvatore; Fiumara, Giacomo; Pagano, Francesco

    2011-01-01

    In this paper we present a framework for the rendering of dynamic 3D virtual environments which can be integrated in the development of videogames. It includes methods to manage sounds and particle effects, paged static geometries, the support of a physics engine and various input systems. It has been designed with a modular structure to allow future expansions. We exploited some open-source state-of-the-art components such as OGRE, PhysX, ParticleUniverse, etc.; all of them have been properly integrated to obtain peculiar physical and environmental effects. The stand-alone version of the application is fully compatible with Direct3D and OpenGL APIs and adopts OpenAL APIs to manage audio cards. Concluding, we devised a showcase demo which reproduces a dynamic 3D environment, including some particular effects: the alternation of day and night infuencing the lighting of the scene, the rendering of terrain, water and vegetation, the reproduction of sounds and atmospheric agents.

  19. STAR: Software Toolkit for Analysis Research

    Energy Technology Data Exchange (ETDEWEB)

    Doak, J.E.; Prommel, J.M.; Whiteson, R.; Hoffbauer, B.L.; Thomas, T.R. [Los Alamos National Lab., NM (United States); Helman, P. [New Mexico Univ., Albuquerque, NM (United States). Dept. of Computer Science

    1993-08-01

    Analyzing vast quantities of data from diverse information sources is an increasingly important element for nonproliferation and arms control analysis. Much of the work in this area has used human analysts to assimilate, integrate, and interpret complex information gathered from various sources. With the advent of fast computers, we now have the capability to automate this process thereby shifting this burden away from humans. In addition, there now exist huge data storage capabilities which have made it possible to formulate large integrated databases comprising many thereabouts of information spanning a variety of subjects. We are currently designing a Software Toolkit for Analysis Research (STAR) to address these issues. The goal of STAR is to Produce a research tool that facilitates the development and interchange of algorithms for locating phenomena of interest to nonproliferation and arms control experts. One major component deals with the preparation of information. The ability to manage and effectively transform raw data into a meaningful form is a prerequisite for analysis by any methodology. The relevant information to be analyzed can be either unstructured text structured data, signals, or images. Text can be numerical and/or character, stored in raw data files, databases, streams of bytes, or compressed into bits in formats ranging from fixed, to character-delimited, to a count followed by content The data can be analyzed in real-time or batch mode. Once the data are preprocessed, different analysis techniques can be applied. Some are built using expert knowledge. Others are trained using data collected over a period of time. Currently, we are considering three classes of analyzers for use in our software toolkit: (1) traditional machine learning techniques, (2) the purely statistical system, and (3) expert systems.

  20. A Perl toolkit for LIMS development

    Directory of Open Access Journals (Sweden)

    Jacobs Ian J

    2008-03-01

    Full Text Available Abstract Background High throughput laboratory techniques generate huge quantities of scientific data. Laboratory Information Management Systems (LIMS are a necessary requirement, dealing with sample tracking, data storage and data reporting. Commercial LIMS solutions are available, but these can be both costly and overly complex for the task. The development of bespoke LIMS solutions offers a number of advantages, including the flexibility to fulfil all a laboratory's requirements at a fraction of the price of a commercial system. The programming language Perl is a perfect development solution for LIMS applications because of Perl's powerful but simple to use database and web interaction, it is also well known for enabling rapid application development and deployment, and boasts a very active and helpful developer community. The development of an in house LIMS from scratch however can take considerable time and resources, so programming tools that enable the rapid development of LIMS applications are essential but there are currently no LIMS development tools for Perl. Results We have developed ArrayPipeline, a Perl toolkit providing object oriented methods that facilitate the rapid development of bespoke LIMS applications. The toolkit includes Perl objects that encapsulate key components of a LIMS, providing methods for creating interactive web pages, interacting with databases, error tracking and reporting, and user and session management. The MT_Plate object provides methods for manipulation and management of microtitre plates, while a given LIMS can be encapsulated by extension of the core modules, providing system specific methods for database interaction and web page management. Conclusion This important addition to the Perl developer's library will make the development of in house LIMS applications quicker and easier encouraging laboratories to create bespoke LIMS applications to meet their specific data management requirements.

  1. Targeting protein function: the expanding toolkit for conditional disruption.

    Science.gov (United States)

    Campbell, Amy E; Bennett, Daimark

    2016-09-01

    A major objective in biological research is to understand spatial and temporal requirements for any given gene, especially in dynamic processes acting over short periods, such as catalytically driven reactions, subcellular transport, cell division, cell rearrangement and cell migration. The interrogation of such processes requires the use of rapid and flexible methods of interfering with gene function. However, many of the most widely used interventional approaches, such as RNAi or CRISPR (clustered regularly interspaced short palindromic repeats)-Cas9 (CRISPR-associated 9), operate at the level of the gene or its transcripts, meaning that the effects of gene perturbation are exhibited over longer time frames than the process under investigation. There has been much activity over the last few years to address this fundamental problem. In the present review, we describe recent advances in disruption technologies acting at the level of the expressed protein, involving inducible methods of protein cleavage, (in)activation, protein sequestration or degradation. Drawing on examples from model organisms we illustrate the utility of fast-acting techniques and discuss how different components of the molecular toolkit can be employed to dissect previously intractable biochemical processes and cellular behaviours.

  2. LOV to BLUF: Flavoprotein Contributions to the Optogenetic Toolkit

    Institute of Scientific and Technical Information of China (English)

    John M. Christie; Jayde Gawthorne; Gillian Young; Niall J. Fraser; Andrew J. Roe

    2012-01-01

    Optogenetics is an emerging field that combines optical and genetic approaches to non-invasively interfere with cellular events with exquisite spatiotemporal control.Although it arose originally from neuroscience,optogenetics is widely applicable to the study of many different biological systems and the range of applications arising from this technology continues to increase.Moreover,the repertoire of light-sensitive proteins used for devising new optogenetic tools is rapidly expanding.Light,Oxygen,or Voltage sensing (LOV) and Blue-Light-Utilizing flavin adenine dinucleotide (FAD) (BLUF) domains represent new contributors to the optogenetic toolkit.These small (100-140-amino acids) flavoprotein modules are derived from plant and bacterial photoreceptors that respond to UV-A/blue light.In recent years,considerable progress has been made in uncovering the photoactivation mechanisms of both LOV and BLUF domains.This knowledge has been applied in the design of synthetic photoswitches and fluorescent reporters with applications in cell biology and biotechnology.In this review,we summarize the photochemical properties of LOV and BLUF photosensors and highlight some of the recent advances in how these flavoproteins are being employed to artificially regulate and image a variety of biological processes.

  3. Functionality and Performance Visualization of the Distributed High Quality Volume Renderer (HVR)

    KAUST Repository

    Shaheen, Sara

    2012-07-01

    Volume rendering systems are designed to provide means to enable scientists and a variety of experts to interactively explore volume data through 3D views of the volume. However, volume rendering techniques are computationally intensive tasks. Moreover, parallel distributed volume rendering systems and multi-threading architectures were suggested as natural solutions to provide an acceptable volume rendering performance for very large volume data sizes, such as Electron Microscopy data (EM). This in turn adds another level of complexity when developing and manipulating volume rendering systems. Given that distributed parallel volume rendering systems are among the most complex systems to develop, trace and debug, it is obvious that traditional debugging tools do not provide enough support. As a consequence, there is a great demand to provide tools that are able to facilitate the manipulation of such systems. This can be achieved by utilizing the power of compute graphics in designing visual representations that reflect how the system works and that visualize the current performance state of the system.The work presented is categorized within the field of software Visualization, where Visualization is used to serve visualizing and understanding various software. In this thesis, a number of visual representations that reflect a number of functionality and performance aspects of the distributed HVR, a high quality volume renderer system that uses various techniques to visualize large volume sizes interactively. This work is provided to visualize different stages of the parallel volume rendering pipeline of HVR. This is along with means of performance analysis through a number of flexible and dynamic visualizations that reflect the current state of the system and enables manipulation of them at runtime. Those visualization are aimed to facilitate debugging, understanding and analyzing the distributed HVR.

  4. Food: Too Good to Waste Implementation Guide and Toolkit

    Science.gov (United States)

    The Food: Too Good to Waste (FTGTW) Implementation Guide and Toolkit is designed for community organizations, local governments, households and others interested in reducing wasteful household food management practices.

  5. A Multi-Physics CFD Toolkit for Reentry Flows Project

    Data.gov (United States)

    National Aeronautics and Space Administration — AeroSoft proposes to develop a full featured CFD toolkit for analysis of the aerothermal environment and its effect on space vehicles. In Phase I, AeroSoft proposes...

  6. Energy Savings Performance Contract Energy Sales Agreement Toolkit

    Energy Technology Data Exchange (ETDEWEB)

    None

    2017-08-14

    FEMP developed the Energy Savings Performance Contracting Energy Sales Agreement (ESPC ESA) Toolkit to provide federal agency contracting officers and other acquisition team members with information that will facilitate the timely execution of ESPC ESA projects.

  7. Toolkit for local decision makers aims to strengthen environmental sustainability

    CSIR Research Space (South Africa)

    Murambadoro, M

    2011-11-01

    Full Text Available Members of the South African Risk and Vulnerability Atlas were involved in a meeting aimed at the development of a toolkit towards improved integration of climate change into local government's integrated development planning (IDP) process....

  8. Ethnography in design: Tool-kit or analytic science?

    DEFF Research Database (Denmark)

    Bossen, Claus

    2002-01-01

    The role of ethograpyh in system development is discussed through the selective application of an ethnographic easy-to-use toolkit, Contextual design, by a computer firm in the initial stages of the development of a health care system.......The role of ethograpyh in system development is discussed through the selective application of an ethnographic easy-to-use toolkit, Contextual design, by a computer firm in the initial stages of the development of a health care system....

  9. NNCTRL - a CANCSD toolkit for MATLAB(R)

    DEFF Research Database (Denmark)

    Nørgård, Peter Magnus; Ravn, Ole; Poulsen, Niels Kjølstad

    1996-01-01

    A set of tools for computer-aided neuro-control system design (CANCSD) has been developed for the MATLAB environment. The tools can be used for construction and simulation of a variety of neural network based control systems. The design methods featured in the toolkit are: direct inverse control,......, internal model control, feedforward, feedback linearization, optimal control, instantaneous linearization, and nonlinear predictive control. Furthermore, the toolkit has been given a flexible design which allows for incorporation of the user's personal control algorithms...

  10. Photon Differential Splatting for Rendering Caustics

    DEFF Research Database (Denmark)

    Frisvad, Jeppe Revall; Schjøth, Lars; Erleben, Kenny;

    2014-01-01

    We present a photon splatting technique which reduces noise and blur in the rendering of caustics. Blurring of illumination edges is an inherent problem in photon splatting, as each photon is unaware of its neighbours when being splatted. This means that the splat size is usually based...... on heuristics rather than knowledge of the local flux density. We use photon differentials to determine the size and shape of the splats such that we achieve adaptive anisotropic flux density estimation in photon splatting. As compared to previous work that uses photon differentials, we present the first method...... where no photons or beams or differentials need to be stored in a map. We also present improvements in the theory of photon differentials, which give more accurate results and a faster implementation. Our technique has good potential for GPU acceleration, and we limit the number of parameters requiring...

  11. Constructing And Rendering Vectorised Photographic Images

    Directory of Open Access Journals (Sweden)

    P. J. Willis

    2013-06-01

    Full Text Available We address the problem of representing captured images in the continuous mathematical space more usually associated with certain forms of drawn ('vector' images. Such an image is resolution-independent so can be used as a master for varying resolution-specific formats. We briefly describe the main features of a vectorising codec for photographic images, whose significance is that drawing programs can access images and image components as first-class vector objects. This paper focuses on the problem of rendering from the isochromic contour form of a vectorised image and demonstrates a new fill algorithm which could also be used in drawing generally. The fill method is described in terms of level set diffusion equations for clarity. Finally we show that image warping is both simplified and enhanced in the vector form and that we can demonstrate real histogram equalisation with genuinely rectangular histograms straightforwardly.

  12. Behavioral Genetic Toolkits: Toward the Evolutionary Origins of Complex Phenotypes.

    Science.gov (United States)

    Rittschof, C C; Robinson, G E

    2016-01-01

    The discovery of toolkit genes, which are highly conserved genes that consistently regulate the development of similar morphological phenotypes across diverse species, is one of the most well-known observations in the field of evolutionary developmental biology. Surprisingly, this phenomenon is also relevant for a wide array of behavioral phenotypes, despite the fact that these phenotypes are highly complex and regulated by many genes operating in diverse tissues. In this chapter, we review the use of the toolkit concept in the context of behavior, noting the challenges of comparing behaviors and genes across diverse species, but emphasizing the successes in identifying genetic toolkits for behavior; these successes are largely attributable to the creative research approaches fueled by advances in behavioral genomics. We have two general goals: (1) to acknowledge the groundbreaking progress in this field, which offers new approaches to the difficult but exciting challenge of understanding the evolutionary genetic basis of behaviors, some of the most complex phenotypes known, and (2) to provide a theoretical framework that encompasses the scope of behavioral genetic toolkit studies in order to clearly articulate the research questions relevant to the toolkit concept. We emphasize areas for growth and highlight the emerging approaches that are being used to drive the field forward. Behavioral genetic toolkit research has elevated the use of integrative and comparative approaches in the study of behavior, with potentially broad implications for evolutionary biologists and behavioral ecologists alike.

  13. VisDock: A Toolkit for Cross-Cutting Interactions in Visualization.

    Science.gov (United States)

    Choi, Jungu; Park, Deok Gun; Wong, Yuet Ling; Fisher, Eli; Elmqvist, Niklas

    2015-09-01

    Standard user applications provide a range of cross-cutting interaction techniques that are common to virtually all such tools: selection, filtering, navigation, layer management, and cut-and-paste. We present VisDock, a JavaScript mixin library that provides a core set of these cross-cutting interaction techniques for visualization, including selection (lasso, paths, shape selection, etc), layer management (visibility, transparency, set operations, etc), navigation (pan, zoom, overview, magnifying lenses, etc), and annotation (point-based, region-based, data-space based, etc). To showcase the utility of the library, we have released it as Open Source and integrated it with a large number of existing web-based visualizations. Furthermore, we have evaluated VisDock using qualitative studies with both developers utilizing the toolkit to build new web-based visualizations, as well as with end-users utilizing it to explore movie ratings data. Results from these studies highlight the usability and effectiveness of the toolkit from both developer and end-user perspectives.

  14. Research standardization tools: pregnancy measures in the PhenX Toolkit.

    Science.gov (United States)

    Malinowski, Ann Kinga; Ananth, Cande V; Catalano, Patrick; Hines, Erin P; Kirby, Russell S; Klebanoff, Mark A; Mulvihill, John J; Simhan, Hyagriv; Hamilton, Carol M; Hendershot, Tabitha P; Phillips, Michael J; Kilpatrick, Lisa A; Maiese, Deborah R; Ramos, Erin M; Wright, Rosalind J; Dolan, Siobhan M

    2017-09-01

    aim, investigators whose work includes the perinatal population are encouraged to utilize the PhenX Toolkit in the design and implementation of their studies, thus potentially reducing heterogeneity in data measures across studies. Such an effort will enhance the overall impact of individual studies, increasing the ability to draw more meaningful conclusions that can then be translated into clinical practice. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Resolution-independent surface rendering using programmable graphics hardware

    Science.gov (United States)

    Loop, Charles T.; Blinn, James Frederick

    2008-12-16

    Surfaces defined by a Bezier tetrahedron, and in particular quadric surfaces, are rendered on programmable graphics hardware. Pixels are rendered through triangular sides of the tetrahedra and locations on the shapes, as well as surface normals for lighting evaluations, are computed using pixel shader computations. Additionally, vertex shaders are used to aid interpolation over a small number of values as input to the pixel shaders. Through this, rendering of the surfaces is performed independently of viewing resolution, allowing for advanced level-of-detail management. By individually rendering tetrahedrally-defined surfaces which together form complex shapes, the complex shapes can be rendered in their entirety.

  16. Efficient and Effective Volume Visualization with Enhanced Isosurface Rendering

    CERN Document Server

    Yang, Fei; Tian, Jie

    2012-01-01

    Compared with full volume rendering, isosurface rendering has several well recognized advantages in efficiency and accuracy. However, standard isosurface rendering has some limitations in effectiveness. First, it uses a monotone colored approach and can only visualize the geometry features of an isosurface. The lack of the capability to illustrate the material property and the internal structures behind an isosurface has been a big limitation of this method in applications. Another limitation of isosurface rendering is the difficulty to reveal physically meaningful structures, which are hidden in one or multiple isosurfaces. As such, the application requirements of extract and recombine structures of interest can not be implemented effectively with isosurface rendering. In this work, we develop an enhanced isosurface rendering technique to improve the effectiveness while maintaining the performance efficiency of the standard isosurface rendering. First, an isosurface color enhancement method is proposed to il...

  17. E-ELT modeling and simulation toolkits: philosophy and progress status

    Science.gov (United States)

    Sedghi, B.; Muller, M.; Bonnet, H.; Esselborn, M.; Le Louarn, M.; Clare, R.; Koch, F.

    2011-09-01

    To predict the performance of the E-ELT three sets of toolkits are developed at ESO: i) The main structure and associated optical unit dynamical and feedback control toolkit, ii) Active optics and phasing toolkit, and iii) adaptive optics simulation toolkit. There was a deliberate policy not to integrate all of the systems into a massive model and tool. The dynamical and control time scale differences are used to separate the simulation environments and tools. Therefore, each toolkit contains an appropriate detail of the problem and holds sufficient overlap with the others to ensure the consistency of the results. In this paper, these toolkits together with some examples are presented.

  18. GENFIT - a generic track reconstruction toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Hoeppner, Christian; Neubert, Sebastian [Technische Universitaet Muenchen, Physik Department E18, 85748 Garching (Germany)

    2008-07-01

    Experiments in high energy physics use a combination of widely different detector systems to achieve an optimal measurement of particle trajectories. The software package GENFIT has been developed to provide a consistent treatment of track parameter estimation with hits from detectors providing different spatial information, e.g. strip projections, 3-D space points, drift distances to wires, etc. The concept is based on the idea of a full separation of parameterizations (hit-measurements and track models) from the algebra of regression algorithms. This implements the possibility to switch between different track propagation algorithms and detector geometries without changing the core fitting classes. Key components of the system are the Kalman filter and so-called virtual detector planes. An interface to the propagation package GEANE has also been realized. The poster illustrates the object-oriented architecture of the toolkit which uses generic programming techniques to realize the flexible and portable design. Some applications in the framework of the PANDA simulation studies are shown.

  19. GENFIT - a Generic track reconstruction toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Neubert, Sebastian; Hoeppner, Christian [Physik Department E18, TU Muenchen, D-85748 Garching (Germany)

    2008-07-01

    Experiments in high energy physics use a combination of widely different detector systems to achieve an optimal measurement of particle trajectories. The software package GENFIT has been developed to provide a consistent treatment of track parameter estimation with hits from detectors providing different spatial information, e.g. strip projections, 3-D space points, drift distances to wires, etc. The concept is based on the idea of a full separation of parameterizations (hit-measurements and track models) from the algebra of regression algorithms. This implements the possibility to switch between different track propagation algorithms and detector geometries without changing the core fitting classes. Key components of the system are the Kalman filter and the so called virtual detector planes. An interface to the propagation package GEANE has also been realized. The poster illustrates the object oriented architecture of the toolkit which uses generic programming techniques to realize the flexible and portable design. Some applications in the framework of the PANDA simulation studies are shown.

  20. Security Assessment Simulation Toolkit (SAST) Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Meitzler, Wayne D.; Ouderkirk, Steven J.; Hughes, Chad O.

    2009-11-15

    The Department of Defense Technical Support Working Group (DoD TSWG) investment in the Pacific Northwest National Laboratory (PNNL) Security Assessment Simulation Toolkit (SAST) research planted a technology seed that germinated into a suite of follow-on Research and Development (R&D) projects culminating in software that is used by multiple DoD organizations. The DoD TSWG technology transfer goal for SAST is already in progress. The Defense Information Systems Agency (DISA), the Defense-wide Information Assurance Program (DIAP), the Marine Corps, Office Of Naval Research (ONR) National Center For Advanced Secure Systems Research (NCASSR) and Office Of Secretary Of Defense International Exercise Program (OSD NII) are currently investing to take SAST to the next level. PNNL currently distributes the software to over 6 government organizations and 30 DoD users. For the past five DoD wide Bulwark Defender exercises, the adoption of this new technology created an expanding role for SAST. In 2009, SAST was also used in the OSD NII International Exercise and is currently scheduled for use in 2010.

  1. The microRNA toolkit of insects

    Science.gov (United States)

    Ylla, Guillem; Fromm, Bastian; Piulachs, Maria-Dolors; Belles, Xavier

    2016-01-01

    Is there a correlation between miRNA diversity and levels of organismic complexity? Exhibiting extraordinary levels of morphological and developmental complexity, insects are the most diverse animal class on earth. Their evolutionary success was in particular shaped by the innovation of holometabolan metamorphosis in endopterygotes. Previously, miRNA evolution had been linked to morphological complexity, but astonishing variation in the currently available miRNA complements of insects made this link unclear. To address this issue, we sequenced the miRNA complement of the hemimetabolan Blattella germanica and reannotated that of two other hemimetabolan species, Locusta migratoria and Acyrthosiphon pisum, and of four holometabolan species, Apis mellifera, Tribolium castaneum, Bombyx mori and Drosophila melanogaster. Our analyses show that the variation of insect miRNAs is an artefact mainly resulting from poor sampling and inaccurate miRNA annotation, and that insects share a conserved microRNA toolkit of 65 families exhibiting very low variation. For example, the evolutionary shift toward a complete metamorphosis was accompanied only by the acquisition of three and the loss of one miRNA families. PMID:27883064

  2. Modelling toolkit for simulation of maglev devices

    Science.gov (United States)

    Peña-Roche, J.; Badía-Majós, A.

    2017-01-01

    A stand-alone App1 has been developed, focused on obtaining information about relevant engineering properties of magnetic levitation systems. Our modelling toolkit provides real time simulations of 2D magneto-mechanical quantities for superconductor (SC)/permanent magnet structures. The source code is open and may be customised for a variety of configurations. Ultimately, it relies on the variational statement of the critical state model for the superconducting component and has been verified against experimental data for YBaCuO/NdFeB assemblies. On a quantitative basis, the values of the arising forces, induced superconducting currents, as well as a plot of the magnetic field lines are displayed upon selection of an arbitrary trajectory of the magnet in the vicinity of the SC. The stability issues related to the cooling process, as well as the maximum attainable forces for a given material and geometry are immediately observed. Due to the complexity of the problem, a strategy based on cluster computing, database compression, and real-time post-processing on the device has been implemented.

  3. Asteroids Outreach Toolkit Development: Using Iterative Feedback In Informal Education

    Science.gov (United States)

    White, Vivian; Berendsen, M.; Gurton, S.; Dusenbery, P. B.

    2011-01-01

    The Night Sky Network is a collaboration of close to 350 astronomy clubs across the US that actively engage in public outreach within their communities. Since 2004, the Astronomical Society of the Pacific has been creating outreach ToolKits filled with carefully crafted sets of physical materials designed to help these volunteer clubs explain the wonders of the night sky to the public. The effectiveness of the ToolKit activities and demonstrations is the direct result of a thorough testing and vetting process. Find out how this iterative assessment process can help other programs create useful tools for both formal and informal educators. The current Space Rocks Outreach ToolKit focuses on explaining asteroids, comets, and meteorites to the general public using quick, big-picture activities that get audiences involved. Eight previous ToolKits cover a wide range of topics from the Moon to black holes. In each case, amateur astronomers and the public helped direct the development the activities along the way through surveys, focus groups, and active field-testing. The resulting activities have been embraced by the larger informal learning community and are enthusiastically being delivered to millions of people across the US and around the world. Each ToolKit is delivered free of charge to active Night Sky Network astronomy clubs. All activity write-ups are available free to download at the website listed here. Amateur astronomers receive frequent questions from the public about Earth impacts, meteors, and comets so this set of activities will help them explain the dynamics of these phenomena to the public. The Space Rocks ToolKit resources complement the Great Balls of Fire museum exhibit produced by Space Science Institute's National Center for Interactive Learning and scheduled for release in 2011. NSF has funded this national traveling exhibition and outreach ToolKit under Grant DRL-0813528.

  4. Direct volume rendering methods for cell structures.

    Science.gov (United States)

    Martišek, Dalibor; Martišek, Karel

    2012-01-01

    The study of the complicated architecture of cell space structures is an important problem in biology and medical research. Optical cuts of cells produced by confocal microscopes enable two-dimensional (2D) and three-dimensional (3D) reconstructions of observed cells. This paper discuses new possibilities for direct volume rendering of these data. We often encounter 16 or more bit images in confocal microscopy of cells. Most of the information contained in these images is unsubstantial for the human vision. Therefore, it is necessary to use mathematical algorithms for visualization of such images. Present software tools as OpenGL or DirectX run quickly in graphic station with special graphic cards, run very unsatisfactory on PC without these cards and outputs are usually poor for real data. These tools are black boxes for a common user and make it impossible to correct and improve them. With the method proposed, more parameters of the environment can be set, making it possible to apply 3D filters to set the output image sharpness in relation to the noise. The quality of the output is incomparable to the earlier described methods and is worth increasing the computing time. We would like to offer mathematical methods of 3D scalar data visualization describing new algorithms that run on standard PCs very well.

  5. PS1-29: Resources to Facilitate Multi-site Collaboration: the PRIMER Research Toolkit

    Science.gov (United States)

    Greene, Sarah; Thompson, Ella; Baldwin, Laura-Mae; Neale, Anne Victoria; Dolor, Rowena

    2010-01-01

    Background and Aims: The national research enterprise has typically functioned in a decentralized fashion, resulting in duplicative or undocumented processes, impeding not only the pace of research, but diffusion of established best practices. To remedy this, many long-standing networks have begun capturing and documenting proven strategies to streamline and standardize various aspects of the research process. The project, “Partnership-driven Resources to IMprove and Enhance Research” (PRIMER), was funded through the Clinical and Translational Science Awards (CTSA) initiative to leverage the collective expertise from two networks: the HMO Research Network and Practice Based Research Networks (PBRNs). Each network has a shared goal of propagating research resources and best practices. Methods: We created and distributed an online survey to 92 CTSA and PBRN representatives in March, 2009 to define critical needs and existing resources that could inform a resource repository. The survey identified barriers and benefits to forming research partnerships, and assessed the perceived utility of various tools that could accelerate the research process. The study team identified, reviewed and organized tools based on the typical research trajectory from design to dissemination. Results: Fifty-five of 92 invitees (59%) completed the survey. Respondents rated the ability to conduct community-relevant research through true academic-community partnerships as the top-rated benefit of multi-site research, followed by the opportunity to accelerate translation of research into practice. The top two perceived barriers to multi-site research were ‘funding opportunities are not adequate (e.g., too few, not enough to support true collaborations), and ‘lack of research infrastructure to support [all] partners (e.g., no IT support, IRB, dedicated research staff). Respondents’ ratings of the utility of various tools and templates was used to guide development of an online

  6. The MPI Bioinformatics Toolkit for protein sequence analysis.

    Science.gov (United States)

    Biegert, Andreas; Mayer, Christian; Remmert, Michael; Söding, Johannes; Lupas, Andrei N

    2006-07-01

    The MPI Bioinformatics Toolkit is an interactive web service which offers access to a great variety of public and in-house bioinformatics tools. They are grouped into different sections that support sequence searches, multiple alignment, secondary and tertiary structure prediction and classification. Several public tools are offered in customized versions that extend their functionality. For example, PSI-BLAST can be run against regularly updated standard databases, customized user databases or selectable sets of genomes. Another tool, Quick2D, integrates the results of various secondary structure, transmembrane and disorder prediction programs into one view. The Toolkit provides a friendly and intuitive user interface with an online help facility. As a key feature, various tools are interconnected so that the results of one tool can be forwarded to other tools. One could run PSI-BLAST, parse out a multiple alignment of selected hits and send the results to a cluster analysis tool. The Toolkit framework and the tools developed in-house will be packaged and freely available under the GNU Lesser General Public Licence (LGPL). The Toolkit can be accessed at http://toolkit.tuebingen.mpg.de.

  7. Teaching and learning "on the run": ready-to-use toolkits in busy clinical settings.

    Science.gov (United States)

    Cleary, Michelle; Walter, Garry

    2010-06-01

    Clinicians should strongly consider using toolkits in their workplaces with students on clinical placement. These toolkits could include brief quizzes, crossword puzzles, vignettes, role-playing, storytelling, or reflective activities to engage students in context-specific, collaborative learning.

  8. HDlive rendering images of the fetal stomach: a preliminary report.

    Science.gov (United States)

    Inubashiri, Eisuke; Abe, Kiyotaka; Watanabe, Yukio; Akutagawa, Noriyuki; Kuroki, Katumaru; Sugawara, Masaki; Maeda, Nobuhiko; Minami, Kunihiro; Nomura, Yasuhiro

    2015-01-01

    This study aimed to show reconstruction of the fetal stomach using the HDlive rendering mode in ultrasound. Seventeen healthy singleton fetuses at 18-34 weeks' gestational age were observed using the HDlive rendering mode of ultrasound in utero. In all of the fetuses, we identified specific spatial structures, including macroscopic anatomical features (e.g., the pyrous, cardia, fundus, and great curvature) of the fetal stomach, using the HDlive rendering mode. In particular, HDlive rendering images showed remarkably fine details that appeared as if they were being viewed under an endoscope, with visible rugal folds after 27 weeks' gestational age. Our study suggests that the HDlive rendering mode can be used as an additional method for evaluating the fetal stomach. The HDlive rendering mode shows detailed 3D structural images and anatomically realistic images of the fetal stomach. This technique may be effective in prenatal diagnosis for examining detailed information of fetal organs.

  9. Dosimetry applications in GATE Monte Carlo toolkit.

    Science.gov (United States)

    Papadimitroulas, Panagiotis

    2017-02-21

    Monte Carlo (MC) simulations are a well-established method for studying physical processes in medical physics. The purpose of this review is to present GATE dosimetry applications on diagnostic and therapeutic simulated protocols. There is a significant need for accurate quantification of the absorbed dose in several specific applications such as preclinical and pediatric studies. GATE is an open-source MC toolkit for simulating imaging, radiotherapy (RT) and dosimetry applications in a user-friendly environment, which is well validated and widely accepted by the scientific community. In RT applications, during treatment planning, it is essential to accurately assess the deposited energy and the absorbed dose per tissue/organ of interest, as well as the local statistical uncertainty. Several types of realistic dosimetric applications are described including: molecular imaging, radio-immunotherapy, radiotherapy and brachytherapy. GATE has been efficiently used in several applications, such as Dose Point Kernels, S-values, Brachytherapy parameters, and has been compared against various MC codes which are considered as standard tools for decades. Furthermore, the presented studies show reliable modeling of particle beams when comparing experimental with simulated data. Examples of different dosimetric protocols are reported for individualized dosimetry and simulations combining imaging and therapy dose monitoring, with the use of modern computational phantoms. Personalization of medical protocols can be achieved by combining GATE MC simulations with anthropomorphic computational models and clinical anatomical data. This is a review study, covering several dosimetric applications of GATE, and the different tools used for modeling realistic clinical acquisitions with accurate dose assessment. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  10. Cultural Algorithm Toolkit for Interactive Knowledge Discovery

    Directory of Open Access Journals (Sweden)

    Sujatha Srinivasa

    2012-10-01

    Full Text Available Cultural algorithms (CA are inspired from the cultural evolutionary process in nature and use socialintelligence to solve problems. Cultural algorithms are composed of a belief space which uses differentknowledge sources, a population space and a protocol that enables exchange of knowledge between thesesources. Knowledge created in the population space is accepted into the belief space while this collectiveknowledge from these sources is combined to influence the decisions of the individual agents in solvingproblems. Classification rules comes under descriptive knowledge discovery in data mining and are themost sought out by users since they represent highly comprehensible form of knowledge. The rules havecertain properties which make them useful forms of actionable knowledge to users. The rules are evaluatedusing these properties represented as objective and subjective measures. Objective measures are problemoriented while subjective measures are more user oriented. Evolutionary systems allow the user toincorporate different rule metrics into the solution of a multi objective rule mining problem. However thealgorithms found in the literature allow only certain attributes of the system to be controlled by the user.Research gap exists in providing a complete user controlled system to experiment with evolutionary multiobjective classification rule mining. In the current study a Cultural Algorithm Toolkit for ClassificationRule Mining (CAT-CRM is proposed which allows the user to control three different set of parameters.CAT-CRM allows the user to control the evolutionary parameters, the rule parameters as well as agentparameters and hence can be used for experimenting with an evolutionary system, a rule mining system oran agent based social system. Results of experiments conducted to observe the effect of different crossoverrates and mutation rates on classification accuracy on a bench mark data set is reported.

  11. Assessing the support provided by a toolkit for rapid prototyping of multimodal systems

    OpenAIRE

    CUENCA LUCERO, Fredy; Vanacken, Davy; Coninx, Karin; Luyten, Kris

    2013-01-01

    Choosing an appropriate toolkit for creating a multimodal interface is a cumbersome task. Several specialized toolkits include fusion and fission engines that allow developers to combine and decompose modalities to capture multimodal input and provide multimodal output. Unfortunately, the extent to which these toolkits can facilitate the creation of a multimodal interface is hard or impossible to estimate, due to the absence of a scale where the toolkit's capabilities can be measured on. In t...

  12. Fast combinative volume rendering by indexed data structure

    Institute of Scientific and Technical Information of China (English)

    孙文武; 王文成; 吴恩华

    2001-01-01

    It is beneficial to study the interesting contents in a data set by combining and rendering variouscontents of the data. In this regard, an indexed data structure is proposed to facilitate the reorganization of data so that the contents of the data can be combined conveniently and only the selected contents in the data are processed for rendering. Based on the structure, the cells of different contents can be queued up easily so that the volume rendering can be conducted more accurately and quickly. Experimental results show that the indexed data structure is very efficient in improving combinative volume rendering.

  13. An epigenetic toolkit allows for diverse genome architectures in eukaryotes.

    Science.gov (United States)

    Maurer-Alcalá, Xyrus X; Katz, Laura A

    2015-12-01

    Genome architecture varies considerably among eukaryotes in terms of both size and structure (e.g. distribution of sequences within the genome, elimination of DNA during formation of somatic nuclei). The diversity in eukaryotic genome architectures and the dynamic processes are only possible due to the well-developed epigenetic toolkit, which probably existed in the Last Eukaryotic Common Ancestor (LECA). This toolkit may have arisen as a means of navigating the genomic conflict that arose from the expansion of transposable elements within the ancestral eukaryotic genome. This toolkit has been coopted to support the dynamic nature of genomes in lineages across the eukaryotic tree of life. Here we highlight how the changes in genome architecture in diverse eukaryotes are regulated by epigenetic processes, such as DNA elimination, genome rearrangements, and adaptive changes to genome architecture. The ability to epigenetically modify and regulate genomes has contributed greatly to the diversity of eukaryotes observed today.

  14. Windows forensic analysis toolkit advanced analysis techniques for Windows 7

    CERN Document Server

    Carvey, Harlan

    2012-01-01

    Now in its third edition, Harlan Carvey has updated "Windows Forensic Analysis Toolkit" to cover Windows 7 systems. The primary focus of this edition is on analyzing Windows 7 systems and on processes using free and open-source tools. The book covers live response, file analysis, malware detection, timeline, and much more. The author presents real-life experiences from the trenches, making the material realistic and showing the why behind the how. New to this edition, the companion and toolkit materials are now hosted online. This material consists of electronic printable checklists, cheat sheets, free custom tools, and walk-through demos. This edition complements "Windows Forensic Analysis Toolkit, 2nd Edition", (ISBN: 9781597494229), which focuses primarily on XP. It includes complete coverage and examples on Windows 7 systems. It contains Lessons from the Field, Case Studies, and War Stories. It features companion online material, including electronic printable checklists, cheat sheets, free custom tools, ...

  15. Comparison of open-source visual analytics toolkits

    Science.gov (United States)

    Harger, John R.; Crossno, Patricia J.

    2012-01-01

    We present the results of the first stage of a two-stage evaluation of open source visual analytics packages. This stage is a broad feature comparison over a range of open source toolkits. Although we had originally intended to restrict ourselves to comparing visual analytics toolkits, we quickly found that very few were available. So we expanded our study to include information visualization, graph analysis, and statistical packages. We examine three aspects of each toolkit: visualization functions, analysis capabilities, and development environments. With respect to development environments, we look at platforms, language bindings, multi-threading/parallelism, user interface frameworks, ease of installation, documentation, and whether the package is still being actively developed.

  16. The PRIDE (Partnership to Improve Diabetes Education) Toolkit

    Science.gov (United States)

    Wolff, Kathleen; Chambers, Laura; Bumol, Stefan; White, Richard O.; Gregory, Becky Pratt; Davis, Dianne; Rothman, Russell L.

    2016-01-01

    Purpose Patients with low literacy, low numeracy, and/or linguistic needs can experience challenges understanding diabetes information and applying concepts to their self-management. The authors designed a toolkit of education materials that are sensitive to patients' literacy and numeracy levels, language preferences, and cultural norms and that encourage shared goal setting to improve diabetes self-management and health outcomes. The Partnership to Improve Diabetes Education (PRIDE) toolkit was developed to facilitate diabetes self-management education and support. Methods The PRIDE toolkit includes a comprehensive set of 30 interactive education modules in English and Spanish to support diabetes self-management activities. The toolkit builds upon the authors' previously validated Diabetes Literacy and Numeracy Education Toolkit (DLNET) by adding a focus on shared goal setting, addressing the needs of Spanish-speaking patients, and including a broader range of diabetes management topics. Each PRIDE module was evaluated using the Suitability Assessment of Materials (SAM) instrument to determine the material's cultural appropriateness and its sensitivity to the needs of patients with low literacy and low numeracy. Reading grade level was also assessed using the Automated Readability Index (ARI), Coleman-Liau, Flesch-Kincaid, Fry, and SMOG formulas. Conclusions The average reading grade level of the materials was 5.3 (SD 1.0), with a mean SAM of 91.2 (SD 5.4). All of the 30 modules received a “superior” score (SAM >70%) when evaluated by 2 independent raters. The PRIDE toolkit modules can be used by all members of a multidisciplinary team to assist patients with low literacy and low numeracy in managing their diabetes. PMID:26647414

  17. Deconstructing the toolkit: creativity and risk in the NHS workforce.

    Science.gov (United States)

    Allen, Von; Brodzinski, Emma

    2009-12-01

    Deconstructing the Toolkit explores the current desire for toolkits that promise failsafe structures to facilitate creative success. The paper examines this cultural phenomenon within the context of the risk-averse workplace-with particular focus on the NHS. The writers draw on Derrida and deconstructionism to reflect upon the principles of creativity and the possibilities for being creative within the workplace. Through reference to The Extra Mile project facilitated by Open Art, the paper examines the importance of engaging with an aesthetic of creativity and embracing a more holistic approach to the problems and potential of the creative process.

  18. System Design Toolkit for Integrated Modular Avionics for Space

    Science.gov (United States)

    Hann, Mark; Balbastre Betoret, Patricia; Simo Ten, Jose Enrique; De Ferluc, Regis; Ramachandran, Jinesh

    2015-09-01

    The IMA-SP development process identified tools were needed to perform the activities of: i) Partitioning and Resource Allocation and ii) System Feasibility Assessment. This paper describes the definition, design, implementation and test of the tool support required to perform the IMA-SP development process activities. This includes the definition of a data model, with associated files and file formats, describing the complete setup of a partitioned system and allowing system feasibility assessment; the development of a prototype of the tool set, that is called the IMA-SP System Design Toolkit (SDT) and the demonstration of the toolkit on a case study.

  19. Water Security Toolkit User Manual Version 1.2.

    Energy Technology Data Exchange (ETDEWEB)

    Klise, Katherine A.; Siirola, John Daniel; Hart, David; Hart, William Eugene; Phillips, Cynthia Ann; Haxton, Terranna; Murray, Regan; Janke, Robert; Taxon, Thomas; Laird, Carl; Seth, Arpan; Hackebeil, Gabriel; McGee, Shawn; Mann, Angelica

    2014-08-01

    The Water Security Toolkit (WST) is a suite of open source software tools that can be used by water utilities to create response strategies to reduce the impact of contamination in a water distribution network . WST includes hydraulic and water quality modeling software , optimizati on methodologies , and visualization tools to identify: (1) sensor locations to detect contamination, (2) locations in the network in which the contamination was introduced, (3) hydrants to remove contaminated water from the distribution system, (4) locations in the network to inject decontamination agents to inactivate, remove, or destroy contaminants, (5) locations in the network to take grab sample s to help identify the source of contamination and (6) valves to close in order to isolate contaminate d areas of the network. This user manual describes the different components of WST , along w ith examples and case studies. License Notice The Water Security Toolkit (WST) v.1.2 Copyright c 2012 Sandia Corporation. Under the terms of Contract DE-AC04-94AL85000, there is a non-exclusive license for use of this work by or on behalf of the U.S. government. This software is distributed under the Revised BSD License (see below). In addition, WST leverages a variety of third-party software packages, which have separate licensing policies: Acro Revised BSD License argparse Python Software Foundation License Boost Boost Software License Coopr Revised BSD License Coverage BSD License Distribute Python Software Foundation License / Zope Public License EPANET Public Domain EPANET-ERD Revised BSD License EPANET-MSX GNU Lesser General Public License (LGPL) v.3 gcovr Revised BSD License GRASP AT&T Commercial License for noncommercial use; includes randomsample and sideconstraints executable files LZMA SDK Public Domain nose GNU Lesser General Public License (LGPL) v.2.1 ordereddict MIT License pip MIT License PLY BSD License PyEPANET Revised BSD License Pyro MIT License PyUtilib Revised BSD License Py

  20. GALSIM: The modular galaxy image simulation toolkit

    Science.gov (United States)

    Rowe, B. T. P.; Jarvis, M.; Mandelbaum, R.; Bernstein, G. M.; Bosch, J.; Simet, M.; Meyers, J. E.; Kacprzak, T.; Nakajima, R.; Zuntz, J.; Miyatake, H.; Dietrich, J. P.; Armstrong, R.; Melchior, P.; Gill, M. S. S.

    2015-04-01

    GALSIM is a collaborative, open-source project aimed at providing an image simulation tool of enduring benefit to the astronomical community. It provides a software library for generating images of astronomical objects such as stars and galaxies in a variety of ways, efficiently handling image transformations and operations such as convolution and rendering at high precision. We describe the GALSIM software and its capabilities, including necessary theoretical background. We demonstrate that the performance of GALSIM meets the stringent requirements of high precision image analysis applications such as weak gravitational lensing, for current datasets and for the Stage IV dark energy surveys of the Large Synoptic Survey Telescope, ESA's Euclid mission, and NASA's WFIRST-AFTA mission. The GALSIM project repository is public and includes the full code history, all open and closed issues, installation instructions, documentation, and wiki pages (including a Frequently Asked Questions section). The GALSIM repository can be found at https://github.com/GalSim-developers/GalSim.

  1. Advancements in Wind Integration Study Input Data Modeling: The Wind Integration National Dataset (WIND) Toolkit

    Science.gov (United States)

    Hodge, B.; Orwig, K.; McCaa, J. R.; Harrold, S.; Draxl, C.; Jones, W.; Searight, K.; Getman, D.

    2013-12-01

    Regional wind integration studies in the United States, such as the Western Wind and Solar Integration Study (WWSIS), Eastern Wind Integration and Transmission Study (EWITS), and Eastern Renewable Generation Integration Study (ERGIS), perform detailed simulations of the power system to determine the impact of high wind and solar energy penetrations on power systems operations. Some of the specific aspects examined include: infrastructure requirements, impacts on grid operations and conventional generators, ancillary service requirements, as well as the benefits of geographic diversity and forecasting. These studies require geographically broad and temporally consistent wind and solar power production input datasets that realistically reflect the ramping characteristics, spatial and temporal correlations, and capacity factors of wind and solar power plant production, and are time-synchronous with load profiles. The original western and eastern wind datasets were generated independently for 2004-2006 using numerical weather prediction (NWP) models run on a ~2 km grid with 10-minute resolution. Each utilized its own site selection process to augment existing wind plants with simulated sites of high development potential. The original dataset also included day-ahead simulated forecasts. These datasets were the first of their kind and many lessons were learned from their development. For example, the modeling approach used generated periodic false ramps that later had to be removed due to unrealistic impacts on ancillary service requirements. For several years, stakeholders have been requesting an updated dataset that: 1) covers more recent years; 2) spans four or more years to better evaluate interannual variability; 3) uses improved methods to minimize false ramps and spatial seams; 4) better incorporates solar power production inputs; and 5) is more easily accessible. To address these needs, the U.S. Department of Energy (DOE) Wind and Solar Programs have funded two

  2. Realistic Real-Time Outdoor Rendering in Augmented Reality

    Science.gov (United States)

    Kolivand, Hoshang; Sunar, Mohd Shahrizal

    2014-01-01

    Realistic rendering techniques of outdoor Augmented Reality (AR) has been an attractive topic since the last two decades considering the sizeable amount of publications in computer graphics. Realistic virtual objects in outdoor rendering AR systems require sophisticated effects such as: shadows, daylight and interactions between sky colours and virtual as well as real objects. A few realistic rendering techniques have been designed to overcome this obstacle, most of which are related to non real-time rendering. However, the problem still remains, especially in outdoor rendering. This paper proposed a much newer, unique technique to achieve realistic real-time outdoor rendering, while taking into account the interaction between sky colours and objects in AR systems with respect to shadows in any specific location, date and time. This approach involves three main phases, which cover different outdoor AR rendering requirements. Firstly, sky colour was generated with respect to the position of the sun. Second step involves the shadow generation algorithm, Z-Partitioning: Gaussian and Fog Shadow Maps (Z-GaF Shadow Maps). Lastly, a technique to integrate sky colours and shadows through its effects on virtual objects in the AR system, is introduced. The experimental results reveal that the proposed technique has significantly improved the realism of real-time outdoor AR rendering, thus solving the problem of realistic AR systems. PMID:25268480

  3. Realistic real-time outdoor rendering in augmented reality.

    Science.gov (United States)

    Kolivand, Hoshang; Sunar, Mohd Shahrizal

    2014-01-01

    Realistic rendering techniques of outdoor Augmented Reality (AR) has been an attractive topic since the last two decades considering the sizeable amount of publications in computer graphics. Realistic virtual objects in outdoor rendering AR systems require sophisticated effects such as: shadows, daylight and interactions between sky colours and virtual as well as real objects. A few realistic rendering techniques have been designed to overcome this obstacle, most of which are related to non real-time rendering. However, the problem still remains, especially in outdoor rendering. This paper proposed a much newer, unique technique to achieve realistic real-time outdoor rendering, while taking into account the interaction between sky colours and objects in AR systems with respect to shadows in any specific location, date and time. This approach involves three main phases, which cover different outdoor AR rendering requirements. Firstly, sky colour was generated with respect to the position of the sun. Second step involves the shadow generation algorithm, Z-Partitioning: Gaussian and Fog Shadow Maps (Z-GaF Shadow Maps). Lastly, a technique to integrate sky colours and shadows through its effects on virtual objects in the AR system, is introduced. The experimental results reveal that the proposed technique has significantly improved the realism of real-time outdoor AR rendering, thus solving the problem of realistic AR systems.

  4. Method of producing hydrogen, and rendering a contaminated biomass inert

    Science.gov (United States)

    Bingham, Dennis N [Idaho Falls, ID; Klingler, Kerry M [Idaho Falls, ID; Wilding, Bruce M [Idaho Falls, ID

    2010-02-23

    A method for rendering a contaminated biomass inert includes providing a first composition, providing a second composition, reacting the first and second compositions together to form an alkaline hydroxide, providing a contaminated biomass feedstock and reacting the alkaline hydroxide with the contaminated biomass feedstock to render the contaminated biomass feedstock inert and further producing hydrogen gas, and a byproduct that includes the first composition.

  5. Realistic real-time outdoor rendering in augmented reality.

    Directory of Open Access Journals (Sweden)

    Hoshang Kolivand

    Full Text Available Realistic rendering techniques of outdoor Augmented Reality (AR has been an attractive topic since the last two decades considering the sizeable amount of publications in computer graphics. Realistic virtual objects in outdoor rendering AR systems require sophisticated effects such as: shadows, daylight and interactions between sky colours and virtual as well as real objects. A few realistic rendering techniques have been designed to overcome this obstacle, most of which are related to non real-time rendering. However, the problem still remains, especially in outdoor rendering. This paper proposed a much newer, unique technique to achieve realistic real-time outdoor rendering, while taking into account the interaction between sky colours and objects in AR systems with respect to shadows in any specific location, date and time. This approach involves three main phases, which cover different outdoor AR rendering requirements. Firstly, sky colour was generated with respect to the position of the sun. Second step involves the shadow generation algorithm, Z-Partitioning: Gaussian and Fog Shadow Maps (Z-GaF Shadow Maps. Lastly, a technique to integrate sky colours and shadows through its effects on virtual objects in the AR system, is introduced. The experimental results reveal that the proposed technique has significantly improved the realism of real-time outdoor AR rendering, thus solving the problem of realistic AR systems.

  6. 7 CFR 54.15 - Advance information concerning service rendered.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Advance information concerning service rendered. 54.15... Service § 54.15 Advance information concerning service rendered. Upon request of any applicant, all or any... SERVICE (Standards, Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE (CONTINUED)...

  7. A Toolkit of Systems Gaming Techniques

    Science.gov (United States)

    Finnigan, David; McCaughey, Jamie W.

    2017-04-01

    Decision-makers facing natural hazard crises need a broad set of cognitive tools to help them grapply with complexity. Systems gaming can act as a kind of 'flight simulator for decision making' enabling us to step through real life complex scenarios of the kind that beset us in natural disaster situations. Australian science-theatre ensemble Boho Interactive is collaborating with the Earth Observatory Singapore to develop an in-person systems game modelling an unfolding natural hazard crisis (volcanic unrest or an approaching typhoon) impacting an Asian city. Through a combination of interactive mechanisms drawn from boardgaming and participatory theatre, players will make decisions and assign resources in response to the unfolding crisis. In this performance, David Finnigan from Boho will illustrate some of the participatory techniques that Boho use to illustrate key concepts from complex systems science. These activities are part of a toolkit which can be adapted to fit a range of different contexts and scenarios. In this session, David will present short activities that demonstrate a range of systems principles including common-pool resource challenges (the Tragedy of the Commons), interconnectivity, unintended consequences, tipping points and phase transitions, and resilience. The interactive mechanisms for these games are all deliberately lo-fi rather than digital, for three reasons. First, the experience of a tactile, hands-on game is more immediate and engaging. It brings the focus of the participants into the room and facilitates engagement with the concepts and with each other, rather than with individual devices. Second, the mechanics of the game are laid bare. This is a valuable way to illustrate that complex systems are all around us, and are not merely the domain of hi-tech systems. Finally, these games can be used in a wide variety of contexts by removing computer hardware requirements and instead using materials and resources that are easily found in

  8. Versatile RNA tetra-U helix linking motif as a toolkit for nucleic acid nanotechnology.

    Science.gov (United States)

    Bui, My N; Brittany Johnson, M; Viard, Mathias; Satterwhite, Emily; Martins, Angelica N; Li, Zhihai; Marriott, Ian; Afonin, Kirill A; Khisamutdinov, Emil F

    2017-04-01

    RNA nanotechnology employs synthetically modified ribonucleic acid (RNA) to engineer highly stable nanostructures in one, two, and three dimensions for medical applications. Despite the tremendous advantages in RNA nanotechnology, unmodified RNA itself is fragile and prone to enzymatic degradation. In contrast to use traditionally modified RNA strands e.g. 2'-fluorine, 2'-amine, 2'-methyl, we studied the effect of RNA/DNA hybrid approach utilizing a computer-assisted RNA tetra-uracil (tetra-U) motif as a toolkit to address questions related to assembly efficiency, versatility, stability, and the production costs of hybrid RNA/DNA nanoparticles. The tetra-U RNA motif was implemented to construct four functional triangles using RNA, DNA and RNA/DNA mixtures, resulting in fine-tunable enzymatic and thermodynamic stabilities, immunostimulatory activity and RNAi capability. Moreover, the tetra-U toolkit has great potential in the fabrication of rectangular, pentagonal, and hexagonal NPs, representing the power of simplicity of RNA/DNA approach for RNA nanotechnology and nanomedicine community. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. OpenMSI Arrayed Analysis Toolkit: Analyzing Spatially Defined Samples Using Mass Spectrometry Imaging.

    Science.gov (United States)

    de Raad, Markus; de Rond, Tristan; Rübel, Oliver; Keasling, Jay D; Northen, Trent R; Bowen, Benjamin P

    2017-06-06

    Mass spectrometry imaging (MSI) has primarily been applied in localizing biomolecules within biological matrices. Although well-suited, the application of MSI for comparing thousands of spatially defined spotted samples has been limited. One reason for this is a lack of suitable and accessible data processing tools for the analysis of large arrayed MSI sample sets. The OpenMSI Arrayed Analysis Toolkit (OMAAT) is a software package that addresses the challenges of analyzing spatially defined samples in MSI data sets. OMAAT is written in Python and is integrated with OpenMSI ( http://openmsi.nersc.gov ), a platform for storing, sharing, and analyzing MSI data. By using a web-based python notebook (Jupyter), OMAAT is accessible to anyone without programming experience yet allows experienced users to leverage all features. OMAAT was evaluated by analyzing an MSI data set of a high-throughput glycoside hydrolase activity screen comprising 384 samples arrayed onto a NIMS surface at a 450 μm spacing, decreasing analysis time >100-fold while maintaining robust spot-finding. The utility of OMAAT was demonstrated for screening metabolic activities of different sized soil particles, including hydrolysis of sugars, revealing a pattern of size dependent activities. These results introduce OMAAT as an effective toolkit for analyzing spatially defined samples in MSI. OMAAT runs on all major operating systems, and the source code can be obtained from the following GitHub repository: https://github.com/biorack/omaat .

  10. OpenMSI Arrayed Analysis Toolkit: Analyzing Spatially Defined Samples Using Mass Spectrometry Imaging

    Energy Technology Data Exchange (ETDEWEB)

    de Raad, Markus [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); de Rond, Tristan [Univ. of California, Berkeley, CA (United States); Rübel, Oliver [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Keasling, Jay D. [Univ. of California, Berkeley, CA (United States); Joint BioEnergy Inst. (JBEI), Emeryville, CA (United States); Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Technical Univ. of Denmark, Lyngby (Denmark); Northen, Trent R. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States); Bowen, Benjamin P. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); USDOE Joint Genome Institute (JGI), Walnut Creek, CA (United States)

    2017-05-03

    Mass spectrometry imaging (MSI) has primarily been applied in localizing biomolecules within biological matrices. Although well-suited, the application of MSI for comparing thousands of spatially defined spotted samples has been limited. One reason for this is a lack of suitable and accessible data processing tools for the analysis of large arrayed MSI sample sets. In this paper, the OpenMSI Arrayed Analysis Toolkit (OMAAT) is a software package that addresses the challenges of analyzing spatially defined samples in MSI data sets. OMAAT is written in Python and is integrated with OpenMSI (http://openmsi.nersc.gov), a platform for storing, sharing, and analyzing MSI data. By using a web-based python notebook (Jupyter), OMAAT is accessible to anyone without programming experience yet allows experienced users to leverage all features. OMAAT was evaluated by analyzing an MSI data set of a high-throughput glycoside hydrolase activity screen comprising 384 samples arrayed onto a NIMS surface at a 450 μm spacing, decreasing analysis time >100-fold while maintaining robust spot-finding. The utility of OMAAT was demonstrated for screening metabolic activities of different sized soil particles, including hydrolysis of sugars, revealing a pattern of size dependent activities. Finally, these results introduce OMAAT as an effective toolkit for analyzing spatially defined samples in MSI. OMAAT runs on all major operating systems, and the source code can be obtained from the following GitHub repository: https://github.com/biorack/omaat.

  11. Spike train analysis toolkit: enabling wider application of information-theoretic techniques to neurophysiology.

    Science.gov (United States)

    Goldberg, David H; Victor, Jonathan D; Gardner, Esther P; Gardner, Daniel

    2009-09-01

    Conventional methods widely available for the analysis of spike trains and related neural data include various time- and frequency-domain analyses, such as peri-event and interspike interval histograms, spectral measures, and probability distributions. Information theoretic methods are increasingly recognized as significant tools for the analysis of spike train data. However, developing robust implementations of these methods can be time-consuming, and determining applicability to neural recordings can require expertise. In order to facilitate more widespread adoption of these informative methods by the neuroscience community, we have developed the Spike Train Analysis Toolkit. STAToolkit is a software package which implements, documents, and guides application of several information-theoretic spike train analysis techniques, thus minimizing the effort needed to adopt and use them. This implementation behaves like a typical Matlab toolbox, but the underlying computations are coded in C for portability, optimized for efficiency, and interfaced with Matlab via the MEX framework. STAToolkit runs on any of three major platforms: Windows, Mac OS, and Linux. The toolkit reads input from files with an easy-to-generate text-based, platform-independent format. STAToolkit, including full documentation and test cases, is freely available open source via http://neuroanalysis.org , maintained as a resource for the computational neuroscience and neuroinformatics communities. Use cases drawn from somatosensory and gustatory neurophysiology, and community use of STAToolkit, demonstrate its utility and scope.

  12. A Toolkit For Storage Qos Provisioning For Data-Intensive Applications

    Directory of Open Access Journals (Sweden)

    Renata Słota

    2012-01-01

    Full Text Available This paper describes a programming toolkit developed in the PL-Grid project, named QStorMan, which supports storage QoS provisioning for data-intensive applications in distributed environments. QStorMan exploits knowledge-oriented methods for matching storage resources to non-functional requirements, which are defined for a data-intensive application. In order to support various usage scenarios, QStorMan provides two interfaces, such as programming libraries or a web portal. The interfaces allow to define the requirements either directly in an application source code or by using an intuitive graphical interface. The first way provides finer granularity, e.g., each portion of data processed by an application can define a different set of requirements. The second method is aimed at legacy applications support, which source code can not be modified. The toolkit has been evaluated using synthetic benchmarks and the production infrastructure of PL-Grid, in particular its storage infrastructure, which utilizes the Lustre file system.

  13. Local and Global Illumination in the Volume Rendering Integral

    Energy Technology Data Exchange (ETDEWEB)

    Max, N; Chen, M

    2005-10-21

    This article is intended as an update of the major survey by Max [1] on optical models for direct volume rendering. It provides a brief overview of the subject scope covered by [1], and brings recent developments, such as new shadow algorithms and refraction rendering, into the perspective. In particular, we examine three fundamentals aspects of direct volume rendering, namely the volume rendering integral, local illumination models and global illumination models, in a wavelength-independent manner. We review the developments on spectral volume rendering, in which visible light are considered as a form of electromagnetic radiation, optical models are implemented in conjunction with representations of spectral power distribution. This survey can provide a basis for, and encourage, new efforts for developing and using complex illumination models to achieve better realism and perception through optical correctness.

  14. Research of global illumination algorithms rendering in glossy scene

    Institute of Scientific and Technical Information of China (English)

    BAI Shuangxue; ZHANG Qiang; ZHOU Dongsheng

    2012-01-01

    In computer graphic (CG), illumination rendering generated realistic effect at virtual scene is amazing. Not only plausible lighting effect is to show the relative position between of the objects, but also to reflect the material of visual appearance of the vir- tual objects. The diffuse-scene rendering reflectance credibility has gradually matured. Global illumination rendering method for the glossy material is still a challenge for the CG research. Because of the shiny materials is highly energy reflection between the com- plex light paths. Whether we trace glossy reflection paths, or use of one-reflection or multi-reflection approximate above complex il- lumination transmission is a difficult working. This paper we gather some commonly used global illumination algorithms recently year and its extension glossy scene improvements. And we introduce the limitation of classical algorithms rendering glossy scene and some extended solution. Finally, we will summarize the illumination rendering for specular scene, there are still some open prob- lems.

  15. Perception-based transparency optimization for direct volume rendering.

    Science.gov (United States)

    Chan, Ming-Yuen; Wu, Yingcai; Mak, Wai-Ho; Chen, Wei; Qu, Huamin

    2009-01-01

    The semi-transparent nature of direct volume rendered images is useful to depict layered structures in a volume. However, obtaining a semi-transparent result with the layers clearly revealed is difficult and may involve tedious adjustment on opacity and other rendering parameters. Furthermore, the visual quality of layers also depends on various perceptual factors. In this paper, we propose an auto-correction method for enhancing the perceived quality of the semi-transparent layers in direct volume rendered images. We introduce a suite of new measures based on psychological principles to evaluate the perceptual quality of transparent structures in the rendered images. By optimizing rendering parameters within an adaptive and intuitive user interaction process, the quality of the images is enhanced such that specific user requirements can be met. Experimental results on various datasets demonstrate the effectiveness and robustness of our method.

  16. A Volume Rendering Algorithm for Sequential 2D Medical Images

    Institute of Scientific and Technical Information of China (English)

    吕忆松; 陈亚珠

    2002-01-01

    Volume rendering of 3D data sets composed of sequential 2D medical images has become an important branch in image processing and computer graphics.To help physicians fully understand deep-seated human organs and focuses(e.g.a tumour)as 3D structures.in this paper,we present a modified volume rendering algorithm to render volumetric data,Using this method.the projection images of structures of interest from different viewing directions can be obtained satisfactorily.By rotating the light source and the observer eyepoint,this method avoids rotates the whole volumetric data in main memory and thus reduces computational complexity and rendering time.Experiments on CT images suggest that the proposed method is useful and efficient for rendering 3D data sets.

  17. Integrating SVC and HOL with the PROSPER Toolkit

    OpenAIRE

    Stevenson, Alan; Dennis, Louise Abigail

    2000-01-01

    We describe an integration of the SVC decision procedure with the HOL theorem prover. This integration was achieved using the PROSPER toolkit. The SVC decision procedure operates on rational numbers, an axiomatic theory for which was provided in HOL. The decision procedure also returns counterexamples and a framework has been devised for handling counterexamples in a HOL setting.

  18. Toolkit for Conceptual Modeling (TCM): User's Guide and Reference

    NARCIS (Netherlands)

    Dehne, F.; Wieringa, R.J.

    1997-01-01

    The Toolkit for Conceptual Modeling (TCM) is a suite of graphical editors for a number of graphical notation systems that are used in software specification methods. The notations can be used to represent the conceptual structure of the software - hence the name of the suite. This manual describes

  19. Designing a Portable and Low Cost Home Energy Management Toolkit

    NARCIS (Netherlands)

    Keyson, D.V.; Al Mahmud, A.; De Hoogh, M.; Luxen, R.

    2013-01-01

    In this paper we describe the design of a home energy and comfort management system. The system has three components such as a smart plug with a wireless module, a residential gateway and a mobile app. The combined system is called a home energy management and comfort toolkit. The design is inspired

  20. Toolkit for Conceptual Modeling (TCM): User's Guide and Reference

    NARCIS (Netherlands)

    Dehne, F.; Wieringa, Roelf J.

    The Toolkit for Conceptual Modeling (TCM) is a suite of graphical editors for a number of graphical notation systems that are used in software specification methods. The notations can be used to represent the conceptual structure of the software - hence the name of the suite. This manual describes

  1. LBTool: A stochastic toolkit for leave-based key updates

    DEFF Research Database (Denmark)

    Yuksel, Ender; Nielson, Hanne Riis; Nielson, Flemming

    2012-01-01

    Quantitative techniques have been successfully employed in verification of information and communication systems. However, the use of such techniques are still rare in the area of security. In this paper, we present a toolkit that implements transient analysis on a key update method for wireless ...

  2. A toolkit for analyzing nonlinear dynamic stochastic models easily

    NARCIS (Netherlands)

    Uhlig, H.F.H.V.S.

    1995-01-01

    Often, researchers wish to analyze nonlinear dynamic discrete-time stochastic models. This paper provides a toolkit for solving such models easily, building on log-linearizing the necessary equations characterizing the equilibrium and solving for the recursive equilibrium law of motion with the meth

  3. A toolkit for analyzing nonlinear dynamic stochastic models easily

    NARCIS (Netherlands)

    Uhlig, H.F.H.V.S.

    1995-01-01

    Often, researchers wish to analyze nonlinear dynamic discrete-time stochastic models. This paper provides a toolkit for solving such models easily, building on log-linearizing the necessary equations characterizing the equilibrium and solving for the recursive equilibrium law of motion with the meth

  4. Using an Assistive Technology Toolkit to Promote Inclusion

    Science.gov (United States)

    Judge, Sharon; Floyd, Kim; Jeffs, Tara

    2008-01-01

    Although the use of assistive technology for young children is increasing, the lack of awareness and the lack of training continue to act as major barriers to providers using assistive technology. This article describes an assistive technology toolkit designed for use with young children with disabilities that can be easily assembled and…

  5. Toolkit - South Africa's good waste management practices: lessons learned

    CSIR Research Space (South Africa)

    Afrika, M

    2010-02-01

    Full Text Available practices are to be found. This paper reports on the development of a Toolkit for municipal waste management service delivery, based on some of the good waste management practices currently implemented in different municipalities across all the categories...

  6. 77 FR 73023 - U.S. Environmental Solutions Toolkit

    Science.gov (United States)

    2012-12-07

    ... International Trade Administration U.S. Environmental Solutions Toolkit AGENCY: International Trade... compressors. The Department of Commerce continues to develop the web- based U.S. Environmental Solutions... environmental issue outlined above that are interested in participating in the U.S. Environmental...

  7. Simulation toolkit with CMOS detector in the framework of hadrontherapy

    Directory of Open Access Journals (Sweden)

    Rescigno R.

    2014-03-01

    Full Text Available Proton imaging can be seen as a powerful technique for on-line monitoring of ion range during carbon ion therapy irradiation. The protons detection technique uses, as three-dimensional tracking system, a set of CMOS sensor planes. A simulation toolkit based on GEANT4 and ROOT is presented including detector response and reconstruction algorithm.

  8. Simulation toolkit with CMOS detector in the framework of hadrontherapy

    Science.gov (United States)

    Rescigno, R.; Finck, Ch.; Juliani, D.; Baudot, J.; Dauvergne, D.; Dedes, G.; Krimmer, J.; Ray, C.; Reithinger, V.; Rousseau, M.; Testa, E.; Winter, M.

    2014-03-01

    Proton imaging can be seen as a powerful technique for on-line monitoring of ion range during carbon ion therapy irradiation. The protons detection technique uses, as three-dimensional tracking system, a set of CMOS sensor planes. A simulation toolkit based on GEANT4 and ROOT is presented including detector response and reconstruction algorithm.

  9. A Beginning Rural Principal's Toolkit: A Guide for Success

    Science.gov (United States)

    Ashton, Brian; Duncan, Heather E.

    2012-01-01

    The purpose of this article is to explore both the challenges and skills needed to effectively assume a leadership position and thus to create an entry plan or "toolkit" for a new rural school leader. The entry plan acts as a guide beginning principals may use to navigate the unavoidable confusion that comes with leadership. It also assists…

  10. Using Toolkits to Achieve STEM Enterprise Learning Outcomes

    Science.gov (United States)

    Watts, Carys A.; Wray, Katie

    2012-01-01

    Purpose: The purpose of this paper is to evaluate the effectiveness of using several commercial tools in science, technology, engineering and maths (STEM) subjects for enterprise education at Newcastle University, UK. Design/methodology/approach: The paper provides an overview of existing toolkit use in higher education, before reviewing where and…

  11. Using AASL's "Health and Wellness" and "Crisis Toolkits"

    Science.gov (United States)

    Logan, Debra Kay

    2009-01-01

    Whether a school library program is the picture of good health in a state that mandates a professionally staffed library media center in every building or is suffering in a low-wealth district that is facing drastic cuts, the recently launched toolkits by the American Association of School Librarians (AASL) are stocked with useful strategies and…

  12. Easy-to-use Software Toolkit for IR Cameras

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    CEDIP Infrared Systems, specialists in thermal IR camera systems, have announced a new toolkit for use with their range of cameras that enables simple set-up and control of a wide range of parameters using National Instruments LabVIEW? programming environment,

  13. A Toolkit to Implement Graduate Attributes in Geography Curricula

    Science.gov (United States)

    Spronken-Smith, Rachel; McLean, Angela; Smith, Nell; Bond, Carol; Jenkins, Martin; Marshall, Stephen; Frielick, Stanley

    2016-01-01

    This article uses findings from a project on engagement with graduate outcomes across higher education institutions in New Zealand to produce a toolkit for implementing graduate attributes in geography curricula. Key facets include strong leadership; academic developers to facilitate conversations about graduate attributes and teaching towards…

  14. Water Security Toolkit User Manual Version 1.2.

    Energy Technology Data Exchange (ETDEWEB)

    Klise, Katherine A.; Siirola, John Daniel; Hart, David; Hart, William Eugene; Phillips, Cynthia Ann; Haxton, Terranna; Murray, Regan; Janke, Robert; Taxon, Thomas; Laird, Carl; Seth, Arpan; Hackebeil, Gabriel; McGee, Shawn; Mann, Angelica

    2014-08-01

    The Water Security Toolkit (WST) is a suite of open source software tools that can be used by water utilities to create response strategies to reduce the impact of contamination in a water distribution network . WST includes hydraulic and water quality modeling software , optimizati on methodologies , and visualization tools to identify: (1) sensor locations to detect contamination, (2) locations in the network in which the contamination was introduced, (3) hydrants to remove contaminated water from the distribution system, (4) locations in the network to inject decontamination agents to inactivate, remove, or destroy contaminants, (5) locations in the network to take grab sample s to help identify the source of contamination and (6) valves to close in order to isolate contaminate d areas of the network. This user manual describes the different components of WST , along w ith examples and case studies. License Notice The Water Security Toolkit (WST) v.1.2 Copyright c 2012 Sandia Corporation. Under the terms of Contract DE-AC04-94AL85000, there is a non-exclusive license for use of this work by or on behalf of the U.S. government. This software is distributed under the Revised BSD License (see below). In addition, WST leverages a variety of third-party software packages, which have separate licensing policies: Acro Revised BSD License argparse Python Software Foundation License Boost Boost Software License Coopr Revised BSD License Coverage BSD License Distribute Python Software Foundation License / Zope Public License EPANET Public Domain EPANET-ERD Revised BSD License EPANET-MSX GNU Lesser General Public License (LGPL) v.3 gcovr Revised BSD License GRASP AT&T Commercial License for noncommercial use; includes randomsample and sideconstraints executable files LZMA SDK Public Domain nose GNU Lesser General Public License (LGPL) v.2.1 ordereddict MIT License pip MIT License PLY BSD License PyEPANET Revised BSD License Pyro MIT License PyUtilib Revised BSD License Py

  15. An improved scheduling algorithm for 3D cluster rendering with platform LSF

    Science.gov (United States)

    Xu, Wenli; Zhu, Yi; Zhang, Liping

    2013-10-01

    High-quality photorealistic rendering of 3D modeling needs powerful computing systems. On this demand highly efficient management of cluster resources develops fast to exert advantages. This paper is absorbed in the aim of how to improve the efficiency of 3D rendering tasks in cluster. It focuses research on a dynamic feedback load balance (DFLB) algorithm, the work principle of load sharing facility (LSF) and optimization of external scheduler plug-in. The algorithm can be applied into match and allocation phase of a scheduling cycle. Candidate hosts is prepared in sequence in match phase. And the scheduler makes allocation decisions for each job in allocation phase. With the dynamic mechanism, new weight is assigned to each candidate host for rearrangement. The most suitable one will be dispatched for rendering. A new plugin module of this algorithm has been designed and integrated into the internal scheduler. Simulation experiments demonstrate the ability of improved plugin module is superior to the default one for rendering tasks. It can help avoid load imbalance among servers, increase system throughput and improve system utilization.

  16. YT: A Multi-Code Analysis Toolkit for Astrophysical Simulation Data

    Energy Technology Data Exchange (ETDEWEB)

    Turk, Matthew J.; /San Diego, CASS; Smith, Britton D.; /Michigan State U.; Oishi, Jeffrey S.; /KIPAC, Menlo Park /Stanford U., Phys. Dept.; Skory, Stephen; Skillman, Samuel W.; /Colorado U., CASA; Abel, Tom; /KIPAC, Menlo Park /Stanford U., Phys. Dept.; Norman, Michael L.; /aff San Diego, CASS

    2011-06-23

    The analysis of complex multiphysics astrophysical simulations presents a unique and rapidly growing set of challenges: reproducibility, parallelization, and vast increases in data size and complexity chief among them. In order to meet these challenges, and in order to open up new avenues for collaboration between users of multiple simulation platforms, we present yt (available at http://yt.enzotools.org/) an open source, community-developed astrophysical analysis and visualization toolkit. Analysis and visualization with yt are oriented around physically relevant quantities rather than quantities native to astrophysical simulation codes. While originally designed for handling Enzo's structure adaptive mesh refinement data, yt has been extended to work with several different simulation methods and simulation codes including Orion, RAMSES, and FLASH. We report on its methods for reading, handling, and visualizing data, including projections, multivariate volume rendering, multi-dimensional histograms, halo finding, light cone generation, and topologically connected isocontour identification. Furthermore, we discuss the underlying algorithms yt uses for processing and visualizing data, and its mechanisms for parallelization of analysis tasks.

  17. Temporally rendered automatic cloud extraction (TRACE) system

    Science.gov (United States)

    Bodrero, Dennis M.; Yale, James G.; Davis, Roger E.; Rollins, John M.

    1999-10-01

    Smoke/obscurant testing requires that 2D cloud extent be extracted from visible and thermal imagery. These data are used alone or in combination with 2D data from other aspects to make 3D calculations of cloud properties, including dimensions, volume, centroid, travel, and uniformity. Determining cloud extent from imagery has historically been a time-consuming manual process. To reduce time and cost associated with smoke/obscurant data processing, automated methods to extract cloud extent from imagery were investigated. The TRACE system described in this paper was developed and implemented at U.S. Army Dugway Proving Ground, UT by the Science and Technology Corporation--Acuity Imaging Incorporated team with Small Business Innovation Research funding. TRACE uses dynamic background subtraction and 3D fast Fourier transform as primary methods to discriminate the smoke/obscurant cloud from the background. TRACE has been designed to run on a PC-based platform using Windows. The PC-Windows environment was chosen for portability, to give TRACE the maximum flexibility in terms of its interaction with peripheral hardware devices such as video capture boards, removable media drives, network cards, and digital video interfaces. Video for Windows provides all of the necessary tools for the development of the video capture utility in TRACE and allows for interchangeability of video capture boards without any software changes. TRACE is designed to take advantage of future upgrades in all aspects of its component hardware. A comparison of cloud extent determined by TRACE with manual method is included in this paper.

  18. Multiresolution maximum intensity volume rendering by morphological adjunction pyramids

    NARCIS (Netherlands)

    Roerdink, Jos B.T.M.

    We describe a multiresolution extension to maximum intensity projection (MIP) volume rendering, allowing progressive refinement and perfect reconstruction. The method makes use of morphological adjunction pyramids. The pyramidal analysis and synthesis operators are composed of morphological 3-D

  19. Multiresolution Maximum Intensity Volume Rendering by Morphological Adjunction Pyramids

    NARCIS (Netherlands)

    Roerdink, Jos B.T.M.

    2001-01-01

    We describe a multiresolution extension to maximum intensity projection (MIP) volume rendering, allowing progressive refinement and perfect reconstruction. The method makes use of morphological adjunction pyramids. The pyramidal analysis and synthesis operators are composed of morphological 3-D

  20. Foggy Scene Rendering Based on Transmission Map Estimation

    Directory of Open Access Journals (Sweden)

    Fan Guo

    2014-01-01

    Full Text Available Realistic rendering of foggy scene is important in game development and virtual reality. Traditional methods have many parameters to control or require a long time to compute, and they are usually limited to depicting a homogeneous fog without considering the foggy scene with heterogeneous fog. In this paper, a new rendering method based on transmission map estimation is proposed. We first generate perlin noise image as the density distribution texture of heterogeneous fog. Then we estimate the transmission map using the Markov random field (MRF model and the bilateral filter. Finally, virtual foggy scene is realistically rendered with the generated perlin noise image and the transmission map according to the atmospheric scattering model. Experimental results show that the rendered results of our approach are quite satisfactory.

  1. Comparison of Morphological Pyramids for Multiresolution MIP Volume Rendering

    NARCIS (Netherlands)

    Roerdink, Jos B.T.M.

    2002-01-01

    We recently proposed a multiresolution representation for maximum intensity projection (MIP) volume rendering based on morphological adjunction pyramids which allow progressive refinement and have the property of perfect reconstruction. In this algorithm the pyramidal analysis and synthesis

  2. Experiencing "Macbeth": From Text Rendering to Multicultural Performance.

    Science.gov (United States)

    Reisin, Gail

    1993-01-01

    Shows how one teacher used innovative methods in teaching William Shakespeare's "Macbeth." Outlines student assignments including text renderings, rewriting a scene from the play, and creating a multicultural scrapbook for the play. (HB)

  3. High-quality multi-resolution volume rendering in medicine

    Institute of Scientific and Technical Information of China (English)

    XIE Kai; YANG Jie; LI Xiao-liang

    2007-01-01

    In order to perform a high-quality interactive rendering of large medical data sets on a single off-theshelf PC, a LOD selection algorithm for multi-resolution volume rendering using 3D texture mapping is presented, which uses an adaptive scheme that renders the volume in a region-of-interest at a high resolution and the volume away from this region at lower resolutions. The algorithm is based on several important criteria, and rendering is done adaptively by selecting high-resolution cells close to a center of attention and low-resolution cells away from this area. In addition, our hierarchical level-of-detail representation guarantees consistent interpolation between different resolution levels. Experiments have been applied to a number of large medical data and have produced high quality images at interactive frame rates using standard PC hardware.

  4. Factors affecting extension workers in their rendering of effective ...

    African Journals Online (AJOL)

    Factors affecting extension workers in their rendering of effective service to pre ... the objective of achieving sustainable livelihoods for the poor and commonages. ... marketing and management to adequately service the land reform programs.

  5. does knowledge influence their attitude and comfort in rendering care?

    African Journals Online (AJOL)

    kemrilib

    Physicians and AIDS care: does knowledge influence their attitude and comfort in rendering ... experience, age and being a consultant or a senior resident influenced attitude, while male ..... having or not having children, prior instructions on ...

  6. Accelerating Monte Carlo Renderers by Ray Histogram Fusion

    Directory of Open Access Journals (Sweden)

    Mauricio Delbracio

    2015-03-01

    Full Text Available This paper details the recently introduced Ray Histogram Fusion (RHF filter for accelerating Monte Carlo renderers [M. Delbracio et al., Boosting Monte Carlo Rendering by Ray Histogram Fusion, ACM Transactions on Graphics, 33 (2014]. In this filter, each pixel in the image is characterized by the colors of the rays that reach its surface. Pixels are compared using a statistical distance on the associated ray color distributions. Based on this distance, it decides whether two pixels can share their rays or not. The RHF filter is consistent: as the number of samples increases, more evidence is required to average two pixels. The algorithm provides a significant gain in PSNR, or equivalently accelerates the rendering process by using many fewer Monte Carlo samples without observable bias. Since the RHF filter depends only on the Monte Carlo samples color values, it can be naturally combined with all rendering effects.

  7. A targeted proteomics toolkit for high-throughput absolute quantification of Escherichia coli proteins.

    Science.gov (United States)

    Batth, Tanveer S; Singh, Pragya; Ramakrishnan, Vikram R; Sousa, Mirta M L; Chan, Leanne Jade G; Tran, Huu M; Luning, Eric G; Pan, Eva H Y; Vuu, Khanh M; Keasling, Jay D; Adams, Paul D; Petzold, Christopher J

    2014-11-01

    Transformation of engineered Escherichia coli into a robust microbial factory is contingent on precise control of metabolism. Yet, the throughput of omics technologies used to characterize cell components has lagged far behind our ability to engineer novel strains. To expand the utility of quantitative proteomics for metabolic engineering, we validated and optimized targeted proteomics methods for over 400 proteins from more than 20 major pathways in E. coli metabolism. Complementing these methods, we constructed a series of synthetic genes to produce concatenated peptides (QconCAT) for absolute quantification of the proteins and made them available through the Addgene plasmid repository (www.addgene.org). To facilitate high sample throughput, we developed a fast, analytical-flow chromatography method using a 5.5-min gradient (10 min total run time). Overall this toolkit provides an invaluable resource for metabolic engineering by increasing sample throughput, minimizing development time and providing peptide standards for absolute quantification of E. coli proteins.

  8. Rethinking the economist’s evaluation toolkit in light of sustainability policy

    Directory of Open Access Journals (Sweden)

    Stefan Hajkowicz

    2008-03-01

    Full Text Available The dominant economic evaluation technique is benefit-cost analysis (BCA. However, sustainability policy must handle outcomes that cannot easily be quantified in monetary units. Multiple criteria analysis (MCA is emerging as an alternative, and/or complementary, economic evaluation tool. The economics profession has been slow to adopt MCA. This paper first explores the role of MCA within the economist’s evaluation toolkit alongside BCA, cost-effectiveness analysis (CEA, and cost-utility analysis (CUA and then proposes a process for selecting an appropriate evaluation method. The choice of technique will depend on the extent to which environmental goods can be valued in monetary units. The paper argues that MCA has an expanded role to play alongside BCA (and the other methods to ensure that sustainability policies are realized.

  9. The Repeat Pattern Toolkit (RPT): Analyzing the structure and evolution of the C. elegans genome

    Energy Technology Data Exchange (ETDEWEB)

    Agarwal, P.; States, D.J. [Washington Univ., St. Louis, MO (United States)

    1994-12-31

    Over 3.6 million bases of DNA sequence from chromosome III of the C. elegans have been determined. The availability of this extended region of contiguous sequence has allowed us to analyze the nature and prevalence of repetitive sequences in the genome of a eukaryotic organism with a high gene density. We have assembled a Repeat Pattern Toolkit (RPT) to analyze the patterns of repeats occurring in DNA. The tools include identifying significant local alignments (utilizing both two-way and three-way alignments), dividing the set of alignments into connected components (signifying repeat families), computing evolutionary distance between repeat family members, constructing minimum spanning trees from the connected components, and visualizing the evolution of the repeat families. Over 7000 families of repetitive sequences were identified. The size of the families ranged from isolated pairs to over 1600 segments of similar sequence. Approximately 12.3% of the analyzed sequence participates in a repeat element.

  10. OpenMSI Arrayed Analysis Toolkit: Analyzing Spatially Defined Samples Using Mass Spectrometry Imaging

    DEFF Research Database (Denmark)

    de Raad, Markus; de Rond, Tristan; Rübel, Oliver

    2017-01-01

    processing tools for the analysis of large arrayed MSI sample sets. The OpenMSI Arrayed Analysis Toolkit (OMAAT) is a software package that addresses the challenges of analyzing spatially defined samples in MSI data sets. OMAAT is written in Python and is integrated with OpenMSI (http......://openmsinersc.gov), a platform for storing, sharing, and analyzing MSI data. By using a web-based python notebook (Jupyter), OMAAT is accessible to anyone without programming experience yet allows experienced users to leverage all features. OMAAT was :evaluated by analyzing an MSI data set of a high-throughput glycoside...... hydrolase activity screen comprising 384 samples arrayed onto a NIMS surface at a 450 pm spacing, decreasing analysis time >100-fold while maintaining robust spot-finding. The utility of OMAAT was demonstrated for screening metabolic activities of different sized soil particles, including hydrolysis...

  11. The Trick Simulation Toolkit: A NASA/Opensource Framework for Running Time Based Physics Models

    Science.gov (United States)

    Penn, John M.

    2016-01-01

    The Trick Simulation Toolkit is a simulation development environment used to create high fidelity training and engineering simulations at the NASA Johnson Space Center and many other NASA facilities. Its purpose is to generate a simulation executable from a collection of user-supplied models and a simulation definition file. For each Trick-based simulation, Trick automatically provides job scheduling, numerical integration, the ability to write and restore human readable checkpoints, data recording, interactive variable manipulation, a run-time interpreter, and many other commonly needed capabilities. This allows simulation developers to concentrate on their domain expertise and the algorithms and equations of their models. Also included in Trick are tools for plotting recorded data and various other supporting utilities and libraries. Trick is written in C/C++ and Java and supports both Linux and MacOSX computer operating systems. This paper describes Trick's design and use at NASA Johnson Space Center.

  12. A Mechanogenetic Toolkit for Interrogating Cell Signaling in Space and Time.

    Science.gov (United States)

    Seo, Daeha; Southard, Kaden M; Kim, Ji-Wook; Lee, Hyun Jung; Farlow, Justin; Lee, Jung-Uk; Litt, David B; Haas, Thomas; Alivisatos, A Paul; Cheon, Jinwoo; Gartner, Zev J; Jun, Young-Wook

    2016-06-02

    Tools capable of imaging and perturbing mechanical signaling pathways with fine spatiotemporal resolution have been elusive, despite their importance in diverse cellular processes. The challenge in developing a mechanogenetic toolkit (i.e., selective and quantitative activation of genetically encoded mechanoreceptors) stems from the fact that many mechanically activated processes are localized in space and time yet additionally require mechanical loading to become activated. To address this challenge, we synthesized magnetoplasmonic nanoparticles that can image, localize, and mechanically load targeted proteins with high spatiotemporal resolution. We demonstrate their utility by investigating the cell-surface activation of two mechanoreceptors: Notch and E-cadherin. By measuring cellular responses to a spectrum of spatial, chemical, temporal, and mechanical inputs at the single-molecule and single-cell levels, we reveal how spatial segregation and mechanical force cooperate to direct receptor activation dynamics. This generalizable technique can be used to control and understand diverse mechanosensitive processes in cell signaling. VIDEO ABSTRACT.

  13. Valorization of rendering industry wastes and co-products for industrial chemicals, materials and energy: review.

    Science.gov (United States)

    Mekonnen, Tizazu; Mussone, Paolo; Bressler, David

    2016-01-01

    Over the past decades, strong global demand for industrial chemicals, raw materials and energy has been driven by rapid industrialization and population growth across the world. In this context, long-term environmental sustainability demands the development of sustainable strategies of resource utilization. The agricultural sector is a major source of underutilized or low-value streams that accompany the production of food and other biomass commodities. Animal agriculture in particular constitutes a substantial portion of the overall agricultural sector, with wastes being generated along the supply chain of slaughtering, handling, catering and rendering. The recent emergence of bovine spongiform encephalopathy (BSE) resulted in the elimination of most of the traditional uses of rendered animal meals such as blood meal, meat and bone meal (MBM) as animal feed with significant economic losses for the entire sector. The focus of this review is on the valorization progress achieved on converting protein feedstock into bio-based plastics, flocculants, surfactants and adhesives. The utilization of other rendering streams such as fat and ash rich biomass for the production of renewable fuels, solvents, drop-in chemicals, minerals and fertilizers is also critically reviewed.

  14. A parallel architecture for interactively rendering scattering and refraction effects.

    Science.gov (United States)

    Bernabei, Daniele; Hakke-Patil, Ajit; Banterle, Francesco; Di Benedetto, Marco; Ganovelli, Fabio; Pattanaik, Sumanta; Scopigno, Roberto

    2012-01-01

    A new method for interactive rendering of complex lighting effects combines two algorithms. The first performs accurate ray tracing in heterogeneous refractive media to compute high-frequency phenomena. The second applies lattice-Boltzmann lighting to account for low-frequency multiple-scattering effects. The two algorithms execute in parallel on modern graphics hardware. This article includes a video animation of the authors' real-time algorithm rendering a variety of scenes.

  15. Wavelet subdivision methods gems for rendering curves and surfaces

    CERN Document Server

    Chui, Charles

    2010-01-01

    OVERVIEW Curve representation and drawing Free-form parametric curves From subdivision to basis functions Wavelet subdivision and editing Surface subdivision BASIS FUNCTIONS FOR CURVE REPRESENTATION Refinability and scaling functions Generation of smooth basis functions Cardinal B-splines Stable bases for integer-shift spaces Splines and polynomial reproduction CURVE SUBDIVISION SCHEMES Subdivision matrices and stencils B-spline subdivision schemes Closed curve rendering Open curve rendering BASIS FUNCTIONS GENERATED BY SUBDIVISION MATRICES Subdivision operators The up-sampling convolution ope

  16. A Sort-Last Rendering System over an Optical Backplane

    Directory of Open Access Journals (Sweden)

    Yasuhiro Kirihata

    2005-06-01

    Full Text Available Sort-Last is a computer graphics technique for rendering extremely large data sets on clusters of computers. Sort-Last works by dividing the data set into even-sized chunks for parallel rendering and then composing the images to form the final result. Since sort-last rendering requires the movement of large amounts of image data among cluster nodes, the network interconnecting the nodes becomes a major bottleneck. In this paper, we describe a sort-last rendering system implemented on a cluster of computers whose nodes are connected by an all-optical switch. The rendering system introduces the notion of the Photonic Computing Engine, a computing system built dynamically by using the optical switch to create dedicated network connections among cluster nodes. The sort-last volume rendering algorithm was implemented on the Photonic Computing Engine, and its performance is evaluated. Prelimi- nary experiments show that performance is affected by the image composition time and average payload size. In an attempt to stabilize the performance of the system, we have designed a flow control mechanism that uses feedback messages to dynamically adjust the data flow rate within the computing engine.

  17. Fast DRR splat rendering using common consumer graphics hardware.

    Science.gov (United States)

    Spoerk, Jakob; Bergmann, Helmar; Wanschitz, Felix; Dong, Shuo; Birkfellner, Wolfgang

    2007-11-01

    Digitally rendered radiographs (DRR) are a vital part of various medical image processing applications such as 2D/3D registration for patient pose determination in image-guided radiotherapy procedures. This paper presents a technique to accelerate DRR creation by using conventional graphics hardware for the rendering process. DRR computation itself is done by an efficient volume rendering method named wobbled splatting. For programming the graphics hardware, NVIDIAs C for Graphics (Cg) is used. The description of an algorithm used for rendering DRRs on the graphics hardware is presented, together with a benchmark comparing this technique to a CPU-based wobbled splatting program. Results show a reduction of rendering time by about 70%-90% depending on the amount of data. For instance, rendering a volume of 2 x 10(6) voxels is feasible at an update rate of 38 Hz compared to 6 Hz on a common Intel-based PC using the graphics processing unit (GPU) of a conventional graphics adapter. In addition, wobbled splatting using graphics hardware for DRR computation provides higher resolution DRRs with comparable image quality due to special processing characteristics of the GPU. We conclude that DRR generation on common graphics hardware using the freely available Cg environment is a major step toward 2D/3D registration in clinical routine.

  18. A Practical Framework for Sharing and Rendering Real-World Bidirectional Scattering Distribution Functions

    Energy Technology Data Exchange (ETDEWEB)

    Ward, Greg [Anywhere Software, Albany, CA (United States); Kurt, Murat [International Computer Institute, Ege University (Turkey); Bonneel, Nicolas [Harvard Univ., Cambridge, MA (United States)

    2012-09-30

    The utilization of real-world materials has been hindered by a lack of standards for sharing and interpreting measured data. This paper presents an XML representation and an Open Source C library to support bidirectional scattering distribution functions (BSDFs) in data-driven lighting simulation and rendering applications.The library provides for the efficient representation, query, and Monte Carlo sampling of arbitrary BSDFs in amodel-free framework. Currently, we support two BSDF data representations: one using a fixed subdivision of thehemisphere, and one with adaptive density. The fixed type has advantages for certain matrix operations, while theadaptive type can more accurately represent highly peaked data. We discuss advanced methods for data-drivenBSDF rendering for both types, including the proxy of detailed geometry to enhance appearance and accuracy.We also present an advanced interpolation method to reduce measured data into these standard representations.We end with our plan for future extensions and sharing of BSDF data.

  19. Evaluating Complex Interventions and Health Technologies Using Normalization Process Theory: Development of a Simplified Approach and Web-Enabled Toolkit

    LENUS (Irish Health Repository)

    May, Carl R

    2011-09-30

    Abstract Background Normalization Process Theory (NPT) can be used to explain implementation processes in health care relating to new technologies and complex interventions. This paper describes the processes by which we developed a simplified version of NPT for use by clinicians, managers, and policy makers, and which could be embedded in a web-enabled toolkit and on-line users manual. Methods Between 2006 and 2010 we undertook four tasks. (i) We presented NPT to potential and actual users in multiple workshops, seminars, and presentations. (ii) Using what we discovered from these meetings, we decided to create a simplified set of statements and explanations expressing core constructs of the theory (iii) We circulated these statements to a criterion sample of 60 researchers, clinicians and others, using SurveyMonkey to collect qualitative textual data about their criticisms of the statements. (iv) We then reconstructed the statements and explanations to meet users\\' criticisms, embedded them in a web-enabled toolkit, and beta tested this \\'in the wild\\'. Results On-line data collection was effective: over a four week period 50\\/60 participants responded using SurveyMonkey (40\\/60) or direct phone and email contact (10\\/60). An additional nine responses were received from people who had been sent the SurveyMonkey form by other respondents. Beta testing of the web enabled toolkit produced 13 responses, from 327 visits to http:\\/\\/www.normalizationprocess.org. Qualitative analysis of both sets of responses showed a high level of support for the statements but also showed that some statements poorly expressed their underlying constructs or overlapped with others. These were rewritten to take account of users\\' criticisms and then embedded in a web-enabled toolkit. As a result we were able translate the core constructs into a simplified set of statements that could be utilized by non-experts. Conclusion Normalization Process Theory has been developed through

  20. Evaluating complex interventions and health technologies using normalization process theory: development of a simplified approach and web-enabled toolkit

    Directory of Open Access Journals (Sweden)

    Murray Elizabeth

    2011-09-01

    Full Text Available Abstract Background Normalization Process Theory (NPT can be used to explain implementation processes in health care relating to new technologies and complex interventions. This paper describes the processes by which we developed a simplified version of NPT for use by clinicians, managers, and policy makers, and which could be embedded in a web-enabled toolkit and on-line users manual. Methods Between 2006 and 2010 we undertook four tasks. (i We presented NPT to potential and actual users in multiple workshops, seminars, and presentations. (ii Using what we discovered from these meetings, we decided to create a simplified set of statements and explanations expressing core constructs of the theory (iii We circulated these statements to a criterion sample of 60 researchers, clinicians and others, using SurveyMonkey to collect qualitative textual data about their criticisms of the statements. (iv We then reconstructed the statements and explanations to meet users' criticisms, embedded them in a web-enabled toolkit, and beta tested this 'in the wild'. Results On-line data collection was effective: over a four week period 50/60 participants responded using SurveyMonkey (40/60 or direct phone and email contact (10/60. An additional nine responses were received from people who had been sent the SurveyMonkey form by other respondents. Beta testing of the web enabled toolkit produced 13 responses, from 327 visits to http://www.normalizationprocess.org. Qualitative analysis of both sets of responses showed a high level of support for the statements but also showed that some statements poorly expressed their underlying constructs or overlapped with others. These were rewritten to take account of users' criticisms and then embedded in a web-enabled toolkit. As a result we were able translate the core constructs into a simplified set of statements that could be utilized by non-experts. Conclusion Normalization Process Theory has been developed through

  1. Developing Mixed Reality Educational Applications: The Virtual Touch Toolkit

    Science.gov (United States)

    Mateu, Juan; Lasala, María José; Alamán, Xavier

    2015-01-01

    In this paper, we present Virtual Touch, a toolkit that allows the development of educational activities through a mixed reality environment such that, using various tangible elements, the interconnection of a virtual world with the real world is enabled. The main goal of Virtual Touch is to facilitate the installation, configuration and programming of different types of technologies, abstracting the creator of educational applications from the technical details involving the use of tangible interfaces and virtual worlds. Therefore, it is specially designed to enable teachers to themselves create educational activities for their students in a simple way, taking into account that teachers generally lack advanced knowledge in computer programming and electronics. The toolkit has been used to develop various educational applications that have been tested in two secondary education high schools in Spain. PMID:26334275

  2. A toolkit for epithermal neutron beam characterisation in BNCT.

    Science.gov (United States)

    Auterinen, Iiro; Serén, Tom; Uusi-Simola, Jouni; Kosunen, Antti; Savolainen, Sauli

    2004-01-01

    Methods for dosimetry of epithermal neutron beams used in boron neutron capture therapy (BNCT) have been developed and utilised within the Finnish BNCT project as well as within a European project for a code of practise for the dosimetry of BNCT. One outcome has been a travelling toolkit for BNCT dosimetry. It consists of activation detectors and ionisation chambers. The free-beam neutron spectrum is measured with a set of activation foils of different isotopes irradiated both in a Cd-capsule and without it. Neutron flux (thermal and epithermal) distribution in phantoms is measured using activation of Mn and Au foils, and Cu wire. Ionisation chamber (IC) measurements are performed both in-free-beam and in-phantom for determination of the neutron and gamma dose components. This toolkit has also been used at other BNCT facilities in Europe, the USA, Argentina and Japan.

  3. Developing Mixed Reality Educational Applications: The Virtual Touch Toolkit.

    Science.gov (United States)

    Mateu, Juan; Lasala, María José; Alamán, Xavier

    2015-08-31

    In this paper, we present Virtual Touch, a toolkit that allows the development of educational activities through a mixed reality environment such that, using various tangible elements, the interconnection of a virtual world with the real world is enabled. The main goal of Virtual Touch is to facilitate the installation, configuration and programming of different types of technologies, abstracting the creator of educational applications from the technical details involving the use of tangible interfaces and virtual worlds. Therefore, it is specially designed to enable teachers to themselves create educational activities for their students in a simple way, taking into account that teachers generally lack advanced knowledge in computer programming and electronics. The toolkit has been used to develop various educational applications that have been tested in two secondary education high schools in Spain.

  4. ProtoMD: A prototyping toolkit for multiscale molecular dynamics

    Science.gov (United States)

    Somogyi, Endre; Mansour, Andrew Abi; Ortoleva, Peter J.

    2016-05-01

    ProtoMD is a toolkit that facilitates the development of algorithms for multiscale molecular dynamics (MD) simulations. It is designed for multiscale methods which capture the dynamic transfer of information across multiple spatial scales, such as the atomic to the mesoscopic scale, via coevolving microscopic and coarse-grained (CG) variables. ProtoMD can be also be used to calibrate parameters needed in traditional CG-MD methods. The toolkit integrates 'GROMACS wrapper' to initiate MD simulations, and 'MDAnalysis' to analyze and manipulate trajectory files. It facilitates experimentation with a spectrum of coarse-grained variables, prototyping rare events (such as chemical reactions), or simulating nanocharacterization experiments such as terahertz spectroscopy, AFM, nanopore, and time-of-flight mass spectroscopy. ProtoMD is written in python and is freely available under the GNU General Public License from github.com/CTCNano/proto_md.

  5. The Next Generation of the Montage Image Mosaic Toolkit

    CERN Document Server

    Berriman, G Bruce; Rusholme, B; Robitaille, T

    2016-01-01

    The scientific computing landscape has evolved dramatically in the past few years, with new schemes for organizing and storing data that reflect the growth in size and complexity of astronomical data sets. In response to this changing landscape, we are, over the next two years, deploying the next generation of the Montage toolkit ([ascl:1010.036]). The first release (October 2015) supports multi-dimensional data sets ("data cubes"), and insertion of XMP/AVM tags that allows images to "drop-in" to the WWT. The same release offers a beta-version of web-based interactive visualization of images; this includes wrappers for visualization in Python. Subsequent releases will support HEALPix (now standard in cosmic background experiments); incorporation of Montage into package managers (which enable automated management of software builds), and support for a library that will enable Montage to be called directly from Python. This next generation toolkit will inherit the architectural benefits of the current engine - ...

  6. RGtk2: A Graphical User Interface Toolkit for R

    Directory of Open Access Journals (Sweden)

    Duncan Temple Lang

    2011-01-01

    Full Text Available Graphical user interfaces (GUIs are growing in popularity as a complement or alternative to the traditional command line interfaces to R. RGtk2 is an R package for creating GUIs in R. The package provides programmatic access to GTK+ 2.0, an open-source GUI toolkit written in C. To construct a GUI, the R programmer calls RGtk2 functions that map to functions in the underlying GTK+ library. This paper introduces the basic concepts underlying GTK+ and explains how to use RGtk2 to construct GUIs from R. The tutorial is based on simple and pratical programming examples. We also provide more complex examples illustrating the advanced features of the package. The design of the RGtk2 API and the low-level interface from R to GTK+ are discussed at length. We compare RGtk2 to alternative GUI toolkits for R.

  7. Geological hazards: from early warning systems to public health toolkits.

    Science.gov (United States)

    Samarasundera, Edgar; Hansell, Anna; Leibovici, Didier; Horwell, Claire J; Anand, Suchith; Oppenheimer, Clive

    2014-11-01

    Extreme geological events, such as earthquakes, are a significant global concern and sometimes their consequences can be devastating. Geographic information plays a critical role in health protection regarding hazards, and there are a range of initiatives using geographic information to communicate risk as well as to support early warning systems operated by geologists. Nevertheless we consider there to remain shortfalls in translating information on extreme geological events into health protection tools, and suggest that social scientists have an important role to play in aiding the development of a new generation of toolkits aimed at public health practitioners. This viewpoint piece reviews the state of the art in this domain and proposes potential contributions different stakeholder groups, including social scientists, could bring to the development of new toolkits.

  8. Developing Mixed Reality Educational Applications: The Virtual Touch Toolkit

    Directory of Open Access Journals (Sweden)

    Juan Mateu

    2015-08-01

    Full Text Available In this paper, we present Virtual Touch, a toolkit that allows the development of educational activities through a mixed reality environment such that, using various tangible elements, the interconnection of a virtual world with the real world is enabled. The main goal of Virtual Touch is to facilitate the installation, configuration and programming of different types of technologies, abstracting the creator of educational applications from the technical details involving the use of tangible interfaces and virtual worlds. Therefore, it is specially designed to enable teachers to themselves create educational activities for their students in a simple way, taking into account that teachers generally lack advanced knowledge in computer programming and electronics. The toolkit has been used to develop various educational applications that have been tested in two secondary education high schools in Spain.

  9. XDANNG: XML based Distributed Artificial Neural Network with Globus Toolkit

    CERN Document Server

    Mahini, Hamidreza; Ghofrani, Javad

    2009-01-01

    Artificial Neural Network is one of the most common AI application fields. This field has direct and indirect usages most sciences. The main goal of ANN is to imitate biological neural networks for solving scientific problems. But the level of parallelism is the main problem of ANN systems in comparison with biological systems. To solve this problem, we have offered a XML-based framework for implementing ANN on the Globus Toolkit Platform. Globus Toolkit is well known management software for multipurpose Grids. Using the Grid for simulating the neuron network will lead to a high degree of parallelism in the implementation of ANN. We have used the XML for improving flexibility and scalability in our framework.

  10. A Multi-Code Analysis Toolkit for Astrophysical Simulation Data

    OpenAIRE

    Turk, Matthew J.; Smith., Britton D.; Oishi, Jeffrey S.; Skory, Stephen; Skillman, Samuel W.; Abel, Tom; Norman, Michael L.

    2010-01-01

    The analysis of complex multiphysics astrophysical simulations presents a unique and rapidly growing set of challenges: reproducibility, parallelization, and vast increases in data size and complexity chief among them. In order to meet these challenges, and in order to open up new avenues for collaboration between users of multiple simulation platforms, we present yt (available at http://yt.enzotools.org/), an open source, community-developed astrophysical analysis and visualization toolkit. ...

  11. Developing Mixed Reality Educational Applications: The Virtual Touch Toolkit

    OpenAIRE

    Juan Mateu; María José Lasala; Xavier Alamán

    2015-01-01

    © 2015 by MDPI (http://www.mdpi.org). Reproduction is permitted for noncommercial purposes. In this paper, we present Virtual Touch, a toolkit that allows the development of educational activities through a mixed reality environment such that, using various tangible elements, the interconnection of a virtual world with the real world is enabled. The main goal of Virtual Touch is to facilitate the installation, configuration and programming of different types of technologies, ab...

  12. Building user interfaces with Google Web Toolkit: usage properties analysis

    OpenAIRE

    Poklukar, Matej

    2012-01-01

    This Bachelor thesis provides an overview of different techniques used for web pages or web applications user interface development using Google Web Toolkit. During development process programmer and designer encounter the problem in which way should design be passed from designer to programmer. Designer can plot web page design easily on paper or in an appropriate program and sends it to the programmer. Or he can create design in more advanced form, as a multitude of different files. Prog...

  13. Business plans--tips from the toolkit 6.

    Science.gov (United States)

    Steer, Neville

    2010-07-01

    General practice is a business. Most practices can stay afloat by having appointments, billing patients, managing the administration processes and working long hours. What distinguishes the high performance organisation from the average organisation is a business plan. This article examines how to create a simple business plan that can be applied to the general practice setting and is drawn from material contained in The Royal Australian College of General Practitioners' 'General practice management toolkit'.

  14. A framework for a teaching toolkit in entrepreneurship education.

    Science.gov (United States)

    Fellnhofer, Katharina

    2017-01-01

    Despite mounting interest in entrepreneurship education (EE), innovative approaches such as multimedia, web-based toolkits including entrepreneurial storytelling have been largely ignored in the EE discipline. Therefore, this conceptual contribution introduces eight propositions as a fruitful basis for assessing a 'learning-through-real-multimedia-entrepreneurial-narratives' pedagogical approach. These recommendations prepare the grounds for a future, empirical investigation of this currently under-researched topic, which could be essential for multiple domains including academic, business and society.

  15. NVIDIA发布CUDA Toolkit3.2

    Institute of Scientific and Technical Information of China (English)

    2010-01-01

    11月17日,NVIDIA正式发档CUDA Toolkit3.2正式版(Production releasel,浚版本软件可实现大幅的性能提升,包含全新的数学库以及先进的集群管理特性,适合这些开发北援一代GPU加速应用程序的开发者使用。

  16. IBM发布Autonomic Computing Toolkit 2.0

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    近日,IBM发布了新的自动计算工具包2.0(Autonomic Computing Toolkit 2.0)。该工具包能够帮助开发者以更快的速度把大量自我管理功能嵌入到已有的应用和服务中去。

  17. Expanding the bioluminescent toolkit for in vivo imaging

    OpenAIRE

    Paley, Miranda Amelia

    2014-01-01

    Bioluminescence imaging (BLI) is among the most dynamic imaging modalities for visualizing whole cells and gene expression patterns in vivo. This technique captures light emission from the luciferase-catalyzed oxidation of small molecule luciferins with highly sensitive CCD cameras. While powerful, current options for multiplexed BLI in mice are limited by the number of luciferase/luciferin pairs found in nature. Our lab aims to expand the bioluminescent toolkit by pairing mutant luciferases ...

  18. Risk of resource failure and toolkit variation in small-scale farmers and herders.

    Directory of Open Access Journals (Sweden)

    Mark Collard

    Full Text Available Recent work suggests that global variation in toolkit structure among hunter-gatherers is driven by risk of resource failure such that as risk of resource failure increases, toolkits become more diverse and complex. Here we report a study in which we investigated whether the toolkits of small-scale farmers and herders are influenced by risk of resource failure in the same way. In the study, we applied simple linear and multiple regression analysis to data from 45 small-scale food-producing groups to test the risk hypothesis. Our results were not consistent with the hypothesis; none of the risk variables we examined had a significant impact on toolkit diversity or on toolkit complexity. It appears, therefore, that the drivers of toolkit structure differ between hunter-gatherers and small-scale food-producers.

  19. pypet: A Python Toolkit for Data Management of Parameter Explorations

    Directory of Open Access Journals (Sweden)

    Robert Meyer

    2016-08-01

    Full Text Available pypet (Python parameter exploration toolkit is a new multi-platform Python toolkit for managing numerical simulations. Sampling the space of model parameters is a key aspect of simulations and numerical experiments. pypet is designed to allow easy and arbitrary sampling of trajectories through a parameter space beyond simple grid searches.pypet collects and stores both simulation parameters and results in a single HDF5 file.This collective storage allows fast and convenient loading of data for further analyses.pypet provides various additional features such as multiprocessing and parallelization of simulations, dynamic loading of data, integration of git version control, and supervision of experiments via the electronic lab notebook Sumatra. pypet supports a rich set of data formats, including native Python types, Numpy and Scipy data, Pandas DataFrames, and BRIAN(2 quantities. Besides these formats, users can easily extend the toolkit to allow customized data types. pypet is a flexible tool suited for both short Python scripts and large scale projects. pypet's various features, especially the tight link between parameters and results, promote reproducible research in computational neuroscience and simulation-based disciplines.

  20. pypet: A Python Toolkit for Data Management of Parameter Explorations.

    Science.gov (United States)

    Meyer, Robert; Obermayer, Klaus

    2016-01-01

    pypet (Python parameter exploration toolkit) is a new multi-platform Python toolkit for managing numerical simulations. Sampling the space of model parameters is a key aspect of simulations and numerical experiments. pypet is designed to allow easy and arbitrary sampling of trajectories through a parameter space beyond simple grid searches. pypet collects and stores both simulation parameters and results in a single HDF5 file. This collective storage allows fast and convenient loading of data for further analyses. pypet provides various additional features such as multiprocessing and parallelization of simulations, dynamic loading of data, integration of git version control, and supervision of experiments via the electronic lab notebook Sumatra. pypet supports a rich set of data formats, including native Python types, Numpy and Scipy data, Pandas DataFrames, and BRIAN(2) quantities. Besides these formats, users can easily extend the toolkit to allow customized data types. pypet is a flexible tool suited for both short Python scripts and large scale projects. pypet's various features, especially the tight link between parameters and results, promote reproducible research in computational neuroscience and simulation-based disciplines.

  1. pypet: A Python Toolkit for Data Management of Parameter Explorations

    Science.gov (United States)

    Meyer, Robert; Obermayer, Klaus

    2016-01-01

    pypet (Python parameter exploration toolkit) is a new multi-platform Python toolkit for managing numerical simulations. Sampling the space of model parameters is a key aspect of simulations and numerical experiments. pypet is designed to allow easy and arbitrary sampling of trajectories through a parameter space beyond simple grid searches. pypet collects and stores both simulation parameters and results in a single HDF5 file. This collective storage allows fast and convenient loading of data for further analyses. pypet provides various additional features such as multiprocessing and parallelization of simulations, dynamic loading of data, integration of git version control, and supervision of experiments via the electronic lab notebook Sumatra. pypet supports a rich set of data formats, including native Python types, Numpy and Scipy data, Pandas DataFrames, and BRIAN(2) quantities. Besides these formats, users can easily extend the toolkit to allow customized data types. pypet is a flexible tool suited for both short Python scripts and large scale projects. pypet's various features, especially the tight link between parameters and results, promote reproducible research in computational neuroscience and simulation-based disciplines. PMID:27610080

  2. Matlab based Toolkits used to Interface with Optical Design Software for NASA's James Webb Space Telescope

    Science.gov (United States)

    Howard, Joseph

    2007-01-01

    The viewgraph presentation provides an introduction to the James Webb Space Telescope (JWST). The first part provides a brief overview of Matlab toolkits including CodeV, OSLO, and Zemax Toolkits. The toolkit overview examines purpose, layout, how Matlab gets data from CodeV, function layout, and using cvHELP. The second part provides examples of use with JWST, including wavefront sensitivities and alignment simulations.

  3. Cloud-based Monte Carlo modelling of BSSRDF for the rendering of human skin appearance (Conference Presentation)

    Science.gov (United States)

    Doronin, Alexander; Rushmeier, Holly E.; Meglinski, Igor; Bykov, Alexander V.

    2016-03-01

    We present a new Monte Carlo based approach for the modelling of Bidirectional Scattering-Surface Reflectance Distribution Function (BSSRDF) for accurate rendering of human skin appearance. The variations of both skin tissues structure and the major chromophores are taken into account correspondingly to the different ethnic and age groups. The computational solution utilizes HTML5, accelerated by the graphics processing units (GPUs), and therefore is convenient for the practical use at the most of modern computer-based devices and operating systems. The results of imitation of human skin reflectance spectra, corresponding skin colours and examples of 3D faces rendering are presented and compared with the results of phantom studies.

  4. The Seismic Tool-Kit (STK): an open source software for seismology and signal processing.

    Science.gov (United States)

    Reymond, Dominique

    2016-04-01

    We present an open source software project (GNU public license), named STK: Seismic ToolKit, that is dedicated mainly for seismology and signal processing. The STK project that started in 2007, is hosted by SourceForge.net, and count more than 19 500 downloads at the date of writing. The STK project is composed of two main branches: First, a graphical interface dedicated to signal processing (in the SAC format (SAC_ASCII and SAC_BIN): where the signal can be plotted, zoomed, filtered, integrated, derivated, ... etc. (a large variety of IFR and FIR filter is proposed). The estimation of spectral density of the signal are performed via the Fourier transform, with visualization of the Power Spectral Density (PSD) in linear or log scale, and also the evolutive time-frequency representation (or sonagram). The 3-components signals can be also processed for estimating their polarization properties, either for a given window, or either for evolutive windows along the time. This polarization analysis is useful for extracting the polarized noises, differentiating P waves, Rayleigh waves, Love waves, ... etc. Secondly, a panel of Utilities-Program are proposed for working in a terminal mode, with basic programs for computing azimuth and distance in spherical geometry, inter/auto-correlation, spectral density, time-frequency for an entire directory of signals, focal planes, and main components axis, radiation pattern of P waves, Polarization analysis of different waves (including noize), under/over-sampling the signals, cubic-spline smoothing, and linear/non linear regression analysis of data set. A MINimum library of Linear AlGebra (MIN-LINAG) is also provided for computing the main matrix process like: QR/QL decomposition, Cholesky solve of linear system, finding eigen value/eigen vectors, QR-solve/Eigen-solve of linear equations systems ... etc. STK is developed in C/C++, mainly under Linux OS, and it has been also partially implemented under MS-Windows. Usefull links: http://sourceforge.net/projects/seismic-toolkit

  5. Clustered deep shadow maps for integrated polyhedral and volume rendering

    KAUST Repository

    Bornik, Alexander

    2012-01-01

    This paper presents a hardware-accelerated approach for shadow computation in scenes containing both complex volumetric objects and polyhedral models. Our system is the first hardware accelerated complete implementation of deep shadow maps, which unifies the computation of volumetric and geometric shadows. Up to now such unified computation was limited to software-only rendering . Previous hardware accelerated techniques can handle only geometric or only volumetric scenes - both resulting in the loss of important properties of the original concept. Our approach supports interactive rendering of polyhedrally bounded volumetric objects on the GPU based on ray casting. The ray casting can be conveniently used for both the shadow map computation and the rendering. We show how anti-aliased high-quality shadows are feasible in scenes composed of multiple overlapping translucent objects, and how sparse scenes can be handled efficiently using clustered deep shadow maps. © 2012 Springer-Verlag.

  6. Real-Time Rendering of Teeth with No Preprocessing

    DEFF Research Database (Denmark)

    Larsen, Christian Thode; Frisvad, Jeppe Revall; Jensen, Peter Dahl Ejby

    2012-01-01

    We present a technique for real-time rendering of teeth with no need for computational or artistic preprocessing. Teeth constitute a translucent material consisting of several layers; a highly scattering material (dentine) beneath a semitransparent layer (enamel) with a transparent coating (saliva......). In this study we examine how light interacts with this multilayered structure. In the past, rendering of teeth has mostly been done using image-based texturing or volumetric scans. We work with surface scans and have therefore developed a simple way of estimating layer thicknesses. We use scattering properties...... based on measurements reported in the optics literature, and we compare rendered results qualitatively to images of ceramic teeth created by denturists....

  7. [A hybrid volume rendering method using general hardware].

    Science.gov (United States)

    Li, Bin; Tian, Lianfang; Chen, Ping; Mao, Zongyuan

    2008-06-01

    In order to improve the effect and efficiency of the reconstructed image after hybrid volume rendering of different kinds of volume data from medical sequential slices or polygonal models, we propose a hybrid volume rendering method based on Shear-Warp with economical hardware. First, the hybrid volume data are pre-processed by Z-Buffer method and RLE (Run-Length Encoded) data structure. Then, during the process of compositing intermediate image, a resampling method based on the dual-interpolation and the intermediate slice interpolation methods is used to improve the efficiency and the effect. Finally, the reconstructed image is rendered by the texture-mapping technology of OpenGL. Experiments demonstrate the good performance of the proposed method.

  8. Universal Rendering Mechanism Supporting Dual-Mode Presentation

    Institute of Scientific and Technical Information of China (English)

    徐鹏; 杨文军; 王克宏

    2003-01-01

    XML is a standard for the exchange of business data that is completely platform and vendor neutral. Because XML data comes in many forms, one of the most important technologies needed for XML applications is the ability to convert the data into visible renderings. This paper focuses on the rendering of XML/XSL documents into a readable and printable format by means of a platform-independent process that enables high-quality printing of the product. This paper introduces the core components in the data rendering engine, the X2P server and different levels of object abstraction. The design pattern and the complete formatting and representation of the XSL stylesheet into different types of output formats in the X2P server are also given. The results show that the X2P sever simultaneously constructs the formatting object tree and the area tree in a very efficient design that saves execution time and memory.

  9. Virtual try-on through image-based rendering.

    Science.gov (United States)

    Hauswiesner, Stefan; Straka, Matthias; Reitmayr, Gerhard

    2013-09-01

    Virtual try-on applications have become popular because they allow users to watch themselves wearing different clothes without the effort of changing them physically. This helps users to make quick buying decisions and, thus, improves the sales efficiency of retailers. Previous solutions usually involve motion capture, 3D reconstruction or modeling, which are time consuming and not robust for all body poses. Our method avoids these steps by combining image-based renderings of the user and previously recorded garments. It transfers the appearance of a garment recorded from one user to another by matching input and recorded frames, image-based visual hull rendering, and online registration methods. Using images of real garments allows for a realistic rendering quality with high performance. It is suitable for a wide range of clothes and complex appearances, allows arbitrary viewing angles, and requires only little manual input. Our system is particularly useful for virtual try-on applications as well as interactive games.

  10. Viewpoint Selection Using Hybrid Simplex Search and Particle Swarm Optimization for Volume Rendering

    Directory of Open Access Journals (Sweden)

    Zhang You-sai,,,

    2012-09-01

    Full Text Available In this paper we proposed a novel method of viewpoint selection using the hybrid Nelder-Mead (NM simplex search and particle swarm optimization (PSO to improve the efficiency and the intelligent level of volume rendering. This method constructed the viewpoint quality evaluation function in the form of entropy by utilizing the luminance and structure features of the two-dimensional projective image of volume data. During the process of volume rendering, the hybrid NM-PSO algorithm intended to locate the globally optimal viewpoint or a set of the optimized viewpoints automatically and intelligently. Experimental results have shown that this method avoids redundant interactions and evidently improves the efficiency of volume rendering. The optimized viewpoints can focus on the important structural features or the region of interest in volume data and exhibit definite correlation with the perception character of human visual system. Compared with the methods based on PSO or NM simplex search, our method has the better performance of convergence rate, convergence accuracy and robustness.

  11. Efficient rendering of breaking waves using MPS method

    Institute of Scientific and Technical Information of China (English)

    WANG Qiang; ZHENG Yao; CHEN Chun; FUJIMOTO Tadahiro; CHIBA Norishige

    2006-01-01

    This paper proposes an approach for rendering breaking waves out of large-scale ofparticle-based simulation. Moving particle semi-implicit (MPS) is used to solve the governing equation, and 2D simulation is expanded to 3D representation by giving motion variation using fractional Brownian motion (fBm). The waterbody surface is reconstructed from the outlines of 2D simulation. The splashing effect is computed according to the properties of the particles. Realistic features of the wave are rendered on GPU, including the reflective and refractive effect and the effect of splash. Experiments showed that the proposed method can simulate large scale breaking waves efficiently.

  12. Beaming teaching application: recording techniques for spatial xylophone sound rendering

    DEFF Research Database (Denmark)

    Markovic, Milos; Madsen, Esben; Olesen, Søren Krarup;

    2012-01-01

    BEAMING is a telepresence research project aiming at providing a multimodal interaction between two or more participants located at distant locations. One of the BEAMING applications allows a distant teacher to give a xylophone playing lecture to the students. Therefore, rendering of the xylophone...... played at student's location is required at teacher's site. This paper presents a comparison of different recording techniques for a spatial xylophone sound rendering. Directivity pattern of the xylophone was measured and spatial properties of the sound field created by a xylophone as a distributed sound...

  13. Chromium Renderserver: Scalable and Open Source Remote RenderingInfrastructure

    Energy Technology Data Exchange (ETDEWEB)

    Paul, Brian; Ahern, Sean; Bethel, E. Wes; Brugger, Eric; Cook,Rich; Daniel, Jamison; Lewis, Ken; Owen, Jens; Southard, Dale

    2007-12-01

    Chromium Renderserver (CRRS) is software infrastructure thatprovides the ability for one or more users to run and view image outputfrom unmodified, interactive OpenGL and X11 applications on a remote,parallel computational platform equipped with graphics hardwareaccelerators via industry-standard Layer 7 network protocolsand clientviewers. The new contributions of this work include a solution to theproblem of synchronizing X11 and OpenGL command streams, remote deliveryof parallel hardware-accelerated rendering, and a performance analysis ofseveral different optimizations that are generally applicable to avariety of rendering architectures. CRRSis fully operational, Open Sourcesoftware.

  14. Depth of Field Effects for Interactive Direct Volume Rendering

    KAUST Repository

    Schott, Mathias

    2011-06-01

    In this paper, a method for interactive direct volume rendering is proposed for computing depth of field effects, which previously were shown to aid observers in depth and size perception of synthetically generated images. The presented technique extends those benefits to volume rendering visualizations of 3D scalar fields from CT/MRI scanners or numerical simulations. It is based on incremental filtering and as such does not depend on any precomputation, thus allowing interactive explorations of volumetric data sets via on-the-fly editing of the shading model parameters or (multi-dimensional) transfer functions. © 2011 The Author(s).

  15. Morphological study of transpterional-insula approach using volume rendering.

    Science.gov (United States)

    Jia, Linpei; Su, Lue; Sun, Wei; Wang, Lina; Yao, Jihang; Li, Youqiong; Luo, Qi

    2012-11-01

    This study describes the measurements of inferior circular insular sulcus (ICIS) and the shortest distance from ICIS to the temporal horn and determines the position of the incision, which does less harm to the temporal stem in the transpterional-insula approach using volume-rendering technique. Results of the research showed that one-third point over the anterior side of ICIS may be the ideal penetration point during operation. And there is no difference between 2 hemispheres (P ICIS from other Chinese researches demonstrated that volume rendering is a reliable method in insular research that enables mass measurements.

  16. LOD 1 VS. LOD 2 - Preliminary Investigations Into Differences in Mobile Rendering Performance

    Science.gov (United States)

    Ellul, C.; Altenbuchner, J.

    2013-09-01

    The increasing availability, size and detail of 3D City Model datasets has led to a challenge when rendering such data on mobile devices. Understanding the limitations to the usability of such models on these devices is particularly important given the broadening range of applications - such as pollution or noise modelling, tourism, planning, solar potential - for which these datasets and resulting visualisations can be utilized. Much 3D City Model data is created by extrusion of 2D topographic datasets, resulting in what is known as Level of Detail (LoD) 1 buildings - with flat roofs. However, in the UK the National Mapping Agency (the Ordnance Survey, OS) is now releasing test datasets to Level of Detail (LoD) 2 - i.e. including roof structures. These datasets are designed to integrate with the LoD 1 datasets provided by the OS, and provide additional detail in particular on larger buildings and in town centres. The availability of such integrated datasets at two different Levels of Detail permits investigation into the impact of the additional roof structures (and hence the display of a more realistic 3D City Model) on rendering performance on a mobile device. This paper describes preliminary work carried out to investigate this issue, for the test area of the city of Sheffield (in the UK Midlands). The data is stored in a 3D spatial database as triangles and then extracted and served as a web-based data stream which is queried by an App developed on the mobile device (using the Android environment, Java and OpenGL for graphics). Initial tests have been carried out on two dataset sizes, for the city centre and a larger area, rendering the data onto a tablet to compare results. Results of 52 seconds for rendering LoD 1 data, and 72 seconds for LoD 1 mixed with LoD 2 data, show that the impact of LoD 2 is significant.

  17. A GIS Software Toolkit for Monitoring Areal Snow Cover and Producing Daily Hydrologic Forecasts using NASA Satellite Imagery Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Aniuk Consulting, LLC, proposes to create a GIS software toolkit for monitoring areal snow cover extent and producing streamflow forecasts. This toolkit will be...

  18. Remote parallel rendering for high-resolution tiled display walls

    KAUST Repository

    Nachbaur, Daniel

    2014-11-01

    © 2014 IEEE. We present a complete, robust and simple to use hardware and software stack delivering remote parallel rendering of complex geometrical and volumetric models to high resolution tiled display walls in a production environment. We describe the setup and configuration, present preliminary benchmarks showing interactive framerates, and describe our contributions for a seamless integration of all the software components.

  19. Light Field Rendering for Head Mounted Displays using Pixel Reprojection

    DEFF Research Database (Denmark)

    Hansen, Anne Juhler; Kraus, Martin; Klein, Jákup

    2017-01-01

    of the information of the different images is redundant, we use pixel reprojection from the corner cameras to compute the remaining images in the light field. We compare the reprojected images with directly rendered images in a user test. In most cases, the users were unable to distinguish the images. In extreme...

  20. An experiment on the color rendering of different light sources

    Science.gov (United States)

    Fumagalli, Simonetta; Bonanomi, Cristian; Rizzi, Alessandro

    2013-02-01

    The color rendering index (CRI) of a light source attempts to measure how much the color appearance of objects is preserved when they are illuminated by the given light source. This problem is of great importance for various industrial and scientific fields, such as lighting architecture, design, ergonomics, etc. Usually a light source is specified through the Correlated Color Temperature or CCT. However two (or more) light sources with the same CCT but different spectral power distribution can exist. Therefore color samples viewed under two light sources with equal CCTs can appear different. Hence, the need for a method to assess the quality of a given illuminant in relation to color. Recently CRI has had a renewed interest because of the new LED-based lighting systems. They usually have a color rendering index rather low, but good preservation of color appearance and a pleasant visual appearance (visual appeal). Various attempts to develop a new color rendering index have been done so far, but still research is working for a better one. This article describes an experiment performed by human observers concerning the appearance preservation of color under some light sources, comparing it with a range of available color rendering indices.

  1. Interacting with Stroke-Based Rendering on a Wall Display

    NARCIS (Netherlands)

    Grubert, Jens; Hanckock, Mark; Carpendale, Sheelagh; Tse, Edward; Isenberg, Tobias

    2007-01-01

    We introduce two new interaction techniques for creating and interacting with non-photorealistic images using stroke-based rendering. We provide bimanual control of a large interactive canvas through both remote pointing and direct touch. Remote pointing allows people to sit and interact at a distan

  2. Selection of plasters and renders for salt laden masonry substrates

    NARCIS (Netherlands)

    Groot, C.; Hees, R.P.J. van; Wijffels, T.J.

    2009-01-01

    The choice of a repair plaster or render by architects often appears to be the result of fortuitous circumstances, such as prior experience with a plaster or a recommendation by a producer. Seldom is the choice based on a sound assessment of the state of the building and the wall that is to be repai

  3. Depth-Dependent Halos : Illustrative Rendering of Dense Line Data

    NARCIS (Netherlands)

    Everts, Maarten H.; Bekker, Henk; Roerdink, Jos B.T.M.; Isenberg, Tobias

    2009-01-01

    We present a technique for the illustrative rendering of 3D line data at interactive frame rates. We create depth-dependent halos around lines to emphasize tight line bundles while less structured lines are de-emphasized. Moreover, the depth-dependent halos combined with depth cueing via line width

  4. Virtual Environment of Real Sport Hall and Analyzing Rendering Quality

    Directory of Open Access Journals (Sweden)

    Filip Popovski

    2015-02-01

    Full Text Available Here is presented virtual environment of a real sport hall created in Quest3D VR Edition. All analyzes of the rendering quality, techniques of interaction and performance of the system in real time are presented. We made critical analysis on all of these techniques on different machines and have excellent results.

  5. Interacting with Stroke-Based Rendering on a Wall Display

    NARCIS (Netherlands)

    Grubert, Jens; Hanckock, Mark; Carpendale, Sheelagh; Tse, Edward; Isenberg, Tobias

    2007-01-01

    We introduce two new interaction techniques for creating and interacting with non-photorealistic images using stroke-based rendering. We provide bimanual control of a large interactive canvas through both remote pointing and direct touch. Remote pointing allows people to sit and interact at a

  6. 7 CFR 54.1016 - Advance information concerning service rendered.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Advance information concerning service rendered. 54..., Processing, and Packaging of Livestock and Poultry Products § 54.1016 Advance information concerning service... MARKETING SERVICE (Standards, Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE...

  7. 7 CFR 53.17 - Advance information concerning service rendered.

    Science.gov (United States)

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Advance information concerning service rendered. 53.17... (CONTINUED) LIVESTOCK (GRADING, CERTIFICATION, AND STANDARDS) Regulations Service § 53.17 Advance information... SERVICE (Standards, Inspections, Marketing Practices), DEPARTMENT OF AGRICULTURE (CONTINUED)...

  8. Fast Rendering of Realistic Virtual Character in Game Scene

    Directory of Open Access Journals (Sweden)

    Mengzhao Yang

    2013-07-01

    Full Text Available Human skin is made up of multiple translucent layers and rendering of skin appearance usually acquire complex modeling and massive calculation. In some practical applications such as 3D game development, we not only approximate the realistic looking skin but also develop efficient method to implement easily for meeting needs of real-time rendering. In this study, we solve the problem of wrap lighting and introduce a surface details approximation method to give realistic rendering of virtual character. Our method considers that different thicknesses of geometry on the skin surface can result in different scattering degree of incident light and so pre-calculate the diffuse falloff into a look-up texture. Also, we notice that scattering is strongly color dependent and small bumps are common on the skin surface and so pre-soften the finer details on the skin surface according to the R/G/B channel. At last, we linearly interpolate the diffuse lighting with different scattering degree from the look-up texture sampled with the curvature and NdotL. Experiment results show that the proposed approach yields realistic virtual character and obtains high frames per second in real-time rendering.

  9. Democratizing rendering for multiple viewers in surround VR systems

    KAUST Repository

    Schulze, Jürgen P.

    2012-03-01

    We present a new approach for how multiple users\\' views can be rendered in a surround virtual environment without using special multi-view hardware. It is based on the idea that different parts of the screen are often viewed by different users, so that they can be rendered from their own view point, or at least from a point closer to their view point than traditionally expected. The vast majority of 3D virtual reality systems are designed for one head-tracked user, and a number of passive viewers. Only the head tracked user gets to see the correct view of the scene, everybody else sees a distorted image. We reduce this problem by algorithmically democratizing the rendering view point among all tracked users. Researchers have proposed solutions for multiple tracked users, but most of them require major changes to the display hardware of the VR system, such as additional projectors or custom VR glasses. Our approach does not require additional hardware, except the ability to track each participating user. We propose three versions of our multi-viewer algorithm. Each of them balances image distortion and frame rate in different ways, making them more or less suitable for certain application scenarios. Our most sophisticated algorithm renders each pixel from its own, optimized camera perspective, which depends on all tracked users\\' head positions and orientations. © 2012 IEEE.

  10. Pydpiper: a flexible toolkit for constructing novel registration pipelines

    Science.gov (United States)

    Friedel, Miriam; van Eede, Matthijs C.; Pipitone, Jon; Chakravarty, M. Mallar; Lerch, Jason P.

    2014-01-01

    Using neuroimaging technologies to elucidate the relationship between genotype and phenotype and brain and behavior will be a key contribution to biomedical research in the twenty-first century. Among the many methods for analyzing neuroimaging data, image registration deserves particular attention due to its wide range of applications. Finding strategies to register together many images and analyze the differences between them can be a challenge, particularly given that different experimental designs require different registration strategies. Moreover, writing software that can handle different types of image registration pipelines in a flexible, reusable and extensible way can be challenging. In response to this challenge, we have created Pydpiper, a neuroimaging registration toolkit written in Python. Pydpiper is an open-source, freely available software package that provides multiple modules for various image registration applications. Pydpiper offers five key innovations. Specifically: (1) a robust file handling class that allows access to outputs from all stages of registration at any point in the pipeline; (2) the ability of the framework to eliminate duplicate stages; (3) reusable, easy to subclass modules; (4) a development toolkit written for non-developers; (5) four complete applications that run complex image registration pipelines “out-of-the-box.” In this paper, we will discuss both the general Pydpiper framework and the various ways in which component modules can be pieced together to easily create new registration pipelines. This will include a discussion of the core principles motivating code development and a comparison of Pydpiper with other available toolkits. We also provide a comprehensive, line-by-line example to orient users with limited programming knowledge and highlight some of the most useful features of Pydpiper. In addition, we will present the four current applications of the code. PMID:25126069

  11. Pydpiper: A Flexible Toolkit for Constructing Novel Registration Pipelines

    Directory of Open Access Journals (Sweden)

    Miriam eFriedel

    2014-07-01

    Full Text Available Using neuroimaging technologies to elucidate the relationship between genotype and phenotype and brain and behavior will be a key contribution to biomedical research in the twenty-first century. Among the many methods for analyzing neuroimaging data, image registration deserves particular attention due to its wide range of applications. Finding strategies to register together many images and analyze the differences between them can be a challenge, particularly given that different experimental designs require different registration strategies. Moreover, writing software that can handle different types of image registration pipelines in a flexible, reusable and extensible way can be challenging. In response to this challenge, we have created Pydpiper, a neuroimaging registration toolkit written in Python. Pydpiper is an open-source, freely available pipeline framework that provides multiple modules for various image registration applications. Pydpiper offers five key innovations. Specifically: (1 a robust file handling class that allows access to outputs from all stages of registration at any point in the pipeline; (2 the ability of the framework to eliminate duplicate stages; (3 reusable, easy to subclass modules; (4 a development toolkit written for non-developers; (5 four complete applications that run complex image registration pipelines ``out-of-the-box.'' In this paper, we will discuss both the general Pydpiper framework and the various ways in which component modules can be pieced together to easily create new registration pipelines. This will include a discussion of the core principles motivating code development and a comparison of Pydpiper with other available toolkits. We also provide a comprehensive, line-by-line example to orient users with limited programming knowledge and highlight some of the most useful features of Pydpiper. In addition, we will present the four current applications of the code.

  12. Building Emergency Contraception Awareness among Adolescents. A Toolkit for Schools and Community-Based Organizations.

    Science.gov (United States)

    Simkin, Linda; Radosh, Alice; Nelsesteun, Kari; Silverstein, Stacy

    This toolkit presents emergency contraception (EC) as a method to help adolescent women avoid pregnancy and abortion after unprotected sexual intercourse. The sections of this toolkit are designed to help increase your knowledge of EC and stay up to date. They provide suggestions for increasing EC awareness in the workplace, whether it is a school…

  13. Toolkit of Available EPA Green Infrastructure Modeling Software. National Stormwater Calculator

    Science.gov (United States)

    This webinar will present a toolkit consisting of five EPA green infrastructure models and tools, along with communication material. This toolkit can be used as a teaching and quick reference resource for use by planners and developers when making green infrastructure implementat...

  14. Classification of Dermal Exposure Modifiers and Assignment of Values for a Risk Assessment Toolkit

    NARCIS (Netherlands)

    Goede, H.A.; Tijssen, S.C.H.A.; Schipper, H.J.; Warren, N.; Oppl, R.; Kalberlah, F.; Hemmen, J.J. van

    2003-01-01

    This paper describes how default dermal exposure values can be adjusted with modifier values for specific work situations. The work presented here is supplementary to a toolkit developed for the EU RISKOFDERM project. This toolkit is intended for the assessment and management of dermal risks in smal

  15. Language Access Toolkit: An Organizing and Advocacy Resource for Community-Based Youth Programs

    Science.gov (United States)

    Beyersdorf, Mark Ro

    2013-01-01

    Asian American Legal Defense and Education Fund (AALDEF) developed this language access toolkit to share the expertise and experiences of National Asian American Education Advocates Network (NAAEA) member organizations with other community organizations interested in developing language access campaigns. This toolkit includes an overview of…

  16. Practitioner Data Use in Schools: Workshop Toolkit. REL 2015-043

    Science.gov (United States)

    Bocala, Candice; Henry, Susan F.; Mundry, Susan; Morgan, Claire

    2014-01-01

    The "Practitioner Data Use in Schools: Workshop Toolkit" is designed to help practitioners systematically and accurately use data to inform their teaching practice. The toolkit includes an agenda, slide deck, participant workbook, and facilitator's guide and covers the following topics: developing data literacy, engaging in a cycle of…

  17. Toolkit for Evaluating Alignment of Instructional and Assessment Materials to the Common Core State Standards

    Science.gov (United States)

    Achieve, Inc., 2014

    2014-01-01

    In joint partnership, Achieve, The Council of Chief State School Officers, and Student Achievement Partners have developed a Toolkit for Evaluating the Alignment of Instructional and Assessment Materials to the Common Core State Standards. The Toolkit is a set of interrelated, freely available instruments for evaluating alignment to the CCSS; each…

  18. Scoping review of toolkits as a knowledge translation strategy in health

    OpenAIRE

    Barac, Raluca; Stein, Sherry; Bruce, Beth; Barwick, Melanie

    2014-01-01

    Background Significant resources are invested in the production of research knowledge with the ultimate objective of integrating research evidence into practice. Toolkits are becoming increasingly popular as a knowledge translation (KT) strategy for disseminating health information, to build awareness, inform, and change public and healthcare provider behavior. Toolkits communicate messages aimed at improving health and changing practice to diverse audiences, including healthcare practitioner...

  19. Toolkit for a Workshop on Building a Culture of Data Use. REL 2015-063

    Science.gov (United States)

    Gerzon, Nancy; Guckenburg, Sarah

    2015-01-01

    The Culture of Data Use Workshop Toolkit helps school and district teams apply research to practice as they establish and support a culture of data use in their educational setting. The field-tested workshop toolkit guides teams through a set of structured activities to develop an understanding of data-use research in schools and to analyze…

  20. Capacity Building Indicators & Dissemination Strategies: Designing and Delivering Intensive Interventions--A Teacher's Toolkit

    Science.gov (United States)

    Center on Instruction, 2012

    2012-01-01

    This toolkit provides activities and resources to assist practitioners in designing and delivering intensive interventions in reading and mathematics for K-12 students with significant learning difficulties and disabilities. Grounded in research, this toolkit is based on the Center on Instruction's "Intensive Interventions for Students Struggling…

  1. TECA: A Parallel Toolkit for Extreme Climate Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Prabhat, Mr; Ruebel, Oliver; Byna, Surendra; Wu, Kesheng; Li, Fuyu; Wehner, Michael; Bethel, E. Wes

    2012-03-12

    We present TECA, a parallel toolkit for detecting extreme events in large climate datasets. Modern climate datasets expose parallelism across a number of dimensions: spatial locations, timesteps and ensemble members. We design TECA to exploit these modes of parallelism and demonstrate a prototype implementation for detecting and tracking three classes of extreme events: tropical cyclones, extra-tropical cyclones and atmospheric rivers. We process a modern TB-sized CAM5 simulation dataset with TECA, and demonstrate good runtime performance for the three case studies.

  2. The RAVE/VERTIGO vertex reconstruction toolkit and framework

    Science.gov (United States)

    Waltenberger, W.; Mitaroff, W.; Moser, F.; Pflugfelder, B.; Riedel, H. V.

    2008-07-01

    A detector-independent toolkit for vertex reconstruction (RAVE1) is being developed, along with a standalone framework (VERTIGO2) for testing, analyzing and debugging. The core algorithms represent state-of-the-art for geometric vertex finding and fitting by both linear (Kalman filter) and robust estimation methods. Main design goals are ease of use, flexibility for embedding into existing software frameworks, extensibility, and openness. The implementation is based on modern object-oriented techniques, is coded in C++ with interfaces for Java and Python, and follows an open-source approach. A beta release is available.

  3. RAVE-a Detector-independent vertex reconstruction toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Waltenberger, Wolfgang [Institute of High Energy Physics, Austrian Academy of Sciences A-1050 Vienna (Austria)], E-mail: walten@hephy.oeaw.ac.at; Mitaroff, Winfried; Moser, Fabian [Institute of High Energy Physics, Austrian Academy of Sciences A-1050 Vienna (Austria)

    2007-10-21

    A detector-independent toolkit for vertex reconstruction (RAVE) is being developed, along with a standalone framework (VERTIGO) for testing, analyzing and debugging. The core algorithms represent state of the art for geometric vertex finding and fitting by both linear (Kalman filter) and robust estimation methods. Main design goals are ease of use, flexibility for embedding into existing software frameworks, extensibility, and openness. The implementation is based on modern object-oriented techniques, is coded in C++ with interfaces for Java and Python, and follows an open-source approach. A beta release is available.

  4. The RAVE/VERTIGO vertex reconstruction toolkit and framework

    Energy Technology Data Exchange (ETDEWEB)

    Waltenberger, W; Mitaroff, W; Moser, F; Pflugfelder, B; Riedel, H V [Austrian Academy of Sciences, Institute of High Energy Physics, A-1050 Vienna (Austria)], E-mail: walten@hephy.oeaw.ac.at

    2008-07-15

    A detector-independent toolkit for vertex reconstruction (RAVE{sup 1}) is being developed, along with a standalone framework (VERTIGO{sup 2}) for testing, analyzing and debugging. The core algorithms represent state-of-the-art for geometric vertex finding and fitting by both linear (Kalman filter) and robust estimation methods. Main design goals are ease of use, flexibility for embedding into existing software frameworks, extensibility, and openness. The implementation is based on modern object-oriented techniques, is coded in C++ with interfaces for Java and Python, and follows an open-source approach. A beta release is available.

  5. TMVA The Toolkit for Multivariate Data Analysis eith ROOT

    CERN Document Server

    Höcker, Andreas; Stelzer, Jörg; Tegenfeldt, Fredrik; Voss, Helge

    2008-01-01

    Multivariate classi cation methods based on machine learning techniques have become a fundamental ingredient to most physics analyses. The classi cation techniques themselves have also signi cantly evolved in recent years. Statisticians have found new ways to tune and to combine classi ers to further gain in performance. Integrated into the analysis framework ROOT, TMVA is a toolkit offering a large variety of multivariate classi cation algorithms. TMVA manages the simultaneous training, testing and performance evaluation of all the classi ers with a user-friendly interface, and also steers the application of the trained classi ers to data.

  6. GENFIT — a Generic Track-Fitting Toolkit

    Science.gov (United States)

    Rauch, Johannes; Schlüter, Tobias

    2015-05-01

    GENFIT is an experiment-independent track-fitting toolkit that combines fitting algorithms, track representations, and measurement geometries into a modular framework. We report on a significantly improved version of GENFIT, based on experience gained in the Belle II, P¯ANDA, and FOPI experiments. Improvements concern the implementation of additional track-fitting algorithms, enhanced implementations of Kalman fitters, enhanced visualization capabilities, and additional implementations of measurement types suited for various kinds of tracking detectors. The data model has been revised, allowing for efficient track merging, smoothing, residual calculation, alignment, and storage.

  7. A Machine Learning and Optimization Toolkit for the Swarm

    Science.gov (United States)

    2014-11-17

    Swarm   Ilge  Akkaya,  Shuhei  Emoto...3. DATES COVERED 00-00-2014 to 00-00-2014 4. TITLE AND SUBTITLE A Machine Learning and Optimization Toolkit for the Swarm 5a. CONTRACT NUMBER...design  by   •  Exploi0ng  component-­‐level  interac0ons  in  the   swarm   •  Restoring  the  system  level  roots

  8. Formal verification an essential toolkit for modern VLSI design

    CERN Document Server

    Seligman, Erik; Kumar, M V Achutha Kiran

    2015-01-01

    Formal Verification: An Essential Toolkit for Modern VLSI Design presents practical approaches for design and validation, with hands-on advice for working engineers integrating these techniques into their work. Building on a basic knowledge of System Verilog, this book demystifies FV and presents the practical applications that are bringing it into mainstream design and validation processes at Intel and other companies. The text prepares readers to effectively introduce FV in their organization and deploy FV techniques to increase design and validation productivity. Presents formal verific

  9. YAP. Yet another partial-wave-analysis toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Di Giglio, Paolo; Greenwald, Daniel; Rauch, Johannes [TUM, Munich (Germany)

    2016-07-01

    We present a new C++ library: YAP, the Yet Another Partial-wave-analysis toolkit. The library calculates amplitudes for multibody particle decays in several model frameworks. It is intended for the analysis of spin-0 heavy mesons, but is programmed with the flexibility to handle other decays. The library implements isobar decompositions, K-matrix formalism, and model-independent approaches for mass-dependent amplitudes; and both Wigner rotation and Zemach (for 3 particles) formalism for spin amplitudes. We introduce the software and give example use cases.

  10. Rich Internet Web Application Development using Google Web Toolkit

    Directory of Open Access Journals (Sweden)

    Niriksha Bhojaraj Kabbin

    2015-05-01

    Full Text Available Web applications in today’s world has a great impact on businesses and are popular since they provide business benefits and hugely deployable. Developing such efficient web applications using leading edge web technologies that promise to deliver upgraded user interface, greater scalability and interoperability, improved performance and usability, among different systems is a challenge. Google Web Toolkit (GWT is one such framework that helps to build Rich Internet Applications (RIAs that enable fertile development of high performance web applications. This paper puts an effort to provide an effective solution to develop quality web based applications with an added layer of security.

  11. Benchmarking the Collocation Stand-Alone Library and Toolkit (CSALT)

    Science.gov (United States)

    Hughes, Steven; Knittel, Jeremy; Shoan, Wendy; Kim, Youngkwang; Conway, Claire; Conway, Darrel J.

    2017-01-01

    This paper describes the processes and results of Verification and Validation (VV) efforts for the Collocation Stand Alone Library and Toolkit (CSALT). We describe the test program and environments, the tools used for independent test data, and comparison results. The VV effort employs classical problems with known analytic solutions, solutions from other available software tools, and comparisons to benchmarking data available in the public literature. Presenting all test results are beyond the scope of a single paper. Here we present high-level test results for a broad range of problems, and detailed comparisons for selected problems.

  12. The Rome Laboratory Reliability Engineer’s Toolkit

    Science.gov (United States)

    1993-04-01

    Optics - Cable Bend Radius 200% 200% 200% (% of Minimum Rated) Cable Tension 50% 50% 50% (% Rated Tensile Strength) Fiber Tension 209/6 20% 20% (% Proof...all equipment of water to prevent freezing or broken pipes . ROME LABORATORY RELIABILITY ENGINEER’S TOOLKIT 63 DESIGN - TOPIC D6 - Control relative...Pister Grp ) 2551 Riva Road PO Box 38042 Annapolis MD 21401 550 Eglinton Ave, West (301)266-4650 Toronto Ontario, M5N 3A8 (416)886-9470 3. Automated

  13. A flexible open-source toolkit for lava flow simulations

    Science.gov (United States)

    Mossoux, Sophie; Feltz, Adelin; Poppe, Sam; Canters, Frank; Kervyn, Matthieu

    2014-05-01

    Lava flow hazard modeling is a useful tool for scientists and stakeholders confronted with imminent or long term hazard from basaltic volcanoes. It can improve their understanding of the spatial distribution of volcanic hazard, influence their land use decisions and improve the city evacuation during a volcanic crisis. Although a range of empirical, stochastic and physically-based lava flow models exists, these models are rarely available or require a large amount of physical constraints. We present a GIS toolkit which models lava flow propagation from one or multiple eruptive vents, defined interactively on a Digital Elevation Model (DEM). It combines existing probabilistic (VORIS) and deterministic (FLOWGO) models in order to improve the simulation of lava flow spatial spread and terminal length. Not only is this toolkit open-source, running in Python, which allows users to adapt the code to their needs, but it also allows users to combine the models included in different ways. The lava flow paths are determined based on the probabilistic steepest slope (VORIS model - Felpeto et al., 2001) which can be constrained in order to favour concentrated or dispersed flow fields. Moreover, the toolkit allows including a corrective factor in order for the lava to overcome small topographical obstacles or pits. The lava flow terminal length can be constrained using a fixed length value, a Gaussian probability density function or can be calculated based on the thermo-rheological properties of the open-channel lava flow (FLOWGO model - Harris and Rowland, 2001). These slope-constrained properties allow estimating the velocity of the flow and its heat losses. The lava flow stops when its velocity is zero or the lava temperature reaches the solidus. Recent lava flows of Karthala volcano (Comoros islands) are here used to demonstrate the quality of lava flow simulations with the toolkit, using a quantitative assessment of the match of the simulation with the real lava flows. The

  14. Tips from the toolkit: 2--assessing organisational strengths.

    Science.gov (United States)

    Steer, Neville

    2010-03-01

    'SWOT' is a familiar term used in the development of business strategy. It is based on the identification of strengths, weaknesses, opportunities and threats as part of a strategic analysis approach. While there are a range of more sophisticated models for analysing and developing business strategy, it is a useful model for general practice as it is less time consuming than other approaches. The following article discusses some ways to apply this framework to assess organisational strengths (and weaknesses). It is based on The Royal Australian College of General Practitioners' "General practice management toolkit".

  15. The toolkit for multivariate data analysis TMVA 4

    CERN Document Server

    Speckmayer, P; Stelzer, J; Voss, H

    2010-01-01

    The toolkit for multivariate analysis, TMVA, provides a large set of advanced multivariate analysis techniques for signal/background classification. In addition, TMVA now also contains regression analysis, all embedded in a framework capable of handling the preprocessing of the data and the evaluation of the output, thus allowing a simple and convenient use of multivariate techniques. The analysis techniques implemented in TMVA can be invoked easily and the direct comparison of their performance allows the user to choose the most appropriate for a particular data analysis. This article gives an overview of the TMVA package and presents recently developed features.

  16. Evaluation of an Extension-Delivered Resource for Accelerating Progress in Childhood Obesity Prevention: The BEPA-Toolkit

    Science.gov (United States)

    Gunter, Katherine B.; Abi Nader, Patrick; Armington, Amanda; Hicks, John C.; John, Deborah

    2017-01-01

    The Balanced Energy Physical Activity Toolkit, or the BEPA-Toolkit, supports physical activity (PA) programming via Extension in elementary schools. In a pilot study, we evaluated the effectiveness of the BEPA-Toolkit as used by teachers through Supplemental Nutrition Assistance Program Education partnerships. We surveyed teachers (n = 57)…

  17. Obesity and Tobacco Cessation Toolkits: Practical Tips and Tools to Save Lives.

    Science.gov (United States)

    Crowe, Susan D; Gregg, Laurie C; DeFrancesco, Mark S

    2016-12-01

    Both obesity and smoking are public health burdens that together contribute to approximately one third of the deaths annually in the United States. In 2015, under the direction of Dr. Mark DeFrancesco, the American College of Obstetricians and Gynecologists convened two workgroups with the purpose of creating toolkits that bring together information that the obstetrician-gynecologist can use to address these preventable health problems. An Obesity Prevention and Treatment Workgroup and a Tobacco and Nicotine Cessation Workgroup developed toolkits on Obesity Prevention and Treatment (www.acog.org/ObesityToolkit)andTobaccoandNicotineCessation(www.acog.org/TobaccoToolkit). The toolkits contain specific talking points, counseling methods, and algorithms to address these health concerns in a supportive, efficient, and effective manner. By including these methods in practice, clinicians can help prevent the tragedy of early deaths caused by obesity, tobacco, and nicotine use.

  18. Experience with a distributed control system toolkit in a batch application; Erfahrungen mit einem Prozessleitsystem Toolkit in einer Batch-Anwendung

    Energy Technology Data Exchange (ETDEWEB)

    Eisenbach, B. [Bayer AG, Leverkusen (Germany); Hennecke, H. [Lang und Peitler Automation GmbH, Leverkusen (Germany)

    1999-07-01

    At Bayer AG structured automation solutions have been implemented for years on Simatic S5 systems using a software specially developed for this purpose. Due to the benefits gained during implementation and especially during commissioning as well as due to the uniform documentation, this structure was transferred to the new Siemens distributed control system PCS7 in form of the Bayer S7 ToolKit. The S7 ToolKit has been applied successfully in various projects. The following article describes the application of the Bayer S7 ToolKit with regard to basic functions and batch applications. (orig.) [German] Bei der Bayer AG wird seit Jahren eine strukturierte Automatisierung bei Simatic S5-Projekten mit einem eigens dafuer entwickelten Softwaresystem realisiert. Durch die positiven Erfahrungen sowohl in der Projektierung, der vereinfachten Inbetriebnahme und der einheitlichen Dokumentation wurde diese Struktur auch auf das neue Siemens-Prozessleitsystem PCS7 in Form des S7 ToolKits uebetragen. In einer Reihe von Projekten wurde das Bayer S7 ToolKit schon angewendet. Im folgenden wird ueber den Einsatz des ToolKits in Verbindung mit Grundfunktions-Fahrweisen und einer Batch-Anwendung berichtet. (orig.)

  19. GPU-Based Volume Rendering of Noisy Multi-Spectral Astronomical Data

    CERN Document Server

    Hassan, Amr H; Barnes, David G

    2010-01-01

    Traditional analysis techniques may not be sufficient for astronomers to make the best use of the data sets that current and future instruments, such as the Square Kilometre Array and its Pathfinders, will produce. By utilizing the incredible pattern-recognition ability of the human mind, scientific visualization provides an excellent opportunity for astronomers to gain valuable new insight and understanding of their data, particularly when used interactively in 3D. The goal of our work is to establish the feasibility of a real-time 3D monitoring system for data going into the Australian SKA Pathfinder archive. Based on CUDA, an increasingly popular development tool, our work utilizes the massively parallel architecture of modern graphics processing units (GPUs) to provide astronomers with an interactive 3D volume rendering for multi-spectral data sets. Unlike other approaches, we are targeting real time interactive visualization of datasets larger than GPU memory while giving special attention to data with l...

  20. 9 CFR 315.1 - Carcasses and parts passed for cooking; rendering into lard or tallow.

    Science.gov (United States)

    2010-01-01

    ...; rendering into lard or tallow. 315.1 Section 315.1 Animals and Animal Products FOOD SAFETY AND INSPECTION... PARTS PASSED FOR COOKING § 315.1 Carcasses and parts passed for cooking; rendering into lard or tallow... subchapter or rendered into tallow, provided such rendering is done in the following manner: (a) When...

  1. Realistic Haptic Rendering of Interacting Deformable Objects in Virtual Environments

    CERN Document Server

    Duriez, Christian; Kheddar, Abderrahmane; Andriot, Claude

    2008-01-01

    A new computer haptics algorithm to be used in general interactive manipulations of deformable virtual objects is presented. In multimodal interactive simulations, haptic feedback computation often comes from contact forces. Subsequently, the fidelity of haptic rendering depends significantly on contact space modeling. Contact and friction laws between deformable models are often simplified in up to date methods. They do not allow a "realistic" rendering of the subtleties of contact space physical phenomena (such as slip and stick effects due to friction or mechanical coupling between contacts). In this paper, we use Signorini's contact law and Coulomb's friction law as a computer haptics basis. Real-time performance is made possible thanks to a linearization of the behavior in the contact space, formulated as the so-called Delassus operator, and iteratively solved by a Gauss-Seidel type algorithm. Dynamic deformation uses corotational global formulation to obtain the Delassus operator in which the mass and s...

  2. Hybrid fur rendering: combining volumetric fur with explicit hair strands

    DEFF Research Database (Denmark)

    Andersen, Tobias Grønbeck; Falster, Viggo; Frisvad, Jeppe Revall

    2016-01-01

    Hair is typically modeled and rendered using either explicitly defined hair strand geometry or a volume texture of hair densities. Taken each on their own, these two hair representations have difficulties in the case of animal fur as it consists of very dense and thin undercoat hairs in combination...... with coarse guard hairs. Explicit hair strand geometry is not well-suited for the undercoat hairs, while volume textures are not well-suited for the guard hairs. To efficiently model and render both guard hairs and undercoat hairs, we present a hybrid technique that combines rasterization of explicitly...... defined guard hairs with ray marching of a prismatic shell volume with dynamic resolution. The latter is the key to practical combination of the two techniques, and it also enables a high degree of detail in the undercoat. We demonstrate that our hybrid technique creates a more detailed and soft fur...

  3. Chromium Renderserver: scalable and open remote rendering infrastructure.

    Science.gov (United States)

    Paul, Brian; Ahern, Sean; Bethel, E Wes; Brugger, Eric; Cook, Rich; Daniel, Jamison; Lewis, Ken; Owen, Jens; Southard, Dale

    2008-01-01

    Chromium Renderserver (CRRS) is software infrastructure that provides the ability for one or more users to run and view image output from unmodified, interactive OpenGL and X11 applications on a remote, parallel computational platform equipped with graphics hardware accelerators via industry-standard Layer 7 network protocols and client viewers. The new contributions of this work include a solution to the problem of synchronizing X11 and OpenGL command streams, remote delivery of parallel hardware accelerated rendering, and a performance analysis of several different optimizations that are generally applicable to a variety of rendering architectures. CRRS is fully operational, Open Source software. imagery and sending it to a remote viewer.

  4. Hybrid fur rendering: combining volumetric fur with explicit hair strands

    DEFF Research Database (Denmark)

    Andersen, Tobias Grønbeck; Falster, Viggo; Frisvad, Jeppe Revall

    2016-01-01

    Hair is typically modeled and rendered using either explicitly defined hair strand geometry or a volume texture of hair densities. Taken each on their own, these two hair representations have difficulties in the case of animal fur as it consists of very dense and thin undercoat hairs in combination...... with coarse guard hairs. Explicit hair strand geometry is not well-suited for the undercoat hairs, while volume textures are not well-suited for the guard hairs. To efficiently model and render both guard hairs and undercoat hairs, we present a hybrid technique that combines rasterization of explicitly...... defined guard hairs with ray marching of a prismatic shell volume with dynamic resolution. The latter is the key to practical combination of the two techniques, and it also enables a high degree of detail in the undercoat. We demonstrate that our hybrid technique creates a more detailed and soft fur...

  5. Tactile display for virtual 3D shape rendering

    CERN Document Server

    Mansutti, Alessandro; Bordegoni, Monica; Cugini, Umberto

    2017-01-01

    This book describes a novel system for the simultaneous visual and tactile rendering of product shapes which allows designers to simultaneously touch and see new product shapes during the conceptual phase of product development. This system offers important advantages, including potential cost and time savings, compared with the standard product design process in which digital 3D models and physical prototypes are often repeatedly modified until an optimal design is achieved. The system consists of a tactile display that is able to represent, within a real environment, the shape of a product. Designers can explore the rendered surface by touching curves lying on the product shape, selecting those curves that can be considered style features and evaluating their aesthetic quality. In order to physically represent these selected curves, a flexible surface is modeled by means of servo-actuated modules controlling a physical deforming strip. The tactile display is designed so as to be portable, low cost, modular,...

  6. Binaural technology for e.g. rendering auditory virtual environments

    DEFF Research Database (Denmark)

    Hammershøi, Dorte

    2008-01-01

    , helped mediate the understanding that if the transfer functions could be mastered, then important dimensions of the auditory percept could also be controlled. He early understood the potential of using the HRTFs and numerical sound transmission analysis programs for rendering auditory virtual...... environments. Jens Blauert participated in many European cooperation projects exploring  this field (and others), among other the SCATIS project addressing the auditory-tactile dimensions in the absence of visual information....

  7. Haptic Rendering Techniques for Non-Physical, Command Decision Support

    Science.gov (United States)

    2004-04-01

    tactile and haptic rendering techniques. BACKGROUND Usually visualizing battlefield implies maps, computer screens filled with information and perhaps 3...Traditional 2-D Screens 3-D stereo glasses HMD CAVE Audio Haptics Level 1, 2 3 …..Fusion - Estimates INTEL SATELLITE RAW DATA Transforms...sensory modes of data presentation Haptics Tactile 8-14 Virtual Lexicon Haptic feedback The sensation of weight or resistance in a virtual world. (from

  8. Rendering Optical Effects Based on Spectra Representation in Complex Scenes

    OpenAIRE

    Dong, Weiming

    2006-01-01

    http://www.springerlink.com/; Rendering the structural color of natural objects or modern industrial products in the 3D environment is not possible with RGB-based graphics platforms and software and very time consuming, even with the most efficient spectra representation based methods previously proposed. Our framework allows computing full spectra light object interactions only when it is needed, i.e. for the part of the scene that requires simulating special spectra sensitive phenomena. Ach...

  9. Anisotropic 3D texture synthesis with application to volume rendering

    DEFF Research Database (Denmark)

    Laursen, Lasse Farnung; Ersbøll, Bjarne Kjær; Bærentzen, Jakob Andreas

    2011-01-01

    We present a novel approach to improving volume rendering by using synthesized textures in combination with a custom transfer function. First, we use existing knowledge to synthesize anisotropic solid textures to fit our volumetric data. As input to the synthesis method, we acquire high quality....... This method is applied to a high quality visualization of a pig carcass, where samples of meat, bone, and fat have been used to produce the anisotropic 3D textures....

  10. Capturing, processing, and rendering real-world scenes

    Science.gov (United States)

    Nyland, Lars S.; Lastra, Anselmo A.; McAllister, David K.; Popescu, Voicu; McCue, Chris; Fuchs, Henry

    2000-12-01

    While photographs vividly capture a scene from a single viewpoint, it is our goal to capture a scene in such a way that a viewer can freely move to any viewpoint, just as he or she would in an actual scene. We have built a prototype system to quickly digitize a scene using a laser rangefinder and a high-resolution digital camera that accurately captures a panorama of high-resolution range and color information. With real-world scenes, we have provided data to fuel research in many area, including representation, registration, data fusion, polygonization, rendering, simplification, and reillumination. The real-world scene data can be used for many purposes, including immersive environments, immersive training, re-engineering and engineering verification, renovation, crime-scene and accident capture and reconstruction, archaeology and historic preservation, sports and entertainment, surveillance, remote tourism and remote sales. We will describe our acquisition system, the necessary processing to merge data from the multiple input devices and positions. We will also describe high quality rendering using the data we have collected. Issues about specific rendering accelerators and algorithms will also be presented. We will conclude by describing future uses and methods of collection for real- world scene data.

  11. Real-time rendering of optical effects using spatial convolution

    Science.gov (United States)

    Rokita, Przemyslaw

    1998-03-01

    Simulation of special effects such as: defocus effect, depth-of-field effect, raindrops or water film falling on the windshield, may be very useful in visual simulators and in all computer graphics applications that need realistic images of outdoor scenery. Those effects are especially important in rendering poor visibility conditions in flight and driving simulators, but can also be applied, for example, in composing computer graphics and video sequences- -i.e. in Augmented Reality systems. This paper proposes a new approach to the rendering of those optical effects by iterative adaptive filtering using spatial convolution. The advantage of this solution is that the adaptive convolution can be done in real-time by existing hardware. Optical effects mentioned above can be introduced into the image computed using conventional camera model by applying to the intensity of each pixel the convolution filter having an appropriate point spread function. The algorithms described in this paper can be easily implemented int the visualization pipeline--the final effect may be obtained by iterative filtering using a single hardware convolution filter or with the pipeline composed of identical 3 X 3 filters placed as the stages of this pipeline. Another advantage of the proposed solution is that the extension based on proposed algorithm can be added to the existing rendering systems as a final stage of the visualization pipeline.

  12. High Performance GPU-Based Fourier Volume Rendering.

    Science.gov (United States)

    Abdellah, Marwan; Eldeib, Ayman; Sharawi, Amr

    2015-01-01

    Fourier volume rendering (FVR) is a significant visualization technique that has been used widely in digital radiography. As a result of its (N (2)log⁡N) time complexity, it provides a faster alternative to spatial domain volume rendering algorithms that are (N (3)) computationally complex. Relying on the Fourier projection-slice theorem, this technique operates on the spectral representation of a 3D volume instead of processing its spatial representation to generate attenuation-only projections that look like X-ray radiographs. Due to the rapid evolution of its underlying architecture, the graphics processing unit (GPU) became an attractive competent platform that can deliver giant computational raw power compared to the central processing unit (CPU) on a per-dollar-basis. The introduction of the compute unified device architecture (CUDA) technology enables embarrassingly-parallel algorithms to run efficiently on CUDA-capable GPU architectures. In this work, a high performance GPU-accelerated implementation of the FVR pipeline on CUDA-enabled GPUs is presented. This proposed implementation can achieve a speed-up of 117x compared to a single-threaded hybrid implementation that uses the CPU and GPU together by taking advantage of executing the rendering pipeline entirely on recent GPU architectures.

  13. High Performance GPU-Based Fourier Volume Rendering

    Directory of Open Access Journals (Sweden)

    Marwan Abdellah

    2015-01-01

    Full Text Available Fourier volume rendering (FVR is a significant visualization technique that has been used widely in digital radiography. As a result of its O(N2log⁡N time complexity, it provides a faster alternative to spatial domain volume rendering algorithms that are O(N3 computationally complex. Relying on the Fourier projection-slice theorem, this technique operates on the spectral representation of a 3D volume instead of processing its spatial representation to generate attenuation-only projections that look like X-ray radiographs. Due to the rapid evolution of its underlying architecture, the graphics processing unit (GPU became an attractive competent platform that can deliver giant computational raw power compared to the central processing unit (CPU on a per-dollar-basis. The introduction of the compute unified device architecture (CUDA technology enables embarrassingly-parallel algorithms to run efficiently on CUDA-capable GPU architectures. In this work, a high performance GPU-accelerated implementation of the FVR pipeline on CUDA-enabled GPUs is presented. This proposed implementation can achieve a speed-up of 117x compared to a single-threaded hybrid implementation that uses the CPU and GPU together by taking advantage of executing the rendering pipeline entirely on recent GPU architectures.

  14. High dynamic range (HDR) virtual bronchoscopy rendering for video tracking

    Science.gov (United States)

    Popa, Teo; Choi, Jae

    2007-03-01

    In this paper, we present the design and implementation of a new rendering method based on high dynamic range (HDR) lighting and exposure control. This rendering method is applied to create video images for a 3D virtual bronchoscopy system. One of the main optical parameters of a bronchoscope's camera is the sensor exposure. The exposure adjustment is needed since the dynamic range of most digital video cameras is narrower than the high dynamic range of real scenes. The dynamic range of a camera is defined as the ratio of the brightest point of an image to the darkest point of the same image where details are present. In a video camera exposure is controlled by shutter speed and the lens aperture. To create the virtual bronchoscopic images, we first rendered a raw image in absolute units (luminance); then, we simulated exposure by mapping the computed values to the values appropriate for video-acquired images using a tone mapping operator. We generated several images with HDR and others with low dynamic range (LDR), and then compared their quality by applying them to a 2D/3D video-based tracking system. We conclude that images with HDR are closer to real bronchoscopy images than those with LDR, and thus, that HDR lighting can improve the accuracy of image-based tracking.

  15. Windows .NET Network Distributed Basic Local Alignment Search Toolkit (W.ND-BLAST

    Directory of Open Access Journals (Sweden)

    Oliver Melvin J

    2005-04-01

    Full Text Available Abstract Background BLAST is one of the most common and useful tools for Genetic Research. This paper describes a software application we have termed Windows .NET Distributed Basic Local Alignment Search Toolkit (W.ND-BLAST, which enhances the BLAST utility by improving usability, fault recovery, and scalability in a Windows desktop environment. Our goal was to develop an easy to use, fault tolerant, high-throughput BLAST solution that incorporates a comprehensive BLAST result viewer with curation and annotation functionality. Results W.ND-BLAST is a comprehensive Windows-based software toolkit that targets researchers, including those with minimal computer skills, and provides the ability increase the performance of BLAST by distributing BLAST queries to any number of Windows based machines across local area networks (LAN. W.ND-BLAST provides intuitive Graphic User Interfaces (GUI for BLAST database creation, BLAST execution, BLAST output evaluation and BLAST result exportation. This software also provides several layers of fault tolerance and fault recovery to prevent loss of data if nodes or master machines fail. This paper lays out the functionality of W.ND-BLAST. W.ND-BLAST displays close to 100% performance efficiency when distributing tasks to 12 remote computers of the same performance class. A high throughput BLAST job which took 662.68 minutes (11 hours on one average machine was completed in 44.97 minutes when distributed to 17 nodes, which included lower performance class machines. Finally, there is a comprehensive high-throughput BLAST Output Viewer (BOV and Annotation Engine components, which provides comprehensive exportation of BLAST hits to text files, annotated fasta files, tables, or association files. Conclusion W.ND-BLAST provides an interactive tool that allows scientists to easily utilizing their available computing resources for high throughput and comprehensive sequence analyses. The install package for W.ND-BLAST is

  16. Partition of biocides between water and inorganic phases of renders with organic binder

    DEFF Research Database (Denmark)

    Urbanczyk, Michal M; Bollmann, Ulla E; Bester, Kai

    2016-01-01

    , the partition of biocides between water and inorganic phases of render with organic binder was investigated. The partition constants of carbendazim, diuron, iodocarb, isoproturon, cybutryn (irgarol), octylisothiazolinone, terbutryn, and tebuconazole towards minerals typically used in renders, e.g. barite...... with render-water distribution constants of two artificially made renders showed that the distribution constants can be estimated based on partition constants of compounds for individual components of the render....

  17. Are CEOs Expected Utility Maximizers?

    OpenAIRE

    John List; Charles Mason

    2009-01-01

    Are individuals expected utility maximizers? This question represents much more than academic curiosity. In a normative sense, at stake are the fundamental underpinnings of the bulk of the last half-century's models of choice under uncertainty. From a positive perspective, the ubiquitous use of benefit-cost analysis across government agencies renders the expected utility maximization paradigm literally the only game in town. In this study, we advance the literature by exploring CEO's preferen...

  18. Implementing a user-driven online quality improvement toolkit for cancer care.

    Science.gov (United States)

    Luck, Jeff; York, Laura S; Bowman, Candice; Gale, Randall C; Smith, Nina; Asch, Steven M

    2015-05-01

    Peer-to-peer collaboration within integrated health systems requires a mechanism for sharing quality improvement lessons. The Veterans Health Administration (VA) developed online compendia of tools linked to specific cancer quality indicators. We evaluated awareness and use of the toolkits, variation across facilities, impact of social marketing, and factors influencing toolkit use. A diffusion of innovations conceptual framework guided the collection of user activity data from the Toolkit Series SharePoint site and an online survey of potential Lung Cancer Care Toolkit users. The VA Toolkit Series site had 5,088 unique visitors in its first 22 months; 5% of users accounted for 40% of page views. Social marketing communications were correlated with site usage. Of survey respondents (n = 355), 54% had visited the site, of whom 24% downloaded at least one tool. Respondents' awareness of the lung cancer quality performance of their facility, and facility participation in quality improvement collaboratives, were positively associated with Toolkit Series site use. Facility-level lung cancer tool implementation varied widely across tool types. The VA Toolkit Series achieved widespread use and a high degree of user engagement, although use varied widely across facilities. The most active users were aware of and active in cancer care quality improvement. Toolkit use seemed to be reinforced by other quality improvement activities. A combination of user-driven tool creation and centralized toolkit development seemed to be effective for leveraging health information technology to spread disease-specific quality improvement tools within an integrated health care system. Copyright © 2015 by American Society of Clinical Oncology.

  19. Overview and Meteorological Validation of the Wind Integration National Dataset toolkit

    Energy Technology Data Exchange (ETDEWEB)

    Draxl, C. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hodge, B. M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Clifton, A. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); McCaa, J. [3TIER by VAisala, Seattle, WA (United States)

    2015-04-13

    The Wind Integration National Dataset (WIND) Toolkit described in this report fulfills these requirements, and constitutes a state-of-the-art national wind resource data set covering the contiguous United States from 2007 to 2013 for use in a variety of next-generation wind integration analyses and wind power planning. The toolkit is a wind resource data set, wind forecast data set, and wind power production and forecast data set derived from the Weather Research and Forecasting (WRF) numerical weather prediction model. WIND Toolkit data are available online for over 116,000 land-based and 10,000 offshore sites representing existing and potential wind facilities.

  20. A Fast Ray-Tracing Using Bounding Spheres and Frustum Rays for Dynamic Scene Rendering

    Science.gov (United States)

    Suzuki, Ken-Ichi; Kaeriyama, Yoshiyuki; Komatsu, Kazuhiko; Egawa, Ryusuke; Ohba, Nobuyuki; Kobayashi, Hiroaki

    Ray tracing is one of the most popular techniques for generating photo-realistic images. Extensive research and development work has made interactive static scene rendering realistic. This paper deals with interactive dynamic scene rendering in which not only the eye point but also the objects in the scene change their 3D locations every frame. In order to realize interactive dynamic scene rendering, RTRPS (Ray Tracing based on Ray Plane and Bounding Sphere), which utilizes the coherency in rays, objects, and grouped-rays, is introduced. RTRPS uses bounding spheres as the spatial data structure which utilizes the coherency in objects. By using bounding spheres, RTRPS can ignore the rotation of moving objects within a sphere, and shorten the update time between frames. RTRPS utilizes the coherency in rays by merging rays into a ray-plane, assuming that the secondary rays and shadow rays are shot through an aligned grid. Since a pair of ray-planes shares an original ray, the intersection for the ray can be completed using the coherency in the ray-planes. Because of the three kinds of coherency, RTRPS can significantly reduce the number of intersection tests for ray tracing. Further acceleration techniques for ray-plane-sphere and ray-triangle intersection are also presented. A parallel projection technique converts a 3D vector inner product operation into a 2D operation and reduces the number of floating point operations. Techniques based on frustum culling and binary-tree structured ray-planes optimize the order of intersection tests between ray-planes and a sphere, resulting in 50% to 90% reduction of intersection tests. Two ray-triangle intersection techniques are also introduced, which are effective when a large number of rays are packed into a ray-plane. Our performance evaluations indicate that RTRPS gives 13 to 392 times speed up in comparison with a ray tracing algorithm without organized rays and spheres. We found out that RTRPS also provides competitive

  1. A toolkit for MSDs prevention--WHO and IEA context.

    Science.gov (United States)

    Caple, David C

    2012-01-01

    Many simple MSD risk management tools have been developed by ergonomists for use by workers and employers with little or no training to undertake injury prevention programs in their workplace. However, currently there is no "toolkit" which places such tools within an holistic, participative ergonomics framework and provides guidance on how best to use individual tools. It is proposed that such an holistic approach should entail initial analysis and evaluation of underlying systems of work and related health and performance indicators, prior to focusing in assessment of MSD risks stemming from particular hazards. Depending on the context, more narrowly focused tools might then be selected to assess risk associated with jobs or tasks identified as problematic. This approach ensures that biomechanical risk factors are considered within a broad context of organizational and psychosocial risk factors. This is consistent with current research evidence on work- related causes of MSDs.

  2. The interactive learning toolkit: technology and the classroom

    Science.gov (United States)

    Lukoff, Brian; Tucker, Laura

    2011-04-01

    Peer Instruction (PI) and Just-in-Time-Teaching (JiTT) have been shown to increase both students' conceptual understanding and problem-solving skills. However, the time investment for the instructor to prepare appropriate conceptual questions and manage student JiTT responses is one of the main implementation hurdles. To overcome this we have developed the Interactive Learning Toolkit (ILT), a course management system specifically designed to support PI and JiTT. We are working to integrate the ILT with a fully interactive classroom system where students can use their laptops and smartphones to respond to ConcepTests in class. The goal is to use technology to engage students in conceptual thinking both in and out of the classroom.

  3. HemI: a toolkit for illustrating heatmaps.

    Directory of Open Access Journals (Sweden)

    Wankun Deng

    Full Text Available Recent high-throughput techniques have generated a flood of biological data in all aspects. The transformation and visualization of multi-dimensional and numerical gene or protein expression data in a single heatmap can provide a concise but comprehensive presentation of molecular dynamics under different conditions. In this work, we developed an easy-to-use tool named HemI (Heat map Illustrator, which can visualize either gene or protein expression data in heatmaps. Additionally, the heatmaps can be recolored, rescaled or rotated in a customized manner. In addition, HemI provides multiple clustering strategies for analyzing the data. Publication-quality figures can be exported directly. We propose that HemI can be a useful toolkit for conveniently visualizing and manipulating heatmaps. The stand-alone packages of HemI were implemented in Java and can be accessed at http://hemi.biocuckoo.org/down.php.

  4. HemI: a toolkit for illustrating heatmaps.

    Science.gov (United States)

    Deng, Wankun; Wang, Yongbo; Liu, Zexian; Cheng, Han; Xue, Yu

    2014-01-01

    Recent high-throughput techniques have generated a flood of biological data in all aspects. The transformation and visualization of multi-dimensional and numerical gene or protein expression data in a single heatmap can provide a concise but comprehensive presentation of molecular dynamics under different conditions. In this work, we developed an easy-to-use tool named HemI (Heat map Illustrator), which can visualize either gene or protein expression data in heatmaps. Additionally, the heatmaps can be recolored, rescaled or rotated in a customized manner. In addition, HemI provides multiple clustering strategies for analyzing the data. Publication-quality figures can be exported directly. We propose that HemI can be a useful toolkit for conveniently visualizing and manipulating heatmaps. The stand-alone packages of HemI were implemented in Java and can be accessed at http://hemi.biocuckoo.org/down.php.

  5. An expanded nuclear phylogenomic PCR toolkit for Sapindales1

    Science.gov (United States)

    Collins, Elizabeth S.; Gostel, Morgan R.; Weeks, Andrea

    2016-01-01

    Premise of the study: We tested PCR amplification of 91 low-copy nuclear gene loci in taxa from Sapindales using primers developed for Bursera simaruba (Burseraceae). Methods and Results: Cross-amplification of these markers among 10 taxa tested was related to their phylogenetic distance from B. simaruba. On average, each Sapindalean taxon yielded product for 53 gene regions (range: 16–90). Arabidopsis thaliana (Brassicales), by contrast, yielded product for two. Single representatives of Anacardiaceae and Rutacaeae yielded 34 and 26 products, respectively. Twenty-six primer pairs worked for all Burseraceae species tested if highly divergent Aucoumea klaineana is excluded, and eight of these amplified product in every Sapindalean taxon. Conclusions: Our study demonstrates that customized primers for Bursera can amplify product in a range of Sapindalean taxa. This collection of primer pairs, therefore, is a valuable addition to the toolkit for nuclear phylogenomic analyses of Sapindales and warrants further investigation. PMID:28101434

  6. Globus Toolkit Version 4: Software for Service-Oriented Systems

    Institute of Scientific and Technical Information of China (English)

    Ian Foster

    2006-01-01

    The Globus Toolkit (GT) has been developed since the late 1990s to support the development of serviceoriented distributed computing applications and infrastructures. Core GT components address, within a common framework,fundamental issues relating to security, resource access, resource management, data movement, resource discovery, and so forth. These components enable a broader "Globus ecosystem" of tools and components that build on, or interoperate with,GT functionality to provide a wide range of useful application-level functions. These tools have in turn been used to develop a wide range of both "Grid" infrastructures and distributed applications. I summarize here the principal characteristics of the recent Web Services-based GT4 release, which provides significant improvements over previous releases in terms of robustness, performance, usability, documentation, standards compliance, and functionality. I also introduce the new "dev.globus" community development process, which allows a larger community to contribute to the development of Globus software.

  7. Integrating surgical robots into the next medical toolkit.

    Science.gov (United States)

    Lai, Fuji; Entin, Eileen

    2006-01-01

    Surgical robots hold much promise for revolutionizing the field of surgery and improving surgical care. However, despite the potential advantages they offer, there are multiple barriers to adoption and integration into practice that may prevent these systems from realizing their full potential benefit. This study elucidated some of the most salient considerations that need to be addressed for integration of new technologies such as robotic systems into the operating room of the future as it evolves into a complex system of systems. We conducted in-depth interviews with operating room team members and other stakeholders to identify potential barriers in areas of workflow, teamwork, training, clinical acceptance, and human-system interaction. The findings of this study will inform an approach for the design and integration of robotics and related computer-assisted technologies into the next medical toolkit for "computer-enhanced surgery" to improve patient safety and healthcare quality.

  8. A personal health information toolkit for health intervention research.

    Science.gov (United States)

    Kizakevich, Paul N; Eckhoff, Randall; Weger, Stacey; Weeks, Adam; Brown, Janice; Bryant, Stephanie; Bakalov, Vesselina; Zhang, Yuying; Lyden, Jennifer; Spira, James

    2014-01-01

    With the emergence of mobile health (mHealth) apps, there is a growing demand for better tools for developing and evaluating mobile health interventions. Recently we developed the Personal Health Intervention Toolkit (PHIT), a software framework which eases app implementation and facilitates scientific evaluation. PHIT integrates self-report and physiological sensor instruments, evidence-based advisor logic, and self-help interventions such as meditation, health education, and cognitive behavior change. PHIT can be used to facilitate research, interventions for chronic diseases, risky behaviors, sleep, medication adherence, environmental monitoring, momentary data collection health screening, and clinical decision support. In a series of usability evaluations, participants reported an overall usability score of 4.5 on a 1-5 Likert scale and an 85 score on the System Usability Scale, indicating a high percentile rank of 95%.

  9. Managing Fieldwork Data with Toolbox and the Natural Language Toolkit

    Directory of Open Access Journals (Sweden)

    Stuart Robinson

    2007-06-01

    Full Text Available This paper shows how fieldwork data can be managed using the program Toolbox together with the Natural Language Toolkit (NLTK for the Python programming language. It provides background information about Toolbox and describes how it can be downloaded and installed. The basic functionality of the program for lexicons and texts is described, and its strengths and weaknesses are reviewed. Its underlying data format is briefly discussed, and Toolbox processing capabilities of NLTK are introduced, showing ways in which it can be used to extend the functionality of Toolbox. This is illustrated with a few simple scripts that demonstrate basic data management tasks relevant to language documentation, such as printing out the contents of a lexicon as HTML.

  10. Globus Toolkit Support for Distributed Data—Intensive Science

    Institute of Scientific and Technical Information of China (English)

    W.Alcock; A.Chervenak; 等

    2001-01-01

    In high-energy physics,terabyte and soon petabyte-scale data collections are emerging as critical community resources.A new class of "Data Grid" infrastructure is required to support distributed access to and analysis of these datasets by potentially thousands of users.Data Grid technology is being deployed in numerous experiments through collaborationssuch as the EU DataGrid,the Grid Physics Network,and the Particel Physics Data Grid[1],The Globus Toolkit is a widely used set of services designed to support the creation of these Grid infrastrunctures and applications,In this paper we survey the Globus technologies that will play a major role in the development and deployment for these Grids.

  11. Upgrading the safety toolkit: Initiatives of the accident analysis subgroup

    Energy Technology Data Exchange (ETDEWEB)

    O' Kula, K.R.; Chung, D.Y.

    1999-07-01

    Since its inception, the Accident Analysis Subgroup (AAS) of the Energy Facility Contractors Group (EFCOG) has been a leading organization promoting development and application of appropriate methodologies for safety analysis of US Department of Energy (DOE) installations. The AAS, one of seven chartered by the EFCOG Safety Analysis Working Group, has performed an oversight function and provided direction to several technical groups. These efforts have been instrumental toward formal evaluation of computer models, improving the pedigree on high-use computer models, and development of the user-friendly Accident Analysis Guidebook (AAG). All of these improvements have improved the analytical toolkit for best complying with DOE orders and standards shaping safety analysis reports (SARs) and related documentation. Major support for these objectives has been through DOE/DP-45.

  12. Migration of 1970s Minicomputer Controls to Modern Toolkit Software

    Energy Technology Data Exchange (ETDEWEB)

    Juras, R.C.; Meigs, M.J.; Sinclair, J.A.; Tatum, B.A.

    1999-11-13

    Controls for accelerators and associated systems at the Holifield Radioactive Ion Beam Facility (HRIBF) at Oak Ridge National Laboratory have been migrated from 197Os-vintage minicomputers to a modern system based on Vista and EPICS toolkit software. Stability and capabilities of EPICS software have motivated increasing use of EPICS for accelerator controls. In addition, very inexpensive subsystems based on EPICS and the EPICS portable CA server running on Linux PCs have been implemented to control an ion source test facility and to control a building-access badge reader system. A new object-oriented, extensible display manager has been developed for EPICS to facilitate the transition to EPICS and will be used in place of MEDM. EPICS device support has been developed for CAMAC serial highway controls.

  13. A survey on hair modeling: styling, simulation, and rendering.

    Science.gov (United States)

    Ward, Kelly; Bertails, Florence; Kim, Tae-Yong; Marschner, Stephen R; Cani, Marie-Paule; Lin, Ming C

    2007-01-01

    Realistic hair modeling is a fundamental part of creating virtual humans in computer graphics. This paper surveys the state of the art in the major topics of hair modeling: hairstyling, hair simulation, and hair rendering. Because of the difficult, often unsolved problems that arise in all these areas, a broad diversity of approaches are used, each with strengths that make it appropriate for particular applications. We discuss each of these major topics in turn, presenting the unique challenges facing each area and describing solutions that have been presented over the years to handle these complex issues. Finally, we outline some of the remaining computational challenges in hair modeling.

  14. Interactive View-Dependent Rendering of Large Isosurfaces

    Energy Technology Data Exchange (ETDEWEB)

    Gregorski, B; Duchaineau, M; Lindstrom, P; Pascucci, V; Joy, K I

    2002-11-19

    We present an algorithm for interactively extracting and rendering isosurfaces of large volume datasets in a view-dependent fashion. A recursive tetrahedral mesh refinement scheme, based on longest edge bisection, is used to hierarchically decompose the data into a multiresolution structure. This data structure allows fast extraction of arbitrary isosurfaces to within user specified view-dependent error bounds. A data layout scheme based on hierarchical space filling curves provides access to the data in a cache coherent manner that follows the data access pattern indicated by the mesh refinement.

  15. Software System for Vocal Rendering of Printed Documents

    Directory of Open Access Journals (Sweden)

    Marian DARDALA

    2008-01-01

    Full Text Available The objective of this paper is to present a software system architecture developed to render the printed documents in a vocal form. On the other hand, in the paper are described the software solutions that exist as software components and are necessary for documents processing as well as for multimedia device controlling used by the system. The usefulness of this system is for people with visual disabilities that can access the contents of documents without that they be printed in Braille system or to exist in an audio form.

  16. An example of quantum imaging: rendering an object undetectable

    CERN Document Server

    Ataman, Stefan

    2016-01-01

    In this paper we propose and analyse a Gedankenexperiment involving three non-linear crystals and two objects inserted in the idler beams. We show that, besides the behaviour that can be extrapolated from previous experiments involving two crystals and one object, we are able to predict a new effect: under certain circumstances, one of the objects can be rendered undetectable to any single detection rate on the signal photons with discarded idler photons. This effect could find applications in future developments of quantum imaging techniques.

  17. An example of quantum imaging: rendering an object undetectable

    Science.gov (United States)

    Ataman, Stefan

    2016-06-01

    In this paper we propose and analyse a Gedankenexperiment involving three non-linear crystals and two objects inserted in the idler beams. We show that, besides the behaviour that can be extrapolated from previous experiments involving two crystals and one object, we are able to predict a new effect: under certain circumstances, one of the objects can be rendered undetectable to any single detection rate on the signal photons with discarded idler photons. This effect could find applications in future developments of quantum imaging techniques.

  18. Horse-shoe lung-rediscovered via volume rendered images

    Directory of Open Access Journals (Sweden)

    Alpa Bharati

    2013-01-01

    Full Text Available Horseshoe lung, usually associated with pulmonary venolobar syndrome, is a rare congenital anomaly involving the fusion of the postero-basal segments of the right and left lungs across the midline. The fused segment or the isthmus lies posterior to the pericardium and anterior to the aorta.The associated pulmonary venolobar syndrome involves anomalous systemic arterial supply and anomlaous systemic venous drainage of the right lung. With the advent of MDCT imaging, we can diagnose this rare condition as well all its associated anomalies non-invasively. Volume-rendered techniques greatly simplify the complex anatomy and provide easy understanding of the same.

  19. Partitioning of biocides between water and inorganic phases of render

    DEFF Research Database (Denmark)

    Urbanczyk, Michal; Bollmann, Ulla E.; Bester, Kai

    The use of biocides as additives for building materials has gained importance in recent years. These biocides are, e.g., applied to renders and paints to prevent them from microbial spoilage. However, these biocides can leach out into the environment. In order to better understand this leaching...... compared. The partitioning constants for calcium carbonate varied between 0.1 (isoproturon) and 1.1 (iodocarb) and 84.6 (dichlorooctylisothiazolinone), respectively. The results for barite, kaolinite and mica were in a similar range and usually the compounds with high partitioning constants for one mineral...

  20. Practical rendering and computation with Direct3D 11

    CERN Document Server

    Zink, Jason; Hoxley, Jack

    2011-01-01

    Practical Rendering and Computation with Direct3D 11 packs in documentation and in-depth coverage of basic and high-level concepts related to using Direct 3D 11 and is a top pick for any serious programming collection. … perfect for a wide range of users. Any interested in computation and multicore models will find this packed with examples and technical applications.-Midwest Book Review, October 2011The authors have generously provided us with an optimal blend of concepts and philosophy, illustrative figures to clarify the more difficult points, and source code fragments to make the ideas con

  1. State of the Art in Transfer Functions for Direct Volume Rendering

    KAUST Repository

    Ljung, Patric

    2016-07-04

    A central topic in scientific visualization is the transfer function (TF) for volume rendering. The TF serves a fundamental role in translating scalar and multivariate data into color and opacity to express and reveal the relevant features present in the data studied. Beyond this core functionality, TFs also serve as a tool for encoding and utilizing domain knowledge and as an expression for visual design of material appearances. TFs also enable interactive volumetric exploration of complex data. The purpose of this state-of-the-art report (STAR) is to provide an overview of research into the various aspects of TFs, which lead to interpretation of the underlying data through the use of meaningful visual representations. The STAR classifies TF research into the following aspects: dimensionality, derived attributes, aggregated attributes, rendering aspects, automation, and user interfaces. The STAR concludes with some interesting research challenges that form the basis of an agenda for the development of next generation TF tools and methodologies. © 2016 The Author(s) Computer Graphics Forum © 2016 The Eurographics Association and John Wiley & Sons Ltd. Published by John Wiley & Sons Ltd.

  2. Signal Processing Implementation and Comparison of Automotive Spatial Sound Rendering Strategies

    Directory of Open Access Journals (Sweden)

    Bai MingsianR

    2009-01-01

    Full Text Available Design and implementation strategies of spatial sound rendering are investigated in this paper for automotive scenarios. Six design methods are implemented for various rendering modes with different number of passengers. Specifically, the downmixing algorithms aimed at balancing the front and back reproductions are developed for the 5.1-channel input. Other five algorithms based on inverse filtering are implemented in two approaches. The first approach utilizes binaural (Head-Related Transfer Functions HRTFs measured in the car interior, whereas the second approach named the point-receiver model targets a point receiver positioned at the center of the passenger's head. The proposed processing algorithms were compared via objective and subjective experiments under various listening conditions. Test data were processed by the multivariate analysis of variance (MANOVA method and the least significant difference (Fisher's LSD method as a post hoc test to justify the statistical significance of the experimental data. The results indicate that inverse filtering algorithms are preferred for the single passenger mode. For the multipassenger mode, however, downmixing algorithms generally outperformed the other processing techniques.

  3. DeepInfer: open-source deep learning deployment toolkit for image-guided therapy

    Science.gov (United States)

    Mehrtash, Alireza; Pesteie, Mehran; Hetherington, Jorden; Behringer, Peter A.; Kapur, Tina; Wells, William M.; Rohling, Robert; Fedorov, Andriy; Abolmaesumi, Purang

    2017-03-01

    Deep learning models have outperformed some of the previous state-of-the-art approaches in medical image analysis. Instead of using hand-engineered features, deep models attempt to automatically extract hierarchical representations at multiple levels of abstraction from the data. Therefore, deep models are usually considered to be more flexible and robust solutions for image analysis problems compared to conventional computer vision models. They have demonstrated significant improvements in computer-aided diagnosis and automatic medical image analysis applied to such tasks as image segmentation, classification and registration. However, deploying deep learning models often has a steep learning curve and requires detailed knowledge of various software packages. Thus, many deep models have not been integrated into the clinical research work ows causing a gap between the state-of-the-art machine learning in medical applications and evaluation in clinical research procedures. In this paper, we propose "DeepInfer" - an open-source toolkit for developing and deploying deep learning models within the 3D Slicer medical image analysis platform. Utilizing a repository of task-specific models, DeepInfer allows clinical researchers and biomedical engineers to deploy a trained model selected from the public registry, and apply it to new data without the need for software development or configuration. As two practical use cases, we demonstrate the application of DeepInfer in prostate segmentation for targeted MRI-guided biopsy and identification of the target plane in 3D ultrasound for spinal injections.

  4. Water Resources Risks and the Climate Resilience Toolkit: Tools, Case Studies, and Partnerships

    Science.gov (United States)

    Read, E. K.; Blodgett, D. L.; Booth, N.

    2014-12-01

    The Water Resources Risk topic of the Climate Resilience Toolkit (CRT) is designed to provide decision support, technical, and educational resources to communities, water resource managers, policy analysts, and water utilities working to increase the resilience of water resources to climate change. We highlight the partnerships (between federal and state agencies, non-governmental organizations, and private partners), tools (e.g., downscaled climate products, historical and real-time water data, and decision support) and success stories that are informing the CRT Water Resources Risks Theme content, and identify remaining needs in available resources for building resilience of water resources to climate change. The following questions will frame the content of the Water Resources Risk CRT: How are human and natural components of the hydrologic cycle changing? How can communities and water managers plan for uncertain future conditions? How will changing water resources impact food production, energy resources, ecosystems, and human health? What water resources data are of high value to society and are they easily accessible? Input on existing tools, resources, or potential partnerships that could be used to further develop content and fill gaps in the Water Resources CRT is welcome. We also invite ideas for water resources 'innovation challenges', in which technology developers work to create tools to that enhance the capacity of communities and managers to increase resilience of water resources at the local and regional scales.

  5. A Powerful Toolkit for Synthetic Biology: Over 3.8 Billion Years of Evolution

    Science.gov (United States)

    Rothschild, Lynn J.

    2010-01-01

    The combination of evolutionary with engineering principles will enhance synthetic biology. Conversely, synthetic biology has the potential to enrich evolutionary biology by explaining why some adaptive space is empty, on Earth or elsewhere. Synthetic biology, the design and construction of artificial biological systems, substitutes bio-engineering for evolution, which is seen as an obstacle. But because evolution has produced the complexity and diversity of life, it provides a proven toolkit of genetic materials and principles available to synthetic biology. Evolution operates on the population level, with the populations composed of unique individuals that are historical entities. The source of genetic novelty includes mutation, gene regulation, sex, symbiosis, and interspecies gene transfer. At a phenotypic level, variation derives from regulatory control, replication and diversification of components, compartmentalization, sexual selection and speciation, among others. Variation is limited by physical constraints such as diffusion, and chemical constraints such as reaction rates and membrane fluidity. While some of these tools of evolution are currently in use in synthetic biology, all ought to be examined for utility. A hybrid approach of synthetic biology coupled with fine-tuning through evolution is suggested

  6. DeepInfer: Open-Source Deep Learning Deployment Toolkit for Image-Guided Therapy.

    Science.gov (United States)

    Mehrtash, Alireza; Pesteie, Mehran; Hetherington, Jorden; Behringer, Peter A; Kapur, Tina; Wells, William M; Rohling, Robert; Fedorov, Andriy; Abolmaesumi, Purang

    2017-02-11

    Deep learning models have outperformed some of the previous state-of-the-art approaches in medical image analysis. Instead of using hand-engineered features, deep models attempt to automatically extract hierarchical representations at multiple levels of abstraction from the data. Therefore, deep models are usually considered to be more flexible and robust solutions for image analysis problems compared to conventional computer vision models. They have demonstrated significant improvements in computer-aided diagnosis and automatic medical image analysis applied to such tasks as image segmentation, classification and registration. However, deploying deep learning models often has a steep learning curve and requires detailed knowledge of various software packages. Thus, many deep models have not been integrated into the clinical research workflows causing a gap between the state-of-the-art machine learning in medical applications and evaluation in clinical research procedures. In this paper, we propose "DeepInfer" - an open-source toolkit for developing and deploying deep learning models within the 3D Slicer medical image analysis platform. Utilizing a repository of task-specific models, DeepInfer allows clinical researchers and biomedical engineers to deploy a trained model selected from the public registry, and apply it to new data without the need for software development or configuration. As two practical use cases, we demonstrate the application of DeepInfer in prostate segmentation for targeted MRI-guided biopsy and identification of the target plane in 3D ultrasound for spinal injections.

  7. FASTAptamer: A Bioinformatic Toolkit for High-throughput Sequence Analysis of Combinatorial Selections

    Directory of Open Access Journals (Sweden)

    Khalid K Alam

    2015-01-01

    Full Text Available High-throughput sequence (HTS analysis of combinatorial selection populations accelerates lead discovery and optimization and offers dynamic insight into selection processes. An underlying principle is that selection enriches high-fitness sequences as a fraction of the population, whereas low-fitness sequences are depleted. HTS analysis readily provides the requisite numerical information by tracking the evolutionary trajectory of individual sequences in response to selection pressures. Unlike genomic data, for which a number of software solutions exist, user-friendly tools are not readily available for the combinatorial selections field, leading many users to create custom software. FASTAptamer was designed to address the sequence-level analysis needs of the field. The open source FASTAptamer toolkit counts, normalizes and ranks read counts in a FASTQ file, compares populations for sequence distribution, generates clusters of sequence families, calculates fold-enrichment of sequences throughout the course of a selection and searches for degenerate sequence motifs. While originally designed for aptamer selections, FASTAptamer can be applied to any selection strategy that can utilize next-generation DNA sequencing, such as ribozyme or deoxyribozyme selections, in vivo mutagenesis and various surface display technologies (peptide, antibody fragment, mRNA, etc.. FASTAptamer software, sample data and a user's guide are available for download at http://burkelab.missouri.edu/fastaptamer.html.

  8. Field tests of a participatory ergonomics toolkit for Total Worker Health.

    Science.gov (United States)

    Nobrega, Suzanne; Kernan, Laura; Plaku-Alakbarova, Bora; Robertson, Michelle; Warren, Nicholas; Henning, Robert

    2017-04-01

    Growing interest in Total Worker Health(®) (TWH) programs to advance worker safety, health and well-being motivated development of a toolkit to guide their implementation. Iterative design of a program toolkit occurred in which participatory ergonomics (PE) served as the primary basis to plan integrated TWH interventions in four diverse organizations. The toolkit provided start-up guides for committee formation and training, and a structured PE process for generating integrated TWH interventions. Process data from program facilitators and participants throughout program implementation were used for iterative toolkit design. Program success depended on organizational commitment to regular design team meetings with a trained facilitator, the availability of subject matter experts on ergonomics and health to support the design process, and retraining whenever committee turnover occurred. A two committee structure (employee Design Team, management Steering Committee) provided advantages over a single, multilevel committee structure, and enhanced the planning, communication, and teamwork skills of participants.

  9. Pyradi: an open-source toolkit for infrared calculation and data processing

    CSIR Research Space (South Africa)

    Willers, CJ

    2012-09-01

    Full Text Available Electro-optical system design, data analysis and modelling involve a significant amount of calculation and processing. Many of these calculations are of a repetitive and general nature, suitable for including in a generic toolkit. The availability...

  10. Mocapy++ - a toolkit for inference and learning in dynamic Bayesian networks

    DEFF Research Database (Denmark)

    Paluszewski, Martin; Hamelryck, Thomas Wim

    2010-01-01

    Background Mocapy++ is a toolkit for parameter learning and inference in dynamic Bayesian networks (DBNs). It supports a wide range of DBN architectures and probability distributions, including distributions from directional statistics (the statistics of angles, directions and orientations...

  11. ACES Model Composition and Development Toolkit to Support NGATS Concepts Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The key innovation proposed in this effort is the development of a model composition toolkit that will enable NASA Airspace Concept Evaluation System (ACES) users to...

  12. Energy Conservation Behaviour Toolkit. Incentive Meachnisms for Effective Decrease of Energy Consumption at the Workplace

    NARCIS (Netherlands)

    Kalz, Marco; Börner, Dirk; Specht, Marcus

    2012-01-01

    Kalz, M., Börner, D., & Specht, M. (2012, 18 September). Energy Conservation Behaviour Toolkit. Incentive Mechanisms for Effective Decrease of Energy Consumption at the Workplace. Presentation at the 'Tussenbijeenkomst SURFnet Innovatieregeling Duurzaamheid & ICT', Utrecht, The Netherlands.

  13. A Teacher Tablet Toolkit to meet the challenges posed by 21st ...

    African Journals Online (AJOL)

    Adele @

    2015-11-25

    Nov 25, 2015 ... participating teachers with a toolkit consisting of technology hardware, pragmatic pedagogical and technology .... new concepts and new processes related to teaching ..... space to experiment and multiple opportunities to.

  14. Sparse PDF Volumes for Consistent Multi-Resolution Volume Rendering

    KAUST Repository

    Sicat, Ronell Barrera

    2014-12-31

    This paper presents a new multi-resolution volume representation called sparse pdf volumes, which enables consistent multi-resolution volume rendering based on probability density functions (pdfs) of voxel neighborhoods. These pdfs are defined in the 4D domain jointly comprising the 3D volume and its 1D intensity range. Crucially, the computation of sparse pdf volumes exploits data coherence in 4D, resulting in a sparse representation with surprisingly low storage requirements. At run time, we dynamically apply transfer functions to the pdfs using simple and fast convolutions. Whereas standard low-pass filtering and down-sampling incur visible differences between resolution levels, the use of pdfs facilitates consistent results independent of the resolution level used. We describe the efficient out-of-core computation of large-scale sparse pdf volumes, using a novel iterative simplification procedure of a mixture of 4D Gaussians. Finally, our data structure is optimized to facilitate interactive multi-resolution volume rendering on GPUs.

  15. Sparse PDF Volumes for Consistent Multi-Resolution Volume Rendering.

    Science.gov (United States)

    Sicat, Ronell; Krüger, Jens; Möller, Torsten; Hadwiger, Markus

    2014-12-01

    This paper presents a new multi-resolution volume representation called sparse pdf volumes, which enables consistent multi-resolution volume rendering based on probability density functions (pdfs) of voxel neighborhoods. These pdfs are defined in the 4D domain jointly comprising the 3D volume and its 1D intensity range. Crucially, the computation of sparse pdf volumes exploits data coherence in 4D, resulting in a sparse representation with surprisingly low storage requirements. At run time, we dynamically apply transfer functions to the pdfs using simple and fast convolutions. Whereas standard low-pass filtering and down-sampling incur visible differences between resolution levels, the use of pdfs facilitates consistent results independent of the resolution level used. We describe the efficient out-of-core computation of large-scale sparse pdf volumes, using a novel iterative simplification procedure of a mixture of 4D Gaussians. Finally, our data structure is optimized to facilitate interactive multi-resolution volume rendering on GPUs.

  16. Fast polyhedral cell sorting for interactive rendering of unstructured grids

    Energy Technology Data Exchange (ETDEWEB)

    Combra, J; Klosowski, J T; Max, N; Silva, C T; Williams, P L

    1998-10-30

    Direct volume rendering based on projective methods works by projecting, in visibility order, the polyhedral cells of a mesh onto the image plane, and incrementally compositing the cell's color and opacity into the final image. Crucial to this method is the computation of a visibility ordering of the cells. If the mesh is ''well-behaved'' (acyclic and convex), then the MPVO method of Williams provides a very fast sorting algorithm; however, this method only computes an approximate ordering in general datasets, resulting in visual artifacts when rendered. A recent method of Silva et al. removed the assumption that the mesh is convex, by means of a sweep algorithm used in conjunction with the MPVO method; their algorithm is substantially faster than previous exact methods for general meshes. In this paper we propose a new technique, which we call BSP-XMPVO, which is based on a fast and simple way of using binary space partitions on the boundary elements of the mesh to augment the ordering produced by MPVO. Our results are shown to be orders of magnitude better than previous exact methods of sorting cells.

  17. Non-Photorealistic Rendering in Chinese Painting of Animals

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    A set of algorithms is proposed in this paper to automatically transform 3D animal models to Chinese painting style. Inspired by real painting process in Chinese painting of animals, we divide the whole rendering process into two parts: borderline stroke making and interior shading. In borderline stroke making process we first find 3D model silhouettes in real-time depending on the viewing direction of a user. After retrieving silhouette information from all model edges, a stroke linking mechanism is applied to link these independent edges into a long stroke. Finally we grow a plain thin silhouette line to a stylus stroke with various widths at each control point and a 2D brush model is combined with it to simulate a Chinese painting stroke. In the interior shading pipeline, three stages are used to convert a Gouraud-shading image to a Chinese painting style image: color quantization, ink diffusion and box filtering. The color quantization stage assigns all pixels in an image into four color levels and each level represents a color layer in a Chinese painting. Ink diffusion stage is used to transfer inks and water between different levels and to grow areas in an irregular way. The box filtering stage blurs sharp borders between different levels to embellish the appearance of final interior shading image. In addition to automatic rendering, an interactive Chinese painting system which is equipped with friendly input devices can be also combined to generate more artistic Chinese painting images manually.

  18. Protein and mineral characterisation of rendered meat and bone meal.

    Science.gov (United States)

    Buckley, M; Penkman, K E H; Wess, T J; Reaney, S; Collins, M J

    2012-10-01

    We report the characterisation of meat and bone meal (MBM) standards (Set B-EFPRA) derived from cattle, sheep, pig and chicken, each rendered at four different temperatures (133, 137, 141 and 145 °C). The standards, prepared for an EU programme STRATFEED (to develop new methodologies for the detection and quantification of illegal addition of mammalian tissues in feeding stuffs), have been widely circulated and used to assess a range of methods for identification of the species composition of MBM. The overall state of mineral alteration and protein preservation as a function of temperature was monitored using small angle X-ray diffraction (SAXS), amino acid composition and racemization analyses. Progressive increases in protein damage and mineral alteration in chicken and cattle standards was observed. In the case of sheep and pig, there was greater damage to the proteins and alteration of the minerals at the lowest treatment temperature (133 °C), suggesting that the thermal treatments must have been compromised in some way. This problem has probably impacted upon the numerous studies which tested methods against these heat treatments. We use protein mass spectrometric methods to explore if thermostable proteins could be used to identify rendered MBM. In more thermally altered samples, so-called 'thermostable' proteins such as osteocalcin which has been proposed as a ideal target to speciate MBM were no longer detectable, but the structural protein type I collagen could be used to differentiate all four species, even in the most thermally altered samples.

  19. Perception-based 3D tactile rendering from a single image for human skin examinations by dynamic touch.

    Science.gov (United States)

    Kim, K; Lee, S

    2015-05-01

    Diagnosis of skin conditions is dependent on the assessment of skin surface properties that are represented by more tactile properties such as stiffness, roughness, and friction than visual information. Due to this reason, adding tactile feedback to existing vision based diagnosis systems can help dermatologists diagnose skin diseases or disorders more accurately. The goal of our research was therefore to develop a tactile rendering system for skin examinations by dynamic touch. Our development consists of two stages: converting a single image to a 3D haptic surface and rendering the generated haptic surface in real-time. Converting to 3D surfaces from 2D single images was implemented with concerning human perception data collected by a psychophysical experiment that measured human visual and haptic sensibility to 3D skin surface changes. For the second stage, we utilized real skin biomechanical properties found by prior studies. Our tactile rendering system is a standalone system that can be used with any single cameras and haptic feedback devices. We evaluated the performance of our system by conducting an identification experiment with three different skin images with five subjects. The participants had to identify one of the three skin surfaces by using a haptic device (Falcon) only. No visual cue was provided for the experiment. The results indicate that our system provides sufficient performance to render discernable tactile rendering with different skin surfaces. Our system uses only a single skin image and automatically generates a 3D haptic surface based on human haptic perception. Realistic skin interactions can be provided in real-time for the purpose of skin diagnosis, simulations, or training. Our system can also be used for other applications like virtual reality and cosmetic applications. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  20. TChem - A Software Toolkit for the Analysis of Complex Kinetic Models

    Energy Technology Data Exchange (ETDEWEB)

    Safta, Cosmin [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Najm, Habib N. [Sandia National Lab. (SNL-CA), Livermore, CA (United States); Knio, Omar [Johns Hopkins Univ., Baltimore, MD (United States)

    2011-05-01

    The TChem toolkit is a software library that enables numerical simulations using complex chemistry and facilitates the analysis of detailed kinetic models. The toolkit provide capabilities for thermodynamic properties based on NASA polynomials and species production/consumption rates. It incorporates methods that can selectively modify reaction parameters for sensitivity analysis. The library contains several functions that provide analytically computed Jacobian matrices necessary for the efficient time advancement and analysis of detailed kinetic models.

  1. Creation of an SWMM Toolkit for Its Application in Urban Drainage Networks Optimization

    OpenAIRE

    2016-01-01

    The Storm Water Management Model (SWMM) is a dynamic simulation engine of flow in sewer systems developed by the USEPA. It has been successfully used for analyzing and designing both storm water and waste water systems. However, despite including some interfacing functions, these functions are insufficient for certain simulations. This paper describes some new functions that have been added to the existing ones to form a library of functions (Toolkit). The Toolkit presented her...

  2. BlockyTalky: A Physical and Distributed Computer Music Toolkit for Kids

    OpenAIRE

    Shapiro, R. Benjamin; Kelly, Annie; Ahrens, Matthew; Fiebrink, Rebecca

    2016-01-01

    NIME research realizes a vision of performance by means of computational expression, linking body and space to sound and imagery through eclectic forms of sensing and interaction. This vision could dramatically impact computer science education, simultaneously modernizing the field and drawing in diverse new participants. We describe our work creating a NIME-inspired computer music toolkit for kids called BlockyTalky; the toolkit enables users to create networks of sensing devices and synthes...

  3. Wind Integration National Dataset (WIND) Toolkit; NREL (National Renewable Energy Laboratory)

    Energy Technology Data Exchange (ETDEWEB)

    Draxl, Caroline; Hodge, Bri-Mathias

    2015-07-14

    A webinar about the Wind Integration National Dataset (WIND) Toolkit was presented by Bri-Mathias Hodge and Caroline Draxl on July 14, 2015. It was hosted by the Southern Alliance for Clean Energy. The toolkit is a grid integration data set that contains meteorological and power data at a 5-minute resolution across the continental United States for 7 years and hourly power forecasts.

  4. Development and Pilot Test of the RAND Suicide Prevention Program Evaluation Toolkit

    Science.gov (United States)

    2013-01-01

    chaplains, and front-desk attendants, at installation gyms to identify individuals at increased risk for suicide and to actively refer those in...articles to supplement existing information in the toolkit chap- ters. Because information in these articles varied widely, we could not use a standard...improved by supplementing the toolkit with training and technical assistance, which could be provided in conjunction with dissemination efforts. This

  5. Linear systems toolkit in Matlab:structural decompositions and their applicationss

    Institute of Scientific and Technical Information of China (English)

    Xinmin LIU; Ben M.CHEN; Zongli LIN

    2005-01-01

    This paper presents a brief description of the software toolbox,linear systems toolkit,developed in Matlab environment.The toolkit contains 66 m-functions,including structural decompositions of linear autonomous systems,unforced/unsensed systems,proper systems,and singular systems,along with their applications to system factorizations,sensor/actuator selection, H-two and H-infinity control,and disturbance decoupling problems.

  6. The nursing human resource planning best practice toolkit: creating a best practice resource for nursing managers.

    Science.gov (United States)

    Vincent, Leslie; Beduz, Mary Agnes

    2010-05-01

    Evidence of acute nursing shortages in urban hospitals has been surfacing since 2000. Further, new graduate nurses account for more than 50% of total nurse turnover in some hospitals and between 35% and 60% of new graduates change workplace during the first year. Critical to organizational success, first line nurse managers must have the knowledge and skills to ensure the accurate projection of nursing resource requirements and to develop proactive recruitment and retention programs that are effective, promote positive nursing socialization, and provide early exposure to the clinical setting. The Nursing Human Resource Planning Best Practice Toolkit project supported the creation of a network of teaching and community hospitals to develop a best practice toolkit in nursing human resource planning targeted at first line nursing managers. The toolkit includes the development of a framework including the conceptual building blocks of planning tools, manager interventions, retention and recruitment and professional practice models. The development of the toolkit involved conducting a review of the literature for best practices in nursing human resource planning, using a mixed method approach to data collection including a survey and extensive interviews of managers and completing a comprehensive scan of human resource practices in the participating organizations. This paper will provide an overview of the process used to develop the toolkit, a description of the toolkit contents and a reflection on the outcomes of the project.

  7. Designing a Composable Geometric Toolkit for Versatility in Applications to Simulation Development

    Science.gov (United States)

    Reed, Gregory S.; Campbell, Thomas

    2008-01-01

    Conceived and implemented through the development of probabilistic risk assessment simulations for Project Constellation, the Geometric Toolkit allows users to create, analyze, and visualize relationships between geometric shapes in three-space using the MATLAB computing environment. The key output of the toolkit is an analysis of how emanations from one "source" geometry (e.g., a leak in a pipe) will affect another "target" geometry (e.g., another heat-sensitive component). It can import computer-aided design (CAD) depictions of a system to be analyzed, allowing the user to reliably and easily represent components within the design and determine the relationships between them, ultimately supporting more technical or physics-based simulations that use the toolkit. We opted to develop a variety of modular, interconnecting software tools to extend the scope of the toolkit, providing the capability to support a range of applications. This concept of simulation composability allows specially-developed tools to be reused by assembling them in various combinations. As a result, the concepts described here and implemented in this toolkit have a wide range of applications outside the domain of risk assessment. To that end, the Geometric Toolkit has been evaluated for use in other unrelated applications due to the advantages provided by its underlying design.

  8. Comparative analysis of video processing and 3D rendering for cloud video games using different virtualization technologies

    Science.gov (United States)

    Bada, Adedayo; Alcaraz-Calero, Jose M.; Wang, Qi; Grecos, Christos

    2014-05-01

    This paper describes a comprehensive empirical performance evaluation of 3D video processing employing the physical/virtual architecture implemented in a cloud environment. Different virtualization technologies, virtual video cards and various 3D benchmarks tools have been utilized in order to analyse the optimal performance in the context of 3D online gaming applications. This study highlights 3D video rendering performance under each type of hypervisors, and other factors including network I/O, disk I/O and memory usage. Comparisons of these factors under well-known virtual display technologies such as VNC, Spice and Virtual 3D adaptors reveal the strengths and weaknesses of the various hypervisors with respect to 3D video rendering and streaming.

  9. Excellent color rendering indexes of multi-package white LEDs.

    Science.gov (United States)

    Oh, Ji Hye; Yang, Su Ji; Sung, Yeon-Goog; Do, Y R

    2012-08-27

    This study introduces multi-package white light-emitting diodes (LEDs) system with the ability to realize high luminous efficacy and an excellent color rendering index (CRI, R a) using the R B,M A B,M G B,M C B (R B,M A B,M G B,M denoted as a long-pass dichroic filter (LPDF)-capped, monochromatic red, amber and green phosphor converted-LED (pc-LED) pumped by a blue LED chip, and C B denoted as a cyan and blue mixed pc-LED pumped by a blue LED) system. The luminous efficacy and color rendering index (CRI) of multi-package white LED systems are compared while changing the concentration of the cyan phosphor used in the paste of a cyan-blue LED package and the driving current of individual LEDs in multi-package white LEDs at correlated color temperatures (CCTs) ranging from 6,500 K (cold white) to 2,700 K (warm white) using a set of eight CCTs as specified by the American National Standards Institute (ANSI) standard number C78.377-2008. A R B,M A B,M G B,M C B white LED system provides high luminous efficacy (≥ 96 lm/W) and a color rendering index (≥ 91) encompassing the complete CCT range. We also compare the optical properties of the R B,M A B,M G B,M C B system with those of the R B,M A B,M G B,M B and RAGB (red, amber, green, and blue semiconductor-type narrow-spectrum-band LEDs) systems. It can be expected that the cyan color added to a blue LED in multi-package white LEDs based on LPDF-capped, phosphor-converted monochromatic LEDs will meet the needs of the high-quality, highly efficient, full-color white LED lighting market in the near future.

  10. 9 CFR 319.703 - Rendered animal fat or mixture thereof.

    Science.gov (United States)

    2010-01-01

    ... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Rendered animal fat or mixture thereof... INSPECTION AND CERTIFICATION DEFINITIONS AND STANDARDS OF IDENTITY OR COMPOSITION Fats, Oils, Shortenings § 319.703 Rendered animal fat or mixture thereof. “Rendered Animal Fat,” or any mixture of...

  11. Efficient Unbiased Rendering using Enlightened Local Path Sampling

    DEFF Research Database (Denmark)

    Kristensen, Anders Wang

    . The downside to using these algorithms is that they can be slow to converge. Due to the nature of Monte Carlo methods, the results are random variables subject to variance. This manifests itself as noise in the images, which can only be reduced by generating more samples. The reason these methods are slow...... is because of a lack of eeffective methods of importance sampling. Most global illumination algorithms are based on local path sampling, which is essentially a recipe for constructing random walks. Using this procedure paths are built based on information given explicitly as part of scene description......, such as the location of the light sources or cameras, or the re flection models at each point. In this work we explore new methods of importance sampling paths. Our idea is to analyze the scene before rendering and compute various statistics that we use to improve importance sampling. The first of these are adjoint...

  12. Real-time Flame Rendering with GPU and CUDA

    Directory of Open Access Journals (Sweden)

    Wei Wei

    2011-02-01

    Full Text Available This paper proposes a method of flame simulation based on Lagrange process and chemical composition, which was non-grid and the problems associated with there grids were overcome. The turbulence movement of flame was described by Lagrange process and chemical composition was added into flame simulation which increased the authenticity of flame. For real-time applications, this paper simplified the EMST model. GPU-based particle system combined with OpenGL VBO and PBO unique technology was used to accelerate finally, the speed of vertex and pixel data interaction between CPU and GPU increased two orders of magnitude, frame rate of rendering increased by 30%, which achieved fast dynamic flame real-time simulation. For further real-time applications, this paper presented a strategy to implement flame simulation with CUDA on GPU, which achieved a speed up to 2.5 times the previous implementation.

  13. Latency in Distributed Acquisition and Rendering for Telepresence Systems.

    Science.gov (United States)

    Ohl, Stephan; Willert, Malte; Staadt, Oliver

    2015-12-01

    Telepresence systems use 3D techniques to create a more natural human-centered communication over long distances. This work concentrates on the analysis of latency in telepresence systems where acquisition and rendering are distributed. Keeping latency low is important to immerse users in the virtual environment. To better understand latency problems and to identify the source of such latency, we focus on the decomposition of system latency into sub-latencies. We contribute a model of latency and show how it can be used to estimate latencies in a complex telepresence dataflow network. To compare the estimates with real latencies in our prototype, we modify two common latency measurement methods. This presented methodology enables the developer to optimize the design, find implementation issues and gain deeper knowledge about specific sources of latency.

  14. Distributed Dimensonality-Based Rendering of LIDAR Point Clouds

    Science.gov (United States)

    Brédif, M.; Vallet, B.; Ferrand, B.

    2015-08-01

    Mobile Mapping Systems (MMS) are now commonly acquiring lidar scans of urban environments for an increasing number of applications such as 3D reconstruction and mapping, urban planning, urban furniture monitoring, practicability assessment for persons with reduced mobility (PRM)... MMS acquisitions are usually huge enough to incur a usability bottleneck for the increasing number of non-expert user that are not trained to process and visualize these huge datasets through specific softwares. A vast majority of their current need is for a simple 2D visualization that is both legible on screen and printable on a static 2D medium, while still conveying the understanding of the 3D scene and minimizing the disturbance of the lidar acquisition geometry (such as lidar shadows). The users that motivated this research are, by law, bound to precisely georeference underground networks for which they currently have schematics with no or poor absolute georeferencing. A solution that may fit their needs is thus a 2D visualization of the MMS dataset that they could easily interpret and on which they could accurately match features with their user datasets they would like to georeference. Our main contribution is two-fold. First, we propose a 3D point cloud stylization for 2D static visualization that leverages a Principal Component Analysis (PCA)-like local geometry analysis. By skipping the usual and error-prone estimation of a ground elevation, this rendering is thus robust to non-flat areas and has no hard-to-tune parameters such as height thresholds. Second, we implemented the corresponding rendering pipeline so that it can scale up to arbitrary large datasets by leveraging the Spark framework and its Resilient Distributed Dataset (RDD) and Dataframe abstractions.

  15. On-the-Fly Decompression and Rendering of Multiresolution Terrain

    Energy Technology Data Exchange (ETDEWEB)

    Lindstrom, P; Cohen, J D

    2009-04-02

    We present a streaming geometry compression codec for multiresolution, uniformly-gridded, triangular terrain patches that supports very fast decompression. Our method is based on linear prediction and residual coding for lossless compression of the full-resolution data. As simplified patches on coarser levels in the hierarchy already incur some data loss, we optionally allow further quantization for more lossy compression. The quantization levels are adaptive on a per-patch basis, while still permitting seamless, adaptive tessellations of the terrain. Our geometry compression on such a hierarchy achieves compression ratios of 3:1 to 12:1. Our scheme is not only suitable for fast decompression on the CPU, but also for parallel decoding on the GPU with peak throughput over 2 billion triangles per second. Each terrain patch is independently decompressed on the fly from a variable-rate bitstream by a GPU geometry program with no branches or conditionals. Thus we can store the geometry compressed on the GPU, reducing storage and bandwidth requirements throughout the system. In our rendering approach, only compressed bitstreams and the decoded height values in the view-dependent 'cut' are explicitly stored on the GPU. Normal vectors are computed in a streaming fashion, and remaining geometry and texture coordinates, as well as mesh connectivity, are shared and re-used for all patches. We demonstrate and evaluate our algorithms on a small prototype system in which all compressed geometry fits in the GPU memory and decompression occurs on the fly every rendering frame without any cache maintenance.

  16. Technologies Render Views of Earth for Virtual Navigation

    Science.gov (United States)

    2012-01-01

    On a December night in 1995, 159 passengers and crewmembers died when American Airlines Flight 965 flew into the side of a mountain while in route to Cali, Colombia. A key factor in the tragedy: The pilots had lost situational awareness in the dark, unfamiliar terrain. They had no idea the plane was approaching a mountain until the ground proximity warning system sounded an alarm only seconds before impact. The accident was of the kind most common at the time CFIT, or controlled flight into terrain says Trey Arthur, research aerospace engineer in the Crew Systems and Aviation Operations Branch at NASA s Langley Research Center. In situations such as bad weather, fog, or nighttime flights, pilots would rely on airspeed, altitude, and other readings to get an accurate sense of location. Miscalculations and rapidly changing conditions could contribute to a fully functioning, in-control airplane flying into the ground. To improve aviation safety by enhancing pilots situational awareness even in poor visibility, NASA began exploring the possibilities of synthetic vision creating a graphical display of the outside terrain on a screen inside the cockpit. How do you display a mountain in the cockpit? You have to have a graphics-powered computer, a terrain database you can render, and an accurate navigation solution, says Arthur. In the mid-1990s, developing GPS technology offered a means for determining an aircraft s position in space with high accuracy, Arthur explains. As the necessary technologies to enable synthetic vision emerged, NASA turned to an industry partner to develop the terrain graphical engine and database for creating the virtual rendering of the outside environment.

  17. The Seismic Tool-Kit (STK): An Open Source Software For Learning the Basis of Signal Processing and Seismology.

    Science.gov (United States)

    Reymond, D.

    2016-12-01

    We present an open source software project (GNU public license), named STK: Seismic Tool-Kit, that is dedicated mainly for learning signal processing and seismology. The STK project that started in 2007, is hosted by SourceForge.net, and count more than 20000 downloads at the date of writing.The STK project is composed of two main branches:First, a graphical interface dedicated to signal processing (in the SAC format (SAC_ASCII and SAC_BIN): where the signal can be plotted, zoomed, filtered, integrated, derivated, ... etc. (a large variety of IFR and FIR filter is proposed). The passage in the frequency domain via the Fourier transform is used to introduce the estimation of spectral density of the signal , with visualization of the Power Spectral Density (PSD) in linear or log scale, and also the evolutive time-frequency representation (or sonagram). The 3-components signals can be also processed for estimating their polarization properties, either for a given window, or either for evolutive windows along the time. This polarization analysis is useful for extracting the polarized noises, differentiating P waves, Rayleigh waves, Love waves, ... etc. Secondly, a panel of Utilities-Program are proposed for working in a terminal mode, with basic programs for computing azimuth and distance in spherical geometry, inter/auto-correlation, spectral density, time-frequency for an entire directory of signals, focal planes, and main components axis, radiation pattern of P waves, Polarization analysis of different waves (including noise), under/over-sampling the signals, cubic-spline smoothing, and linear/non linear regression analysis of data set. STK is developed in C/C++, mainly under Linux OS, and it has been also partially implemented under MS-Windows. STK has been used in some schools for viewing and plotting seismic records provided by IRIS, and it has been used as a practical support for teaching the basis of signal processing. Useful links:http://sourceforge.net

  18. 网格开发工具Globus Toolkit 4的安装与配置%Installation and Configuration of Grid Development Tools Globus Toolkit 4

    Institute of Scientific and Technical Information of China (English)

    焦鸿斌; 刘晓彦; 李新蕾; 矫亮; 陈桂芬

    2010-01-01

    Globus Toolkit是当前建立网格系统的核心实现工具之一,介绍了Globus工具集的起源、功能与构成,并对Globus Toolkit 4在Linux环境下的安装与配置进行了详细的阐述.

  19. Toward a VPH/Physiome ToolKit.

    Science.gov (United States)

    Garny, Alan; Cooper, Jonathan; Hunter, Peter J

    2010-01-01

    The Physiome Project was officially launched in 1997 and has since brought together teams from around the world to work on the development of a computational framework for the modeling of the human body. At the European level, this effort is focused around patient-specific solutions and is known as the Virtual Physiological Human (VPH) Initiative.Such modeling is both multiscale (in space and time) and multiphysics. This, therefore, requires careful interaction and collaboration between the teams involved in the VPH/Physiome effort, if we are to produce computer models that are not only quantitative, but also integrative and predictive.In that context, several technologies and solutions are already available, developed both by groups involved in the VPH/Physiome effort, and by others. They address areas such as data handling/fusion, markup languages, model repositories, ontologies, tools (for simulation, imaging, data fitting, etc.), as well as grid, middleware, and workflow.Here, we provide an overview of resources that should be considered for inclusion in the VPH/Physiome ToolKit (i.e., the set of tools that addresses the needs and requirements of the Physiome Project and VPH Initiative) and discuss some of the challenges that we are still facing.

  20. TMVA - Tool-kit for Multivariate Data Analysis in ROOT

    Energy Technology Data Exchange (ETDEWEB)

    Therhaag, Jan; Von Toerne, Eckhard [Univ. Bonn, Physikalisches Institut, Nussallee 12, 53115 Bonn (Germany); Hoecker, Andreas; Speckmayer, Peter [European Organization for Nuclear Research - CERN, CH-1211 Geneve 23 (Switzerland); Stelzer, Joerg [Deutsches Elektronen-Synchrotron - DESY, Platanenallee 6, D-15738 Zeuthen (Germany); Voss, Helge [Max-Planck-Institut fuer Kernphysik - MPI, Postfach 10 39 80, Saupfercheckweg 1, DE-69117 Heidelberg (Germany)

    2010-07-01

    Given the ever-increasing complexity of modern HEP data analysis, multivariate analysis techniques have proven an indispensable tool in extracting the most valuable information from the data. TMVA, the Tool-kit for Multivariate Data Analysis, provides a large variety of advanced multivariate analysis techniques for both signal/background classification and regression problems. In TMVA, all methods are embedded in a user-friendly framework capable of handling the pre-processing of the data as well as the evaluation of the results, thus allowing for a simple use of even the most sophisticated multivariate techniques. Convenient assessment and comparison of different analysis techniques enable the user to choose the most efficient approach for any particular data analysis task. TMVA is an integral part of the ROOT data analysis framework and is widely-used in the LHC experiments. In this talk I will review recent developments in TMVA, discuss typical use-cases in HEP and present the performance of our most important multivariate techniques on example data by comparing it to theoretical performance limits. (authors)

  1. Microgrid Design Toolkit (MDT) Technical Documentation and Component Summaries

    Energy Technology Data Exchange (ETDEWEB)

    Arguello, Bryan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Gearhart, Jared Lee [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Jones, Katherine A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Eddy, John P. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-09-01

    The Microgrid Design Toolkit (MDT) is a decision support software tool for microgrid designers to use during the microgrid design process. The models that support the two main capabilities in MDT are described. The first capability, the Microgrid Sizing Capability (MSC), is used to determine the size and composition of a new microgrid in the early stages of the design process. MSC is a mixed-integer linear program that is focused on developing a microgrid that is economically viable when connected to the grid. The second capability is focused on refining a microgrid design for operation in islanded mode. This second capability relies on two models: the Technology Management Optimization (TMO) model and Performance Reliability Model (PRM). TMO uses a genetic algorithm to create and refine a collection of candidate microgrid designs. It uses PRM, a simulation based reliability model, to assess the performance of these designs. TMO produces a collection of microgrid designs that perform well with respect to one or more performance metrics.

  2. Machine learning for a Toolkit for Image Mining

    Science.gov (United States)

    Delanoy, Richard L.

    1995-01-01

    A prototype user environment is described that enables a user with very limited computer skills to collaborate with a computer algorithm to develop search tools (agents) that can be used for image analysis, creating metadata for tagging images, searching for images in an image database on the basis of image content, or as a component of computer vision algorithms. Agents are learned in an ongoing, two-way dialogue between the user and the algorithm. The user points to mistakes made in classification. The algorithm, in response, attempts to discover which image attributes are discriminating between objects of interest and clutter. It then builds a candidate agent and applies it to an input image, producing an 'interest' image highlighting features that are consistent with the set of objects and clutter indicated by the user. The dialogue repeats until the user is satisfied. The prototype environment, called the Toolkit for Image Mining (TIM) is currently capable of learning spectral and textural patterns. Learning exhibits rapid convergence to reasonable levels of performance and, when thoroughly trained, Fo appears to be competitive in discrimination accuracy with other classification techniques.

  3. Integrating the microbiome as a resource in the forensics toolkit.

    Science.gov (United States)

    Clarke, Thomas H; Gomez, Andres; Singh, Harinder; Nelson, Karen E; Brinkac, Lauren M

    2017-09-01

    The introduction of DNA fingerprinting to forensic science rapidly expanded the available evidence that could be garnered from a crime scene and used in court cases. Next generation sequencing technologies increased available genetic data that could be used as evidence by orders of magnitude, and as such, significant additional genetic information is now available for use in forensic science. This includes DNA from the bacteria that live in and on humans, known as the human microbiome. Next generation sequencing of the human microbiome demonstrates that its bacterial DNA can be used to uniquely identify an individual, provide information about their life and behavioral patterns, determine the body site where a sample came from, and estimate postmortem intervals. Bacterial samples from the environment and objects can also be leveraged to address similar questions about the individual(s) who interacted with them. However, the applications of this new field in forensic sciences raises concerns on current methods used in sample processing, including sample collection, storage, and the statistical power of published studies. These areas of human microbiome research need to be fully addressed before microbiome data can become a regularly incorporated evidence type and routine procedure of the forensic toolkit. Here, we summarize information on the current status of microbiome research as applies to the forensic field, the mathematical models used to make predictions, and the possible legal and practical difficulties that can limit the application of microbiomes in forensic science. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  4. A Gateway MultiSite recombination cloning toolkit.

    Directory of Open Access Journals (Sweden)

    Lena K Petersen

    Full Text Available The generation of DNA constructs is often a rate-limiting step in conducting biological experiments. Recombination cloning of single DNA fragments using the Gateway system provided an advance over traditional restriction enzyme cloning due to increases in efficiency and reliability. Here we introduce a series of entry clones and a destination vector for use in two, three, and four fragment Gateway MultiSite recombination cloning whose advantages include increased flexibility and versatility. In contrast to Gateway single-fragment cloning approaches where variations are typically incorporated into model system-specific destination vectors, our Gateway MultiSite cloning strategy incorporates variations in easily generated entry clones that are model system-independent. In particular, we present entry clones containing insertions of GAL4, QF, UAS, QUAS, eGFP, and mCherry, among others, and demonstrate their in vivo functionality in Drosophila by using them to generate expression clones including GAL4 and QF drivers for various trp ion channel family members, UAS and QUAS excitatory and inhibitory light-gated ion channels, and QUAS red and green fluorescent synaptic vesicle markers. We thus establish a starter toolkit of modular Gateway MultiSite entry clones potentially adaptable to any model system. An inventory of entry clones and destination vectors for Gateway MultiSite cloning has also been established (www.gatewaymultisite.org.

  5. RNAi screening of developmental toolkit genes: a search for novel wing genes in the red flour beetle, Tribolium castaneum.

    Science.gov (United States)

    Linz, David M; Tomoyasu, Yoshinori

    2015-01-01

    The amazing array of diversity among insect wings offers a powerful opportunity to study the mechanisms guiding morphological evolution. Studies in Drosophila (the fruit fly) have identified dozens of genes important for wing development. These genes are often called candidate genes, serving as an ideal starting point to study wing development in other insects. However, we also need to explore beyond the candidate genes to gain a more comprehensive view of insect wing evolution. As a first step away from the traditional candidate genes, we utilized Tribolium (the red flour beetle) as a model and assessed the potential involvement of a group of developmental toolkit genes (embryonic patterning genes) in beetle wing development. We hypothesized that the highly pleiotropic nature of these developmental genes would increase the likelihood of finding novel wing genes in Tribolium. Through the RNA interference screening, we found that Tc-cactus has a less characterized (but potentially evolutionarily conserved) role in wing development. We also found that the odd-skipped family genes are essential for the formation of the thoracic pleural plates, including the recently discovered wing serial homologs in Tribolium. In addition, we obtained several novel insights into the function of these developmental genes, such as the involvement of mille-pattes and Tc-odd-paired in metamorphosis. Despite these findings, no gene we examined was found to have novel wing-related roles unique in Tribolium. These results suggest a relatively conserved nature of developmental toolkit genes and highlight the limited degree to which these genes are co-opted during insect wing evolution.

  6. High-performance GPU-based rendering for real-time, rigid 2D/3D-image registration and motion prediction in radiation oncology

    Science.gov (United States)

    Spoerk, Jakob; Gendrin, Christelle; Weber, Christoph; Figl, Michael; Pawiro, Supriyanto Ardjo; Furtado, Hugo; Fabri, Daniella; Bloch, Christoph; Bergmann, Helmar; Gröller, Eduard; Birkfellner, Wolfgang

    2012-01-01

    A common problem in image-guided radiation therapy (IGRT) of lung cancer as well as other malignant diseases is the compensation of periodic and aperiodic motion during dose delivery. Modern systems for image-guided radiation oncology allow for the acquisition of cone-beam computed tomography data in the treatment room as well as the acquisition of planar radiographs during the treatment. A mid-term research goal is the compensation of tumor target volume motion by 2D/3D registration. In 2D/3D registration, spatial information on organ location is derived by an iterative comparison of perspective volume renderings, so-called digitally rendered radiographs (DRR) from computed tomography volume data, and planar reference x-rays. Currently, this rendering process is very time consuming, and real-time registration, which should at least provide data on organ position in less than a second, has not come into existence. We present two GPU-based rendering algorithms which generate a DRR of 512 × 512 pixels size from a CT dataset of 53 MB size at a pace of almost 100 Hz. This rendering rate is feasible by applying a number of algorithmic simplifications which range from alternative volume-driven rendering approaches – namely so-called wobbled splatting – to sub-sampling of the DRR-image by means of specialized raycasting techniques. Furthermore, general purpose graphics processing unit (GPGPU) programming paradigms were consequently utilized. Rendering quality and performance as well as the influence on the quality and performance of the overall registration process were measured and analyzed in detail. The results show that both methods are competitive and pave the way for fast motion compensation by rigid and possibly even non-rigid 2D/3D registration and, beyond that, adaptive filtering of motion models in IGRT. PMID:21782399

  7. Age, Health and Attractiveness Perception of Virtual (Rendered) Human Hair.

    Science.gov (United States)

    Fink, Bernhard; Hufschmidt, Carla; Hirn, Thomas; Will, Susanne; McKelvey, Graham; Lankhof, John

    2016-01-01

    The social significance of physical appearance and beauty has been documented in many studies. It is known that even subtle manipulations of facial morphology and skin condition can alter people's perception of a person's age, health and attractiveness. While the variation in facial morphology and skin condition cues has been studied quite extensively, comparably little is known on the effect of hair on social perception. This has been partly caused by the technical difficulty of creating appropriate stimuli for investigations of people's response to systematic variation of certain hair characteristics, such as color and style, while keeping other features constant. Here, we present a modeling approach to the investigation of human hair perception using computer-generated, virtual (rendered) human hair. In three experiments, we manipulated hair diameter (Experiment 1), hair density (Experiment 2), and hair style (Experiment 3) of human (female) head hair and studied perceptions of age, health and attractiveness. Our results show that even subtle changes in these features have an impact on hair perception. We discuss our findings with reference to previous studies on condition-dependent quality cues in women that influence human social perception, thereby suggesting that hair is a salient feature of human physical appearance, which contributes to the perception of beauty.

  8. VITRAIL: Acquisition, Modeling, and Rendering of Stained Glass.

    Science.gov (United States)

    Thanikachalam, Niranjan; Baboulaz, Loic; Prandoni, Paolo; Trumpler, Stefan; Wolf, Sophie; Vetterli, Martin

    2016-10-01

    Stained glass windows are designed to reveal their powerful artistry under diverse and time-varying lighting conditions; virtual relighting of stained glass, therefore, represents an exceptional tool for the appreciation of this age old art form. However, as opposed to most other artifacts, stained glass windows are extremely difficult if not impossible to analyze using controlled illumination because of their size and position. In this paper, we present novel methods built upon image based priors to perform virtual relighting of stained glass artwork by acquiring the actual light transport properties of a given artifact. In a preprocessing step, we build a material-dependent dictionary for light transport by studying the scattering properties of glass samples in a laboratory setup. We can now use the dictionary to recover a light transport matrix in two ways: under controlled illuminations the dictionary constitutes a sparsifying basis for a compressive sensing acquisition, while in the case of uncontrolled illuminations the dictionary is used to perform sparse regularization. The proposed basis preserves volume impurities and we show that the retrieved light transport matrix is heterogeneous, as in the case of real world objects. We present the rendering results of several stained glass artifacts, including the Rose Window of the Cathedral of Lausanne, digitized using the presented methods.

  9. Moisture Transfer through Facades Covered with Organic Binder Renders

    Directory of Open Access Journals (Sweden)

    Carmen DICO

    2013-07-01

    Full Text Available Year after year we witness the negative effect of water over buildings, caused by direct or indirect actions. This situation is obvious in case of old, historical building, subjected to this aggression for a long period of time, but new buildings are also affected. Moisture in building materials causes not only structural damage, but also reduces the thermal insulation capacity of building components.Materials like plasters or paints have been used historically for a long period of time, fulfilling two basics functions: Decoration and Protection. The most acute demands are made on exterior plasters, as they, besides being an important decorative element for the facade, must perform two different functions simultaneously: protect the substrate against weathering and moisture without sealing, providing it a certain ability to “breathe” (Heilen, 2005. In order to accomplish this aim, the first step is to understand the hygrothermal behavior of coating and substrate and define the fundamental principles of moisture transfer; According to Künzel’s Facade Protection Theory, two material properties play the most important role: Water absorption and Vapor permeability.In the context of recently adoption (2009 of the “harmonized” European standard EN 15824 – „Specifications for external renders and internal plasters based on organic binders”, this paper deals extensively with the interaction of the two mentioned above properties for the coating materials, covered by EN 15824.

  10. Light field rendering with omni-directional camera

    Science.gov (United States)

    Todoroki, Hiroshi; Saito, Hideo

    2003-06-01

    This paper presents an approach to capture visual appearance of a real environment such as an interior of a room. We propose the method for generating arbitrary viewpoint images by building light field with the omni-directional camera, which can capture the wide circumferences. Omni-directional camera used in this technique is a special camera with the hyperbolic mirror in the upper part of a camera, so that we can capture luminosity in the environment in the range of 360 degree of circumferences in one image. We apply the light field method, which is one technique of Image-Based-Rendering(IBR), for generating the arbitrary viewpoint images. The light field is a kind of the database that records the luminosity information in the object space. We employ the omni-directional camera for constructing the light field, so that we can collect many view direction images in the light field. Thus our method allows the user to explore the wide scene, that can acheive realistic representation of virtual enviroment. For demonstating the proposed method, we capture image sequence in our lab's interior environment with an omni-directional camera, and succesfully generate arbitray viewpoint images for virual tour of the environment.

  11. WikiPrints: rendering enterprise Wiki content for printing

    Science.gov (United States)

    Berkner, Kathrin

    2010-02-01

    Wikis have become a tool of choice for collaborative, informative communication. In contrast to the immense Wikipedia, that serves as a reference web site and typically covers only one topic per web page, enterprise wikis are often used as project management tools and contain several closely related pages authored by members of one project. In that scenario it is useful to print closely related content for review or teaching purposes. In this paper we propose a novel technique for rendering enterprise wiki content for printing called WikiPrints, that creates a linearized version of wiki content formatted as a mixture between web layout and conventional document layout suitable for printing. Compared to existing print options for wiki content, Wikiprints automatically selects content from different wiki pages given user preferences and usage scenarios. Meta data such as content authors or time of content editing are considered. A preview of the linearized content is shown to the user and an interface for making manual formatting changes provided.

  12. Age, Health and Attractiveness Perception of Virtual (Rendered) Human Hair

    Science.gov (United States)

    Fink, Bernhard; Hufschmidt, Carla; Hirn, Thomas; Will, Susanne; McKelvey, Graham; Lankhof, John

    2016-01-01

    The social significance of physical appearance and beauty has been documented in many studies. It is known that even subtle manipulations of facial morphology and skin condition can alter people’s perception of a person’s age, health and attractiveness. While the variation in facial morphology and skin condition cues has been studied quite extensively, comparably little is known on the effect of hair on social perception. This has been partly caused by the technical difficulty of creating appropriate stimuli for investigations of people’s response to systematic variation of certain hair characteristics, such as color and style, while keeping other features constant. Here, we present a modeling approach to the investigation of human hair perception using computer-generated, virtual (rendered) human hair. In three experiments, we manipulated hair diameter (Experiment 1), hair density (Experiment 2), and hair style (Experiment 3) of human (female) head hair and studied perceptions of age, health and attractiveness. Our results show that even subtle changes in these features have an impact on hair perception. We discuss our findings with reference to previous studies on condition-dependent quality cues in women that influence human social perception, thereby suggesting that hair is a salient feature of human physical appearance, which contributes to the perception of beauty. PMID:28066276

  13. Differentiation renders susceptibility to excitotoxicity in HT22 neurons

    Institute of Scientific and Technical Information of China (English)

    Minchao He; Jun Liu; Shaowu Cheng; Yigang Xing; William Z Suo

    2013-01-01

    HT22 is an immortalized mouse hippocampal neuronal cell line that does not express cholinergic and glutamate receptors like mature hippocampal neurons in vivo. This in part prevents its use as a model for mature hippocampal neurons in memory-related studies. We now report that HT22 cells were appropriately induced to differentiate and possess properties similar to those of mature hippocampal neurons in vivo, such as becoming more glutamate-receptive and excitatory. Results showed that sensitivity of HT22 cells to glutamate-induced toxicity changed dramatically when comparing undifferentiated with differentiated cells, with the half-effective concentration for differentiated cells reducing approximately two orders of magnitude. Moreover, glutamate-induced toxicity in differentiated cells, but not undifferentiated cells, was inhibited by the N-methyl-D- aspartate receptor antagonists MK-801 and memantine. Evidently, differentiated HT22 cells expressed N-methyl-D-aspartate receptors, while undifferentiated cells did not. Our experimental findings indicated that differentiation is important for immortalized cell lines to render post-mitotic neuronal properties, and that differentiated HT22 neurons represent a better model of hippocampal neurons than undifferentiated cells.

  14. PsyToolkit: a software package for programming psychological experiments using Linux.

    Science.gov (United States)

    Stoet, Gijsbert

    2010-11-01

    PsyToolkit is a set of software tools for programming psychological experiments on Linux computers. Given that PsyToolkit is freely available under the Gnu Public License, open source, and designed such that it can easily be modified and extended for individual needs, it is suitable not only for technically oriented Linux users, but also for students, researchers on small budgets, and universities in developing countries. The software includes a high-level scripting language, a library for the programming language C, and a questionnaire presenter. The software easily integrates with other open source tools, such as the statistical software package R. PsyToolkit is designed to work with external hardware (including IoLab and Cedrus response keyboards and two common digital input/output boards) and to support millisecond timing precision. Four in-depth examples explain the basic functionality of PsyToolkit. Example 1 demonstrates a stimulus-response compatibility experiment. Example 2 demonstrates a novel mouse-controlled visual search experiment. Example 3 shows how to control light emitting diodes using PsyToolkit, and Example 4 shows how to build a light-detection sensor. The last two examples explain the electronic hardware setup such that they can even be used with other software packages.

  15. Cinfony – combining Open Source cheminformatics toolkits behind a common interface

    Directory of Open Access Journals (Sweden)

    Hutchison Geoffrey R

    2008-12-01

    Full Text Available Abstract Background Open Source cheminformatics toolkits such as OpenBabel, the CDK and the RDKit share the same core functionality but support different sets of file formats and forcefields, and calculate different fingerprints and descriptors. Despite their complementary features, using these toolkits in the same program is difficult as they are implemented in different languages (C++ versus Java, have different underlying chemical models and have different application programming interfaces (APIs. Results We describe Cinfony, a Python module that presents a common interface to all three of these toolkits, allowing the user to easily combine methods and results from any of the toolkits. In general, the run time of the Cinfony modules is almost as fast as accessing the underlying toolkits directly from C++ or Java, but Cinfony makes it much easier to carry out common tasks in cheminformatics such as reading file formats and calculating descriptors. Conclusion By providing a simplified interface and improving interoperability, Cinfony makes it easy to combine complementary features of OpenBabel, the CDK and the RDKit.

  16. Pybel: a Python wrapper for the OpenBabel cheminformatics toolkit

    Directory of Open Access Journals (Sweden)

    Morley Chris

    2008-03-01

    Full Text Available Abstract Background Scripting languages such as Python are ideally suited to common programming tasks in cheminformatics such as data analysis and parsing information from files. However, for reasons of efficiency, cheminformatics toolkits such as the OpenBabel toolkit are often implemented in compiled languages such as C++. We describe Pybel, a Python module that provides access to the OpenBabel toolkit. Results Pybel wraps the direct toolkit bindings to simplify common tasks such as reading and writing molecular files and calculating fingerprints. Extensive use is made of Python iterators to simplify loops such as that over all the molecules in a file. A Pybel Molecule can be easily interconverted to an OpenBabel OBMol to access those methods or attributes not wrapped by Pybel. Conclusion Pybel allows cheminformaticians to rapidly develop Python scripts that manipulate chemical information. It is open source, available cross-platform, and offers the power of the OpenBabel toolkit to Python programmers.

  17. The development of an artificial organic networks toolkit for LabVIEW.

    Science.gov (United States)

    Ponce, Hiram; Ponce, Pedro; Molina, Arturo

    2015-03-15

    Two of the most challenging problems that scientists and researchers face when they want to experiment with new cutting-edge algorithms are the time-consuming for encoding and the difficulties for linking them with other technologies and devices. In that sense, this article introduces the artificial organic networks toolkit for LabVIEW™ (AON-TL) from the implementation point of view. The toolkit is based on the framework provided by the artificial organic networks technique, giving it the potential to add new algorithms in the future based on this technique. Moreover, the toolkit inherits both the rapid prototyping and the easy-to-use characteristics of the LabVIEW™ software (e.g., graphical programming, transparent usage of other softwares and devices, built-in programming event-driven for user interfaces), to make it simple for the end-user. In fact, the article describes the global architecture of the toolkit, with particular emphasis in the software implementation of the so-called artificial hydrocarbon networks algorithm. Lastly, the article includes two case studies for engineering purposes (i.e., sensor characterization) and chemistry applications (i.e., blood-brain barrier partitioning data model) to show the usage of the toolkit and the potential scalability of the artificial organic networks technique.

  18. Pareto utility

    NARCIS (Netherlands)

    Masako, I.; Laeven, R.J.A.; Magnus, J.R.; Muris, C.

    2013-01-01

    In searching for an appropriate utility function in the expected utility framework, we formulate four properties that we want the utility function to satisfy. We conduct a search for such a function, and we identify Pareto utility as a function satisfying all four desired properties. Pareto utility

  19. The e-Health Implementation Toolkit: Qualitative evaluation across four European countries

    LENUS (Irish Health Repository)

    MacFarlane, Anne

    2011-11-19

    Abstract Background Implementation researchers have attempted to overcome the research-practice gap in e-health by developing tools that summarize and synthesize research evidence of factors that impede or facilitate implementation of innovation in healthcare settings. The e-Health Implementation Toolkit (e-HIT) is an example of such a tool that was designed within the context of the United Kingdom National Health Service to promote implementation of e-health services. Its utility in international settings is unknown. Methods We conducted a qualitative evaluation of the e-HIT in use across four countries--Finland, Norway, Scotland, and Sweden. Data were generated using a combination of interview approaches (n = 22) to document e-HIT users\\' experiences of the tool to guide decision making about the selection of e-health pilot services and to monitor their progress over time. Results e-HIT users evaluated the tool positively in terms of its scope to organize and enhance their critical thinking about their implementation work and, importantly, to facilitate discussion between those involved in that work. It was easy to use in either its paper- or web-based format, and its visual elements were positively received. There were some minor criticisms of the e-HIT with some suggestions for content changes and comments about its design as a generic tool (rather than specific to sites and e-health services). However, overall, e-HIT users considered it to be a highly workable tool that they found useful, which they would use again, and which they would recommend to other e-health implementers. Conclusion The use of the e-HIT is feasible and acceptable in a range of international contexts by a range of professionals for a range of different e-health systems.

  20. The e-health implementation toolkit: qualitative evaluation across four European countries

    Directory of Open Access Journals (Sweden)

    MacFarlane Anne

    2011-11-01

    Full Text Available Abstract Background Implementation researchers have attempted to overcome the research-practice gap in e-health by developing tools that summarize and synthesize research evidence of factors that impede or facilitate implementation of innovation in healthcare settings. The e-Health Implementation Toolkit (e-HIT is an example of such a tool that was designed within the context of the United Kingdom National Health Service to promote implementation of e-health services. Its utility in international settings is unknown. Methods We conducted a qualitative evaluation of the e-HIT in use across four countries--Finland, Norway, Scotland, and Sweden. Data were generated using a combination of interview approaches (n = 22 to document e-HIT users' experiences of the tool to guide decision making about the selection of e-health pilot services and to monitor their progress over time. Results e-HIT users evaluated the tool positively in terms of its scope to organize and enhance their critical thinking about their implementation work and, importantly, to facilitate discussion between those involved in that work. It was easy to use in either its paper- or web-based format, and its visual elements were positively received. There were some minor criticisms of the e-HIT with some suggestions for content changes and comments about its design as a generic tool (rather than specific to sites and e-health services. However, overall, e-HIT users considered it to be a highly workable tool that they found useful, which they would use again, and which they would recommend to other e-health implementers. Conclusion The use of the e-HIT is feasible and acceptable in a range of international contexts by a range of professionals for a range of different e-health systems.

  1. Mission Analysis, Operations, and Navigation Toolkit Environment (Monte) Version 040

    Science.gov (United States)

    Sunseri, Richard F.; Wu, Hsi-Cheng; Evans, Scott E.; Evans, James R.; Drain, Theodore R.; Guevara, Michelle M.

    2012-01-01

    Monte is a software set designed for use in mission design and spacecraft navigation operations. The system can process measurement data, design optimal trajectories and maneuvers, and do orbit determination, all in one application. For the first time, a single software set can be used for mission design and navigation operations. This eliminates problems due to different models and fidelities used in legacy mission design and navigation software. The unique features of Monte 040 include a blowdown thruster model for GRAIL (Gravity Recovery and Interior Laboratory) with associated pressure models, as well as an updated, optimalsearch capability (COSMIC) that facilitated mission design for ARTEMIS. Existing legacy software lacked the capabilities necessary for these two missions. There is also a mean orbital element propagator and an osculating to mean element converter that allows long-term orbital stability analysis for the first time in compiled code. The optimized trajectory search tool COSMIC allows users to place constraints and controls on their searches without any restrictions. Constraints may be user-defined and depend on trajectory information either forward or backwards in time. In addition, a long-term orbit stability analysis tool (morbiter) existed previously as a set of scripts on top of Monte. Monte is becoming the primary tool for navigation operations, a core competency at JPL. The mission design capabilities in Monte are becoming mature enough for use in project proposals as well as post-phase A mission design. Monte has three distinct advantages over existing software. First, it is being developed in a modern paradigm: object- oriented C++ and Python. Second, the software has been developed as a toolkit, which allows users to customize their own applications and allows the development team to implement requirements quickly, efficiently, and with minimal bugs. Finally, the software is managed in accordance with the CMMI (Capability Maturity Model

  2. BioWarehouse: a bioinformatics database warehouse toolkit

    Directory of Open Access Journals (Sweden)

    Stringer-Calvert David WJ

    2006-03-01

    Full Text Available Abstract Background This article addresses the problem of interoperation of heterogeneous bioinformatics databases. Results We introduce BioWarehouse, an open source toolkit for constructing bioinformatics database warehouses using the MySQL and Oracle relational database managers. BioWarehouse integrates its component databases into a common representational framework within a single database management system, thus enabling multi-database queries using the Structured Query Language (SQL but also facilitating a variety of database integration tasks such as comparative analysis and data mining. BioWarehouse currently supports the integration of a pathway-centric set of databases including ENZYME, KEGG, and BioCyc, and in addition the UniProt, GenBank, NCBI Taxonomy, and CMR databases, and the Gene Ontology. Loader tools, written in the C and JAVA languages, parse and load these databases into a relational database schema. The loaders also apply a degree of semantic normalization to their respective source data, decreasing semantic heterogeneity. The schema supports the following bioinformatics datatypes: chemical compounds, biochemical reactions, metabolic pathways, proteins, genes, nucleic acid sequences, features on protein and nucleic-acid sequences, organisms, organism taxonomies, and controlled vocabularies. As an application example, we applied BioWarehouse to determine the fraction of biochemically characterized enzyme activities for which no sequences exist in the public sequence databases. The answer is that no sequence exists for 36% of enzyme activities for which EC numbers have been assigned. These gaps in sequence data significantly limit the accuracy of genome annotation and metabolic pathway prediction, and are a barrier for metabolic engineering. Complex queries of this type provide examples of the value of the data warehousing approach to bioinformatics research. Conclusion BioWarehouse embodies significant progress on the

  3. Cal-Adapt: California's Climate Data Resource and Interactive Toolkit

    Science.gov (United States)

    Thomas, N.; Mukhtyar, S.; Wilhelm, S.; Galey, B.; Lehmer, E.

    2016-12-01

    Cal-Adapt is a web-based application that provides an interactive toolkit and information clearinghouse to help agencies, communities, local planners, resource managers, and the public understand climate change risks and impacts at the local level. The website offers interactive, visually compelling, and useful data visualization tools that show how climate change might affect California using downscaled continental climate data. Cal-Adapt is supporting California's Fourth Climate Change Assessment through providing access to the wealth of modeled and observed data and adaption-related information produced by California's scientific community. The site has been developed by UC Berkeley's Geospatial Innovation Facility (GIF) in collaboration with the California Energy Commission's (CEC) Research Program. The Cal-Adapt website allows decision makers, scientists and residents of California to turn research results and climate projections into effective adaptation decisions and policies. Since its release to the public in June 2011, Cal-Adapt has been visited by more than 94,000 unique visitors from over 180 countries, all 50 U.S. states, and 689 California localities. We will present several key visualizations that have been employed by Cal-Adapt's users to support their efforts to understand local impacts of climate change, indicate the breadth of data available, and delineate specific use cases. Recently, CEC and GIF have been developing and releasing Cal-Adapt 2.0, which includes updates and enhancements that are increasing its ease of use, information value, visualization tools, and data accessibility. We showcase how Cal-Adapt is evolving in response to feedback from a variety of sources to present finer-resolution downscaled data, and offer an open API that allows other organization to access Cal-Adapt climate data and build domain specific visualization and planning tools. Through a combination of locally relevant information, visualization tools, and access to

  4. A Python Interface for the Dakota Iterative Systems Analysis Toolkit

    Science.gov (United States)

    Piper, M.; Hutton, E.; Syvitski, J. P.

    2016-12-01

    Uncertainty quantification is required to improve the accuracy, reliability, and accountability of Earth science models. Dakota is a software toolkit, developed at Sandia National Laboratories, that provides an interface between models and a library of analysis methods, including support for sensitivity analysis, uncertainty quantification, optimization, and calibration techniques. Dakota is a powerful tool, but its learning curve is steep: the user not only must understand the structure and syntax of the Dakota input file, but also must develop intermediate code, called an analysis driver, that allows Dakota to run a model. The CSDMS Dakota interface (CDI) is a Python package that wraps and extends Dakota's user interface. It simplifies the process of configuring and running a Dakota experiment. A user can program to the CDI, allowing a Dakota experiment to be scripted. The CDI creates Dakota input files and provides a generic analysis driver. Any model written in Python that exposes a Basic Model Interface (BMI), as well as any model componentized in the CSDMS modeling framework, automatically works with the CDI. The CDI has a plugin architecture, so models written in other languages, or those that don't expose a BMI, can be accessed by the CDI by programmatically extending a template; an example is provided in the CDI distribution. Currently, six Dakota analysis methods have been implemented for examples from the much larger Dakota library. To demonstrate the CDI, we performed an uncertainty quantification experiment with the HydroTrend hydrological water balance and transport model. In the experiment, we evaluated the response of long-term suspended sediment load at the river mouth (Qs) to uncertainty in two input parameters, annual mean temperature (T) and precipitation (P), over a series of 100-year runs, using the polynomial chaos method. Through Dakota, we calculated moments, local and global (Sobol') sensitivity indices, and probability density and

  5. An economic toolkit for identifying the cost of emergency medical services (EMS) systems: detailed methodology of the EMS Cost Analysis Project (EMSCAP).

    Science.gov (United States)

    Lerner, E Brooke; Garrison, Herbert G; Nichol, Graham; Maio, Ronald F; Lookman, Hunaid A; Sheahan, William D; Franz, Timothy R; Austad, James D; Ginster, Aaron M; Spaite, Daniel W

    2012-02-01

    Calculating the cost of an emergency medical services (EMS) system using a standardized method is important for determining the value of EMS. This article describes the development of a methodology for calculating the cost of an EMS system to its community. This includes a tool for calculating the cost of EMS (the "cost workbook") and detailed directions for determining cost (the "cost guide"). The 12-step process that was developed is consistent with current theories of health economics, applicable to prehospital care, flexible enough to be used in varying sizes and types of EMS systems, and comprehensive enough to provide meaningful conclusions. It was developed by an expert panel (the EMS Cost Analysis Project [EMSCAP] investigator team) in an iterative process that included pilot testing the process in three diverse communities. The iterative process allowed ongoing modification of the toolkit during the development phase, based upon direct, practical, ongoing interaction with the EMS systems that were using the toolkit. The resulting methodology estimates EMS system costs within a user-defined community, allowing either the number of patients treated or the estimated number of lives saved by EMS to be assessed in light of the cost of those efforts. Much controversy exists about the cost of EMS and whether the resources spent for this purpose are justified. However, the existence of a validated toolkit that provides a standardized process will allow meaningful assessments and comparisons to be made and will supply objective information to inform EMS and community officials who are tasked with determining the utilization of scarce societal resources.

  6. Implementing Project Based Survey Research Skills to Grade Six ELP Students with "The Survey Toolkit" and "TinkerPlots"[R

    Science.gov (United States)

    Walsh, Thomas, Jr.

    2011-01-01

    "Survey Toolkit Collecting Information, Analyzing Data and Writing Reports" (Walsh, 2009a) is discussed as a survey research curriculum used by the author's sixth grade students. The report describes the implementation of "The Survey Toolkit" curriculum and "TinkerPlots"[R] software to provide instruction to students learning a project based…

  7. Implementing Project Based Survey Research Skills to Grade Six ELP Students with "The Survey Toolkit" and "TinkerPlots"[R

    Science.gov (United States)

    Walsh, Thomas, Jr.

    2011-01-01

    "Survey Toolkit Collecting Information, Analyzing Data and Writing Reports" (Walsh, 2009a) is discussed as a survey research curriculum used by the author's sixth grade students. The report describes the implementation of "The Survey Toolkit" curriculum and "TinkerPlots"[R] software to provide instruction to students learning a project based…

  8. Toolkit of Resources for Engaging Parents and Community as Partners in Education. Part 4: Engaging All in Data Conversations

    Science.gov (United States)

    Regional Educational Laboratory Pacific, 2015

    2015-01-01

    This toolkit is designed to guide school staff in strengthening partnerships with families and community members to support student learning. This toolkit offers an integrated approach to family and community engagement, bringing together research, promising practices, and a wide range of useful tools and resources with explanations and directions…

  9. Toolkit of Resources for Engaging Parents and Community as Partners in Education. Part 2: Building a Cultural Bridge

    Science.gov (United States)

    Regional Educational Laboratory Pacific, 2015

    2015-01-01

    This toolkit is designed to guide school staff in strengthening partnerships with families and community members to support student learning. This toolkit offers an integrated approach to family and community engagement, bringing together research, promising practices, and a wide range of useful tools and resources with explanations and directions…

  10. The MITK image guided therapy toolkit and its application for augmented reality in laparoscopic prostate surgery

    Science.gov (United States)

    Baumhauer, Matthias; Neuhaus, Jochen; Fritzsche, Klaus; Meinzer, Hans-Peter

    2010-02-01

    Image Guided Therapy (IGT) faces researchers with high demands and efforts in system design, prototype implementation, and evaluation. The lack of standardized software tools, like algorithm implementations, tracking device and tool setups, and data processing methods escalate the labor for system development and sustainable system evaluation. In this paper, a new toolkit component of the Medical Imaging and Interaction Toolkit (MITK), the MITK-IGT, and its exemplary application for computer-assisted prostate surgery are presented. MITK-IGT aims at integrating software tools, algorithms and tracking device interfaces into the MITK toolkit to provide a comprehensive software framework for computer aided diagnosis support, therapy planning, treatment support, and radiological follow-up. An exemplary application of the MITK-IGT framework is introduced with a surgical navigation system for laparos-copic prostate surgery. It illustrates the broad range of application possibilities provided by the framework, as well as its simple extensibility with custom algorithms and other software modules.

  11. AutoMicromanager: a microscopy scripting toolkit for LABVIEW and other programming environments.

    Science.gov (United States)

    Ashcroft, Brian Alan; Oosterkamp, Tjerk

    2010-11-01

    We present a scripting toolkit for the acquisition and analysis of a wide variety of imaging data by integrating the ease of use of various programming environments such as LABVIEW, IGOR PRO, MATLAB, SCILAB, and others. This toolkit is designed to allow the user to quickly program a variety of standard microscopy components for custom microscopy applications allowing much more flexibility than other packages. Included are both programming tools as well as graphical user interface classes allowing a standard, consistent, and easy to maintain scripting environment. This programming toolkit allows easy access to most commonly used cameras, stages, and shutters through the Micromanager project so the scripter can focus on their custom application instead of boilerplate code generation.

  12. Development of a Human Physiologically Based Pharmacokinetic (PBPK Toolkit for Environmental Pollutants

    Directory of Open Access Journals (Sweden)

    Patricia Ruiz

    2011-10-01

    Full Text Available Physiologically Based Pharmacokinetic (PBPK models can be used to determine the internal dose and strengthen exposure assessment. Many PBPK models are available, but they are not easily accessible for field use. The Agency for Toxic Substances and Disease Registry (ATSDR has conducted translational research to develop a human PBPK model toolkit by recoding published PBPK models. This toolkit, when fully developed, will provide a platform that consists of a series of priority PBPK models of environmental pollutants. Presented here is work on recoded PBPK models for volatile organic compounds (VOCs and metals. Good agreement was generally obtained between the original and the recoded models. This toolkit will be available for ATSDR scientists and public health assessors to perform simulations of exposures from contaminated environmental media at sites of concern and to help interpret biomonitoring data. It can be used as screening tools that can provide useful information for the protection of the public.

  13. New light field camera based on physical based rendering tracing

    Science.gov (United States)

    Chung, Ming-Han; Chang, Shan-Ching; Lee, Chih-Kung

    2014-03-01

    Even though light field technology was first invented more than 50 years ago, it did not gain popularity due to the limitation imposed by the computation technology. With the rapid advancement of computer technology over the last decade, the limitation has been uplifted and the light field technology quickly returns to the spotlight of the research stage. In this paper, PBRT (Physical Based Rendering Tracing) was introduced to overcome the limitation of using traditional optical simulation approach to study the light field camera technology. More specifically, traditional optical simulation approach can only present light energy distribution but typically lack the capability to present the pictures in realistic scenes. By using PBRT, which was developed to create virtual scenes, 4D light field information was obtained to conduct initial data analysis and calculation. This PBRT approach was also used to explore the light field data calculation potential in creating realistic photos. Furthermore, we integrated the optical experimental measurement results with PBRT in order to place the real measurement results into the virtually created scenes. In other words, our approach provided us with a way to establish a link of virtual scene with the real measurement results. Several images developed based on the above-mentioned approaches were analyzed and discussed to verify the pros and cons of the newly developed PBRT based light field camera technology. It will be shown that this newly developed light field camera approach can circumvent the loss of spatial resolution associated with adopting a micro-lens array in front of the image sensors. Detailed operational constraint, performance metrics, computation resources needed, etc. associated with this newly developed light field camera technique were presented in detail.

  14. Patient-Centered Personal Health Record and Portal Implementation Toolkit for Ambulatory Clinics: A Feasibility Study.

    Science.gov (United States)

    Nahm, Eun-Shim; Diblasi, Catherine; Gonzales, Eva; Silver, Kristi; Zhu, Shijun; Sagherian, Knar; Kongs, Katherine

    2017-04-01

    Personal health records and patient portals have been shown to be effective in managing chronic illnesses. Despite recent nationwide implementation efforts, the personal health record and patient portal adoption rates among patients are low, and the lack of support for patients using the programs remains a critical gap in most implementation processes. In this study, we implemented the Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit in a large diabetes/endocrinology center and assessed its preliminary impact on personal health record and patient portal knowledge, self-efficacy, patient-provider communication, and adherence to treatment plans. Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit is composed of Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit-General, clinic-level resources for clinicians, staff, and patients, and Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit Plus, an optional 4-week online resource program for patients ("MyHealthPortal"). First, Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit-General was implemented, and all clinicians and staff were educated about the center's personal health record and patient portal. Then general patient education was initiated, while a randomized controlled trial was conducted to test the preliminary effects of "MyHealthPortal" using a small sample (n = 74) with three observations (baseline and 4 and 12 weeks). The intervention group showed significantly greater improvement than the control group in patient-provider communication at 4 weeks (t56 = 3.00, P = .004). For other variables, the intervention group tended to show greater improvement; however, the differences were not significant. In this preliminary study, Patient-Centered Personal Health Record and Patient Portal Implementation Toolkit showed potential for filling the gap in the current

  15. Design and Implementation of an Application. Programming Interface for Volume Rendering

    OpenAIRE

    Selldin, Håkan

    2002-01-01

    To efficiently examine volumetric data sets from CT or MRI scans good volume rendering applications are needed. This thesis describes the design and implementation of an application programming interface (API) to be used when developing volume-rendering applications. A complete application programming interface has been designed. The interface is designed so that it makes writing application programs containing volume rendering fast and easy. The interface also makes created application progr...

  16. The IGUANA Interactive Graphics TOolkit with Examples from CMS and D0

    Institute of Scientific and Technical Information of China (English)

    GeorgeAlverson; IannaOsborne; 等

    2001-01-01

    IGUANA(Interactive Graphics for User ANAlysis)is a C++ toolkit for developing graphical user interfaces and high performance 2-D and 3-D graphics applications,such as data browsers and detector and event visualisation programs.The IGUANA strategy is to use freely available software(e.g.Qt,SoQt,OpenInventor,OpenGL,HEPVis)and package and extend it to provide a general-purpose and experiment-independent toolkit.We describe the evaluation and choices of publicly available GUI/graphics software and the additional functionality currently provided by IGUANA.We demonstrate the use of IGUANA with several applications built for CMS and D0.

  17. The PyRosetta Toolkit: a graphical user interface for the Rosetta software suite.

    Directory of Open Access Journals (Sweden)

    Jared Adolf-Bryfogle

    Full Text Available The Rosetta Molecular Modeling suite is a command-line-only collection of applications that enable high-resolution modeling and design of proteins and other molecules. Although extremely useful, Rosetta can be difficult to learn for scientists with little computational or programming experience. To that end, we have created a Graphical User Interface (GUI for Rosetta, called the PyRosetta Toolkit, for creating and running protocols in Rosetta for common molecular modeling and protein design tasks and for analyzing the results of Rosetta calculations. The program is highly extensible so that developers can add new protocols and analysis tools to the PyRosetta Toolkit GUI.

  18. Analysis of the Toolkit method for the time-dependant Schr\\"odinger equation

    CERN Document Server

    Baudouin, Lucie; Turinici, Gabriel

    2009-01-01

    The goal of this paper is to provide an analysis of the ``toolkit'' method used in the numerical approximation of the time-dependent Schr\\"odinger equation. The ``toolkit'' method is based on precomputation of elementary propagators and was seen to be very efficient in the optimal control framework. Our analysis shows that this method provides better results than the second order Strang operator splitting. In addition, we present two improvements of the method in the limit of low and large intensity control fields.

  19. Analysis of the Toolkit method for the time-dependant Schr\\"odinger equation

    CERN Document Server

    Baudouin, Lucie; Turinici, Gabriel

    2010-01-01

    The goal of this paper is to provide an analysis of the "toolkit" method used in the numerical approximation of the time-dependent Schr\\"odinger equation. The "toolkit" method is based on precomputation of elementary propagators and was seen to be very efficient in the optimal control framework. Our analysis shows that this method provides better results than the second order Strang operator splitting. In addition, we present two improvements of the method in the limit of low and large intensity control fields.

  20. A toolkit for forward/inverse problems in electrocardiography within the SCIRun problem solving environment.

    Science.gov (United States)

    Burton, Brett M; Tate, Jess D; Erem, Burak; Swenson, Darrell J; Wang, Dafang F; Steffen, Michael; Brooks, Dana H; van Dam, Peter M; Macleod, Rob S

    2011-01-01

    Computational modeling in electrocardiography often requires the examination of cardiac forward and inverse problems in order to non-invasively analyze physiological events that are otherwise inaccessible or unethical to explore. The study of these models can be performed in the open-source SCIRun problem solving environment developed at the Center for Integrative Biomedical Computing (CIBC). A new toolkit within SCIRun provides researchers with essential frameworks for constructing and manipulating electrocardiographic forward and inverse models in a highly efficient and interactive way. The toolkit contains sample networks, tutorials and documentation which direct users through SCIRun-specific approaches in the assembly and execution of these specific problems.