Sample records for admin tool detail

  1. Modernising the CERN Admin e-guide

    DG-RPC-PA (section des processus administratifs)


    In just a few years, the CERN Admin e-guide has become the essential guide for CERN's administrative procedures, intended not just for administrative services but for all members of the personnel of CERN. The guide is edited by the Administrative Process Section (DG-RPC-PA) and allows users to look up administrative procedures and forms relating to the application of the Staff Rules and Regulations easily in both French and English. Site visit statistics show that 5000 people on average consult the Admin e-guide each month. The most consulted procedures are those relating to green licence plates, taxes, Swiss and French cards and leave. The site has just been moved over to Drupal, which will make it easier to update the information provided and will improve content management. This system change meant migrating 220 procedures and more than 500 frequently asked questions. The structure has been revised for clarity, but the principles that made the guide so successful have not changed, namely the e...

  2. Large Terrain Continuous Level of Detail 3D Visualization Tool

    Myint, Steven; Jain, Abhinandan


    This software solved the problem of displaying terrains that are usually too large to be displayed on standard workstations in real time. The software can visualize terrain data sets composed of billions of vertices, and can display these data sets at greater than 30 frames per second. The Large Terrain Continuous Level of Detail 3D Visualization Tool allows large terrains, which can be composed of billions of vertices, to be visualized in real time. It utilizes a continuous level of detail technique called clipmapping to support this. It offloads much of the work involved in breaking up the terrain into levels of details onto the GPU (graphics processing unit) for faster processing.

  3. Framework for Detailed Comparison of Building Environmental Assessment Tools

    Ola Eriksson


    Full Text Available Understanding how Building Environmental Assessments Tools (BEATs measure and define “environmental” building is of great interest to many stakeholders, but it is difficult to understand how BEATs relate to each other, as well as to make detailed and systematic tool comparisons. A framework for comparing BEATs is presented in the following which facilitates an understanding and comparison of similarities and differences in terms of structure, content, aggregation, and scope. The framework was tested by comparing three distinctly different assessment tools; LEED-NC v3, Code for Sustainable Homes (CSH, and EcoEffect. Illustrations of the hierarchical structure of the tools gave a clear overview of their structural differences. When using the framework, the analysis showed that all three tools treat issues related to the main assessment categories: Energy and Pollution, Indoor Environment, and Materials and Waste. However, the environmental issues addressed, and the parameters defining the object of study, differ and, subsequently, so do rating, results, categories, issues, input data, aggregation methodology, and weighting. This means that BEATs measure “environmental” building differently and push “environmental” design in different directions. Therefore, tool comparisons are important, and the framework can be used to make these comparisons in a more detailed and systematic way.

  4. Salesforce CRM the definitive admin handbook

    Goodey, Paul


    A practical guide which will help to discover how to setup and configure the Salesforce CRM application. It offers solutions and practical examples on how to further improve and maintain its functionality with clear systematic instructions. Being highly organized and compact, this book contains detailed instructions with screenshots, diagrams, and tips that clearly describe how you can administer and configure complex Salesforce CRM functionality with absolute ease.This book is for administrators who want to develop and strengthen their Salesforce CRM skills in the areas of configuration and s

  5. Launch of the new CERN Admin e-guide

    Laëtitia Pedroso


    The CERN Admin e-guide is a new guide to the Organization's administrative procedures, which has been drawn up for the benefit of members of the personnel and the various administrative services alike and replaces the old "Administrative Procedures Manual". All the different procedures currently available on separate department sites will henceforth be accessible at a single website.   Home page of the new CERN Admin e-guide. The goal of creating a compendium of CERN's administrative procedures, available at a single website and accessible with a simple click of the mouse, has now been realised.  "It had become difficult to know where to find the relevant up-to-date information on administrative procedures", says Yaël Grange-Lavigne of the HR-SPS-OP Section (Organisation and Procedures), coordinator of the working group that compiled the e-guide. The team, which comprised members of the HR Department, observers from other department...

  6. Creating Simple Admin Tools Using Info*Engine and Java

    Jones, Corey; Kapatos, Dennis; Skradski, Cory; Felkins, J. D.


    PTC has provided a simple way to dynamically interact with Windchill using Info*Engine. This presentation will describe how to create a simple Info*Engine Tasks capable of saving Windchill 10.0 administration of tedious work.

  7. Information Technology Administrator’s Instruction Manual for the Personal Academic Strategies for Success (PASS) Tool, With Subcomponent Academic Class Composite Tool (AC2T)


    information was generated through evidence-based research and statistical regression modeling with data from 579 68W AIT students. The same tool contains a...the best fit model of academic achievement predictor variables was identified. The final regression model then formed the basis for both the PASS and...passwords: (a) IT Admin: PassTool10!! (b) VBA Window (Code): PASStoolAdmin (c) Back End: PASStoolBE2010!! (d) First Researcher Admin: SuperAdmin12%^ 2


    M. Taranenko


    Full Text Available The directions concerning bus body production costs reduction based on large-size facing details stamping on electric-hydraulic presses are considered. The engineering of unified large-size facing detail blanks, manufacturing processes and variants of unified technological tooling for bus body facing details production, including simple- or multi-type details are described.

  9. Introducción a phpMyAdmin, ejercicio


    El propósito del siguiente ejercicio es replicar, más o menos, la base de datos utilizada en los videos de introducción a phpMyAdmin y comprobar que eres capaz de generar una base de datos y unas cuantas tablas relacionadas entre sí. La solución la tienes al final de este documento pero, claro, mejor si lo intentas por ti mismo primero. Recurso para el MOOC "Introducción al Desarrollo Web", BDgite (GITE-11014-UA), departamento de Lenguajes y Sistemas Informáticos, Uni...

  10. Improving Windows desktop security - the 'Non-Admin' Project

    NICE Team


    From 16 January 2006, Windows XP NICE installations (both new computers installed and old computers re-installed) will no longer grant administrative privileges by default to the main user or to the person responsible for the computer. Administrative privileges allow the user to perform administrative actions on the computer, such as installing new applications or changing system settings. Until now, these privileges have been enforced each time machines are rebooted, but this creates a risk of compromising the computer every time a code from an unknown source (e.g. e-mail attachments or web browsing) is executed. So that users can continue to install software and change system settings on their computers, a shortcut called 'NiceAdmin' in the Start | Programmes menu will offer a means of performing certain tasks requiring administrative privileges, but only on demand. Users with valid reasons to be a permanent administrator for their machine will still have this option. However, users wishing to benefit fr...

  11. Introducción a phpMyAdmin (3/4)

    Suárez Cueto, Armando


    Curso "Introducción al desarrollo web": inserción de datos, creación de una relación muchos a muchos (N:N), claves ajenas, integridad referencial, diseñador de phpMyAdmin. Más información:

  12. MySQL Admin Cookbook LITE Replication and Indexing

    Schneller, Daniel


    This cookbook presents solutions to problems in the form of recipes. Each recipe provides the reader with easy step-by-step descriptions of the actions necessary to accomplish a specific task. Example values and code samples are used throughout the recipes, which makes adaptation for individual needs easy. This book is for ambitious MySQL users as well as professional data center database administrators. Beginners as well as experienced administrators will benefit from this cookbook and get fresh ideas to improve their MySQL environments. Detailed background information will enable them to wid

  13. Introducción a phpMyAdmin (2/4)

    Suárez Cueto, Armando


    Curso "Introducción al desarrollo web": phpMyAdmin, creación de tablas, tipos de datos, creación de índices, motores de MySQL (INNODB, MyISAM, MEMORY), inserción de datos, seguimiento de una tabla (historial), exportar una tabla (backup). Más información:

  14. Mastering phpMyAdmin 34 for Effective MySQL Management

    Delisle, Marc


    This is a step-by-step instructional guide to get you started easily with phpMyAdmin and teach you to manage and perform database functions on your database. You will first be introduced to the interface and then build basic tables and perform both simple and advanced functions on the created database. The book progresses gradually and you will follow it best by reading it sequentially. If you are a developer, system administrator, or web designer who wants to manage MySQL databases and tables efficiently, then this book is for you. This book assumes that you are already wellacquainted with My

  15. Introducción a phpMyAdmin (1/4)

    Suárez Cueto, Armando


    Curso "Introducción al desarrollo web": explicación del escenario de aprendizaje, repaso de los conceptos básicos de las bases de datos relacionales (clave primaria, clave ajena, relación), repaso del lenguaje de consulta y de manipulación SQL (select, insert, delete, update), presentación del interfaz y del área de trabajo de phpMyAdmin, creación de un usuario de prueba, privilegios (permisos) de los usuarios. Más información:

  16. Advanced computational tools for PEM fuel cell design. Part 2. Detailed experimental validation and parametric study

    Sui, P. C.; Kumar, S.; Djilali, N.

    This paper reports on the systematic experimental validation of a comprehensive 3D CFD-based computational model presented and documented in Part 1. Simulations for unit cells with straight channels, similar to the Ballard Mk902 hardware, are performed and analyzed in conjunction with detailed current mapping measurements and water mass distributions in the membrane-electrode assembly. The experiments were designed to display sensitivity of the cell over a range of operating parameters including current density, humidification, and coolant temperature, making the data particularly well suited for systematic validation. Based on the validation and analysis of the predictions, values of model parameters, including the electro-osmotic drag coefficient, capillary diffusion coefficient, and catalyst specific surface area are determined adjusted to fit experimental data of current density and MEA water content. The predicted net water flux out of the anode (normalized by the total water generated) increases as anode humidification water flow rate is increased, in agreement with experimental results. A modification of the constitutive equation for the capillary diffusivity of water in the porous electrodes that attempts to incorporate the experimentally observed immobile (or irreducible) saturation yields a better fit of the predicted MEA water mass with experimental data. The specific surface area parameter used in the catalyst layer model is found to be effective in tuning the simulations to predict the correct cell voltage over a range of stoichiometries.

  17. Introducción a phpMyAdmin (parte 0 o resumen)

    Suárez Cueto, Armando


    Curso "Introducción al desarrollo web": este vídeo realiza una doble función, sirve como introducción al resto de vídeos, pero también sirve como resumen, por lo que se puede ver tanto al principio como al final. En este vídeo se resume: - Objetivo de los vídeos, introducción a phpMyAdmin. - Parte 1: conceptos generales de bases de datos, creación de usuarios y privilegios. - Parte 2: creación de tablas, tipos de datos, motores de almacenamiento, inserción de filas, exportación (estructura y ...

  18. MySQL Admin Cookbook LITE Configuration, Server Monitoring, Managing Users

    Schneller, Daniel


    This cookbook presents solutions to problems in the form of recipes. Each recipe provides the reader with easy step-by-step descriptions of the actions necessary to accomplish a specific task. Example values and code samples are used throughout the recipes, which makes adaptation for individual needs easy. This book is for ambitious MySQL users as well as professional data center database administrators. Beginners as well as experienced administrators will benefit from this cookbook and get fresh ideas to improve their MySQL environments. Detailed background information will enable them to wid

  19. Comprehensive tool for calculation of radiative fluxes: illustration of shortwave aerosol radiative effect sensitivities to the details in aerosol and underlying surface characteristics

    Derimian, Yevgeny; Dubovik, Oleg; Huang, Xin; Lapyonok, Tatyana; Litvinov, Pavel; Kostinski, Alex B.; Dubuisson, Philippe; Ducos, Fabrice


    The evaluation of aerosol radiative effect on broadband hemispherical solar flux is often performed using simplified spectral and directional scattering characteristics of atmospheric aerosol and underlying surface reflectance. In this study we present a rigorous yet fast computational tool that accurately accounts for detailed variability of both spectral and angular scattering properties of aerosol and surface reflectance in calculation of direct aerosol radiative effect. The tool is developed as part of the GRASP (Generalized Retrieval of Aerosol and Surface Properties) project. We use the tool to evaluate instantaneous and daily average radiative efficiencies (radiative effect per unit aerosol optical thickness) of several key atmospheric aerosol models over different surface types. We then examine the differences due to neglect of surface reflectance anisotropy, nonsphericity of aerosol particle shape and accounting only for aerosol angular scattering asymmetry instead of using full phase function. For example, it is shown that neglecting aerosol particle nonsphericity causes mainly overestimation of the aerosol cooling effect and that magnitude of this overestimate changes significantly as a function of solar zenith angle (SZA) if the asymmetry parameter is used instead of detailed phase function. It was also found that the nonspherical-spherical differences in the calculated aerosol radiative effect are not modified significantly if detailed BRDF (bidirectional reflectance distribution function) is used instead of Lambertian approximation of surface reflectance. Additionally, calculations show that usage of only angular scattering asymmetry, even for the case of spherical aerosols, modifies the dependence of instantaneous aerosol radiative effect on SZA. This effect can be canceled for daily average values, but only if sun reaches the zenith; otherwise a systematic bias remains. Since the daily average radiative effect is obtained by integration over a range

  20. Xgrid admin guide

    Strauss, Charlie E M [Los Alamos National Laboratory


    Xgrid, with a capital-X is the name for Apple's grid computing system. With a lower case x, xgrid is the name of the command line utility that clients can use, among other ways, to submit jobs to a controller. An Xgrid divides into three logical components: Agent, Controller and Client. Client computers submit jobs (a set of tasks) they want run to a Controller computer. The Controller queues the Client jobs and distributes tasks to Agent computers. Agent computers run the tasks and report their output and status back to the controller where it is stored until deleted by the Client. The Clients can asynchronously query the controller about the status of a job and the results. Any OSX computer can be any of these. A single mac can be more than one: it's possible to be Agent, Controller and Client at the same time. There is one Controller per Grid. Clients can submit jobs to Controllers of different grids. Agents can work for more than one grid. Xgrid's setup has a pleasantly small palette of choices. The first two decisions to make are the kind of authentication & authorization to use and if a shared file system is needed. A shared file system that all the agents can access can be very beneficial for many computing problems, but it is not appropriate for every network.

  1. A Minimal Fragmentation Approach to Real Time Aerosol Mass Spectrometry: A New Tool for Detailed Laboratory Studies of Organic Aerosol Aging

    Campuzano-Jost, P.; Hanna, S.; Simpson, E.; Robb, D.; Blades, M. W.; Hepburn, J. W.; Bertram, A. K.


    The study of the atmospheric distribution and chemical processing of both biogenic and anthropogenic organics is one of the oldest and still most enduring challenges in atmospheric chemistry. The large number and structural complexity of many of the compounds as well as the high reactivity of many intermediates makes it hard to design analytical tools that are at the same time sensitive enough as well as being reasonably broad in scope. Despite big advances in techniques to characterize the gaseous phase component, there is still a dearth of instruments capable of doing the same for the organic aerosol component. This is due in part to the type of the compounds present in the aerosol phase, which in general lend themselves less to classical analytical methods such as GC/MS, as well as the inherent problems of any aerosol analysis, namely to transfer the aerosol to a suitable phase for analysis without altering it and while keeping track, at the same time, of the physical properties of the aerosol. Although impaction methods coupled to conventional analysis techniques have some specific advantages, the most widely used approach is the aerosol mass spectrometer. Unlike their predecessors, current aerosol mass spectrometer designs do a reasonably good job of delivering a representative sample of the aerosol phase to the detector while keeping track of the physical properties of the aerosol. However, the ionization step (either multitphoton absorption or electron impact in most cases) still leads to massive fragmentation of all but the most stable organics, making it very difficult to characterize individual compounds beyond establishing their functional groups(Allan et al. 2003; Su et al. 2004). Single photon near threshold ionization has been proposed and used recently (Oktem et al. 2004; Nash et al. 2005), but the challenges of producing coherent VUV radiation has led to a high detection threshold and a still significant amount of fragmentation, since these studies

  2. Main: Clone Detail [KOME

    Full Text Available Clone Detail Mapping Pseudomolecule data detail Detail information Mapping to the TIGR japonica Pseudomolecu...les kome_mapping_pseudomolecule_data_detail ...

  3. Detail and survey radioautographs

    Wainwright, Wm.W.


    The much used survey or contact type of radioautograph is indispensible for a study of the gross distribution of radioactive materials. A detail radioautograph is equally indispensible. The radioautograph makes possible the determination of plutonium with respect to cells. Outlines of survey and detail techniques are given.

  4. Visual overview, oral detail

    Hertzum, Morten; Simonsen, Jesper


    and with the coordinating nurse, who is the main keeper of the whiteboard. On the basis of observations, we find that coordination is accomplished through a highly intertwined process of technologically mediated visual overview combined with orally communicated details. The oral details serve to clarify and elaborate...

  5. LOCKE Detailed Specification Tables

    Menezo, Lucia G; Gregorio, Jose-Angel


    This document shows the detailed specification of LOCKE coherence protocol for each cache controller, using a table-based technique. This representation provides clear, concise visual information yet includes sufficient detail (e.g., transient states) arguably lacking in the traditional, graphical form of state diagrams.

  6. Detailed Soils 24K

    Kansas Data Access and Support Center — This data set is a digital soil survey and is the most detailed level of soil geographic data developed by the National Cooperative Soil Survey. The information was...

  7. Three Latin Phonological Details

    Olsen, Birgit Anette


    The present paper deals with three minor details of Latin phonology: 1) the development of the initial sequence *u¿l¿-, where it is suggested that an apparent vacillation between ul- and vol-/vul- represents sandhi variants going back to the proto-language, 2) the adjectives ama¯rus ‘bitter' and ...

  8. Detailed Debunking of Denial

    Enting, I. G.; Abraham, J. P.


    The disinformation campaign against climate science has been compared to a guerilla war whose tactics undermine the traditional checks and balances of science. One comprehensive approach has to been produce archives of generic responses such as the websites of RealClimate and SkepticalScience. We review our experiences with an alternative approach of detailed responses to a small number of high profile cases. Our particular examples were Professor Ian Plimer and Christopher Monckton, the Third Viscount Monckton of Brenchley, each of whom has been taken seriously by political leaders in our respective countries. We relate our experiences to comparable examples such as John Mashey's analysis of the Wegman report and the formal complaints about Lomborg's "Skeptical Environmentalist" and Durkin's "Great Global Warming Swindle". Our two approaches used contrasting approaches: an on-line video of a lecture vs an evolving compendium of misrepresentations. Additionally our approaches differed in the emphasis. The analysis of Monckton concentrated on the misrepresentation of the science, while the analysis of Plimer concentrated on departures from accepted scientific practice: fabrication of data, misrepresentation of cited sources and unattributed use of the work of others. Benefits of an evolving compendium were the ability to incorporate contributions from members of the public who had identified additional errors and the scope for addressing new aspects as they came to public attention. `Detailed debunking' gives non-specialists a reference point for distinguishing non-science when engaging in public debate.

  9. Global detailed gravimetric geoid

    Vincent, S.; Marsh, J. G.


    A global detailed gravimetric geoid has been computed by combining the Goddard Space Flight Center GEM-4 gravity model derived from satellite and surface gravity data and surface 1 x 1-deg mean free-air gravity anomaly data. The accuracy of the geoid is plus or minus 2 meters on continents, 5 to 7 meters in areas where surface gravity data are sparse, and 10 to 15 meters in areas where no surface gravity data are available. Comparisons have been made with the astrogeodetic data provided by Rice (United States), Bomford (Europe), and Mather (Australia). Comparisons have also been carried out with geoid heights derived from satellite solutions for geocentric station coordinates in North America, the Caribbean, Europe and Australia.

  10. Crowdsourcing detailed flood data

    Walliman, Nicholas; Ogden, Ray; Amouzad*, Shahrzhad


    Over the last decade the average annual loss across the European Union due to flooding has been 4.5bn Euros, but increasingly intense rainfall, as well as population growth, urbanisation and the rising costs of asset replacements, may see this rise to 23bn Euros a year by 2050. Equally disturbing are the profound social costs to individuals, families and communities which in addition to loss of lives include: loss of livelihoods, decreased purchasing and production power, relocation and migration, adverse psychosocial effects, and hindrance of economic growth and development. Flood prediction, management and defence strategies rely on the availability of accurate information and flood modelling. Whilst automated data gathering (by measurement and satellite) of the extent of flooding is already advanced it is least reliable in urban and physically complex geographies where often the need for precise estimation is most acute. Crowdsourced data of actual flood events is a potentially critical component of this allowing improved accuracy in situations and identifying the effects of local landscape and topography where the height of a simple kerb, or discontinuity in a boundary wall can have profound importance. Mobile 'App' based data acquisition using crowdsourcing in critical areas can combine camera records with GPS positional data and time, as well as descriptive data relating to the event. This will automatically produce a dataset, managed in ArcView GIS, with the potential for follow up calls to get more information through structured scripts for each strand. Through this local residents can provide highly detailed information that can be reflected in sophisticated flood protection models and be core to framing urban resilience strategies and optimising the effectiveness of investment. This paper will describe this pioneering approach that will develop flood event data in support of systems that will advance existing approaches such as developed in the in the UK

  11. Detailed IR aperture measurements

    Bruce, Roderik; Garcia Morales, Hector; Giovannozzi, Massimo; Hermes, Pascal Dominik; Mirarchi, Daniele; Quaranta, Elena; Redaelli, Stefano; Rossi, Carlo; Skowronski, Piotr Krzysztof; Wretborn, Sven Joel; CERN. Geneva. ATS Department


    MD 1673 was carried out on October 5 2016, in order to investigate in more detail the available aperture in the LHC high-luminosity insertions at 6.5 TeV and β∗=40 cm. Previous aperture measurements in 2016 during commissioning had shown that the available aperture is at the edge of protection, and that the aperture bottleneck at β∗=40 cm in certain cases is found in the separation plane instead of in the crossing plane. Furthermore, the bottlenecks were consistently found in close to the upstream end of Q3 on the side of the incoming beam, and not in Q2 on the outgoing beam as expected from calculations. Therefore, this MD aimed at measuring IR1 and IR5 separately (at 6.5 TeV and β∗=40 cm, for 185 µrad half crossing angle), to further localize the bottlenecks longitudinally using newly installed BLMs, investigate the difference in aperture between Q2 and Q3, and to see if any aperture can be gained using special orbit bumps.

  12. CERN in detail

    Laëtitia Pedroso


    Before, you had to go on the TPG website to find a tram-route, use Google Maps to see an aerial photo of CERN, and look for CERN buildings on Now, that's ancient history, with a new Geographical Information System (GIS) Portal set up by the Design Office and Patrimony Service (GS/SEM/DOP).  It's a one-stop-shop for all this information and much more.   A screenshot of the GIS Portal. Over the past few days, you might have noticed the new interface called MAPSearch that pops up when you make a building search using the Building and Roads field on the CERN homepage. This is a simplified version of the new GIS web Portal, a project on which the GS Department's Design Office and Patrimony Service has been working since January 2010. "In today's informatics age, we need to respond ever more quickly to increasing numbers of specific user requests," explains Project Leader Youri Robert. This is more than just a new release of an old tool, it's a completely n...

  13. Geographic Names Information System (GNIS) Admin Features

    Department of Homeland Security — The Geographic Names Information System (GNIS) is the Federal standard for geographic nomenclature. The U.S. Geological Survey developed the GNIS for the U.S. Board...

  14. MarFS-Requirements-Design-Configuration-Admin

    Kettering, Brett Michael [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Grider, Gary Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)


    This document will be organized into sections that are defined by the requirements for a file system that presents a near-POSIX (Portable Operating System Interface) interface to the user, but whose data is stored in whatever form is most efficient for the type of data being stored. After defining the requirement the design for meeting the requirement will be explained. Finally there will be sections on configuring and administering this file system. More and more, data dominates the computing world. There is a “sea” of data out there in many different formats that needs to be managed and used. “Mar” means “sea” in Spanish. Thus, this product is dubbed MarFS, a file system for a sea of data.

  15. Tools used for hand deburring

    Gillespie, L.K.


    This guide is designed to help in quick identification of those tools most commonly used to deburr hand size or smaller parts. Photographs and textual descriptions are used to provide rapid yet detailed information. The data presented include the Bendix Kansas City Division coded tool number, tool description, tool crib in which the tool can be found, the maximum and minimum inventory requirements, the cost of each tool, and the number of the illustration that shows the tool.

  16. Computed tomography:the details.

    Doerry, Armin Walter


    Computed Tomography (CT) is a well established technique, particularly in medical imaging, but also applied in Synthetic Aperture Radar (SAR) imaging. Basic CT imaging via back-projection is treated in many texts, but often with insufficient detail to appreciate subtleties such as the role of non-uniform sampling densities. Herein are given some details often neglected in many texts.

  17. Phonetic Detail in American English

    Ray Freeze


    @@ In the course of teaching general phonetics and phonological analysis in the psat few years,l have found some phonetic detail which some native speakers as well as non-native speakers were unaware of. This subtle detail will be the focus of this presentation. Som e of this detail many of you will already be aware of because of your experience in learning, teaching, and thinking about English. If anything is new to you, I hope you might enjoy hearing about it even if it turns out not to be useful in your work.

  18. Transformative Dynamics in Detailed Planning

    Quitzau, Maj-Britt; Poulsen, Naja; Gustavsson, Ted;

    that the translation process relies heavily on integration of impositions in the detailed plan, although this has clear limitations, since some sustainable strategies are more difficult to impose than others. It also shows how strategic navigation may represent an alternative translation strategy to promote more...... difficult sustainable strategies that address the project design more directly. In conclusion, the paper argues that strategic navigation represents a stronger mediator of change compared to the detailed plan, but that especially timing issues in the coordination between formal planning and design processes...

  19. Review of Ship Structural Details


    8 4.3 Knee and Beam Brackets 4-11 4.3.1 Brackets for Girders and Deep Webs 4-11 4.3.2 Brackets Connecting Rolled Sections 4-15 4.4 Tripping...are shell stringers penetrating deep web frames and longitudinal girders penetrating deep transverses. This is not a common detail. If double...34. 3-76 ^"SECTION ’’.’(-K PLAJ iNG * S v *^ 4Fb^:TH»r.KNF.^ SAME AS FLAMGE ► BULKHFADQR DEEP WEB SS- 9 Detail Type: STANCHION END

  20. Detail in architecture: Between arts

    Dulencin Juraj


    Full Text Available Architectural detail represents an important part of architecture. Not only can it be used as an identifier of a specific building but at the same time enhances the experience of the realized project. Within it lie the signs of a great architect and clues to understanding his or her way of thinking. It is therefore the central topic of a seminar offered to architecture students at the Brno University of Technology. During the course of the semester-long class the students acquaint themselves with atypical architectural details of domestic and international architects by learning to read them, understand them and subsequently draw them by creating architectural blueprints. In other words, by general analysis of a detail the students learn theoretical thinking of its architect who, depending on the nature of the design, had to incorporate a variety of techniques and crafts. Students apply this analytical part to their own architectural detail design. The methodology of the seminar consists of experiential learning by project management and is complemented by a series of lectures discussing a diversity of details as well as materials and technologies required to implement it. The architectural detail design is also part of students’ bachelors thesis, therefore, the realistic nature of their blueprints can be verified in the production process of its physical counterpart. Based on their own documentation the students choose the most suitable manufacturing process whether it is supplied by a specific technology or a craftsman. Students actively participate in the production and correct their design proposals in real scale with the actual material. A student, as a future architect, stands somewhere between a client and an artisan, materializes his or her idea and adjusts the manufacturing process so that the final detail fulfills aesthetic consistency and is in harmony with its initial concept. One of the very important aspects of the design is its

  1. DAGAL: Detailed Anatomy of Galaxies

    Knapen, Johan H.


    The current IAU Symposium is closely connected to the EU-funded network DAGAL (Detailed Anatomy of Galaxies), with the final annual network meeting of DAGAL being at the core of this international symposium. In this short paper, we give an overview of DAGAL, its training activities, and some of the scientific advances that have been made under its umbrella.

  2. DAGAL: Detailed Anatomy of Galaxies

    Knapen, Johan H


    The current IAU Symposium is closely connected to the EU-funded network DAGAL (Detailed Anatomy of Galaxies), with the final annual network meeting of DAGAL being at the core of this international symposium. In this short paper, we give an overview of DAGAL, its training activities, and some of the scientific advances that have been made under its umbrella.

  3. On Detailing in Contemporary Architecture

    Kristensen, Claus; Kirkegaard, Poul Henning


    / tactility can blur the meaning of the architecture and turn it into an empty statement. The present paper will outline detailing in contemporary architecture and discuss the issue with respect to architectural quality. Architectural cases considered as sublime piece of architecture will be presented...

  4. A Generalized Detailed Balance Relation

    Ruelle, David


    Given a system M in a thermal bath we obtain a generalized detailed balance relation for the ratio r=π _τ (K→ J)/π _τ (J→ K) of the transition probabilities M:J→ K and M:K→ J in time τ . We assume an active bath, containing solute molecules in metastable states. These molecules may react with M and the transition J→ K occurs through different channels α involving different reactions with the bath. We find that r=sum p^α r^α , where p^α is the probability that channel α occurs, and r^α depends on the amount of heat (more precisely enthalpy) released to the bath in channel α.

  5. A generalized detailed balance relation

    Ruelle, David


    Given a system $M$ in a thermal bath we obtain a generalized detailed balance relation for the ratio $r=\\pi_\\tau(K\\to J)/\\pi_\\tau(J\\to K)$ of the transition probabilities $M:J\\to K$ and $M:K\\to J$ in time $\\tau$. We assume an active bath, containing solute molecules in metastable states. These molecules may react with $M$ and the transition $J\\to K$ occurs through different channels $\\alpha$ involving different reactions with the bath. We find that $r=\\sum p^\\alpha r^\\alpha$, where $p^\\alpha$ is the probability that channel $\\alpha$ occurs, and $r^\\alpha$ depends on the amount of heat (more precisely enthalpy) released to the bath in channel $\\alpha$.

  6. Google - Security Testing Tool

    Staykov, Georgi


    Using Google as a security testing tool, basic and advanced search techniques using advanced google search operators. Examples of obtaining control over security cameras, VoIP systems, web servers and collecting valuable information as: Credit card details, cvv codes – only using Google.

  7. Tools for charged Higgs bosons

    Staal, Oscar


    We review the status of publicly available software tools applicable to charged Higgs physics. A selection of codes are highlighted in more detail, focusing on new developments that have taken place since the previous charged Higgs workshop in 2008. We conclude that phenomenologists now have the tools ready to face the LHC data. A new web page collecting charged Higgs resources is presented. (orig.)

  8. Simulation tools

    Jenni, F


    In the last two decades, simulation tools made a significant contribution to the great progress in development of power electronics. Time to market was shortened and development costs were reduced drastically. Falling costs, as well as improved speed and precision, opened new fields of application. Today, continuous and switched circuits can be mixed. A comfortable number of powerful simulation tools is available. The users have to choose the best suitable for their application. Here a simple rule applies: The best available simulation tool is the tool the user is already used to (provided, it can solve the task). Abilities, speed, user friendliness and other features are continuously being improved—even though they are already powerful and comfortable. This paper aims at giving the reader an insight into the simulation of power electronics. Starting with a short description of the fundamentals of a simulation tool as well as properties of tools, several tools are presented. Starting with simplified models ...

  9. Chatter and machine tools

    Stone, Brian


    Focussing on occurrences of unstable vibrations, or Chatter, in machine tools, this book gives important insights into how to eliminate chatter with associated improvements in product quality, surface finish and tool wear. Covering a wide range of machining processes, including turning, drilling, milling and grinding, the author uses his research expertise and practical knowledge of vibration problems to provide solutions supported by experimental evidence of their effectiveness. In addition, this book contains links to supplementary animation programs that help readers to visualise the ideas detailed in the text. Advancing knowledge in chatter avoidance and suggesting areas for new innovations, Chatter and Machine Tools serves as a handbook for those desiring to achieve significant reductions in noise, longer tool and grinding wheel life and improved product finish.

  10. Monte Carlo methods beyond detailed balance

    Schram, Raoul D.; Barkema, Gerard T.


    Monte Carlo algorithms are nearly always based on the concept of detailed balance and ergodicity. In this paper we focus on algorithms that do not satisfy detailed balance. We introduce a general method for designing non-detailed balance algorithms, starting from a conventional algorithm satisfying

  11. General purpose MDE tools

    Juan Manuel Cueva Lovelle


    Full Text Available MDE paradigm promises to release developers from writing code. The basis of this paradigm consists in working at such a level of abstraction that will make it easyer for analysts to detail the project to be undertaken. Using the model described by analysts, software tools will do the rest of the task, generating software that will comply with customer's defined requirements. The purpose of this study is to compare general purpose tools available right now that enable to put in practice the principles of this paradigm and aimed at generating a wide variety of applications composed by interactive multimedia and artificial intelligence components.

  12. Nabokov's Details: Making Sense of Irrational Standards


    Vladimir Nabokov's passion for detail is well-known, central to our very idea of the "Nabokovian." Yet Nabokov's most important claims for detail pose a challenge for the reader who would take them seriously. Startlingly extreme and deliberately counterintuitive -- Nabokov called them his "irrational standards" -- these claims push the very limits of reason and belief. Nabokov's critics have tended to treat his more extravagant claims for detail -- including his assertion that the "capacity t...

  13. Displaying of Details in Subvoxel Accuracy

    蔡文立; 陈天洲; 等


    Under the volume segmentation in voxel space,a lot of details,some fine and thin objects,are ignored.In order to accurately display these details,this paper has developed a methodology for volume segmentation in subvoxel space.In the subvoxel space,most of the “bridges”between adjacent layers are broken down.Based on the subvoxel space,an automatic segmentation algorithm reserving details is discussed.After segmentation,volume data in subvoxel space are reduced to original voxel space.Thus,the details with width of only one or several voxels are extracted and displayed.

  14. Tool steels

    Højerslev, C.


    resistance against abrasive wear and secondary carbides (if any) increase the resistance against plastic deformation. Tool steels are alloyed with carbide forming elements (Typically: vanadium, tungsten, molybdenumand chromium) furthermore some steel types contains cobalt. Addition of alloying elements...

  15. Detailed Analysis of Motor Unit Activity

    Nikolic, Mile; Sørensen, John Aasted; Dahl, Kristian


    System for decomposition of EMG signals intotheir constituent motor unit potentials and their firing patterns.The aim of the system is detailed analysis ofmotor unit variability.......System for decomposition of EMG signals intotheir constituent motor unit potentials and their firing patterns.The aim of the system is detailed analysis ofmotor unit variability....

  16. Stiilne detail vanalinnas / Jüri Kuuskemaa

    Kuuskemaa, Jüri, 1942-


    Näitus Rotermanni soolalaos "Stiilne detail vanalinnas" esitab detaile, mis on ilmestanud Tallinna siluetti (tuulelipud), tänavapilti (lukusiltidest ja uksekoputitest pühakuorvakujudeni) ja tubasid (akna käepidemest ahjukahlite, põrandaplaatide, toauste, stukk- ja kivireljeefideni). Kuraator J. Kuuskemaa, kujundaja M. Agabush.

  17. Understanding brains: details, intuition, and big data.

    Eve Marder


    Full Text Available Understanding how the brain works requires a delicate balance between the appreciation of the importance of a multitude of biological details and the ability to see beyond those details to general principles. As technological innovations vastly increase the amount of data we collect, the importance of intuition into how to analyze and treat these data may, paradoxically, become more important.

  18. Management Tools


    Manugistics, Inc. (formerly AVYX, Inc.) has introduced a new programming language for IBM and IBM compatible computers called TREES-pls. It is a resource management tool originating from the space shuttle, that can be used in such applications as scheduling, resource allocation project control, information management, and artificial intelligence. Manugistics, Inc. was looking for a flexible tool that can be applied to many problems with minimal adaptation. Among the non-government markets are aerospace, other manufacturing, transportation, health care, food and beverage and professional services.

  19. High Performance Tools And Technologies

    Collette, M R; Corey, I R; Johnson, J R


    This goal of this project was to evaluate the capability and limits of current scientific simulation development tools and technologies with specific focus on their suitability for use with the next generation of scientific parallel applications and High Performance Computing (HPC) platforms. The opinions expressed in this document are those of the authors, and reflect the authors' current understanding and functionality of the many tools investigated. As a deliverable for this effort, we are presenting this report describing our findings along with an associated spreadsheet outlining current capabilities and characteristics of leading and emerging tools in the high performance computing arena. This first chapter summarizes our findings (which are detailed in the other chapters) and presents our conclusions, remarks, and anticipations for the future. In the second chapter, we detail how various teams in our local high performance community utilize HPC tools and technologies, and mention some common concerns they have about them. In the third chapter, we review the platforms currently or potentially available to utilize these tools and technologies on to help in software development. Subsequent chapters attempt to provide an exhaustive overview of the available parallel software development tools and technologies, including their strong and weak points and future concerns. We categorize them as debuggers, memory checkers, performance analysis tools, communication libraries, data visualization programs, and other parallel development aides. The last chapter contains our closing information. Included with this paper at the end is a table of the discussed development tools and their operational environment.

  20. Frequency Response Analysis Tool

    Etingov, Pavel V. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kosterev, Dmitry [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Dai, T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)


    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  1. Frequency Response Analysis Tool

    Etingov, Pavel V.; Kosterev, Dmitry; Dai, T.


    Frequency response has received a lot of attention in recent years at the national level, which culminated in the development and approval of North American Electricity Reliability Corporation (NERC) BAL-003-1 Frequency Response and Frequency Bias Setting Reliability Standard. This report is prepared to describe the details of the work conducted by Pacific Northwest National Laboratory (PNNL) in collaboration with the Bonneville Power Administration and Western Electricity Coordinating Council (WECC) Joint Synchronized Information Subcommittee (JSIS) to develop a frequency response analysis tool (FRAT). The document provides the details on the methodology and main features of the FRAT. The tool manages the database of under-frequency events and calculates the frequency response baseline. Frequency response calculations are consistent with frequency response measure (FRM) in NERC BAL-003-1 for an interconnection and balancing authority. The FRAT can use both phasor measurement unit (PMU) data, where available, and supervisory control and data acquisition (SCADA) data. The tool is also capable of automatically generating NERC Frequency Response Survey (FRS) forms required by BAL-003-1 Standard.

  2. Post Entitlement Management Information - Detail Database

    Social Security Administration — Contains data that supports the detailed and aggregate receipt, pending and clearance data, as well as other strategic and tactical MI for many Title II and Title...

  3. Template Assembly for Detailed Urban Reconstruction

    Nan, Liangliang


    We propose a new framework to reconstruct building details by automatically assembling 3D templates on coarse textured building models. In a preprocessing step, we generate an initial coarse model to approximate a point cloud computed using Structure from Motion and Multi View Stereo, and we model a set of 3D templates of facade details. Next, we optimize the initial coarse model to enforce consistency between geometry and appearance (texture images). Then, building details are reconstructed by assembling templates on the textured faces of the coarse model. The 3D templates are automatically chosen and located by our optimization-based template assembly algorithm that balances image matching and structural regularity. In the results, we demonstrate how our framework can enrich the details of coarse models using various data sets.

  4. Online Citation and Reference Management Tools

    Das, Anup-Kumar


    This Unit is on online citation and reference management tools. The tools discussed are Mendeley, CiteULike, Zotero, Google Scholar Library, and EndNote Basic. The features of all the management tools have been discussed with figures, tables, and text boxes. This Unit discusses in details aspects of different Online Citation and Reference Management Tools. Published in the Open Access for Researchers > Module 4: Research Evaluation Metrics > Unit 4: Online Citation and Reference Management T...

  5. ANT Advanced Neural Tool

    Labrador, I.; Carrasco, R.; Martinez, L.


    This paper describes a practical introduction to the use of Artificial Neural Networks. Artificial Neural Nets are often used as an alternative to the traditional symbolic manipulation and first order logic used in Artificial Intelligence, due the high degree of difficulty to solve problems that can not be handled by programmers using algorithmic strategies. As a particular case of Neural Net a Multilayer Perception developed by programming in C language on OS9 real time operating system is presented. A detailed description about the program structure and practical use are included. Finally, several application examples that have been treated with the tool are presented, and some suggestions about hardware implementations. (Author) 15 refs.

  6. RSP Tooling Technology



    RSP Tooling{trademark} is a spray forming technology tailored for producing molds and dies. The approach combines rapid solidification processing and net-shape materials processing in a single step. The general concept involves converting a mold design described by a CAD file to a tooling master using a suitable rapid prototyping (RP) technology such as stereolithography. A pattern transfer is made to a castable ceramic, typically alumina or fused silica (Figure 1). This is followed by spray forming a thick deposit of a tooling alloy on the pattern to capture the desired shape, surface texture, and detail. The resultant metal block is cooled to room temperature and separated from the pattern. The deposit's exterior walls are machined square, allowing it to be used as an insert in a standard mold base. The overall turnaround time for tooling is about 3 to 5 days, starting with a master. Molds and dies produced in this way have been used in high volume production runs in plastic injection molding and die casting. A Cooperative Research and Development Agreement (CRADA) between the Idaho National Engineering and Environmental Laboratory (INEEL) and Grupo Vitro has been established to evaluate the feasibility of using RSP Tooling technology for producing molds and dies of interest to Vitro. This report summarizes results from Phase I of this agreement, and describes work scope and budget for Phase I1 activities. The main objective in Phase I was to demonstrate the feasibility of applying the Rapid Solidification Process (RSP) Tooling method to produce molds for the manufacture of glass and other components of interest to Vitro. This objective was successfully achieved.

  7. Making detailed predictions makes (some) predictions worse

    Kelly, Theresa F.

    In this paper, we investigate whether making detailed predictions about an event makes other predictions worse. Across 19 experiments, 10,895 participants, and 415,960 predictions about 724 professional sports games, we find that people who made detailed predictions about sporting events (e.g., how many hits each baseball team would get) made worse predictions about more general outcomes (e.g., which team would win). We rule out that this effect is caused by inattention or fatigue, thinking too hard, or a differential reliance on holistic information about the teams. Instead, we find that thinking about game-relevant details before predicting winning teams causes people to give less weight to predictive information, presumably because predicting details makes information that is relatively useless for predicting the winning team more readily accessible in memory and therefore incorporated into forecasts. Furthermore, we show that this differential use of information can be used to predict what kinds of games will and will not be susceptible to the negative effect of making detailed predictions.

  8. Mathematical tools

    Capozziello, Salvatore; Faraoni, Valerio

    In this chapter we discuss certain mathematical tools which are used extensively in the following chapters. Some of these concepts and methods are part of the standard baggage taught in undergraduate and graduate courses, while others enter the tool-box of more advanced researchers. These mathematical methods are very useful in formulating ETGs and in finding analytical solutions.We begin by studying conformal transformations, which allow for different representations of scalar-tensor and f(R) theories of gravity, in addition to being useful in GR. We continue by discussing variational principles in GR, which are the basis for presenting ETGs in the following chapters. We close the chapter with a discussion of Noether symmetries, which are used elsewhere in this book to obtain analytical solutions.

  9. Fatigue-Prone Details in Steel Bridges

    Mohsen Heshmati


    Full Text Available This paper reviews the results of a comprehensive investigation including more than 100 fatigue damage cases, reported for steel and composite bridges. The damage cases are categorized according to types of detail. The mechanisms behind fatigue damage in each category are identified and studied. It was found that more than 90% of all reported damage cases are of deformation-induced type and generated by some kind of unintentional or otherwise overlooked interaction between different load-carrying members or systems in the bridge. Poor detailing, with unstiffened gaps and abrupt changes in stiffness at the connections between different members were also found to contribute to fatigue cracking in many details.

  10. Memory for details with self-referencing.

    Serbun, Sarah J; Shih, Joanne Y; Gutchess, Angela H


    Self-referencing benefits item memory, but little is known about the ways in which referencing the self affects memory for details. Experiment 1 assessed whether the effects of self-referencing operate only at the item, or general, level or whether they also enhance memory for specific visual details of objects. Participants incidentally encoded objects by making judgements in reference to the self, a close other (one's mother), or a familiar other (Bill Clinton). Results indicate that referencing the self or a close other enhances both specific and general memory. Experiments 2 and 3 assessed verbal memory for source in a task that relied on distinguishing between different mental operations (internal sources). The results indicate that self-referencing disproportionately enhances source memory, relative to conditions referencing other people, semantic, or perceptual information. We conclude that self-referencing not only enhances specific memory for both visual and verbal information, but can also disproportionately improve memory for specific internal source details.

  11. Detail in architecture: Between arts & crafts

    Dulencin, Juraj


    Architectural detail represents an important part of architecture. Not only can it be used as an identifier of a specific building but at the same time enhances the experience of the realized project. Within it lie the signs of a great architect and clues to understanding his or her way of thinking. It is therefore the central topic of a seminar offered to architecture students at the Brno University of Technology. During the course of the semester-long class the students acquaint themselves with atypical architectural details of domestic and international architects by learning to read them, understand them and subsequently draw them by creating architectural blueprints. In other words, by general analysis of a detail the students learn theoretical thinking of its architect who, depending on the nature of the design, had to incorporate a variety of techniques and crafts. Students apply this analytical part to their own architectural detail design. The methodology of the seminar consists of experiential learning by project management and is complemented by a series of lectures discussing a diversity of details as well as materials and technologies required to implement it. The architectural detail design is also part of students' bachelors thesis, therefore, the realistic nature of their blueprints can be verified in the production process of its physical counterpart. Based on their own documentation the students choose the most suitable manufacturing process whether it is supplied by a specific technology or a craftsman. Students actively participate in the production and correct their design proposals in real scale with the actual material. A student, as a future architect, stands somewhere between a client and an artisan, materializes his or her idea and adjusts the manufacturing process so that the final detail fulfills aesthetic consistency and is in harmony with its initial concept. One of the very important aspects of the design is its economic cost, an

  12. Local address and emergency contact details


    The HR Department would like to remind members of the personnel that they are responsible for ensuring that their personal data concerning local address and preferred emergency contact details remains valid and up-to-date.   Both are easily accessible via the links below: Local address:   Emergency contacts:   Please take a few minutes to check your details and modify if necessary. Thank you in advance. HR Department Head Office

  13. Detailed chemical kinetic oxidation mechanism for a biodiesel surrogate

    Herbinet, O; Pitz, W J; Westbrook, C K


    A detailed chemical kinetic mechanism has been developed and used to study the oxidation of methyl decanoate, a surrogate for biodiesel fuels. This model has been built by following the rules established by Curran et al. for the oxidation of n-heptane and it includes all the reactions known to be pertinent to both low and high temperatures. Computed results have been compared with methyl decanoate experiments in an engine and oxidation of rapeseed oil methyl esters in a jet stirred reactor. An important feature of this mechanism is its ability to reproduce the early formation of carbon dioxide that is unique to biofuels and due to the presence of the ester group in the reactant. The model also predicts ignition delay times and OH profiles very close to observed values in shock tube experiments fueled by n-decane. These model capabilities indicate that large n-alkanes can be good surrogates for large methyl esters and biodiesel fuels to predict overall reactivity, but some kinetic details, including early CO{sub 2} production from biodiesel fuels, can be predicted only by a detailed kinetic mechanism for a true methyl ester fuel. The present methyl decanoate mechanism provides a realistic kinetic tool for simulation of biodiesel fuels.

  14. Detailed chemical kinetic oxidation mechanism for a biodiesel surrogate

    Herbinet, O; Pitz, W J; Westbrook, C K


    A detailed chemical kinetic mechanism has been developed and used to study the oxidation of methyl decanoate, a surrogate for biodiesel fuels. This model has been built by following the rules established by Curran et al. for the oxidation of n-heptane and it includes all the reactions known to be pertinent to both low and high temperatures. Computed results have been compared with methyl decanoate experiments in an engine and oxidation of rapeseed oil methyl esters in a jet stirred reactor. An important feature of this mechanism is its ability to reproduce the early formation of carbon dioxide that is unique to biofuels and due to the presence of the ester group in the reactant. The model also predicts ignition delay times and OH profiles very close to observed values in shock tube experiments fueled by n-decane. These model capabilities indicate that large n-alkanes can be good surrogates for large methyl esters and biodiesel fuels to predict overall reactivity, but some kinetic details, including early CO2 production from biodiesel fuels, can be predicted only by a detailed kinetic mechanism for a true methyl ester fuel. The present methyl decanoate mechanism provides a realistic kinetic tool for simulation of biodiesel fuels.

  15. Surface Detail Capturing for Realistic Facial Animation

    Pei-Hsuan Tu; I-Chen Lin; Jeng-Sheng Yeh; Rung-Huei Liang; Ming Ouhyoung


    In this paper, a facial animation system is proposed for capturing both geometrical information and illumination changes of surface details, called expression details, from video clips simultaneously, and the captured data can be widely applied to different 2D face images and 3D face models. While tracking the geometric data,we record the expression details by ratio images. For 2D facial animation synthesis, these ratio images are used to generate dynamic textures. Because a ratio image is obtained via dividing colors of an expressive face by those of a neutral face, pixels with ratio value smaller than one are where a wrinkle or crease appears. Therefore, the gradients of the ratio value at each pixel in ratio images are regarded as changes of a face surface, and original normals on the surface can be adjusted according to these gradients. Based on this idea, we can convert the ratio images into a sequence of normal maps and then apply them to animated 3D model rendering. With the expression detail mapping, the resulted facial animations are more life-like and more expressive.

  16. New details emerge from the Einstein files

    Overbye, D


    For many years the FBI spied on Einstein. New details of this surveilance are emerging in "The Einstein File: J. Edgar Hoover's Secret War Against the World's Most Famous Scientist," by Fred Jerome, who sued the government with the help of the Public Citizen Litigation Group to obtain a less censored version of the file (1 page).

  17. Detailed Balancing and the Structure of Proton

    Zhang, Y Z


    The protons are taken as an ensemble of Fock states. Using detailed balancing principle, ensemble density metrix on the basis of the number of partons is calculated, and so some information about intrinsic gluons and intrinsic sea quarks are gained without any parameter.

  18. Constructing Overview + Detail Dendrogram-Matrix Views

    Chen, Jin; MacEachren, Alan M.; Peuquet, Donna J.


    A dendrogram that visualizes a clustering hierarchy is often integrated with a reorderable matrix for pattern identification. The method is widely used in many research fields including biology, geography, statistics, and data mining. However, most dendrograms do not scale up well, particularly with respect to problems of graphical and cognitive information overload. This research proposes a strategy that links an overview dendrogram and a detail-view dendrogram, each integrated with a re-orderable matrix. The overview displays only a user-controlled, limited number of nodes that represent the “skeleton” of a hierarchy. The detail view displays the sub-tree represented by a selected meta-node in the overview. The research presented here focuses on constructing a concise overview dendrogram and its coordination with a detail view. The proposed method has the following benefits: dramatic alleviation of information overload, enhanced scalability and data abstraction quality on the dendrogram, and the support of data exploration at arbitrary levels of detail. The contribution of the paper includes a new metric to measure the “importance” of nodes in a dendrogram; the method to construct the concise overview dendrogram from the dynamically-identified, important nodes; and measure for evaluating the data abstraction quality for dendrograms. We evaluate and compare the proposed method to some related existing methods, and demonstrating how the proposed method can help users find interesting patterns through a case study on county-level U.S. cervical cancer mortality and demographic data. PMID:19834151

  19. Extended Testability Analysis Tool

    Melcher, Kevin; Maul, William A.; Fulton, Christopher


    The Extended Testability Analysis (ETA) Tool is a software application that supports fault management (FM) by performing testability analyses on the fault propagation model of a given system. Fault management includes the prevention of faults through robust design margins and quality assurance methods, or the mitigation of system failures. Fault management requires an understanding of the system design and operation, potential failure mechanisms within the system, and the propagation of those potential failures through the system. The purpose of the ETA Tool software is to process the testability analysis results from a commercial software program called TEAMS Designer in order to provide a detailed set of diagnostic assessment reports. The ETA Tool is a command-line process with several user-selectable report output options. The ETA Tool also extends the COTS testability analysis and enables variation studies with sensor sensitivity impacts on system diagnostics and component isolation using a single testability output. The ETA Tool can also provide extended analyses from a single set of testability output files. The following analysis reports are available to the user: (1) the Detectability Report provides a breakdown of how each tested failure mode was detected, (2) the Test Utilization Report identifies all the failure modes that each test detects, (3) the Failure Mode Isolation Report demonstrates the system s ability to discriminate between failure modes, (4) the Component Isolation Report demonstrates the system s ability to discriminate between failure modes relative to the components containing the failure modes, (5) the Sensor Sensor Sensitivity Analysis Report shows the diagnostic impact due to loss of sensor information, and (6) the Effect Mapping Report identifies failure modes that result in specified system-level effects.

  20. Thirty Meter Telescope Detailed Science Case: 2015

    Skidmore, Warren; Fukugawa, Misato; Goswami, Aruna; Hao, Lei; Jewitt, David; Laughlin, Greg; Steidel, Charles; Hickson, Paul; Simard, Luc; Schöck, Matthias; Treu, Tommaso; Cohen, Judith; Anupama, G C; Dickinson, Mark; Harrison, Fiona; Kodama, Tadayuki; Lu, Jessica R; Macintosh, Bruce; Malkan, Matt; Mao, Shude; Narita, Norio; Sekiguchi, Tomohiko; Subramaniam, Annapurni; Tanaka, Masaomi; Tian, Feng; A'Hearn, Michael; Akiyama, Masayuki; Ali, Babar; Aoki, Wako; Bagchi, Manjari; Barth, Aaron; Bhalerao, Varun; Bradac, Marusa; Bullock, James; Burgasser, Adam J; Chapman, Scott; Chary, Ranga-Ram; Chiba, Masashi; Cooray, Asantha; Crossfield, Ian; Currie, Thayne; Das, Mousumi; Dewangan, G C; de Grijs, Richard; Do, Tuan; Dong, Subo; Evslin, Jarah; Fang, Taotao; Fang, Xuan; Fassnacht, Christopher; Fletcher, Leigh; Gaidos, Eric; Gal, Roy; Ghez, Andrea; Giavalisco, Mauro; Grady, Carol A; Greathouse, Thomas; Gogoi, Rupjyoti; Guhathakurta, Puragra; Ho, Luis; Hasan, Priya; Herczeg, Gregory J; Honda, Mitsuhiko; Imanishi, Masa; Inanmi, Hanae; Iye, Masanori; Kamath, U S; Kane, Stephen; Kashikawa, Nobunari; Kasliwal, Mansi; Kirby, Vishal KasliwalEvan; Konopacky, Quinn M; Lepine, Sebastien; Li, Di; Li, Jianyang; Liu, Junjun; Liu, Michael C; Lopez-Rodriguez, Enrigue; Lotz, Jennifer; Lubin, Philip; Macri, Lucas; Maeda, Keiichi; Marchis, Franck; Marois, Christian; Marscher, Alan; Martin, Crystal; Matsuo, Taro; Max, Claire; McConnachie, Alan; McGough, Stacy; Melis, Carl; Meyer, Leo; Mumma, Michael; Muto, Takayuki; Nagao, Tohru; Najita, Joan R; Navarro, Julio; Pierce, Michael; Prochaska, Jason X; Oguri, Masamune; Ojha, Devendra K; Okamoto, Yoshiko K; Orton, Glenn; Otarola, Angel; Ouchi, Masami; Packham, Chris; Padgett, Deborah L; Pandey, Shashi Bhushan; Pilachowsky, Catherine; Pontoppidan, Klaus M; Primack, Joel; Puthiyaveettil, Shalima; Ramirez-Ruiz, Enrico; Reddy, Naveen; Rich, Michael; Richter, Matthew J; Schombert, James; Sen, Anjan Ananda; Shi, Jianrong; Sheth, Kartik; Srianand, R; Tan, Jonathan C; Tanaka, Masayuki; Tanner, Angelle; Tominaga, Nozomu; Tytler, David; U, Vivian; Wang, Lingzhi; Wang, Xiaofeng; Wang, Yiping; Wilson, Gillian; Wright, Shelley; Wu, Chao; Wu, Xufeng; Xu, Renxin; Yamada, Toru; Yang, Bin; Zhao, Gongbo; Zhao, Hongsheng


    The TMT Detailed Science Case describes the transformational science that the Thirty Meter Telescope will enable. Planned to begin science operations in 2024, TMT will open up opportunities for revolutionary discoveries in essentially every field of astronomy, astrophysics and cosmology, seeing much fainter objects much more clearly than existing telescopes. Per this capability, TMT's science agenda fills all of space and time, from nearby comets and asteroids, to exoplanets, to the most distant galaxies, and all the way back to the very first sources of light in the Universe. More than 150 astronomers from within the TMT partnership and beyond offered input in compiling the new 2015 Detailed Science Case. The contributing astronomers represent the entire TMT partnership, including the California Institute of Technology (Caltech), the Indian Institute of Astrophysics (IIA), the National Astronomical Observatories of the Chinese Academy of Sciences (NAOC), the National Astronomical Observatory of Japan (NAOJ),...

  1. Detailed Electrochemical Characterisation of Large SOFC Stacks

    Mosbæk, Rasmus Rode; Hjelm, Johan; Barfod, R.


    application of advanced methods for detailed electrochemical characterisation during operation. An operating stack is subject to steep compositional gradients in the gaseous reactant streams, and significant temperature gradients across each cell and across the stack, which makes it a complex system...... Fuel Cell A/S was characterised in detail using electrochemical impedance spectroscopy. An investigation of the optimal geometrical placement of the current probes and voltage probes was carried out in order to minimise measurement errors caused by stray impedances. Unwanted stray impedances...... are particularly problematic at high frequencies. Stray impedances may be caused by mutual inductance and stray capacitance in the geometrical set-up and do not describe the fuel cell. Three different stack geometries were investigated by electrochemical impedance spectroscopy. Impedance measurements were carried...

  2. Detailed gravimetric geoid computations in North America

    Marsh, J. G.; Chang, E. S.


    A detailed gravimetric geoid has been computed for the Eastern United States and the Northwestern Atlantic Ocean by combining the Goddard Space Flight Center GEM-8 earth gravity model with the available 15 x 15 arcmin and 1 x 1 deg mean free air surface gravity observations. The short wavelength undulations were computed by applying Stokes' formula to the 15 x 15 arcmin and 1 x 1 deg surface data. The long wavelength undulations were provided by the GEM-8 model. The gravimetric geoid has been compared with Geoceiver derived and astrogeodetically determined geoid heights in the United States and the rms agreement is on the order of 1.5 meters. Excellent agreement in shape has been found between the detailed geoid and geoidal profiles derived from GEOS-III altimeter data in the Northwest Atlantic Ocean.

  3. Detailed spectral analysis of decellularized skin implants

    Timchenko, E. V.; Timchenko, P. E.; Volova, L. T.; Dolgushkin, D. A.; Shalkovsky, P. Y.; Pershutkina, S. V.


    The resutls of detailed analysis of donor skin implants using Raman spectroscopy method are presented. Fourier-deconvolution method was used to separate overlapping spectrum lines and to improve its informativeness. Based on the processed spectra were introduced coefficients that represent changes in relative concentration of implant components, which determines the quality of implants. It was established that Raman spectroscopy method can be used in assessment of skin implants.

  4. Structural concepts and details for seismic design


    This manual discusses building and building component behavior during earthquakes, and provides suggested details for seismic resistance which have shown by experience to provide adequate performance during earthquakes. Special design and construction practices are also described which, although they might be common in some high-seismic regions, may not be common in low and moderate seismic-hazard regions of the United States. Special attention is given to describing the level of detailing appropriate for each seismic region. The UBC seismic criteria for all seismic zones is carefully examined, and many examples of connection details are given. The general scope of discussion is limited to materials and construction types common to Department of Energy (DOE) sites. Although the manual is primarily written for professional engineers engaged in performing seismic-resistant design for DOE facilities, the first two chapters, plus the introductory sections of succeeding chapters, contain descriptions which are also directed toward project engineers who authorize, review, or supervise the design and construction of DOE facilities. 88 refs., 188 figs.

  5. Detailed weather data generator for building simulations

    Adelard, L; Garde, F; Gatina, J -C


    Thermal buildings simulation softwares need meteorological files in thermal comfort, energetic evaluation studies. Few tools can make significant meteorological data available such as generated typical year, representative days, or artificial meteorological database. This paper deals about the presentation of a new software, RUNEOLE, used to provide weather data in buildings applications with a method adapted to all kind of climates. RUNEOLE associates three modules of description, modelling and generation of weather data. The statistical description of an existing meteorological database makes typical representative days available and leads to the creation of model libraries. The generation module leads to the generation of non existing sequences. This software tends to be usable for the searchers and designers, by means of interactivity, facilitated use and easy communication. The conceptual basis of this tool will be exposed and we'll propose two examples of applications in building physics for tropical hu...

  6. Molecular tools and bumble bees: revealing hidden details of ecology and evolution in a model system

    Bumble bees are a longstanding model system for studies on behavior, ecology, and evolution, due to their well-studied social lifestyle, invaluable roles as both wild and managed pollinators, and their ubiquity and diversity across temperate ecosystems. Yet despite their importance, many aspects of ...

  7. A Clinically Useful Tool to Determine an Effective Snellen Fraction: Details


    of arc is not very far from reality. Prior to the invention of the telescope astronomers, such as Tycho Brahe (1546–1601) (25) and Johannes...and Dual Tasking: The Influence of Refractive Blur. Optom Vis Sci 2003, 44, 2885–2891. 25. Manos, H. Tycho Brahe’s Stjerneborg. The Physics

  8. Differential scanning calorimetry: An invaluable tool for a detailed thermodynamic characterization of macromolecules and their interactions

    Michael H Chiu


    Full Text Available Differential Scanning Calorimetry (DSC is a highly sensitive technique to study the thermotropic properties of many different biological macromolecules and extracts. Since its early development, DSC has been applied to the pharmaceutical field with excipient studies and DNA drugs. In recent times, more attention has been applied to lipid-based drug delivery systems and drug interactions with biomimetic membranes. Highly reproducible phase transitions have been used to determine values, such as, the type of binding interaction, purity, stability, and release from a drug delivery mechanism. This review focuses on the use of DSC for biochemical and pharmaceutical applications.

  9. Development of a Procedure to Apply Detailed Chemical Kinetic Mechanisms to CFD Simulations as Post Processing

    Skjøth-Rasmussen, Martin Skov; Glarborg, Peter; Jensen, Anker;


    It is desired to make detailed chemical kinetic mechanisms applicable to the complex geometries of practical combustion devices simulated with computational fluid dynamics tools. This work presents a novel general approach to combining computational fluid dynamics and a detailed chemical kinetic...... mechanism. It involves post-processing of data extracted from computational fluid dynamics simulations. Application of this approach successfully describes combustion chemistry in a standard swirl burner, the so-called Harwell furnace. Nevertheless, it needs validation against more complex combustion models...

  10. Broken detailed balance reveals stress heterogeneity in active matter

    Gladrow, J; MacKintosh, F C; Schmidt, C F; Broedersz, C P


    Myosin motor proteins drive vigorous steady-state fluctuations in the actin cytoskeleton of cells. Endogenous embedded semiflexible filaments such as microtubules, or added filaments such as single-walled carbon nanotubes are used as novel tools to non-invasively track equilibrium and non-equilibrium fluctuations in such biopolymer networks. Here we analytically calculate shape fluctuations of semiflexible probe filaments in a viscoelastic environment, driven out of equilibrium by motor activity. Transverse bending fluctuations of the probe filaments can be decomposed into dynamic normal modes. We find that these modes no longer evolve independently under non-equilibrium driving. This effective mode coupling results in non-zero circulatory currents in a conformational phase space, reflecting a violation of detailed balance. We present predictions for the characteristic frequencies associated with these currents and investigate how the temporal signatures of motor activity determine mode correlations, which we...

  11. Detailed electromagnetic simulation for the structural color of butterfly wings.

    Lee, R Todd; Smith, Glenn S


    Many species of butterflies exhibit interesting optical phenomena due to structural color. The physical reason for this color is subwavelength features on the surface of a single scale. The exposed surface of a scale is covered with a ridge structure. The fully three-dimensional, periodic, finite-difference time-domain method is used to create a detailed electromagnetic model of a generic ridge. A novel method for presenting the three-dimensional observed color pattern is developed. Using these tools, the change in color that is a result of varying individual features of the scale is explored. Computational models are developed that are similar to three butterflies: Morpho rhetenor, Troides magellanus, and Ancyluris meliboeus.

  12. Detailed modeling of mountain wave PSCs

    S. Fueglistaler


    Full Text Available Polar stratospheric clouds (PSCs play a key role in polar ozone depletion. In the Arctic, PSCs can occur on the mesoscale due to orographically induced gravity waves. Here we present a detailed study of a mountain wave PSC event on 25-27 January 2000 over Scandinavia. The mountain wave PSCs were intensively observed by in-situ and remote-sensing techniques during the second phase of the SOLVE/THESEO-2000 Arctic campaign. We use these excellent data of PSC observations on 3 successive days to analyze the PSCs and to perform a detailed comparison with modeled clouds. We simulated the 3-dimensional PSC structure on all 3 days with a mesoscale numerical weather prediction (NWP model and a microphysical box model (using best available nucleation rates for ice and nitric acid trihydrate particles. We show that the combined mesoscale/microphysical model is capable of reproducing the PSC measurements within the uncertainty of data interpretation with respect to spatial dimensions, temporal development and microphysical properties, without manipulating temperatures or using other tuning parameters. In contrast, microphysical modeling based upon coarser scale global NWP data, e.g. current ECMWF analysis data, cannot reproduce observations, in particular the occurrence of ice and nitric acid trihydrate clouds. Combined mesoscale/microphysical modeling may be used for detailed a posteriori PSC analysis and for future Arctic campaign flight and mission planning. The fact that remote sensing alone cannot further constrain model results due to uncertainities in the interpretation of measurements, underlines the need for synchronous in-situ PSC observations in campaigns.

  13. Detailed measurement on a HESCO diffuser

    Jensen, Rasmus Lund; Holm, Dorte; Nielsen, Peter V.


    This paper focuses on measuring the inlet velocity from a HESCO diffuser used in the IEA Annex 20 work as a function of the volume flow it provides. The aim of the present work is to establish a relation between the inlet velocity, the effective area and the airflow. This is important because...... the inlet velocity is a very important boundary condition both in CFD calculation and general flow measurements. If only the volume flow and the geometrical area are used, a relatively large error in the inlet velocity may result. From the detailed measurements it was possible to establish an expression...

  14. A detailed phylogeny for the Methanomicrobiales

    Rouviere, P.; Mandelco, L.; Winker, S.; Woese, C. R.


    The small subunit rRNA sequence of twenty archaea, members of the Methanomicrobiales, permits a detailed phylogenetic tree to be inferred for the group. The tree confirms earlier studies, based on far fewer sequences, in showing the group to be divided into two major clusters, temporarily designated the "methanosarcina" group and the "methanogenium" group. The tree also defines phylogenetic relationships within these two groups, which in some cases do not agree with the phylogenetic relationships implied by current taxonomic names--a problem most acute for the genus Methanogenium and its relatives. The present phylogenetic characterization provides the basis for a consistent taxonomic restructuring of this major methanogenic taxon.

  15. Exploration of networks using overview+detail with constraint-based cooperative layout.

    Dwyer, Tim; Marriott, Kim; Schreiber, Falk; Stuckey, Peter; Woodward, Michael; Wybrow, Michael


    A standard approach to large network visualization is to provide an overview of the network and a detailed view of a small component of the graph centred around a focal node. The user explores the network by changing the focal node in the detailed view or by changing the level of detail of a node or cluster. For scalability, fast force-based layout algorithms are used for the overview and the detailed view. However, using the same layout algorithm in both views is problematic since layout for the detailed view has different requirements to that in the overview. Here we present a model in which constrained graph layout algorithms are used for layout in the detailed view. This means the detailed view has high-quality layout including sophisticated edge routing and is customisable by the user who can add placement constraints on the layout. Scalability is still ensured since the slower layout techniques are only applied to the small subgraph shown in the detailed view. The main technical innovations are techniques to ensure that the overview and detailed view remain synchronized, and modifying constrained graph layout algorithms to support smooth, stable layout. The key innovation supporting stability are new dynamic graph layout algorithms that preserve the topology or structure of the network when the user changes the focus node or the level of detail by in situ semantic zooming. We have built a prototype tool and demonstrate its use in two application domains, UML class diagrams and biological networks.

  16. Thirty Meter Telescope Detailed Science Case: 2015

    Skidmore, Warren; TMT International Science Development Teams; Science Advisory Committee, TMT


    The TMT Detailed Science Case describes the transformational science that the Thirty Meter Telescope will enable. Planned to begin science operations in 2024, TMT will open up opportunities for revolutionary discoveries in essentially every field of astronomy, astrophysics and cosmology, seeing much fainter objects much more clearly than existing telescopes. Per this capability, TMT's science agenda fills all of space and time, from nearby comets and asteroids, to exoplanets, to the most distant galaxies, and all the way back to the very first sources of light in the universe. More than 150 astronomers from within the TMT partnership and beyond offered input in compiling the new 2015 Detailed Science Case. The contributing astronomers represent the entire TMT partnership, including the California Institute of Technology (Caltech), the Indian Institute of Astrophysics (IIA), the National Astronomical Observatories of the Chinese Academy of Sciences (NAOC), the National Astronomical Observatory of Japan (NAOJ), the University of California, the Association of Canadian Universities for Research in Astronomy (ACURA) and US associate partner, the Association of Universities for Research in Astronomy (AURA). Cover image: artist's rendition of the TMT International Observatory on Mauna Kea opening in the late evening before beginning operations.

  17. Bolivia-Brazil gas line route detailed


    This paper reports that state oil companies of Brazil and Bolivia have signed an agreement outlining the route for a 2,270 km pipeline system to deliver natural gas from Bolivian fields to Southeast Brazil. The two sides currently are negotiating details about construction costs as well as contract volumes and prices. Capacity is projected at 283-565 MMcfd. No official details are available, but Roberto Y. Hukai, a director of the Sao Paulo engineering company Jaako Poyry/Technoplan, estimates transportation cost of the Bolivian gas at 90 cents/MMBTU. That would be competitive with the price of gas delivered to the Sao Paulo gas utility Comgas, he the. Brazil's Petroleos Brasileiro SA estimates construction of the pipeline on the Brazilian side alone with cost $1.2-1.4 billion. Bolivia's Yacimientos Petroliferos Fiscales Bolivianos (YPFB) is negotiating with private domestic and foreign investors for construction of the Bolivian portion of the project.

  18. Mathematical tools for physicists


    Mathematical Tools for Physisists is a unique collection of 18 review articles, each one written by a renowned expert of its field. Their professional style will be beneficial for advanced students as well as for the scientist at work. The first may find a comprehensive introduction while the latter use it as a quick reference.The contributions range from fundamental methods right up to the latest applications, including:Algebraic/ analytic / geometric methodsSymmetries and conservation lawsMathematical modellingQuantum computationGreat attention was paid to ensuring fast access to the information, and each carefully reviewed article features:an abstracta detailed table of contentscontinuous cross-referencingreferences to the most relevant publications in the field, andsuggestions for further reading, both introductory as well as highly specialized.In addition, a comprehensive index provides easy access to the enormous number of key words beyond the headlines

  19. Picornavirus uncoating intermediate captured in atomic detail

    Ren, Jingshan; Wang, Xiangxi; Hu, Zhongyu; Gao, Qiang; Sun, Yao; Li, Xuemei; Porta, Claudine; Walter, Thomas S.; Gilbert, Robert J.; Zhao, Yuguang; Axford, Danny; Williams, Mark; McAuley, Katherine; Rowlands, David J.; Yin, Weidong; Wang, Junzhi; Stuart, David I.; Rao, Zihe; Fry, Elizabeth E.


    It remains largely mysterious how the genomes of non-enveloped eukaryotic viruses are transferred across a membrane into the host cell. Picornaviruses are simple models for such viruses, and initiate this uncoating process through particle expansion, which reveals channels through which internal capsid proteins and the viral genome presumably exit the particle, although this has not been clearly seen until now. Here we present the atomic structure of an uncoating intermediate for the major human picornavirus pathogen CAV16, which reveals VP1 partly extruded from the capsid, poised to embed in the host membrane. Together with previous low-resolution results, we are able to propose a detailed hypothesis for the ordered egress of the internal proteins, using two distinct sets of channels through the capsid, and suggest a structural link to the condensed RNA within the particle, which may be involved in triggering RNA release. PMID:23728514

  20. Detailed Chromospheric Activity Nature of KIC 9641031

    Yoldaş, Ezgi


    This study depends on KIC 9641031 eclipsing binary system with a chromospherically active component. There are three type variations, such as geometrical variations due to eclipses, sinusoidal variations due to the rotational modulations and also flares, in the light curves obtained with the data taken from the Kepler Mission Database. Taking into account results obtained from KIC 9641031's observations in the Kepler Mission Database, we present and discuss the details of chromospheric activity. The sinusoidal light variations due to rotational modulation and the flare events were modelled separately. 92 different data subsets separated using the analytic models described in the literature were modelled separately to obtain the cool spot configuration. It is seen that just one component of the system is chromospherically active star. On this component, there are two active regions separated by about 180 deg longitudinally between the latitudes of +50 deg and +100 deg, whose locations and forms are rapidly cha...

  1. Most Detailed Image of the Crab Nebula


    This new Hubble image -- one among the largest ever produced with the Earth-orbiting observatory -- shows the most detailed view so far of the entire Crab Nebula ever made. The Crab is arguably the single most interesting object, as well as one of the most studied, in all of astronomy. The image is the largest image ever taken with Hubble's WFPC2 workhorse camera. The Crab Nebula is one of the most intricately structured and highly dynamical objects ever observed. The new Hubble image of the Crab was assembled from 24 individual exposures taken with the NASA/ESA Hubble Space Telescope and is the highest resolution image of the entire Crab Nebula ever made.

  2. A meaningful expansion around detailed balance

    Colangeli, Matteo; Wynants, Bram


    We consider Markovian dynamics modeling open mesoscopic systems which are driven away from detailed balance by a nonconservative force. A systematic expansion is obtained of the stationary distribution around an equilibrium reference, in orders of the nonequilibrium forcing. The first order around equilibrium has been known since the work of McLennan (1959), and involves the transient irreversible entropy flux. The expansion generalizes the McLennan formula to higher orders, complementing the entropy flux with the dynamical activity. The latter is more kinetic than thermodynamic and is a possible realization of Landauer's insight (1975) that, for nonequilibrium, the relative occupation of states also depends on the noise along possible escape routes. In that way nonlinear response around equilibrium can be meaningfully discussed in terms of two main quantities only, the entropy flux and the dynamical activity. The expansion makes mathematical sense as shown in the simplest cases from exponential ergodicity.

  3. Detailed gravimetric geoid for the United States.

    Strange, W. E.; Vincent, S. F.; Berry, R. H.; Marsh, J. G.


    A detailed gravimetric geoid was computed for the United States using a combination of satellite-derived spherical harmonic coefficients and 1 by 1 deg mean gravity values from surface gravimetry. Comparisons of this geoid with astrogeodetic geoid data indicate that a precision of plus or minus 2 meters has been obtained. Translations only were used to convert the NAD astrogeodetic geoid heights to geocentric astrogeodetic heights. On the basis of the agreement between the geocentric astrogeodetic geoid heights and the gravimetric geoid heights, no evidence is found for rotation in the North American datum. The value of the zero-order undulation can vary by 10 to 20 meters, depending on which investigator's station positions are used to establish it.

  4. Detailed Design of Intelligent Object Framework

    Sasa Savicand Hao Shi


    Full Text Available The design and implementation of Intelligent Object Framework(IOF aims to unite the communication and device management through a platform independent ma nagement protocol in conjunction with a management application. The Core Framework is devel oped using Microsoft Visual Studio, Microsoft’s .NET Framework and Microsoft’s Windows Mobile SDK. Secondary Intelligent Object is developed using Tibbo Integrated Development Environment (TIDE and T-BASIC programming language that is loaded on an EM1026 Embedded Device Platform running Tibbo Op erating System (TiOS. The backend database is based on Microsoft’s SQL Server.In this paper, prot ocols associated with Smart Living are first reviewed.The system architecture and intelligent ob ject management studio are presented. Then device application design and database design are detailed . Finally conclusions are drawn and future work is addressed.

  5. Detailed Chemical Abundances of Extragalactic Globular Clusters

    Bernstein, R A


    We outline a method to measure the detailed chemical composition of extragalactic (unresolved) globular clusters (GCs) from echelle spectra of their integrated light. Our goal is to use this method to measure abundance patterns of GCs in distant spiral and elliptical galaxies to constrain their formation histories. To develop this technique we have obtained a ``training set'' of integrated-light spectra of resolved GCs in the Milky Way and LMC by scanning across the clusters during exposures. Our training set also include spectra of individual stars in those GCs from which abundances can be obtained in the normal way to provide a check on our integrated-light results. We present here the preliminary integrated-light analysis of one GC in our training set, NGC 104 (47 Tuc), and outline some of the techniques utilized and problems encountered in that analysis.

  6. Unstable total hip arthroplasty: detailed overview.

    Berry, D J


    Hip dislocation is one of the most common complications of THA. Good preoperative planning, good postoperative patient education, accurate intraoperative component positioning, rigorous intraoperative testing of hip stability, and good repair of soft tissues during closure all help prevent dislocation. Early postoperative dislocations and first or second dislocations usually are treated with closed reduction and a hip guide brace or hip spica cast, but when dislocation becomes recurrent, surgical treatment usually is needed. When possible, surgical treatment is based on identifying and treating a specific problem leading to the dislocation, such as implant malposition, inadequate soft-tissue tension, or impingement. In selected circumstances, constrained implants or bipolar or tripolar implants provide powerful tools to restore hip stability.

  7. Downhole tool with replaceable tool sleeve sections

    Case, W. A.


    A downhole tool for insertion in a drill stem includes elongated cylindrical half sleeve tool sections adapted to be non-rotatably supported on an elongated cylindrical body. The tool sections are mountable on and removable from the body without disconnecting either end of the tool from a drill stem. The half sleeve tool sections are provided with tapered axially extending flanges on their opposite ends which fit in corresponding tapered recesses formed on the tool body and the tool sections are retained on the body by a locknut threadedly engaged with the body and engageable with an axially movable retaining collar. The tool sections may be drivably engaged with axial keys formed on the body or the tool sections may be formed with flat surfaces on the sleeve inner sides cooperable with complementary flat surfaces formed on a reduced diameter portion of the body around which the tool sections are mounted.

  8. Sheet Bending using Soft Tools

    Sinke, J.


    Sheet bending is usually performed by air bending and V-die bending processes. Both processes apply rigid tools. These solid tools facilitate the generation of software for the numerical control of those processes. When the lower rigid die is replaced with a soft or rubber tool, the numerical control becomes much more difficult, since the soft tool deforms too. Compared to other bending processes the rubber backed bending process has some distinct advantages, like large radius-to-thickness ratios, applicability to materials with topcoats, well defined radii, and the feasibility of forming details (ridges, beads). These advantages may give the process exclusive benefits over conventional bending processes, not only for industries related to mechanical engineering and sheet metal forming, but also for other disciplines like Architecture and Industrial Design The largest disadvantage is that also the soft (rubber) tool deforms. Although the tool deformation is elastic and recovers after each process cycle, the applied force during bending is related to the deformation of the metal sheet and the deformation of the rubber. The deformation of the rubber interacts with the process but also with sheet parameters. This makes the numerical control of the process much more complicated. This paper presents a model for the bending of sheet materials using a rubber lower die. This model can be implemented in software in order to control the bending process numerically. The model itself is based on numerical and experimental research. In this research a number of variables related to the tooling and the material have been evaluated. The numerical part of the research was used to investigate the influence of the features of the soft lower tool, like the hardness and dimensions, and the influence of the sheet thickness, which also interacts with the soft tool deformation. The experimental research was focused on the relation between the machine control parameters and the most


    Tukker, Arnold; de Koning, Arjan; Wood, Richard; Hawkins, Troy; Lutter, Stephan; Acosta, Jose; Rueda Cantuche, Jose M.; Bouwmeester, Maaike; Oosterhaven, Jan; Drosdowski, Thomas; Kuenen, Jeroen


    EXIOPOL (A New Environmental Accounting Framework Using Externality Data and InputOutput Tools for Policy Analysis) was a European Union (EU)-funded project creating a detailed, global, multiregional environmentally extended Supply and Use table (MR EE SUT) of 43 countries, 129 sectors, 80 resources

  10. Preselection Of Diamond Single-Point Tools

    Decker, D. L.; Hurt, H. H.; Dancy, J. H.; Fountain, C. W.


    Diamond single-point tools of the very highest quality are required for precision machining of optical surfaces. However, a great variation in edge quality and tool life is observed in practice. The differences between poor and excellent tools are subtle and not easily detectable without verification by actual machining. A companion paper concerning the tribologic aspects of tool-edge quality and life discusses the possible mechanisms of tool degradation. This paper provides a discussion of practical methods for preselecting tools for high performance without resorting to machining use. Edge quality as observed by optical microscopy is not sufficient. Scanning electron microscopy and a recently developed two-stage replication process for the cutting edge and subsequent examination in transmission electron microscopy can yield the necessary resolution (<< 100 Å). In addition to characterization by high resolution microscopy, tool crystallographic orientation and perfection are also crucial. X-ray diffraction characterization is described in detail.


    Hüseyin Metin ERTUNÇ


    Full Text Available In this study, wear mechanisms on cutting tools, especially for the drill bits, during the cutting operation have been investigated. As the importance of full automation in industry has gained substantial importance, tool wear condition monitoring during the cutting operation has been the subject of many investigators. Tool condition monitoring is very crucial in order to change the tool before breakage. Because tool breakage can cause considerable economical damage to both the machine tool and workpiece. In this paper, the studies on the monitoring of drill bit wear in literature have been introduced; the direct/indirect techniques used and sensor fusion techniques have been summarized. The methods which were proposed to determine tool wear evolution as processing the sensor signals collected have been provided and their references have been given for detailed information.

  12. Level of detail technique for plant models

    Xiaopeng ZHANG; Qingqiong DENG; Marc JAEGER


    Realistic modelling and interactive rendering of forestry and landscape is a challenge in computer graphics and virtual reality. Recent new developments in plant growth modelling and simulation lead to plant models faithful to botanical structure and development, not only representing the complex architecture of a real plant but also its functioning in interaction with its environment. Complex geometry and material of a large group of plants is a big burden even for high performances computers, and they often overwhelm the numerical calculation power and graphic rendering power. Thus, in order to accelerate the rendering speed of a group of plants, software techniques are often developed. In this paper, we focus on plant organs, i.e. leaves, flowers, fruits and inter-nodes. Our approach is a simplification process of all sparse organs at the same time, i. e. , Level of Detail (LOD) , and multi-resolution models for plants. We do explain here the principle and construction of plant simplification. They are used to construct LOD and multi-resolution models of sparse organs and branches of big trees. These approaches take benefit from basic knowledge of plant architecture, clustering tree organs according to biological structures. We illustrate the potential of our approach on several big virtual plants for geometrical compression or LOD model definition. Finally we prove the efficiency of the proposed LOD models for realistic rendering with a virtual scene composed by 184 mature trees.

  13. Detailed modelling of the 21-cm Forest

    Semelin, Benoit


    The 21-cm forest is a promising probe of the Epoch of Reionization. The local state of the intergalactic medium (IGM) is encoded in the spectrum of a background source (radio-loud quasars or gamma ray burst afterglow) by absorption at the local 21-cm wavelength, resulting in a continuous and fluctuating absorption level. Small-scale structures (filaments and minihaloes) in the IGM are responsible for the strongest absorption features. The absorption can also be modulated on large scales by inhomogeneous heating and Wouthuysen-Field coupling. We present the results from a simulation that attempts to preserve the cosmological environment while resolving some of the small-scale structures (a few kpc resolution in a 50 Mpc/h box). The simulation couples the dynamics and the ionizing radiative transfer and includes X-ray and Lyman lines radiative transfer for a detailed physical modelling. As a result we find that soft X-ray self-shielding, Lyman-alpha self-shielding and shock heating all have an impact on the pre...

  14. Detailed Aerosol Characterization using Polarimetric Measurements

    Hasekamp, Otto; di Noia, Antonio; Stap, Arjen; Rietjens, Jeroen; Smit, Martijn; van Harten, Gerard; Snik, Frans


    Anthropogenic aerosols are believed to cause the second most important anthropogenic forcing of climate change after greenhouse gases. In contrast to the climate effect of greenhouse gases, which is understood relatively well, the negative forcing (cooling effect) caused by aerosols represents the largest reported uncertainty in the most recent assessment of the International Panel on Climate Change (IPCC). To reduce the large uncertainty on the aerosol effects on cloud formation and climate, accurate satellite measurements of aerosol optical properties (optical thickness, single scattering albedo, phase function) and microphysical properties (size distribution, refractive index, shape) are essential. There is growing consensus in the aerosol remote sensing community that multi-angle measurements of intensity and polarization are essential to unambiguously determine all relevant aerosol properties. This presentations adresses the different aspects of polarimetric remote sensing of atmospheric aerosols, including retrieval algorithm development, validation, and data needs for climate and air quality applications. During past years, at SRON-Netherlands Instite for Space Research retrieval algorithms have been developed that make full use of the capabilities of polarimetric measurements. We will show results of detailed aerosol properties from ground-based- (groundSPEX), airborne- (NASA Research Scanning Polarimeter), and satellite (POLDER) measurements. Also we will discuss observational needs for future instrumentation in order to improve our understanding of the role of aerosols in climate change and air quality.

  15. Details of tetrahedral anisotropic mesh adaptation

    Jensen, Kristian Ejlebjerg; Gorman, Gerard


    We have implemented tetrahedral anisotropic mesh adaptation using the local operations of coarsening, swapping, refinement and smoothing in MATLAB without the use of any for- N loops, i.e. the script is fully vectorised. In the process of doing so, we have made three observations related to details of the implementation: 1. restricting refinement to a single edge split per element not only simplifies the code, it also improves mesh quality, 2. face to edge swapping is unnecessary, and 3. optimising for the Vassilevski functional tends to give a little higher value for the mean condition number functional than optimising for the condition number functional directly. These observations have been made for a uniform and a radial shock metric field, both starting from a structured mesh in a cube. Finally, we compare two coarsening techniques and demonstrate the importance of applying smoothing in the mesh adaptation loop. The results pertain to a unit cube geometry, but we also show the effect of corners and edges by applying the implementation in a spherical geometry.

  16. Optoelectronic pH Meter: Further Details

    Jeevarajan, Antony S.; Anderson, Mejody M.; Macatangay, Ariel V.


    A collection of documents provides further detailed information about an optoelectronic instrument that measures the pH of an aqueous cell-culture medium to within 0.1 unit in the range from 6.5 to 7.5. The instrument at an earlier stage of development was reported in Optoelectronic Instrument Monitors pH in a Culture Medium (MSC-23107), NASA Tech Briefs, Vol. 28, No. 9 (September 2004), page 4a. To recapitulate: The instrument includes a quartz cuvette through which the medium flows as it is circulated through a bioreactor. The medium contains some phenol red, which is an organic pH-indicator dye. The cuvette sits between a light source and a photodetector. [The light source in the earlier version comprised red (625 nm) and green (558 nm) light-emitting diodes (LEDs); the light source in the present version comprises a single green- (560 nm)-or-red (623 nm) LED.] The red and green are repeatedly flashed in alternation. The responses of the photodiode to the green and red are processed electronically to obtain the ratio between the amounts of green and red light transmitted through the medium. The optical absorbance of the phenol red in the green light varies as a known function of pH. Hence, the pH of the medium can be calculated from the aforesaid ratio.

  17. Software reference for SaTool - a Tool for Structural Analysis of Automated Systems

    Lorentzen, Torsten; Blanke, Mogens


    This software reference details the functions of SaTool – a tool for structural analysis of technical systems. SaTool is intended used as part of an industrial systems design cycle. Structural analysis is a graph-based technique where principal relations between variables express the system...... of the graph. SaTool makes analysis of the structure graph to provide knowledge about fundamental properties of the system in normal and faulty conditions. Salient features of SaTool include rapid analysis of possibility to diagnose faults and ability to make autonomous recovery should faults occur....

  18. A Detailed Modeling Study of Propane Oxidation

    Westbrook, C K; Jayaweera, T M; Pitz, W J; Curran, H J


    A detailed chemical kinetic mechanism has been used to simulate ignition delay times recorded by a number of experimental shock tube studies over the temperature range 900 {le} T {le} 1800 K, in the pressure range 0.75-40 atm and in the equivalence ratio range 0.5 {le} {phi} {le} 2.0. Flame speed measurements at 1 atm in the equivalence ratio range 0.4 {le} {phi} {le} 1.8 have also been simulated. Both of these data sets, particularly those recorded at high pressure, are of particular importance in validating a kinetic mechanism, as internal combustion engines operate at elevated pressures and temperatures and rates of fuel oxidation are critical to efficient system operation. Experiments in which reactant, intermediate and product species were quantitatively recorded, versus temperature in a jet-stirred reactor (JSR) and versus time in a flow reactor are also simulated. This data provide a stringent test of the kinetic mechanism as it must reproduce accurate quantitative profiles for all reactant, intermediate and product species. The JSR experiments were performed in the temperature range 1000-1110 K, in the equivalence ratio range 0.5 {le} {phi} {le} 4.0, at a pressure of 5 atm. These experiments are complemented by those carried out in a flow reactor in the temperature range 660-820 K, at 10 atm and at an equivalence ratio of 0.4. In addition, burner stabilized flames were simulated, where chemical species profiles were measured at atmospheric pressure for two propane-air flat flames. Overall, reasonably good agreement is observed between the model simulations and the experimental results.

  19. Design package for fuel retrieval system fuel handling tool modification



    This is a design package that contains the details for a modification to a tool used for moving fuel elements during loading of MCO Fuel Baskets for the Fuel Retrieval System. The tool is called the fuel handling tool (or stinger). This document contains requirements, development design information, tests, and test reports.

  20. Design package for fuel retrieval system fuel handling tool modification



    This is a design package that contains the details for a modification to a tool used for moving fuel elements during loading of MCO Fuel Baskets for the Fuel Retrieval System. The tool is called the fuel handling tool (or stinger). This document contains requirements, development design information, tests, and test reports.

  1. CoC GIS Tools (GIS Tool)

    Department of Housing and Urban Development — This tool provides a no-cost downloadable software tool that allows users to interact with professional quality GIS maps. Users access pre-compiled projects through...

  2. Detailed transcriptome atlas of the pancreatic beta cell

    Eizirik Decio L


    Full Text Available Abstract Background Gene expression patterns provide a detailed view of cellular functions. Comparison of profiles in disease vs normal conditions provides insights into the processes underlying disease progression. However, availability and integration of public gene expression datasets remains a major challenge. The aim of the present study was to explore the transcriptome of pancreatic islets and, based on this information, to prepare a comprehensive and open access inventory of insulin-producing beta cell gene expression, the Beta Cell Gene Atlas (BCGA. Methods We performed Massively Parallel Signature Sequencing (MPSS analysis of human pancreatic islet samples and microarray analyses of purified rat beta cells, alpha cells and INS-1 cells, and compared the information with available array data in the literature. Results MPSS analysis detected around 7600 mRNA transcripts, of which around a third were of low abundance. We identified 2000 and 1400 transcripts that are enriched/depleted in beta cells compared to alpha cells and INS-1 cells, respectively. Microarray analysis identified around 200 transcription factors that are differentially expressed in either beta or alpha cells. We reanalyzed publicly available gene expression data and integrated these results with the new data from this study to build the BCGA. The BCGA contains basal (untreated conditions gene expression level estimates in beta cells as well as in different cell types in human, rat and mouse pancreas. Hierarchical clustering of expression profile estimates classify cell types based on species while beta cells were clustered together. Conclusion Our gene atlas is a valuable source for detailed information on the gene expression distribution in beta cells and pancreatic islets along with insulin producing cell lines. The BCGA tool, as well as the data and code used to generate the Atlas are available at the T1Dbase website (

  3. Detailed signal model of coherent wind measurement lidar

    Ma, Yuechao; Li, Sining; Lu, Wei


    Lidar is short for light detection and ranging, which is a tool to help measuring some useful information of atmosphere. In the recent years, more and more attention was paid to the research of wind measurement by lidar. Because the accurate wind information can be used not only in weather report, but also the safety guarantee of the airplanes. In this paper, a more detailed signal model of wind measurement lidar is proposed. It includes the laser transmitting part which describes the broadening of the spectral, the laser attenuation in the atmosphere, the backscattering signal and the detected signal. A Voigt profile is used to describe the broadening of the transmitting laser spectral, which is the most common situation that is the convolution of different broadening line shapes. The laser attenuation includes scattering and absorption. We use a Rayleigh scattering model and partially-Correlated quadratic-Velocity-Dependent Hard-Collision (pCqSDHC) model to describe the molecule scattering and absorption. When calculate the particles scattering and absorption, the Gaussian particles model is used to describe the shape of particles. Because of the Doppler Effect occurred between the laser and atmosphere, the wind velocity can be calculated by the backscattering signal. Then, a two parameter Weibull distribution is used to describe the wind filed, so that we can use it to do the future work. After all the description, the signal model of coherent wind measurement lidar is decided. And some of the simulation is given by MATLAB. This signal model can describe the system more accurate and more detailed, so that the following work will be easier and more efficient.

  4. Human Factors Considerations in New Nuclear Power Plants: Detailed Analysis.

    OHara,J.; Higgins, J.; Brown, W.; Fink, R.


    This Nuclear Regulatory Commission (NRC) sponsored study has identified human-performance issues in new and advanced nuclear power plants. To identify the issues, current industry developments and trends were evaluated in the areas of reactor technology, instrumentation and control technology, human-system integration technology, and human factors engineering (HFE) methods and tools. The issues were organized into seven high-level HFE topic areas: Role of Personnel and Automation, Staffing and Training, Normal Operations Management, Disturbance and Emergency Management, Maintenance and Change Management, Plant Design and Construction, and HFE Methods and Tools. The issues where then prioritized into four categories using a 'Phenomena Identification and Ranking Table' methodology based on evaluations provided by 14 independent subject matter experts. The subject matter experts were knowledgeable in a variety of disciplines. Vendors, utilities, research organizations and regulators all participated. Twenty issues were categorized into the top priority category. This Brookhaven National Laboratory (BNL) technical report provides the detailed methodology, issue analysis, and results. A summary of the results of this study can be found in NUREG/CR-6947. The research performed for this project has identified a large number of human-performance issues for new control stations and new nuclear power plant designs. The information gathered in this project can serve as input to the development of a long-term strategy and plan for addressing human performance in these areas through regulatory research. Addressing human-performance issues will provide the technical basis from which regulatory review guidance can be developed to meet these challenges. The availability of this review guidance will help set clear expectations for how the NRC staff will evaluate new designs, reduce regulatory uncertainty, and provide a well-defined path to new nuclear power plant

  5. Detailed free span assessment for Mexilhao flow lines

    Pereira, Antonio; Franco, Luciano; Eigbe, Uwa; BomfimSilva, Carlos [INTECSEA, Houston, TX (United States); Escudero, Carlos [PETROBRAS S.A., Rio de Janeiro, RJ (Brazil)


    design life. This paper presents the FE methodology and associated tools to perform the detailed free span assessment of the Mexilhao flow lines, considering the design information, the post-lay survey data and the as-built reports after span correction in order to accurately account for the multi-spans and multimode effects in the span assessment procedure. (author)

  6. Detailed Burnup Calculations for Research Reactors

    Leszczynski, F. [Centro Atomico Bariloche (CNEA), 8400 S. C. de Bariloche (Argentina)


    A general method (RRMCQ) has been developed by introducing a microscopic burn up scheme which uses the Monte Carlo calculated spatial power distribution of a research reactor core and a depletion code for burn up calculations, as a basis for solving nuclide material balance equations for each spatial region in which the system is divided. Continuous energy dependent cross-section libraries and full 3D geometry of the system is input for the calculations. The resulting predictions for the system at successive burn up time steps are thus based on a calculation route where both geometry and cross-sections are accurately represented, without geometry simplifications and with continuous energy data. The main advantage of this method over the classical deterministic methods currently used is that RRMCQ System is a direct 3D method without the limitations and errors introduced on the homogenization of geometry and condensation of energy of deterministic methods. The Monte Carlo and burn up codes adopted until now are the widely used MCNP5 and ORIGEN2 codes, but other codes can be used also. For using this method, there is a need of a well-known set of nuclear data for isotopes involved in burn up chains, including burnable poisons, fission products and actinides. For fixing the data to be included on this set, a study of the present status of nuclear data is performed, as part of the development of RRMCQ method. This study begins with a review of the available cross-section data of isotopes involved in burn up chains for research nuclear reactors. The main data needs for burn up calculations are neutron cross-sections, decay constants, branching ratios, fission energy and yields. The present work includes results of selected experimental benchmarks and conclusions about the sensitivity of different sets of cross-section data for burn up calculations, using some of the main available evaluated nuclear data files. Basically, the RRMCQ detailed burn up method includes four

  7. Detailed surface reaction mechanism in a three-way catalyst.

    Chatterjee, D; Deutschmann, O; Warnatz, J


    Monolithic three-way catalysts are applied to reduce the emission of combustion engines. The design of such a catalytic converter is a complex process involving the optimization of different physical and chemical parameters (in the simplest case, e.g., length, cell densities or metal coverage of the catalyst). Numerical simulation can be used as an effective tool for the investigation of the catalytic properties of a catalytic converter and for the prediction of the performance of the catalyst. To attain this goal, a two-dimensional flow-field description is coupled with a detailed surface reaction model (gas-phase reactions can be neglected in three-way catalysts). This surface reaction mechanism (with C3H6 taken as representative of unburnt hydrocarbons) was developed using sub-mechanisms recently developed for hydrogen, carbon monoxide and methane oxidation, literature values for C3H6 oxidation, and estimates for the remaining unknown reactions. Results of the simulation of a monolithic single channel are used to validate the surface reaction mechanism. The performance of the catalyst was simulated under lean, nearly stoichiometric and rich conditions. For these characteristic conditions, the oxidation of propene and carbon monoxide and the reduction of NO on a typical Pt/Rh coated three-way catalyst were simulated as a function of temperature. The numerically predicted conversion data are compared with experimentally measured data. The simulation further reveals the coupling between chemical reactions and transport processes within the monolithic channel.

  8. Cutting costs through detailed probabilistic fire risk analysis

    Oliveira, Luiz; Huser, Asmund; Vianna, Savio [Det Norske Veritas PRINCIPIA, Rio de Janeiro, RJ (Brazil)


    A new procedure for calculation of fire risks to offshore installations has been developed. The purposes of the procedure are to calculate the escalation and impairment frequencies to be applied in quantitative risk analyses, to optimize Passive Fire Protection (PFP) arrangement, and to optimize other fire mitigation means. The novelties of the procedure are that it uses state of the art Computational Fluid Dynamics (CFD) models to simulate fires and radiation, as well as the use of a probabilistic approach to decide the dimensioning fire loads. A CFD model of an actual platform was used to investigate the dynamic properties of a large set of jet fires, resulting in detailed knowledge of the important parameters that decide the severity of offshore fires. These results are applied to design the procedure. Potential increase in safety is further obtained for those conditions where simplified tools may have failed to predict abnormal heat loads due to geometrical effects. Using a field example it is indicated that the probabilistic approach can give significant reductions in PFP coverage with corresponding cost savings, still keeping the risk at acceptable level. (author)

  9. Tool Changer For Robot

    Voellmer, George M.


    Mechanism enables robot to change tools on end of arm. Actuated by motion of robot: requires no additional electrical or pneumatic energy to make or break connection between tool and wrist at end of arm. Includes three basic subassemblies: wrist interface plate attached to robot arm at wrist, tool interface plate attached to tool, and holster. Separate tool interface plate and holster provided for each tool robot uses.

  10. Route Availabililty Planning Tool -

    Department of Transportation — The Route Availability Planning Tool (RAPT) is a weather-assimilated decision support tool (DST) that supports the development and execution of departure management...

  11. Rapid Tooling Technique Based on Stereolithograph Prototype

    丁浩; 狄平; 顾伟生; 朱世根


    Rapid tooling technique based on the sterelithograph prototype is investigated. The epoxy tooling technological process was elucidated. It is analyzed in detail that the epoxy resin formula is easy to cast, curing process, and release agents. The transitional plaster model is also proposed. The mold to encrust mutual.inductors with epoxy and mold to inject plastic soapboxes was made with the technique The tooling needs very little time and cost, for the process is only to achieve the nice replica of the prototype. It is benefit for the trial and small batch of production.

  12. Surgical tools and medical devices

    Jackson, Mark


    This new edition presents information and knowledge on the field of biomedical devices and surgical tools. The authors look at the interactions between nanotechnology, nanomaterials, design, modeling, and tools for surgical and dental applications, as well as how nanostructured surfaces can be created for the purposes of improving cell adhesion between medical devices and the human body. Each original chapter is revised in this second edition and describes developments in coatings for heart valves, stents, hip and knee joints, cardiovascular devices, orthodontic applications, and regenerative materials such as bone substitutes. There are also 8 new chapters that address: Microvascular anastomoses Inhaler devices used for pulmonary delivery of medical aerosols Surface modification of interference screws Biomechanics of the mandible (a detailed case study) Safety and medical devices The synthesis of nanostructured material Delivery of anticancer molecules using carbon nanotubes Nano and micro coatings for medic...

  13. Detailed Modeling of Distillation Technologies for Closed-Loop Water Recovery Systems

    Allada, Rama Kumar; Lange, Kevin E.; Anderson, Molly S.


    Detailed chemical process simulations are a useful tool in designing and optimizing complex systems and architectures for human life support. Dynamic and steady-state models of these systems help contrast the interactions of various operating parameters and hardware designs, which become extremely useful in trade-study analyses. NASA?s Exploration Life Support technology development project recently made use of such models to compliment a series of tests on different waste water distillation systems. This paper presents efforts to develop chemical process simulations for three technologies: the Cascade Distillation System (CDS), the Vapor Compression Distillation (VCD) system and the Wiped-Film Rotating Disk (WFRD) using the Aspen Custom Modeler and Aspen Plus process simulation tools. The paper discusses system design, modeling details, and modeling results for each technology and presents some comparisons between the model results and recent test data. Following these initial comparisons, some general conclusions and forward work are discussed.

  14. - the research and service platform


    DETAIL Archive and Download Centre Every year, DETAIL magazine publishes more than 120 outstanding buildings from all over the world, along with interviews, critiques and articles written by authors from different disciplines.

  15. 14 CFR 23.685 - Control system details.


    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Control system details. 23.685 Section 23... Control Systems § 23.685 Control system details. (a) Each detail of each control system must be designed... cables or tubes against other parts. (d) Each element of the flight control system must have...

  16. 46 CFR 70.25-1 - Electrical engineering details.


    ... 46 Shipping 3 2010-10-01 2010-10-01 false Electrical engineering details. 70.25-1 Section 70.25-1 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) PASSENGER VESSELS GENERAL PROVISIONS General Electrical Engineering Requirements § 70.25-1 Electrical engineering details. All electrical engineering details and installations shall...

  17. 46 CFR 188.25-1 - Electrical engineering details.


    ... 46 Shipping 7 2010-10-01 2010-10-01 false Electrical engineering details. 188.25-1 Section 188.25-1 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) OCEANOGRAPHIC RESEARCH VESSELS GENERAL PROVISIONS General Electrical Engineering Requirements § 188.25-1 Electrical engineering details. (a) The electrical engineering details...

  18. 46 CFR 90.25-1 - Electrical engineering details.


    ... 46 Shipping 4 2010-10-01 2010-10-01 false Electrical engineering details. 90.25-1 Section 90.25-1 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) CARGO AND MISCELLANEOUS VESSELS GENERAL PROVISIONS General Electrical Engineering Requirements § 90.25-1 Electrical engineering details. (a) All electrical engineering details...

  19. 46 CFR 188.20-1 - Marine engineering details.


    ... PROVISIONS General Marine Engineering Requirements § 188.20-1 Marine engineering details. (a) The marine engineering details shall be in accordance with Subchapter F (Marine Engineering) of this chapter. ... 46 Shipping 7 2010-10-01 2010-10-01 false Marine engineering details. 188.20-1 Section...

  20. 46 CFR 24.20-1 - Marine engineering details.


    ... Engineering Requirements § 24.20-1 Marine engineering details. (a) All marine engineering details relative to... 46 Shipping 1 2010-10-01 2010-10-01 false Marine engineering details. 24.20-1 Section 24.20-1... 40 feet in length will be found in subchapter F (Marine Engineering) of this chapter....

  1. Detailed protein sequence alignment based on Spectral Similarity Score (SSS

    Thomas Dina


    Full Text Available Abstract Background The chemical property and biological function of a protein is a direct consequence of its primary structure. Several algorithms have been developed which determine alignment and similarity of primary protein sequences. However, character based similarity cannot provide insight into the structural aspects of a protein. We present a method based on spectral similarity to compare subsequences of amino acids that behave similarly but are not aligned well by considering amino acids as mere characters. This approach finds a similarity score between sequences based on any given attribute, like hydrophobicity of amino acids, on the basis of spectral information after partial conversion to the frequency domain. Results Distance matrices of various branches of the human kinome, that is the full complement of human kinases, were developed that matched the phylogenetic tree of the human kinome establishing the efficacy of the global alignment of the algorithm. PKCd and PKCe kinases share close biological properties and structural similarities but do not give high scores with character based alignments. Detailed comparison established close similarities between subsequences that do not have any significant character identity. We compared their known 3D structures to establish that the algorithm is able to pick subsequences that are not considered similar by character based matching algorithms but share structural similarities. Similarly many subsequences with low character identity were picked between xyna-theau and xyna-clotm F/10 xylanases. Comparison of 3D structures of the subsequences confirmed the claim of similarity in structure. Conclusion An algorithm is developed which is inspired by successful application of spectral similarity applied to music sequences. The method captures subsequences that do not align by traditional character based alignment tools but give rise to similar secondary and tertiary structures. The Spectral

  2. D5.2 Numerical tools

    Møhlenberg, Flemming; Christensen, Erik Damgaard


    . The planning and design of MUPS in MERMAID has therefore not only involved standard engineering methods, but also advanced numerical tools, that can enable a detailed understanding of the environment and the interactions between the MUP and the surrounding water environments. The intention of this report...

  3. Sclerochronology: a tool for interpreting past environments

    Hudson, J. Harold; Shinn, Eugene A.; Halley, Robert B.; Lidz, Barbara


    X-radiographs of stony coral slabs reveal two types of annual density bands. Detailed studies of these bands in relation to known variations in air temperatures indicate that sclerochronology is a valid tool for documenting time sequences and changing environmental conditions on a coral reef.

  4. Save Energy Now Assessments Results 2008 Detailed Report

    Wright, Anthony L [ORNL; Martin, Michaela A [ORNL; Nimbalkar, Sachin U [ORNL; Quinn, James [U.S. Department of Energy; Glatt, Ms. Sandy [DOE Industrial Technologies Program; Orthwein, Mr. Bill [U.S. Department of Energy


    In October 2005, U.S. Department of Energy Secretary Bodman launched his Easy Ways to Save Energy campaign with a promise to provide energy assessments to 200 of the largest U.S. manufacturing plants. DOE's Industrial Technologies Program (ITP) responded to the Secretary's campaign with its Save Energy Now initiative, featuring a new and highly cost-effective form of energy savings assessment. The approach for these assessments drew heavily on the existing resources of ITP's technology delivery component. Over the years, ITP Technology Delivery has worked with industry partners to assemble a suite of respected software tools, proven assessment protocols, training curricula, certified energy experts, and strong partnerships for deployment. The Save Energy Now assessments conducted in calendar year 2006 focused on natural gas savings and targeted many of the nation's largest manufacturing plants - those that consume at least 1 TBtu of energy annually. The 2006 Save Energy Now assessments focused primarily on assessments of steam and process heating systems, which account for an estimated 74% of all natural gas use by U.S. manufacturing plants. Because of the success of the Save Energy Now assessments conducted in 2006 and 2007, the program was expanded and enhanced in two major ways in 2008: (1) a new goal was set to perform at least 260 assessments; and (2) the assessment focus was expanded to include pumping, compressed air, and fan systems in addition to steam and process heating. DOE ITP also has developed software tools to assess energy efficiency improvement opportunities in pumping, compressed air, and fan systems. The Save Energy Now assessments integrate a strong training component designed to teach industrial plant personnel how to use DOE's opportunity assessment software tools. This approach has the advantages of promoting strong buy-in of plant personnel for the assessment and its outcomes and preparing them better to

  5. Useful design tools?

    Jensen, Jesper Ole


    Tools for design management are on the agenda in building projects in order to set targets, to choose and prioritise between alternative environmental solutions, to involve stakeholders and to document, evaluate and benchmark. Different types of tools are available, but what can we learn from...... the use or lack of use of current tools in the development of future design tools for sustainable buildings? Why are some used while others are not? Who is using them? The paper deals with design management, with special focus on sustainable building in Denmark, and the challenge of turning the generally...... vague and contested concept of sustainability into concrete concepts and building projects. It describes a typology of tools: process tools, impact assessment tools, multi-criteria tools and tools for monitoring. It includes a Danish paradigmatic case study of stakeholder participation in the planning...

  6. LensTools: Weak Lensing computing tools

    Petri, A.


    LensTools implements a wide range of routines frequently used in Weak Gravitational Lensing, including tools for image analysis, statistical processing and numerical theory predictions. The package offers many useful features, including complete flexibility and easy customization of input/output formats; efficient measurements of power spectrum, PDF, Minkowski functionals and peak counts of convergence maps; survey masks; artificial noise generation engines; easy to compute parameter statistical inferences; ray tracing simulations; and many others. It requires standard numpy and scipy, and depending on tools used, may require Astropy (ascl:1304.002), emcee (ascl:1303.002), matplotlib, and mpi4py.

  7. Revolutions in Neuroscience: Tool Development

    John eBickle


    Full Text Available Thomas Kuhn’s famous model of the components and dynamics of scientific revolutions is still dominant to this day across science, philosophy, and history. The guiding philosophical theme of this paper is that, concerning actual revolutions in neuroscience over the past sixty years, Kuhn’s account is wrong. There have been revolutions, and new ones are brewing, but they do not turn on competing paradigms, anomalies, or the like. Instead, they turn exclusively on the development of new experimental tools. I adopt a metascientific approach and examine in detail the development of two recent neuroscience revolutions: the impact of engineered genetically mutated mammals in the search for causal mechanisms of higher cognitive functions; and the more recent impact of optogenetics (and DREADDs. The two key metascientific concepts I derive from these case studies are a revolutionary new tool’s motivating problem, and its initial and second-phase hook experiments. These concepts hardly exhaust a detailed metascience of Tool Development experiments in neuroscience, but they get that project off to a useful start and distinguish the subsequent account of neuroscience revolutions clearly from Kuhn’s famous model. I close with a brief remark about the general importance of molecular biology for a current philosophical understanding of science, as comparable to the place physics occupied when Kuhn formulated his famous theory of scientific revolutions.

  8. Shot Planning and Analysis Tools

    Casey, A; Beeler, R; Conder, A; Fallejo, R; Flegel, M; Hutton, M; Jancaitis, K; Lakamsani, V; Potter, D; Reisdorf, S; Tappero, J; Whitman, P; Carr, W; Liao, Z


    Shot planning and analysis tools (SPLAT) integrate components necessary to help achieve a high over-all operational efficiency of the National Ignition Facility (NIF) by combining near and long-term shot planning, final optics demand and supply loops, target diagnostics planning, and target fabrication requirements. Currently, the SPLAT project is comprised of two primary tool suites for shot planning and optics demand. The shot planning component provides a web-based interface to selecting and building a sequence of proposed shots for the NIF. These shot sequences, or 'lanes' as they are referred to by shot planners, provide for planning both near-term shots in the Facility and long-term 'campaigns' in the months and years to come. The shot planning capabilities integrate with the Configuration Management Tool (CMT) for experiment details and the NIF calendar for availability. Future enhancements will additionally integrate with target diagnostics planning and target fabrication requirements tools. The optics demand component is built upon predictive modelling of maintenance requirements on the final optics as a result of the proposed shots assembled during shot planning. The predictive models integrate energetics from a Laser Performance Operations Model (LPOM), the status of the deployed optics as provided by the online Final Optics Inspection system, and physics-based mathematical 'rules' that predict optic flaw growth and new flaw initiations. These models are then run on an analytical cluster comprised of forty-eight Linux-based compute nodes. Results from the predictive models are used to produce decision-support reports in the areas of optics inspection planning, optics maintenance exchanges, and optics beam blocker placement advisories. Over time, the SPLAT project will evolve to provide a variety of decision-support and operation optimization tools.

  9. Personal Wellness Tools

    ... Public Service Announcements Partnering with DBSA Personal Wellness Tools The Merriam-Webster dictionary gives several definitions for ... home to a wealth of customizable, personal wellness tools to help you live a full, healthy, and ...

  10. Exploiting available Internet tools for multimedia applications

    Scott, Andrew C.


    The rapidly increasing number of tools available on the internet is changing the way people view software systems. People are now used to downloading plug in helper tools in order to decode and display different types of media within web browsers. The ease with which this can now be done is a far cry from the days, quite recently, when data had to be manually processed by a number of obviously independent software packages. Using the tools available to simply decode and display new data formats in only one way in which such software can be used, one could even consider a web browser as just another tool. Compete new applications could be constructed by selecting a suitable range of tools and supplying minimal glue software. This paper describes, as an example of this approach, a collaborative application supporting synchronous audio-visual communication and collaborative web browsing. The system develop is designed to make use of a wide range of freely available tools with no modification of existing web servers or clients. Alternative implementation strategies are discussed, followed by a detailed description of the approach chosen for this implementation. A technique allowing small to medium sized groups World Wide Web users to be tracked and their location to be presented to people with similar interests is then explained, followed by details of a mechanism allowing the information gained about such groups to be shared among arbitrary of similar groups.




    This document reports the results of a study of cost tools to support the analysis of Operations Other Than War (OOTW). It recommends the continued development of the Department of Defense (DoD) Contingency Operational Support Tool (COST) as the basic cost analysis tool for 00TWS. It also recommends modifications to be included in future versions of COST and the development of an 00TW mission planning tool to supply valid input for costing.

  12. Software engineering tools.

    Wear, L L; Pinkert, J R


    We have looked at general descriptions and illustrations of several software development tools, such as tools for prototyping, developing DFDs, testing, and maintenance. Many others are available, and new ones are being developed. However, you have at least seen some examples of powerful CASE tools for systems development.

  13. Pro Tools HD

    Camou, Edouard


    An easy-to-follow guide for using Pro Tools HD 11 effectively.This book is ideal for anyone who already uses ProTools and wants to learn more, or is new to Pro Tools HD and wants to use it effectively in their own audio workstations.

  14. Research on the Hotel Image Based on the Detail Service

    Li, Ban; Shenghua, Zheng; He, Yi

    Detail service management, initially developed as marketing programs to enhance customer loyalty, has now become an important part of customer relation strategy. This paper analyzes the critical factors of detail service and its influence on the hotel image. We establish the theoretical model of influencing factors on hotel image and propose corresponding hypotheses. We use applying statistical method to test and verify the above-mentioned hypotheses. This paper provides a foundation for further study of detail service design and planning issues.

  15. Combustion instability detection using the wavelet detail of pressure fluctuations

    Junjie JI; Yonghao LUO


    A combustion instability detection method that uses the wavelet detail of combustion pressure fluctuations is put forward. To confirm this method, combustion pressure fluctuations in a stoker boiler are recorded at stable and unstable combustion with a pressure transducer. Daubechies one-order wavelet is chosen to obtain the wavelet details for comparison. It shows that the wavelet approximation indicates the general pressure change in the furnace, and the wavelet detail magnitude is consistent with the intensity of turbulence and combustion noise. The magnitude of the wavelet detail is nearly constant when the combustion is stable, however, it will fluctuate much when the combustion is unstable.

  16. EPA Enforcement and Compliance History Online: Water Effluent Charts Details

    U.S. Environmental Protection Agency — Detailed Discharge Monitoring Report (DMR) data supporting effluent charts for one Clean Water Act discharge permit. Includes effluent parameters, amounts discharged...

  17. Scheme Program Documentation Tools

    Nørmark, Kurt


    This paper describes and discusses two different Scheme documentation tools. The first is SchemeDoc, which is intended for documentation of the interfaces of Scheme libraries (APIs). The second is the Scheme Elucidator, which is for internal documentation of Scheme programs. Although the tools...... are separate and intended for different documentation purposes they are related to each other in several ways. Both tools are based on XML languages for tool setup and for documentation authoring. In addition, both tools rely on the LAML framework which---in a systematic way---makes an XML language available...

  18. Machine tool structures

    Koenigsberger, F


    Machine Tool Structures, Volume 1 deals with fundamental theories and calculation methods for machine tool structures. Experimental investigations into stiffness are discussed, along with the application of the results to the design of machine tool structures. Topics covered range from static and dynamic stiffness to chatter in metal cutting, stability in machine tools, and deformations of machine tool structures. This volume is divided into three sections and opens with a discussion on stiffness specifications and the effect of stiffness on the behavior of the machine under forced vibration c

  19. Tools and Behavioral Abstraction: A Direction for Software Engineering

    Leino, K. Rustan M.

    As in other engineering professions, software engineers rely on tools. Such tools can analyze program texts and design specifications more automatically and in more detail than ever before. While many tools today are applied to find new defects in old code, I predict that more software-engineering tools of the future will be available to software authors at the time of authoring. If such analysis tools can be made to be fast enough and easy enough to use, they can help software engineers better produce and evolve programs.

  20. Simplified Analysis Tool for Ship-Ship Collision

    Yamada, Yasuhira; Pedersen, Preben Terndrup


    to the collision scenario thatwhere a VLCC in ballast condition collides perpendicularly with the mid part of another D/H VLCC in fully loaded condition. The results obtained from the present tool are compared with those obtained by large scale FEA, and fairy good agreements are achieved. The applicability......, limitation and future enhancement of the present tool are discussed in detail....

  1. Risk assessment meta tool LDRD final report.

    Bouchard, Ann Marie; Osbourn, Gordon Cecil


    The goal of this project was to develop a risk analysis meta tool--a tool that enables security analysts both to combine and analyze data from multiple other risk assessment tools on demand. Our approach was based on the innovative self-assembling software technology under development by the project team. This technology provides a mechanism for the user to specify his intentions at a very high level (e.g., equations or English-like text), and then the code self-assembles itself, taking care of the implementation details. The first version of the meta tool focused specifically in importing and analyzing data from Joint Conflict and Tactical Simulation (JCATS) force-on-force simulation. We discuss the problem, our approach, technical risk, and accomplishments on this project, and outline next steps to be addressed with follow-on funding.

  2. A tool for searching in information systems under uncertainty

    Walek, Bogdan; Farana, Radim


    This article deals with a design of a tool for searching in information systems under uncertainty. During the search, user data often works with uncertainty, which may lead to a lack of the desired result or to find a large number of results that the user must evaluate. The main goal of the proposed tool is to process vague information and find relevant data. The article describes in detail various steps of the proposed tool.

  3. Commentary: The Perils of Seduction: Distracting Details or Incomprehensible Abstractions?

    Goetz, Ernest T.; Sadoski, Mark


    Reviews studies that have explicitly investigated the "seductive detail" effect (in which a reader's attention is diverted toward the interesting but unimportant seductive details and away from the uninteresting but important main ideas). Concludes that these studies do not provide convincing evidence for the existence of the effect. Argues that…

  4. 44 CFR 5.27 - Deletion of identifying details.


    ... 44 Emergency Management and Assistance 1 2010-10-01 2010-10-01 false Deletion of identifying details. 5.27 Section 5.27 Emergency Management and Assistance FEDERAL EMERGENCY MANAGEMENT AGENCY..., FEMA may delete identifying details when making available or publishing an opinion, statement of...

  5. Processing and Recall of Seductive Details in Scientific Text

    Lehman, Stephen; Schraw, Gregory; McCrudden, Matthew T.; Hartley, Kendall


    This study examined how seductive details affect on-line processing of a technical, scientific text. In Experiment 1, each sentence from the experimental text was rated for interest and importance. Participants rated seductive details as being more interesting but less important than main ideas. In Experiment 2, we examined the effect of seductive…

  6. 46 CFR 70.20-1 - Marine engineering details.


    ... General Marine Engineering Requirements § 70.20-1 Marine engineering details. All marine engineering... 46 Shipping 3 2010-10-01 2010-10-01 false Marine engineering details. 70.20-1 Section 70.20-1... subchapter F (Marine Engineering) of this chapter....

  7. Space Telecommunications Radio System (STRS) Architecture, Tutorial Part 2 - Detailed

    Handler, Louis


    The STRS architecture detail presentation presents each requirement in the STRS Architecture Standard with some examples and supporting information. The purpose is to give a platform provider, application provider, or application integrator a better, more detailed understanding of the STRS Architecture Standard and its use.

  8. Lunar hand tools

    Bentz, Karl F.; Coleman, Robert D.; Dubnik, Kathy; Marshall, William S.; Mcentee, Amy; Na, Sae H.; Patton, Scott G.; West, Michael C.


    Tools useful for operations and maintenance tasks on the lunar surface were determined and designed. Primary constraints are the lunar environment, the astronaut's space suit and the strength limits of the astronaut on the moon. A multipurpose rotary motion tool and a collapsible tool carrier were designed. For the rotary tool, a brushless motor and controls were specified, a material for the housing was chosen, bearings and lubrication were recommended and a planetary reduction gear attachment was designed. The tool carrier was designed primarily for ease of access to the tools and fasteners. A material was selected and structural analysis was performed on the carrier. Recommendations were made about the limitations of human performance and about possible attachments to the torque driver.

  9. Integrated piezoelectric actuators in deep drawing tools

    Neugebauer, R.; Mainda, P.; Drossel, W.-G.; Kerschner, M.; Wolf, K.


    The production of car body panels are defective in succession of process fluctuations. Thus the produced car body panel can be precise or damaged. To reduce the error rate, an intelligent deep drawing tool was developed at the Fraunhofer Institute for Machine Tools and Forming Technology IWU in cooperation with Audi and Volkswagen. Mechatronic components in a closed-loop control is the main differentiating factor between an intelligent and a conventional deep drawing tool. In correlation with sensors for process monitoring, the intelligent tool consists of piezoelectric actuators to actuate the deep drawing process. By enabling the usage of sensors and actuators at the die, the forming tool transform to a smart structure. The interface between sensors and actuators will be realized with a closed-loop control. The content of this research will present the experimental results with the piezoelectric actuator. For the analysis a production-oriented forming tool with all automotive requirements were used. The disposed actuators are monolithic multilayer actuators of the piezo injector system. In order to achieve required force, the actuators are combined in a cluster. The cluster is redundant and economical. In addition to the detailed assembly structures, this research will highlight intensive analysis with the intelligent deep drawing tool.

  10. Open Health Tools: Tooling for Interoperable Healthcare

    Skip McGaughey


    Full Text Available The Open Health Tools initiative is creating an ecosystem focused on the production of software tooling that promotes the exchange of medical information across political, geographic, cultural, product, and technology lines. At its core, OHT believes that the availability of high-quality tooling that interoperates will propel the industry forward, enabling organizations and vendors to build products and systems that effectively work together. This will ?raise the interoperability bar? as a result of having tools that just work. To achieve these lofty goals, careful consideration must be made to the constituencies that will be most affected by an OHT-influenced world. This document outlines a vision of OHT?s impact to these stakeholders. It does not explain the OHT process itself or how the OHT community operates. Instead, we place emphasis on the impact of that process within the health industry. The catchphrase ?code is king? underpins this document, meaning that the manifestation of any open source community lies in the products and technology it produces.

  11. SBAT. A stochastic BPMN analysis tool

    Herbert, Luke Thomas; Hansen, Zaza Nadja Lee; Jacobsen, Peter


    This paper presents SBAT, a tool framework for the modelling and analysis of complex business workflows. SBAT is applied to analyse an example from the Danish baked goods industry. Based upon the Business Process Modelling and Notation (BPMN) language for business process modelling, we describe...... a formalised variant of this language extended to support the addition of intention preserving stochastic branching and parameterised reward annotations. Building on previous work, we detail the design of SBAT, a software tool which allows for the analysis of BPMN models. Within SBAT, properties of interest...

  12. Authoring tool evaluation

    Wilson, A.L.; Klenk, K.S.; Coday, A.C.; McGee, J.P.; Rivenburgh, R.R.; Gonzales, D.M.; Mniszewski, S.M.


    This paper discusses and evaluates a number of authoring tools currently on the market. The tools evaluated are Visix Galaxy, NeuronData Open Interface Elements, Sybase Gain Momentum, XVT Power++, Aimtech IconAuthor, Liant C++/Views, and Inmark Technology zApp. Also discussed is the LIST project and how this evaluation is being used to fit an authoring tool to the project.

  13. Population Density Modeling Tool


    194 POPULATION DENSITY MODELING TOOL by Davy Andrew Michael Knott David Burke 26 June 2012 Distribution...MARYLAND NAWCADPAX/TR-2012/194 26 June 2012 POPULATION DENSITY MODELING TOOL by Davy Andrew Michael Knott David Burke...Density Modeling Tool 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Davy Andrew Michael Knott David Burke 5d. PROJECT NUMBER

  14. CMS offline web tools

    Metson, S; Bockelman, B; Dziedziniewicz, K; Egeland, R; Elmer, P; Eulisse, G; Evans, D; Fanfani, A; Feichtinger, D; Kavka, C; Kuznetsov, V; Van Lingen, F; Newbold, D; Tuura, L; Wakefield, S


    We describe a relatively new effort within CMS to converge on a set of web based tools, using state of the art industry techniques, to engage with the CMS offline computing system. CMS collaborators require tools to monitor various components of the computing system and interact with the system itself. The current state of the various CMS web tools is described along side current planned developments.

  15. Recovering and Preventing Loss of Detailed Memory: Differential Rates of Forgetting for Detail Types in Episodic Memory

    Sekeres, Melanie J.; Bonasia, Kyra; St-Laurent, Marie; Pishdadian, Sara; Winocur, Gordon; Grady, Cheryl; Moscovitch, Morris


    Episodic memories undergo qualitative changes with time, but little is known about how different aspects of memory are affected. Different types of information in a memory, such as perceptual detail, and central themes, may be lost at different rates. In patients with medial temporal lobe damage, memory for perceptual details is severely impaired,…

  16. Investigating the Interaction of Graphic Organizers and Seductive Details: Can a Graphic Organizer Mitigate the Seductive-Details Effect?

    Rowland-Bryant, Emily; Skinner, Christopher H.; Skinner, Amy L.; Saudargas, Richard; Robinson, Daniel H.; Kirk, Emily R.


    The interaction between seductive details (SD) and a graphic organizer (GO) was investigated. Undergraduate students (n = 207) read a target-material passage about Freud's psychosexual stages. Depending on condition, the participants also read a biographical paragraph (SD-only), viewed a graphic organizer that linked the seductive details to the…

  17. Qlikview Audit Tool (QLIKVIEW) -

    Department of Transportation — This tool supports the cyclical financial audit process. Qlikview supports large volumes of financial transaction data that can be mined, summarized and presented to...

  18. Instant Spring Tool Suite

    Chiang, Geoff


    Filled with practical, step-by-step instructions and clear explanations for the most important and useful tasks. A tutorial guide that walks you through how to use the features of Spring Tool Suite using well defined sections for the different parts of Spring.Instant Spring Tool Suite is for novice to intermediate Java developers looking to get a head-start in enterprise application development using Spring Tool Suite and the Spring framework. If you are looking for a guide for effective application development using Spring Tool Suite, then this book is for you.

  19. Agreement Workflow Tool (AWT)

    Social Security Administration — The Agreement Workflow Tool (AWT) is a role-based Intranet application used for processing SSA's Reimbursable Agreements according to SSA's standards. AWT provides...

  20. Java Power Tools

    Smart, John


    All true craftsmen need the best tools to do their finest work, and programmers are no different. Java Power Tools delivers 30 open source tools designed to improve the development practices of Java developers in any size team or organization. Each chapter includes a series of short articles about one particular tool -- whether it's for build systems, version control, or other aspects of the development process -- giving you the equivalent of 30 short reference books in one package. No matter which development method your team chooses, whether it's Agile, RUP, XP, SCRUM, or one of many other

  1. Detailed black hole state counting in loop quantum gravity

    Agullo, Ivan; Barbero G., J. Fernando; Borja, Enrique F.; Diaz-Polo, Jacobo; Villaseñor, Eduardo J. S.


    We give a complete and detailed description of the computation of black hole entropy in loop quantum gravity by employing the most recently introduced number-theoretic and combinatorial methods. The use of these techniques allows us to perform a detailed analysis of the precise structure of the entropy spectrum for small black holes, showing some relevant features that were not discernible in previous computations. The ability to manipulate and understand the spectrum up to the level of detail that we describe in the paper is a crucial step toward obtaining the behavior of entropy in the asymptotic (large horizon area) regime.

  2. Detailed black hole state counting in loop quantum gravity

    Agullo, Ivan; Borja, Enrique F; Diaz-Polo, Jacobo; Villaseñor, Eduardo J S; 10.1103/PhysRevD.82.084029


    We give a complete and detailed description of the computation of black hole entropy in loop quantum gravity by employing the most recently introduced number-theoretic and combinatorial methods. The use of these techniques allows us to perform a detailed analysis of the precise structure of the entropy spectrum for small black holes, showing some relevant features that were not discernible in previous computations. The ability to manipulate and understand the spectrum up to the level of detail that we describe in the paper is a crucial step towards obtaining the behavior of entropy in the asymptotic (large horizon area) regime.

  3. A detailed gravimetric geoid from North America to Eurasia

    Vincent, S. F.; Strange, W. E.; Marsh, J. G.


    A detailed gravimetric geoid of the United States, North Atlantic, and Eurasia, which was computed from a combination of satellite derived and surface gravity data, is presented. The precision of this detailed geoid is + or - 2 to + or - 3 m in the continents but may be in the range of 5 to 7 m in those areas where data is sparse. Comparisons of the detailed gravimetric geoid with results of Rapp, Fischer, and Rice for the United States, Bomford in Europe, and Heiskanen and Fischer in India are presented. Comparisons are also presented with geoid heights from satellite solutions for geocentric station coordinates in North America, the Caribbean, and Europe.

  4. Autonomous underwater riser inspection tool

    Camerini, Claudio; Marnet, Robson [Petrobras SA, (Brazil); Freitas, Miguel; Von der Weid, Jean Pierre [CPTI/PUC-Rio, Rio de Janeiro, (Brazil); Artigas Lander, Ricardo [EngeMOVI, Curitiba, (Brazil)


    The detection of damage on the riser is a serious concern for pipeline companies. Visual examinations by remotely operated vehicle (ROV) are presently carried out to detect the defects but this process has limitations and is expensive. This paper presents the development of a new tool to ensure autonomous underwater riser inspection (AURI) that uses the riser itself for guidance. The AURI, which is autonomous in terms of control and power supply, is equipped with several cameras that perform a complete visual inspection of the riser with 100 % coverage of the external surface of the riser. The paper presents the detailed characteristics of the first AURI prototype, describes its launching procedure and provides the preliminary test results from pool testing. The results showed that the AURI is a viable system for autonomous riser inspection. Offshore tests on riser pipelines are scheduled to be performed shortly.

  5. General Mission Analysis Tool (GMAT)

    Hughes, Steven P. (Compiler)


    This is a software tutorial and presentation demonstrating the application of the General Mission Analysis Tool (GMAT) to the critical design phase of NASA missions. The demonstration discusses GMAT basics, then presents a detailed example of GMAT application to the Transiting Exoplanet Survey Satellite (TESS) mission. Other examples include OSIRIS-Rex. This talk is a combination of existing presentations; a GMAT basics and overview, and technical presentations from the TESS and OSIRIS-REx projects on their application of GMAT to critical mission design. The GMAT basics slides are taken from the open source training material. The OSIRIS-REx slides are from a previous conference presentation. The TESS slides are a streamlined version of the CDR package provided by the project with SBU and ITAR data removed by the TESS project.

  6. Modeling of Nitrogen in River Water Using a Detailed and a Simplified Model

    Mona Radwan


    Full Text Available To model catchment surface water quantity and quality, different model types are available. They vary from detailed physically based models to simplified conceptual and empirical models. The most appropriate model type for a certain application depends on the project objectives and the data availability. The detailed models are very useful for short-term simulations of representative events. They cannot be used for long-term statistical information or as a management tool. For those purposes, more simplified (conceptual or meta- models must be used. In this study, nitrogen dynamics are modeled in a river in Flanders. Nitrogen sources from agricultural leaching and domestic point sources are considered. Based on this input, concentrations of ammonium (NH4-N and nitrate (NO3-N in the river water are modeled in MIKE 11 by taking into consideration advection and dispersion and the most important biological and chemical processes. Model calibration was done on the basis of available measured water quality data. To this detailed model, a more simplified model was calibrated with the objective to more easily yield long-term simulation results that can be used in a statistical analysis. The results show that the conceptual simplified model is 1800 times faster than the MIKE 11 model. Moreover the two models have almost the same accuracy. The detailed models are recommended for short-term simulations unless there are enough data for model input and model parameters. The conceptual simplified model is recommended for long-term simulations.

  7. Transient Studies in Large Offshore Wind Farms, Employing Detailed Circuit Breaker Representation

    Glasdam, Jakob Bærholm; Bak, Claus Leth; Hjerrild, Jesper


    in order to ensure reliable switching operations. Transient measurement results in an OWF are compared with simulation results in PSCAD EMTDC and DigSILENT Power Factory. A user-defined model of the vacuum circuit breaker (VCB) is included in both tools, capable of simulating multiple prestrikes during...... the closing operation. An analysis of the switching transients that might occur in OWFs will be made on the basis of the validated model, and the importance of the inclusion of a sufficiently accurate representation of the VCB in the simulation tool will be described. The inclusion of the VCB model in PSCAD...... greatly improves the simulation results, whereas little improvement is found in DigSILENT. Based on the transient study it is found that the simulated SOV can be up to 60% higher at the sending end when using the detailed VCB representation compared to the built-in switch, which emphasises the need...

  8. Data Center IT Equipment Energy Assessment Tools: Current State of Commercial Tools, Proposal for a Future Set of Assessment Tools

    Radhakrishnan, Ben D. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); National Univ., San Diego, CA (United States). School of Engineering


    This research project, which was conducted during the Summer and Fall of 2011, investigated some commercially available assessment tools with a focus on IT equipment to see if such tools could round out the DC Pro tool suite. In this research, the assessment capabilities of the various tools were compiled to help make “non-biased” information available to the public. This research should not be considered to be exhaustive on all existing vendor tools although a number of vendors were contacted. Large IT equipment OEM’s like IBM and Dell provide their proprietary internal automated software which does not work on any other IT equipment. However, found two companies with products that showed promise in performing automated assessments for IT equipment from different OEM vendors. This report documents the research and provides a list of software products reviewed, contacts and websites, product details, discussions with specific companies, a set of recommendations, and next steps. As a result of this research, a simple 3-level approach to an IT assessment tool is proposed along with an example of an assessment using a simple IT equipment data collection tool (Level 1, spreadsheet). The tool has been reviewed with the Green Grid and LBNL staff. The initial feedback has been positive although further refinement to the tool will be necessary. Proposed next steps include a field trial of at least two vendors’ software in two different data centers with an objective to prove the concept, ascertain the extent of energy and computational assessment, ease of installation and opportunities for continuous improvement. Based on the discussions, field trials (or case studies) are proposed with two vendors – JouleX (expected to be completed in 2012) and Sentilla.

  9. Results of Detailed Hydrologic Characterization Tests - Fiscal Year 1999

    Spane, Frank A.; Thorne, Paul D.; Newcomer, Darrell R.


    This report provides the results of detailed hydrologic characterization tests conducted within newly constructed Hanford Site wells during FY 1999. Detailed characterization tests performed during FY 1999 included: groundwater flow characterization, barometric response evaluation, slug tests, single-well tracer tests, constant-rate pumping tests, and in-well vertical flow tests. Hydraulic property estimates obtained from the detailed hydrologic tests include: transmissivity, hydraulic conductivity, specific yield, effective porosity, in-well lateral flow velocity, aquifer flow velocity, vertical distribution of hydraulic conductivity (within the well-screen section) and in-well vertical flow velocity. In addition, local groundwater flow characteristics (i.e., hydraulic gradient and flow direction) were determined for four sites where detailed well testing was performed.

  10. Results of Detailed Hydrologic Characterization Tests - Fiscal Year 2000

    Spane, Frank A.; Thorne, Paul D.; Newcomer, Darrell R.


    This report provides the resluts of detailed hydrologic characterization tests conducted within eleven Hanford Site wells during fiscal year 2000. Detailed characterization tests performed included groundwater-flow characterization; barometric response evaluation; slug tests; single-well tracer tests; constant-rate pumping tests; and in-well, vertical flow tests. Hydraulic property estimates obtained from the detailed hydrologic tests include transmissivity; hydraulic conductivity; specific yield; effective porosity; in-well, lateral flow velocity; aquifer-flow velocity; vertical distribution of hydraulic conductivity (within the well-screen section); and in-well, verticla flow velocity. In addition, local groundwater-flow characteristics (i.e., hydraulic gradient and flow direction) were determined for four sites where detailed well testing was performed.

  11. 42 CFR 401.118 - Deletion of identifying details.


    ... Deletion of identifying details. When CMS publishes or otherwise makes available an opinion or order, statement of policy, or other record which relates to a private party or parties, the name or names or...

  12. Analysis of Common Fatigue Details in Steel Truss Structures

    张玉玲; 潘际炎; 潘际銮


    Generally, the number of fatigue cycles, the range of the repeated stresses, and the type of the structural details are the key factors affecting fatigue in large-scale welded structures. Seven types of structure details were tested using a 2000-kN hydraulic-pressure-servo fatigue machine to imitate fatigue behavior in modern steel-truss-structures fabricated using thicker welded steel plates and integral joint technology. The details included longitudinal edge welds, welded attachment affecting detail, integral joint, and weld repairs on plate edges. The fatigue damage locations show that the stress (normal or shear), the shape, and the location of the weld start and end points are three major factors reducing the fatigue strength. The test results can be used for similar large structures.

  13. CDC WONDER: Detailed Mortality - Underlying Cause of Death

    U.S. Department of Health & Human Services — The Detailed Mortality - Underlying Cause of Death data on CDC WONDER are county-level national mortality and population data spanning the years 1999-2009. Data are...

  14. A detailed discussion of superfield supergravity prepotential perturbations

    Ovalle, J.


    This paper presents a detailed discussion of the issue of supergravity perturbations around the flat five dimensional superspace required for manifest superspace formulations of the supergravity side of the AdS_{5}/CFT_{4} Correspondence.

  15. Maailma suurim tool


    AS Tartu näitused, Tartu Kunstikool ja ajakiri 'Diivan' korraldavad 9.-11. III Tartu messikeskuse I paviljonis näituse 'Tool 2000'. Eksponeeritakse 2000 tooli, mille hulgast valitakse TOP 12. Messikeskuse territooriumile on kavas püstitada maailma suurim tool. Samal ajal II paviljonis kaksikmess 'Sisustus 2000' ja 'Büroo 2000'.

  16. Study of Tools Interoperability

    Krilavičius, T.


    Interoperability of tools usually refers to a combination of methods and techniques that address the problem of making a collection of tools to work together. In this study we survey different notions that are used in this context: interoperability, interaction and integration. We point out relation

  17. WATERS Expert Query Tool

    U.S. Environmental Protection Agency — The Expert Query Tool is a web-based reporting tool using the EPA’s WATERS database.There are just three steps to using Expert Query:1. View Selection – Choose what...

  18. Coring Sample Acquisition Tool

    Haddad, Nicolas E.; Murray, Saben D.; Walkemeyer, Phillip E.; Badescu, Mircea; Sherrit, Stewart; Bao, Xiaoqi; Kriechbaum, Kristopher L.; Richardson, Megan; Klein, Kerry J.


    A sample acquisition tool (SAT) has been developed that can be used autonomously to sample drill and capture rock cores. The tool is designed to accommodate core transfer using a sample tube to the IMSAH (integrated Mars sample acquisition and handling) SHEC (sample handling, encapsulation, and containerization) without ever touching the pristine core sample in the transfer process.

  19. A New Video Coding Method Based on Improving Detail Regions


    The Moving Pictures Expert Group (MPEG) and H.263 standard coding method is widely used in video compression. However, the visual quality of detail regions such as eyes and mouth is not content in people at the decoder, as far as the conference telephone or videophone is concerned. A new coding method based on improving detail regions is presented in this paper. Experimental results show that this method can improve the visual quality at the decoder.

  20. Parametric programming of CNC machine tools

    Gołębski Rafał


    Full Text Available The article presents the possibilities of parametric programming of CNC machine tools for the SINUMERIK 840D sl control system. The kinds and types of the definition of variables for the control system under discussion described. On the example of the longitudinal cutting cycle, parametric programming possibilities are shown. The program’s code and its implementation in the control system is described in detail. The principle of parametric programming in a high-level language is also explained.

  1. Detail Enhancement for Infrared Images Based on Propagated Image Filter

    Yishu Peng


    Full Text Available For displaying high-dynamic-range images acquired by thermal camera systems, 14-bit raw infrared data should map into 8-bit gray values. This paper presents a new method for detail enhancement of infrared images to display the image with a relatively satisfied contrast and brightness, rich detail information, and no artifacts caused by the image processing. We first adopt a propagated image filter to smooth the input image and separate the image into the base layer and the detail layer. Then, we refine the base layer by using modified histogram projection for compressing. Meanwhile, the adaptive weights derived from the layer decomposition processing are used as the strict gain control for the detail layer. The final display result is obtained by recombining the two modified layers. Experimental results on both cooled and uncooled infrared data verify that the proposed method outperforms the method based on log-power histogram modification and bilateral filter-based detail enhancement in both detail enhancement and visual effect.

  2. Language Management Tools

    Sanden, Guro Refsum

    may deploy in order to address the language needs of the organisation. Finally, front-line practices refer to the use of informal, emergent language management tools available to staff members. The language management tools taxonomy provides a framework for operationalising the management of language......This paper offers a review of existing literature on the topic of language management tools – the means by which language is managed – in multilingual organisations. By drawing on a combination of sociolinguistics and international business and management studies, a new taxonomy of language...... management tools is proposed, differentiating between three categories of tools. Firstly, corporate policies are the deliberate control of issues pertaining to language and communication developed at the managerial level of a firm. Secondly, corporate measures are the planned activities the firm’s leadership...

  3. Software Tool Issues

    Hennell, Michael

    This chapter relies on experience with tool development gained over the last thirty years. It shows that there are a large number of techniques that contribute to any successful project, and that formality is always the key: a modern software test tool is based on a firm mathematical foundation. After a brief introduction, Section 2 recalls and extends the terminology of Chapter 1. Section 3 discusses the the design of different sorts of static and dynamic analysis tools. Nine important issues to be taken into consideration when evaluating such tools are presented in Section 4. Section 5 investigates the interplay between testing and proof. In Section 6, we call for developers to take their own medicine and verify their tools. Finally, we conclude in Section 7 with a summary of our main messages, emphasising the important role of testing.

  4. OOTW Force Design Tools

    Bell, R.E.; Hartley, D.S.III; Packard, S.L.


    This report documents refined requirements for tools to aid the process of force design in Operations Other Than War (OOTWs). It recommends actions for the creation of one tool and work on other tools relating to mission planning. It also identifies the governmental agencies and commands with interests in each tool, from whom should come the user advisory groups overseeing the respective tool development activities. The understanding of OOTWs and their analytical support requirements has matured to the point where action can be taken in three areas: force design, collaborative analysis, and impact analysis. While the nature of the action and the length of time before complete results can be expected depends on the area, in each case the action should begin immediately. Force design for OOTWs is not a technically difficult process. Like force design for combat operations, it is a process of matching the capabilities of forces against the specified and implied tasks of the operation, considering the constraints of logistics, transport and force availabilities. However, there is a critical difference that restricts the usefulness of combat force design tools for OOTWs: the combat tools are built to infer non-combat capability requirements from combat capability requirements and cannot reverse the direction of the inference, as is required for OOTWs. Recently, OOTWs have played a larger role in force assessment, system effectiveness and tradeoff analysis, and concept and doctrine development and analysis. In the first Quadrennial Defense Review (QDR), each of the Services created its own OOTW force design tool. Unfortunately, the tools address different parts of the problem and do not coordinate the use of competing capabilities. These tools satisfied the immediate requirements of the QDR, but do not provide a long-term cost-effective solution.

  5. Older adults report moderately more detailed autobiographical memories

    Robert S Gardner


    Full Text Available Autobiographical memory (AM is an essential component of the human mind. Although the amount and types of subjective detail (content that compose AMs constitute important dimensions of recall, age-related changes in memory content are not well characterized. Previously, we introduced the Cue-Recalled Autobiographical Memory test (CRAM; see, an instrument that collects subjective reports of AM content, and applied it to college-aged subjects. CRAM elicits AMs using naturalistic word-cues. Subsequently, subjects date each cued AM to a life period and count the number of remembered details from specified categories (features, e.g., temporal detail, spatial detail, persons, objects, and emotions. The current work applies CRAM to a broad range of individuals (18-78 years old to quantify the effects of age on AM content. Subject age showed a moderately positive effect on AM content: older compared with younger adults reported ~16% more details (~25 vs. ~21 in typical AMs. This age-related increase in memory content was similarly observed for remote and recent AMs, although content declined with the age of the event among all subjects. In general, the distribution of details across features was largely consistent among younger and older adults. However, certain types of details, i.e., those related to objects and sequences of events, contributed more to the age effect on content. Altogether, this work identifies a moderate age-related feature-specific alteration in the way life events are subjectively recalled, among an otherwise stable retrieval profile.

  6. Computer Tools for Construction, Modification and Analysis of Petri Nets

    Jensen, Kurt


    The practical use of Petri nets is — just as any other description technique — very dependent on the existence of adequate computer tools, which may assist the user to cope with the many details of a large description. For Petri nets there is a need for tools supporting construction of nets....... It describes some of the requirements which these tools must fulfil, in order to support the user in a natural and effective way. Finally some references are given to papers which describe examples of existing Petri net tools....

  7. Machine Tool Software


    A NASA-developed software package has played a part in technical education of students who major in Mechanical Engineering Technology at William Rainey Harper College. Professor Hack has been using (APT) Automatically Programmed Tool Software since 1969 in his CAD/CAM Computer Aided Design and Manufacturing curriculum. Professor Hack teaches the use of APT programming languages for control of metal cutting machines. Machine tool instructions are geometry definitions written in APT Language to constitute a "part program." The part program is processed by the machine tool. CAD/CAM students go from writing a program to cutting steel in the course of a semester.

  8. Benchmarking expert system tools

    Riley, Gary


    As part of its evaluation of new technologies, the Artificial Intelligence Section of the Mission Planning and Analysis Div. at NASA-Johnson has made timing tests of several expert system building tools. Among the production systems tested were Automated Reasoning Tool, several versions of OPS5, and CLIPS (C Language Integrated Production System), an expert system builder developed by the AI section. Also included in the test were a Zetalisp version of the benchmark along with four versions of the benchmark written in Knowledge Engineering Environment, an object oriented, frame based expert system tool. The benchmarks used for testing are studied.

  9. Biomedical images texture detail denoising based on PDE

    Chen, Guan-nan; Pan, Jian-ji; Li, Chao; Chen, Rong; Lin, Ju-qiang; Yan, Kun-tao; Huang, Zu-fang


    Biomedical images denosing based on Partial Differential Equation are well-known for their good processing results. General denosing methods based on PDE can remove the noises of images with gentle changes and preserve more structure details of edges, but have a poor effectiveness on the denosing of biomedical images with many texture details. This paper attempts to make an overview of biomedical images texture detail denosing based on PDE. Subsequently, Three kinds of important image denosing schemes are introduced in this paper: one is image denosing based on the adaptive parameter estimation total variation model, which denosing the images based on region energy distribution; the second is using G norm on the perception scale, which provides a more intuitive understanding of this norm; the final is multi-scale denosing decomposition. The above methods involved can preserve more structures of biomedical images texture detail. Furthermore, this paper demonstrates the applications of those three methods. In the end, the future trend of biomedical images texture detail denosing Based on PDE is pointed out.

  10. Detailed velocity ratio mapping during the aftershock sequence as a tool to monitor the fluid activity within the fault plane

    Bachura, Martin; Fischer, Tomáš


    The rheological properties of Earth materials are expressed by their seismic velocities and VP /VS ratio, which is easily obtained by the Wadati method. Its double-difference version based on cross-correlated waveforms enables focusing on very local structures and allows tracking, monitoring and analysing the fluid activity along faults. We applied the method to three 2014 mainshock-aftershock sequences in the West Bohemia/Vogtland (Czech Republic) earthquake swarm area and found pronounced VP /VS variations in time and space for different clusters of events located on a steeply dipping fault zone at depths ranging from 7 to 11 km. Each cluster reflects the spatial distribution of earthquakes along the fault plane but also the temporal evolution of the activity. Low values of VP /VS ratio down to 1.59 ± 0.02 were identified in the deeper part of the fault zone whereas higher values up to 1.73 ± 0.01 were estimated for clusters located on a shallower segment of the fault. Temporally the low VP /VS values are associated with the early aftershocks, while the higher VP /VS ratios are related only to later aftershocks. We interpret this behaviour as a result of saturation of the focal zone by compressible fluids: in the beginning the mainshock and early aftershocks driven by over-pressured fluids increased the porosity due to opening the fluid pathways. This process was associated with a decrease of the velocity ratio. In later stages the pressure and porosity decreased and the velocity ratio recovered to levels of 1.73, typical for a Poissonian medium and Earth's crust.

  11. A detailed and unified treatment of spin-orbit systems using tools distilled from the theory of bundles

    Heinemann, Klaus; Ellison, James A; Vogt, Mathias


    We return to our study \\cite{BEH} of invariant spin fields and spin tunes for polarized beams in storage rings but in contrast to the continuous-time treatment in \\cite{BEH}, we now employ a discrete-time formalism, beginning with the $\\rm{Poincar\\acute{e}}$ maps of the continuous time formalism. We then substantially extend our toolset and generalize the notions of invariant spin field and invariant frame field. We revisit some old theorems and prove several theorems believed to be new. In particular we study two transformation rules, one of them known and the other new, where the former turns out to be an $SO(3)$-gauge transformation rule. We then apply the theory to the dynamics of spin-$1/2$ and spin-$1$ particle bunches and their density matrix functions, describing semiclassically the particle-spin content of bunches. Our approach thus unifies the spin-vector dynamics from the T-BMT equation with the spin-tensor dynamics and other dynamics. This unifying aspect of our approach relates the examples elega...

  12. A detailed and unified treatment of spin-orbit systems using tools distilled from the theory of bundles

    Heinemann, K.; Ellison, J.A. [New Mexico Univ., Albuquerque, NM (United States). Dept. of Mathematics and Statistics; Barber, D.P.; Vogt, M. [Deutsches Elektronen-Synchrotron (DESY), Hamburg (Germany)


    We return to our study (2001) of invariant spin fields and spin tunes for polarized beams in storage rings but in contrast to the continuous-time treatment in this study, we now employ a discrete-time formalism, beginning with the Poincare maps of the continuous time formalism. We then substantially extend our toolset and generalize the notions of invariant spin field and invariant frame field. We revisit some old theorems and prove several theorems believed to be new. In particular we study two transformation rules, one of them known and the other new, where the former turns out to be an SO(3)-gauge transformation rule. We then apply the theory to the dynamics of spin-1/2 and spin-1 particle bunches and their density matrix functions, describing semiclassically the particle-spin content of bunches. Our approach thus unifies the spin-vector dynamics from the T-BMT equation with the spin-tensor dynamics and other dynamics. This unifying aspect of our approach relates the examples elegantly and uncovers relations between the various underlying dynamical systems in a transparent way. The particle motion is integrable but we now allow for nonlinear particle motion on each torus. Since this work is inspired by notions from the theory of bundles, we also provide insight into the underlying bundle-theoretic aspects of the well-established concepts of invariant spin field, spin tune and invariant frame field. Thus the group theoretical notion is exhibited. Since we neglect, as is usual, the Stern-Gerlach force, the underlying principal bundle is of product form so that we can present the theory in a fashion which does not use bundle theory. Nevertheless we occasionally mention the bundle-theoretic meaning of our concepts and we also mention the similarities with the geometrical approach to Yang-Mills Theory.

  13. Chapter 8: Planning Tools to Simulate and Optimize Neighborhood Energy Systems

    Zhivov, Alexander Michael; Case, Michael Patrick; Jank, Reinhard; Eicker, Ursula; Booth, Samuel


    This section introduces different energy modeling tools available in Europe and the USA for community energy master planning process varying from strategic Urban Energy Planning to more detailed Local Energy Planning. Two modeling tools used for Energy Master Planning of primarily residential communities, the 3D city model with CityGML, and the Net Zero Planner tool developed for the US Department of Defense installations are described in more details.

  14. A real-time tool positioning sensor for machine-tools.

    Ruiz, Antonio Ramon Jimenez; Rosas, Jorge Guevara; Granja, Fernando Seco; Honorato, Jose Carlos Prieto; Taboada, Jose Juan Esteve; Serrano, Vicente Mico; Jimenez, Teresa Molina


    In machining, natural oscillations, and elastic, gravitational or temperature deformations, are still a problem to guarantee the quality of fabricated parts. In this paper we present an optical measurement system designed to track and localize in 3D a reference retro-reflector close to the machine-tool's drill. The complete system and its components are described in detail. Several tests, some static (including impacts and rotations) and others dynamic (by executing linear and circular trajectories), were performed on two different machine tools. It has been integrated, for the first time, a laser tracking system into the position control loop of a machine-tool. Results indicate that oscillations and deformations close to the tool can be estimated with micrometric resolution and a bandwidth from 0 to more than 100 Hz. Therefore this sensor opens the possibility for on-line compensation of oscillations and deformations.

  15. A Real-Time Tool Positioning Sensor for Machine-Tools

    Ruiz, Antonio Ramon Jimenez; Rosas, Jorge Guevara; Granja, Fernando Seco; Honorato, Jose Carlos Prieto; Taboada, Jose Juan Esteve; Serrano, Vicente Mico; Jimenez, Teresa Molina


    In machining, natural oscillations, and elastic, gravitational or temperature deformations, are still a problem to guarantee the quality of fabricated parts. In this paper we present an optical measurement system designed to track and localize in 3D a reference retro-reflector close to the machine-tool's drill. The complete system and its components are described in detail. Several tests, some static (including impacts and rotations) and others dynamic (by executing linear and circular trajectories), were performed on two different machine tools. It has been integrated, for the first time, a laser tracking system into the position control loop of a machine-tool. Results indicate that oscillations and deformations close to the tool can be estimated with micrometric resolution and a bandwidth from 0 to more than 100 Hz. Therefore this sensor opens the possibility for on-line compensation of oscillations and deformations. PMID:22408472

  16. Study on Hierarchical Structure of Detailed Control Planning


    Using case studies,this paper analyzes the characteristics of detailed control planning and its hierarchical controls,the form and composition of plan content,and methodological innovations.It then suggests improvements to the planning structure that are oriented at adaptability,fairness,centrality,and scientific principles with regard to the content,methods,and results of the planning.Regarding the hierarchical control system,the paper suggests that the detailed control plan should be composed of "block planning" and "plot planning".It is believed that through a combination of block and plot planning,the problem of joining long-term and short-term planning will be solved and it will be possible to address the need for adjustment and revision of detailed control plan.

  17. Generic Reliability-Based Inspection Planning for Fatigue Sensitive Details

    Sørensen, John Dalsgaard; Straub, Daniel; Faber, Michael Havbro


    of fatigue sensitive details in fixed offshore steel jacket platforms and FPSO ship structures. Inspection and maintenance activities are planned such that code based requirements to the safety of personnel and environment for the considered structure are fulfilled and at the same time such that the overall......The generic approach for planning of in-service NDT inspections is extended to cover the case where the fatigue load is modified during the design lifetime of the structure. Generic reliability-based inspection planning has been developed as a practical approach to perform inspection planning...... expected costs for design, inspections, repairs and failures are minimized. The method is based on the assumption of “no-finds” of cracks during inspections. Each fatigue sensitive detail is categorized according to their type of details (SN curves), FDF values, RSR values, inspection, repair and failure...

  18. Testing for detailed balance in a financial market

    Fiebig, H. R.; Musgrove, D. P.


    We test a historical price-time series in a financial market (the NASDAQ 100 index) for a statistical property known as detailed balance. The presence of detailed balance would imply that the market can be modeled by a stochastic process based on a Markov chain, thus leading to equilibrium. In economic terms, a positive outcome of the test would support the efficient market hypothesis, a cornerstone of neo-classical economic theory. In contrast to the usage in prevalent economic theory the term equilibrium here is tied to the returns, rather than the price-time series. The test is based on an action functional S constructed from the elements of the detailed balance condition and the historical data set, and then analyzing S by means of simulated annealing. Checks are performed to verify the validity of the analysis method. We discuss the outcome of this analysis.

  19. A detailed gravimetric geoid of North America, Eurasia, and Australia

    Vincent, S.; Strange, W. E.


    A detailed gravimetric geoid of North America, the North Atlantic, Eurasia, and Australia computed from a combination of satellite-derived and surface 1 x 1 gravity data, is presented. Using a consistent set of parameters, this geoid is referenced to an absolute datum. The precision of this detailed geoid is + or - 2 meters in the continents but may be in the range of 5 to 7 meters in those areas where data was sparse. Comparisons of the detailed gravimetric geoid with results of Rice for the United States, Bomford and Fischer in Eurasia, and Mather in Australia are presented. Comparisons are also presented with geoid heights from satellite solutions for geocentric station coordinates in North America, the Caribbean, Europe, and Australia.

  20. Smart Growth Tools

    This page describes a variety of tools useful to federal, state, tribal, regional, and local government staff and elected officials; community leaders; developers; and others interested in smart growth development.

  1. Neighborhood Mapping Tool

    Department of Housing and Urban Development — This tool assists the public and Choice Neighborhoods applicants to prepare data to submit with their grant application by allowing applicants to draw the exact...

  2. TENCompetence tool demonstration

    Kluijfhout, Eric


    Kluijfhout, E. (2009). TENCompetence tool demonstration. Presented at Zorgacademie Parkstad (Health Academy Parkstad), Limburg Leisure Academy, Life Long Learning Limburg and a number of regional educational institutions. May, 18, 2009, Heerlen, The Netherlands: Open University of the Netherlands, T

  3. Tools and their uses


    Teaches names, general uses, and correct operation of all basic hand and power tools, fasteners, and measuring devices you are likely to need. Also, grinding, metal cutting, soldering, and more. 329 illustrations.

  4. NWRS Survey Prioritization Tool

    US Fish and Wildlife Service, Department of the Interior — A SMART Tool and User's Guide for aiding NWRS Station staff when prioritizing their surveys for an Inventory and Monitoring Plan. This guide describes a process and...

  5. Smart tool holder

    Day, Robert Dean; Foreman, Larry R.; Hatch, Douglas J.; Meadows, Mark S.


    There is provided an apparatus for machining surfaces to accuracies within the nanometer range by use of electrical current flow through the contact of the cutting tool with the workpiece as a feedback signal to control depth of cut.

  6. Game development tool essentials

    Berinstein, Paula; Ardolino, Alessandro; Franco, Simon; Herubel, Adrien; McCutchan, John; Nedelcu, Nicusor; Nitschke, Benjamin; Olmstead, Don; Robinet, Fabrice; Ronchi, Christian; Turkowski, Rita; Walter, Robert; Samour, Gustavo


    Offers game developers new techniques for streamlining the critical game tools pipeline. Inspires game developers to share their secrets and improve the productivity of the entire industry. Helps game industry practitioners compete in a hyper-competitive environment.

  7. Mapping Medicare Disparities Tool

    U.S. Department of Health & Human Services — The CMS Office of Minority Health has designed an interactive map, the Mapping Medicare Disparities Tool, to identify areas of disparities between subgroups of...

  8. ATO Resource Tool -

    Department of Transportation — Cru-X/ART is a shift management tool designed for?use by operational employees in Air Traffic Facilities.? Cru-X/ART is used for shift scheduling, shift sign in/out,...

  9. Chemical Data Access Tool

    U.S. Environmental Protection Agency — This tool is intended to aid individuals interested in learning more about chemicals that are manufactured or imported into the United States. Health and safety...

  10. Recovery Action Mapping Tool

    National Oceanic and Atmospheric Administration, Department of Commerce — The Recovery Action Mapping Tool is a web map that allows users to visually interact with and query actions that were developed to recover species listed under the...

  11. Cash Reconciliation Tool

    US Agency for International Development — CART is a cash reconciliation tool that allows users to reconcile Agency cash disbursements with Treasury fund balances; track open unreconciled items; and create an...

  12. Friction stir welding tool

    Tolle; Charles R. , Clark; Denis E. , Barnes; Timothy A.


    A friction stir welding tool is described and which includes a shank portion; a shoulder portion which is releasably engageable with the shank portion; and a pin which is releasably engageable with the shoulder portion.

  13. Autism Teaching Tool


    CERN pattern recognition technologies transferred to Austistic children learning tool. The state of the art of pattern recognition technology developed at CERN for High Energy Physics are transferred to Computer Vision domain and are used to develop a new

  14. Manual bamboo cutting tool.

    Bezerra, Mariana Pereira; Correia, Walter Franklin Marques; da Costa Campos, Fabio Ferreira


    The paper presents the development of a cutting tool guide, specifically for the harvest of bamboo. The development was made based on precepts of eco-design and ergonomics, for prioritizing the physical health of the operator and the maintenance of the environment, as well as meet specific requirements of bamboo. The main goal is to spread the use of bamboo as construction material, handicrafts, among others, from a handy, easy assembly and material available tool.

  15. Stochastic tools in turbulence

    Lumey, John L


    Stochastic Tools in Turbulence discusses the available mathematical tools to describe stochastic vector fields to solve problems related to these fields. The book deals with the needs of turbulence in relation to stochastic vector fields, particularly, on three-dimensional aspects, linear problems, and stochastic model building. The text describes probability distributions and densities, including Lebesgue integration, conditional probabilities, conditional expectations, statistical independence, lack of correlation. The book also explains the significance of the moments, the properties of the

  16. Simulation of flame-vortex interaction using detailed and reduced

    Hilka, M. [Gaz de France (GDF), 75 - Paris (France); Veynante, D. [Ecole Centrale de Paris, Laboratoire EM2C. CNRS, 92 - Chatenay-Malabry (France); Baum, M. [CERFACS (France); Poinsot, T.J. [Centre National de la Recherche Scientifique (CNRS), 45 - Orleans-la-Source (France). Institut de Mecanique des Fluides de Toulouse


    The interaction between a pair of counter-rotating vortices and a lean premixed CH{sub 4}/O{sub 2}/N{sub 2} flame ({Phi} = + 0.55) has been studied by direct numerical simulations using detailed and reduced chemical reaction schemes. Results from the complex chemistry simulation are discussed with respect to earlier experiments and differences in the simulations using detailed and reduces chemistry are investigated. Transient evolutions of the flame surface and the total heat release rate are compared and modifications in the evolution of the local flame structure are displayed. (authors) 22 refs.

  17. Detailed field test of yaw-based wake steering

    Fleming, P.; Churchfield, M.; Scholbrock, A.


    This paper describes a detailed field-test campaign to investigate yaw-based wake steering. In yaw-based wake steering, an upstream turbine intentionally misaligns its yaw with respect to the inflow to deflect its wake away from a downstream turbine, with the goal of increasing total power...... production. In the first phase, a nacelle-mounted scanning lidar was used to verify wake deflection of a misaligned turbine and calibrate wake deflection models. In the second phase, these models were used within a yaw controller to achieve a desired wake deflection. This paper details the experimental...

  18. Tile-based Level of Detail for the Parallel Age

    Niski, K; Cohen, J D


    Today's PCs incorporate multiple CPUs and GPUs and are easily arranged in clusters for high-performance, interactive graphics. We present an approach based on hierarchical, screen-space tiles to parallelizing rendering with level of detail. Adapt tiles, render tiles, and machine tiles are associated with CPUs, GPUs, and PCs, respectively, to efficiently parallelize the workload with good resource utilization. Adaptive tile sizes provide load balancing while our level of detail system allows total and independent management of the load on CPUs and GPUs. We demonstrate our approach on parallel configurations consisting of both single PCs and a cluster of PCs.

  19. Benefits of detailed models of muscle activation and mechanics

    Lehman, S. L.; Stark, L.


    Recent biophysical and physiological studies identified some of the detailed mechanisms involved in excitation-contraction coupling, muscle contraction, and deactivation. Mathematical models incorporating these mechanisms allow independent estimates of key parameters, direct interplay between basic muscle research and the study of motor control, and realistic model behaviors, some of which are not accessible to previous, simpler, models. The existence of previously unmodeled behaviors has important implications for strategies of motor control and identification of neural signals. New developments in the analysis of differential equations make the more detailed models feasible for simulation in realistic experimental situations.

  20. Comparison of emerging diagnostic tools for large commercial HVAC systems

    Friedman, Hannah; Piette, Mary Ann


    Diagnostic software tools for large commercial buildings are being developed to help detect and diagnose energy and other performance problems with building operations. These software applications utilize energy management control system (EMCS) trend log data. Due to the recent development of diagnostic tools, there has been little detailed comparison among the tools and a limited awareness of tool capabilities by potential users. Today, these diagnostic tools focus mainly on air handlers, but the opportunity exists for broadening the scope of the tools to include all major parts of heating, cooling, and ventilation systems in more detail. This paper compares several tools in the following areas: (1) Scope, intent, and background; (2) Data acquisition, pre-processing, and management; (3) Problems detected; (4) Raw data visualization; (5) Manual and automated diagnostic methods and (6) Level of automation. This comparison is intended to provide practitioners and researchers with a picture of the current state of diagnostic tools. There is tremendous potential for these tools to help improve commercial building energy and non-energy performance.

  1. Surface analysis of stone and bone tools

    Stemp, W. James; Watson, Adam S.; Evans, Adrian A.


    Microwear (use-wear) analysis is a powerful method for identifying tool use that archaeologists and anthropologists employ to determine the activities undertaken by both humans and their hominin ancestors. Knowledge of tool use allows for more accurate and detailed reconstructions of past behavior, particularly in relation to subsistence practices, economic activities, conflict and ritual. It can also be used to document changes in these activities over time, in different locations, and by different members of society, in terms of gender and status, for example. Both stone and bone tools have been analyzed using a variety of techniques that focus on the observation, documentation and interpretation of wear traces. Traditionally, microwear analysis relied on the qualitative assessment of wear features using microscopes and often included comparisons between replicated tools used experimentally and the recovered artifacts, as well as functional analogies dependent upon modern implements and those used by indigenous peoples from various places around the world. Determination of tool use has also relied on the recovery and analysis of both organic and inorganic residues of past worked materials that survived in and on artifact surfaces. To determine tool use and better understand the mechanics of wear formation, particularly on stone and bone, archaeologists and anthropologists have increasingly turned to surface metrology and tribology to assist them in their research. This paper provides a history of the development of traditional microwear analysis in archaeology and anthropology and also explores the introduction and adoption of more modern methods and technologies for documenting and identifying wear on stone and bone tools, specifically those developed for the engineering sciences to study surface structures on micro- and nanoscales. The current state of microwear analysis is discussed as are the future directions in the study of microwear on stone and bone tools.

  2. Reliability assessment of welded steel details in bridges using inspection

    Toft, Henrik Stensgaard; Sørensen, John Dalsgaard; Yalamas, T.


    of the membrane stresses are estimated using a generic bridge structure and traffic measurements. The optimal reliability level for a welded detail in a bridge subjected to fatigue are estimated by cost benefit-analysis taking into account the risk of human lives through the Life Quality Index. Since the optimal...

  3. The Hybrid Motor Prototype: Design Details and Demonstration Results


    S.Ueha and Y.Tomikawa3 have published some interesting details of the performance and life of ultrasonic motors with di erent frictional published as a technical report of the Institute for Systems Research, University of Maryland at College Park. [3] S. Ueha and Y. Tomikawa, Ultrasonic Motors : Theory and Applications. Clarendon Press, Oxford, 1993. 13

  4. Video-based facial animation with detailed appearance texture


    Facial shape transformation described by facial animation parameters (FAPs) involves the dynamic movement or deformation of eyes, brows, mouth, and lips, while detailed facial appearance concerns the facial textures such as creases, wrinkles, etc.Video-based facial animation exhibits not only facial shape transformation but also detailed appearance updates. In this paper, a novel algorithm for effectively extracting FAPs from video is proposed. Our system adopts the ICA-enforced direct appearance model (DAM) to track faces from video sequences; and then, FAPs are extracted from every frame of the video based on an extended model of Wincandidate 3.1. Facial appearance details are transformed from each frame by mapping an expression ratio image to the original image. We adopt wavelet to synthesize expressive details by combining the low-frequency signals of the original face and high-frequency signals of the expressive face from each frame of the video. Experimental results show that our proposed algorithm is suitable for reproducing realistic, expressive facial animations.

  5. Syllabus Detail and Students' Perceptions of Teacher Effectiveness

    Saville, Bryan K.; Zinn, Tracy E.; Brown, Allison R.; Marchuk, Kimberly A.


    Although syllabi provide students with important course information, they can also affect perceptions of teaching effectiveness. To test this idea, we distributed 2 versions of a hypothetical course syllabus, a brief version and a detailed version, and asked students to rate the teacher of the course on qualities associated with master teaching.…

  6. MIV project: Simulator detailed design and integration for the EUROSIM

    Thuesen, Gøsta; Parisch, Manlio; Jørgensen, John Leif;


    Under the ESA contract #11453/95/NL/JG(SC), aiming at assessing the feasibility of Rendez-vous and docking of unmanned spacecrafts, a reference mission scenario was defined. This report describes the detailed code developed for the contract, the code module interface and the interface to the EURO...

  7. Baca geothermal demonstration project. Power plant detail design document


    This Baca Geothermal Demonstration Power Plant document presents the design criteria and detail design for power plant equipment and systems, as well as discussing the rationale used to arrive at the design. Where applicable, results of in-house evaluations of alternatives are presented.

  8. Probabilistic Model for Fatigue Crack Growth in Welded Bridge Details

    Toft, Henrik Stensgaard; Sørensen, John Dalsgaard; Yalamas, Thierry


    In the present paper a probabilistic model for fatigue crack growth in welded steel details in road bridges is presented. The probabilistic model takes the influence of bending stresses in the joints into account. The bending stresses can either be introduced by e.g. misalignment or redistributio...

  9. Detailed bathymetric surveys in the central Indian Basin

    Kodagali, V.N.; KameshRaju, K.A.; Ramprasad, T.; George, P.; Jaisankar, S.

    Over 420,000 line kilometers of echo-sounding data was collected in the Central Indian Basin. This data was digitized, merged with navigation data and a detailed bathymetric map of the Basin was prepared. The Basin can be broadly classified...

  10. Flavor Asymmetry of Nucleon Sea from Detailed Balance

    Zhang, Y J; Yang, L M; Zhang, Yong-Jun; Ma, Bo-Qiang; Yang, Li-Ming


    In this study, the proton is taken as an ensemble of quark-gluon Fock states. Using the principle of detailed balance, we find $\\bar{d}-\\bar{u} \\approx 0.124$, which is in surprisingly agreement with the experimental observation.

  11. Pharmaceutical crystallography: is there a devil in the details?

    Bond, A. D.


    Modern instruments for small-molecule crystallography continue to become more sophisticated and more automated. This technical progress provides a basis for frontier research in chemical and pharmaceutical crystallography, but it also encourages analytical crystallographers to become more...... are presented for pharmaceutical compounds, and the potential importance of the "details" in pharmaceutical crystallography is discussed....

  12. 5 CFR 2635.104 - Applicability to employees on detail.


    ... 5 Administrative Personnel 3 2010-01-01 2010-01-01 false Applicability to employees on detail. 2635.104 Section 2635.104 Administrative Personnel OFFICE OF GOVERNMENT ETHICS GOVERNMENT ETHICS STANDARDS OF ETHICAL CONDUCT FOR EMPLOYEES OF THE EXECUTIVE BRANCH General Provisions §...

  13. Spectangular - Spectral Disentangling For Detailed Chemical Analysis Of Binaries

    Sablowski, Daniel


    Disentangling of spectra helps to improve the orbit parameters and allows detailed chemical analysis. Spectangular is a GUI program written in C++ for spectral disentangling of spectra of SB1 and SB2 systems. It is based on singular value decomposition in the wavelength space and is coupled to an orbital solution.The results are the component spectra and the orbital parameters.

  14. Details of the battle to control Campeche Bay spill


    Details of the battle to control Campeche Bay spill from Petroleos Mexicanos' well at Ixtoc 1 are given, including the poor performance of ''Operation Sombrero'' and air and surface monitoring of spill transport, particularly by the US Coast Guard.

  15. 18 CFR 2.80 - Detailed environmental statement.


    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Detailed environmental statement. 2.80 Section 2.80 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY GENERAL RULES GENERAL POLICY AND INTERPRETATIONS Statement of General Policy...

  16. Educational Outreach to Opioid Prescribers: The Case for Academic Detailing.

    Trotter Davis, Margot; Bateman, Brian; Avorn, Jerry


    Nonmedical use of opioid medications constitutes a serious health threat as the rates of addiction, overdoses, and deaths have risen in recent years. Increasingly, inappropriate and excessively liberal prescribing of opioids by physicians is understood to be a central part of the crisis. Public health officials, hospital systems, and legislators are developing programs and regulations to address the problem in sustained and systematic ways that both insures effective treatment of pain and appropriate limits on the availability of opioids. Three approaches have obtained prominence as means of avoiding excessive and inappropriate prescribing, including: providing financial incentives to physicians to change their clinical decision through pay-for-performance contracts, monitoring patient medications through Prescription Drug Monitoring Programs, and educational outreach to physicians. A promising approach to educational outreach to physicians is an intervention known as "academic detailing." It was developed in the 1980s to provide one-on-one educational outreach to physicians using similar methods as the pharmaceutical industry that sends "detailers" to market their products to physician practices. Core to academic detailing, however, is the idea that medical decisions should be based on evidence-based information, including managing conditions with updated assessment measures, behavioral, and nonpharmacological interventions. With the pharmaceutical industry spending billions of dollars to advertise their products, individual practitioners can have difficulty gathering unbiased information, especially as the number of approved medications grows each year. Academic detailing has successfully affected the management of health conditions, such as atrial fibrillation, chronic obstructive pulmonary disease, and recently, has targeted physicians who prescribe opioids. This article discusses the approach as a potentially effective preventative intervention to address the

  17. Tool Gear Documentation

    May, J; Gyllenhaal, J


    Tool Gear is designed to allow tool developers to insert instrumentation code into target programs using the DPCL library. This code can gather data and send it back to the Client for display or analysis. Tools can use the Tool Gear client without using the DPCL Collector. Any collector using the right protocols can send data to the Client for display and analysis. However, this document will focus on how to gather data with the DPCL Collector. There are three parts to the task of using Tool Gear to gather data through DPCL: (1) Write the instrumentation code that will be loaded and run in the target program. The code should be in the form of one or more functions, which can pass data structures back to the Client by way of DPCL. The collections of functions is compiled into a library, as described in this report. (2) Write the code that tells the DPCL Collector about the instrumentation and how to forward data back to the Client. (3) Extend the client to accept data from the Collector and display it in a useful way. The rest of this report describes how to carry out each of these steps.


    Vladimíra Schindlerová


    Full Text Available The cold bulk forming is a technology that is commonly used in many industrial enterprises. Even though nowadays high demands are posed on labour productivity, quality and own production costs, the findings from practice suggests that not sufficient attention is paid to the issue of tool manage-ment. Also, the theoretical backgrounds and knowledge in this area are not processed in a sufficiently detailed and comprehensive way. This paper submitted to conference deals with the issue of predic-tion of the wear surface of the forming tools and their subsequent renewal. The research at selected materials was focused on the course of their straining in contact of the blank with the tool in the process of cold bulk forming. The experiments were based on a simple performance of the conven-tional upsetting test. On the basis of analysis of the results was determined the mechanism of tool wear by abrasion and was evaluated its impact on the service life time of the tool and also the possi-bility of influencing the quality of final parts.

  19. Climate Change and Water Tools

    EPA tools and workbooks guide users to mitigate and adapt to climate change impacts. Various tools can help manage risks, others can visualize climate projections in maps. Included are comprehensive tool kits hosted by other federal agencies.

  20. C-TOOL

    Taghizadeh-Toosi, Arezoo; Christensen, Bent Tolstrup; Hutchings, Nicholas John


    Soil organic carbon (SOC) is a significant component of the global carbon (C) cycle. Changes in SOC storage affect atmospheric CO2 concentrations on decadal to centennial timescales. The C-TOOL model was developed to simulate farm- and regional-scale effects of management on medium- to long......-term SOC storage in the profile of well-drained agricultural mineral soils. C-TOOL uses three SOC pools for both the topsoil (0–25 cm) and the subsoil (25–100 cm), and applies temperature-dependent first order kinetics to regulate C turnover. C-TOOL also enables the simulation of 14C turnover. The simple...... model structure facilitates calibration and requires few inputs (mean monthly air temperature, soil clay content, soil C/N ratio and C in organic inputs). The model was parameterised using data from 19 treatments drawn from seven long-term field experiments in the United Kingdom, Sweden and Denmark...

  1. Cataract Surgery Tool


    The NASA-McGannon cataract surgery tool is a tiny cutter-pump which liquefies and pumps the cataract lens material from the eye. Inserted through a small incision in the cornea, the tool can be used on the hardest cataract lens. The cutter is driven by a turbine which operates at about 200,000 revolutions per minute. Incorporated in the mechanism are two passages for saline solutions, one to maintain constant pressure within the eye, the other for removal of the fragmented lens material and fluids. Three years of effort have produced a design, now being clinically evaluated, with excellent potential for improved cataract surgery. The use of this tool is expected to reduce the patient's hospital stay and recovery period significantly.

  2. New Conceptual Design Tools

    Pugnale, Alberto; Holst, Malene; Kirkegaard, Poul Henning


    This paper aims to discuss recent approaches in using more and more frequently computer tools as supports for the conceptual design phase of the architectural project. The present state-of-the-art about software as conceptual design tool could be summarized in two parallel tendencies. On the one...... hand, the main software houses are trying to introduce powerful and effective user-friendly applications in the world of building designers, that are more and more able to fit their specific requirements; on the other hand, some groups of expert users with a basic programming knowledge seem to deal...... with the problem of software as conceptual design tool by means of 'scripting', in other words by self-developing codes able to solve specific and well defined design problems. Starting with a brief historical recall and the discussion of relevant researches and practical experiences, this paper investigates...

  3. Lessons Learned from Creating a Course Advising Tool

    Mattei, Nicholas; Guerin, Joshua T; Goldsmith, Judy; Mazur, Joan M


    We detail some lessons learned while designing and testing a course selection tool for undergraduates at a large state university. Between 2009 - 2011 we conducted two surveys of over 500 students in multiple majors and colleges. These surveys asked students detailed questions about their preferences concerning courses selection, advising, and career paths. We present data from this study which may be helpful for faculty and staff who advise undergraduate students. We find that advising software tools can help both students and human advisors in terms of rote requirement checking and basic course planning, but nothing can replace an in person advising session.

  4. An Investigation of Placement and Type of Seductive Details: The Primacy Effect of Seductive Details on Text Recall

    Rowland, Emily; Skinner, Christopher H.; Davis-Richards, Kai; Saudargas, Richard; Robinson, Daniel H.


    Seductive details are interesting, but sometimes irrelevant to the target material present in texts and lectures. In the current study, 388 undergraduate students read six paragraphs describing Sigmund Freud's psychosexual stages (i.e., target material). Participants in four groups also read one of two biographical paragraphs. The biographical…

  5. Tool nimega Sacco


    Kolmekümneseks on saanud Zanotta kott ehk tool "Sacco", mille 1968. a. disainisid P. Gatti, C. Paolini, F. Teodoro. "Sacco" - polüstüreenist graanulitega täidetud kott. Tähelepanu pälvis ka Zanotta firma täispuhutav tool "Blow" (1967, Scholari, D'Urbino, Lomazzi, De Pas). E. Lucie-Smith neist. 1968. aastale on pühendatud Düsseldorfi Kunstimuuseumi näitus "1968. a. legendid ja sümbolid", kus on eksponeeritud ligi 500 objekti ja mitu rekonstrueeritud interjööri

  6. Service Provider DevOps network capabilities and tools

    Steinert, Rebecca; John, Wolfgang; Sköldström, Pontus; Pechenot, Bertrand; Gulyás, András; Pelle, István; Lévai, Tamás; Németh, Felicián; Kim, Juhoon; Meirosu, Catalin; Cai, Xuejun; Fu, Chunyan; Pentikousis, Kostas; Sharma, Sachin; Papafili, Ioanna


    This report provides an understanding of how the UNIFY Service Provider (SP)-DevOps concept can be applied and integrated with a combined cloud and transport network NFV architecture. Specifically, the report contains technical descriptions of a set of novel SP-DevOps tools and support functions that facilitate observability, troubleshooting, verification, and VNF development processes. The tools and support functions are described in detail together with their architectural mapping, giving a...

  7. Exploring Architectural Details Through a Wearable Egocentric Vision Device

    Stefano Alletto


    Full Text Available Augmented user experiences in the cultural heritage domain are in increasing demand by the new digital native tourists of 21st century. In this paper, we propose a novel solution that aims at assisting the visitor during an outdoor tour of a cultural site using the unique first person perspective of wearable cameras. In particular, the approach exploits computer vision techniques to retrieve the details by proposing a robust descriptor based on the covariance of local features. Using a lightweight wearable board, the solution can localize the user with respect to the 3D point cloud of the historical landmark and provide him with information about the details at which he is currently looking. Experimental results validate the method both in terms of accuracy and computational effort. Furthermore, user evaluation based on real-world experiments shows that the proposal is deemed effective in enriching a cultural experience.

  8. Memory for contextual details: effects of emotion and aging.

    Kensinger, Elizabeth A; Piguet, Olivier; Krendl, Anne C; Corkin, Suzanne


    When individuals are confronted with a complex visual scene that includes some emotional element, memory for the emotional component often is enhanced, whereas memory for peripheral (nonemotional) details is reduced. The present study examined the effects of age and encoding instructions on this effect. With incidental encoding instructions, young and older adults showed this pattern of results, indicating that both groups focused attention on the emotional aspects of the scene. With intentional encoding instructions, young adults no longer showed the effect: They were just as likely to remember peripheral details of negative images as of neutral images. The older adults, in contrast, did not overcome the attentional bias: They continued to show reduced memory for the peripheral elements of the emotional compared with the neutral scenes, even with the intentional encoding instructions.

  9. Samnett: the EMPS model with power flow constraints: implementation details

    Helseth, Arild; Warland, Geir; Mo, Birger; Fosso, Olav B.


    This report describes the development and implementation of Samnett. Samnett is a new prototype for solving the coupled market and transmission network problem. The prototype is based on the EMPS model (Samkjoeringsmodellen). Results from the market model are distributed to a detailed transmission network model, where a DC power flow detects if there are overloads on monitored lines or interconnections. In case of overloads, power flow constraints are generated and added to the market problem. This report is an updated version of TR A6891 {sup I}mplementing Network Constraints in the EMPS model{sup .} It further elaborates on theoretical and implementation details in Samnett, but does not contain the case studies and file descriptions presented in TR A6891.(auth)

  10. An alternative measure of solar activity from detailed sunspot datasets

    Muraközy, Judit; Ludmány, András


    The sunspot number is analyzed by using detailed sunspot data, including aspects of observability, sunspot sizes, and proper identification of sunspot groups as discrete entities of the solar activity. The tests show that besides the subjective factors there are also objective causes of the ambiguities in the series of sunspot numbers. To introduce an alternative activity measure the physical meaning of the sunspot number has to be reconsidered. It contains two components whose numbers are governed by different physical mechanisms, this is one source of the ambiguity. This article suggests an activity index, which is the amount of emerged magnetic flux. The only long-term proxy measure is the detailed sunspot area dataset with proper calibration to the magnetic flux amount. The Debrecen sunspot databases provide an appropriate source for the establishment of the suggested activity index.

  11. Detailed Performance of the Outer Tracker at LHCb

    Tuning, N


    The LHCb Outer Tracker is a gaseous detector covering an area of 5x6m2 with 12 double layers of straw tubes. Based on data of the first LHC running period from 2010 to 2012, the performance in terms of the single hit resolution and efficiency are presented. Details on the ionization length and subtle effects regarding signal reflections and the subsequent time-walk correction are given. The efficiency to detect a hit in the central half of the straw is estimated to be 99.2%, and the position resolution is determined to be approximately 200 um, depending on the detailed implementation of the internal alignment of individual detector modules. The Outer Tracker received a dose in the hottest region corresponding to 0.12 C/cm, and no signs of gain deterioration or other ageing effects are observed.

  12. Detailed field test of yaw-based wake steering

    Fleming, P.; Churchfield, M.; Scholbrock, A.; Clifton, A.; Schreck, S.; Johnson, K.; Wright, A.; Gebraad, P.; Annoni, J.; Naughton, B.; Berg, J.; Herges, T.; White, J.; Mikkelsen, T.; Sjöholm, M.; Angelou, N.


    This paper describes a detailed field-test campaign to investigate yaw-based wake steering. In yaw-based wake steering, an upstream turbine intentionally misaligns its yaw with respect to the inflow to deflect its wake away from a downstream turbine, with the goal of increasing total power production. In the first phase, a nacelle-mounted scanning lidar was used to verify wake deflection of a misaligned turbine and calibrate wake deflection models. In the second phase, these models were used within a yaw controller to achieve a desired wake deflection. This paper details the experimental design and setup. All data collected as part of this field experiment will be archived and made available to the public via the U.S. Department of Energy's Atmosphere to Electrons Data Archive and Portal.

  13. Detailed assessment of homology detection using different substitution matrices

    LI Jing; WANG Wei


    Homology detection plays a key role in bioinformatics, whereas substitution matrix is one of the most important components in homology detection. Thus, besides the improvement of alignment algorithms, another effective way to enhance the accuracy of homology detection is to use proper substitution matrices or even construct new matrices.A study on the features of various matrices and on the comparison of the performances between different matrices in homology detection enable us to choose the most proper or optimal matrix for some specific applications. In this paper, by taking BLOSUM matrices as an example, some detailed features of matrices in homology detection are studied by calculating the distributions of numbers of recognized proteins over different sequence identities and sequence lengths. Our results clearly showed that different matrices have different preferences and abilities to the recognition of remote homologous proteins. Furthermore, detailed features of the various matrices can be used to improve the accuracy of homology detection.


    Fan, Jianhua; Andersen, Elsa; Furbo, Simon


    The charging behaviour of smart solar tanks for solar combisystems for one-family houses is investigated with detailed Computational Fluid Dynamics (CFD) modelling and Particle Image Velocimetry (PIV) measurements. The smart solar tank can be charged with a variable auxiliary volume fitted...... to the expected future energy demand. Therefore the heat loss from the tank is decreased and the thermal performance of the solar heating system is increased compared to a traditional system with a fixed auxiliary volume. The solar tank can be charged either by an electric heating element situated in the tank...... or by an electric heating element in a side-arm mounted on the side of the tank. Detailed CFD models of the smart tanks are built with different mesh densities in the tank and in the side-arm. The thermal conditions of the tank during charging are calculated with the CFD models. The fluid flow and temperature...


    Fan, Jianhua; Andersen, Elsa; Furbo, Simon

    The charging behaviour of smart solar tanks for solar combisystems for one-family houses is investigated with detailed Computational Fluid Dynamics (CFD) modelling and Particle Image Velocimetry (PIV) measurements. The smart solar tank can be charged with a variable auxiliary volume fitted...... to the expected future energy demand. Therefore the heat loss from the tank is decreased and the thermal performance of the solar heating system is increased compared to a traditional system with a fixed auxiliary volume. The solar tank can be charged either by an electric heating element situated in the tank...... or by an electric heating element in a side-arm mounted on the side of the tank. Detailed CFD models of the smart tanks are built with different mesh densities in the tank and in the side-arm. The thermal conditions of the tank during charging are calculated with the CFD models. The fluid flow and temperature...

  16. Detailed thermal-hydraulic computation into a containment building

    Caruso. A.; Flour, I.; Simonin, O. [EDF/LNH, Chatou (France); Cherbonnel, C [EDF/SEPTEN, Villeurbanne (France)


    This paper deals with numerical predictions of the influence of water sprays upon stratifications into a containment building using a two-dimensional two-phase flow code. Basic equations and closure assumptions are briefly presented. A test case in a situation involving spray evaporation is first detailed to illustrate the validation step. Then results are presented for a compressible recirculating flow into a containment building with condensation phenomena.

  17. Technology of Strengthening Steel Details by Surfacing Composite Coatings

    Burov, V. G.; Bataev, A. A.; Rakhimyanov, Kh M.; Mul, D. O.


    The article considers the problem of forming wear resistant meal ceramic coatings on steel surfaces using the results of our own investigations and the analysis of achievements made in the country and abroad. Increasing the wear resistance of surface layers of steel details is achieved by surfacing composite coatings with carbides or borides of metals as disperse particles in the strengthening phase. The use of surfacing on wearing machine details and mechanisms has a history of more than 100 years. But still engineering investigations in this field are being conducted up to now. The use of heating sources which provide a high density of power allows ensuring temperature and time conditions of surfacing under which composites with peculiar service and functional properties are formed. High concentration of energy in the zone of melt, which is created from powder mixtures and the hardened surface layer, allows producing the transition zone between the main material and surfaced coating. Surfacing by the electron beam directed from vacuum to the atmosphere is of considerable technological advantages. They give the possibility of strengthening surface layers of large-sized details by surfacing powder mixtures without their preliminary compacting. A modified layer of the main metal with ceramic particles distributed in it is created as a result of heating surfaced powders and the detail surface layer by the electron beam. Technology of surfacing allows using powders of refractory metals and graphite in the composition of powder mixtures. They interact with one another and form the particles of the hardening phase of the composition coating. The chemical composition of the main and surfaced materials is considered to be the main factor which determines the character of metallurgical processes in local zones of melt as well as the structure and properties of surfaced composition.

  18. New trends in Internet attacks: Clickjacking in detail

    Thoresen, Torgeir Dahlqvist


    While the complexity of web applications and their functionality continually increase, so do the number of opportunities for an attacker to launch successful attacks against a web application's users. In this thesis we investigate and describe clickjacking in great detail. To our knowledge, this work represent the first systematic scientific approach to assess clickjacking that also consider the attack's social consequences for users' security through an experiment and survey. We address the...

  19. Properties of quantum Markovian master equations. [Semigroup law, detailed balance

    Gorini, V.; Frigerio, A.; Verri, M.; Kossakowski, A.; Sudarshan, E.C.G.


    An essentially self-contained account is given of some general structural properties of the dynamics of quantum open Markovian systems. Some recent results regarding the problem of the classification of quantum Markovian master equations and the limiting conditions under which the dynamical evolution of a quantum open system obeys an exact semigroup law (weak coupling limit and singular coupling limit are reviewed). A general form of quantum detailed balance and its relation to thermal relaxation and to microreversibility is discussed.

  20. Detailed models for timing and efficiency in resistive plate chambers

    Riegler, Werner


    We discuss detailed models for detector physics processes in Resistive Plate Chambers, in particular including the effect of attachment on the avalanche statistics. In addition, we present analytic formulas for average charges and intrinsic RPC time resolution. Using a Monte Carlo simulation including all the steps from primary ionization to the front-end electronics we discuss the dependence of efficiency and time resolution on parameters like primary ionization, avalanche statistics and threshold.

  1. Detailed reduction of reaction mechanisms for flame modeling

    Wang, Hai; Frenklach, Michael


    A method for reduction of detailed chemical reaction mechanisms, introduced earlier for ignition system, was extended to laminar premixed flames. The reduction is based on testing the reaction and reaction-enthalpy rates of the 'full' reaction mechanism using a zero-dimensional model with the flame temperature profile as a constraint. The technique is demonstrated with numerical tests performed on the mechanism of methane combustion.

  2. Shading-based Surface Detail Recovery under General Unknown Illumination.

    Xu, Di; Duan, Qi; Zheng, Jianmin; Zhang, Juyong; Cai, Jianfei; Cham, Tat-Jen


    Reconstructing the shape of a 3D object from multi-view images under unknown, general illumination is a fundamental problem in computer vision and high quality reconstruction is usually challenging especially when fine detail is needed and the albedo of the object is non-uniform. This paper introduces vertex overall illumination vectors to model the illumination effect and presents a total variation (TV) based approach for recovering surface details using shading and multi-view stereo (MVS). Behind the approach are the two important observations: (1) the illumination over the surface of an object often appears to be piece wise smooth and (2) the recovery of surface orientation is not sufficient for reconstructing the surface, which was often overlooked previously. Thus we propose to use TV to regularize the overall illumination vectors and use visual hull to constrain partial vertices. The reconstruction is formulated as a constrained TV-minimization problem that simultaneously treats the shape and illumination vectors as unknowns. An augmented Lagrangian method is proposed to quickly solve the TV-minimization problem. As a result, our approach is robust, stable and is able to efficiently recover high quality of surface details even when starting with a coarse model obtained using MVS. These advantages are demonstrated by extensive experiments on the state-of-the-art MVS database, which includes challenging objects with varying albedo.

  3. Digital Tectonic Tools

    Schmidt, Anne Marie Due


    in particular. A model of the aspects in the term tectonics – epresentation, ontology and culture – will be presented and used to discuss the current digital tools’ ability in tectonics. Furthermore it will be discussed what a digital tectonic tool is and could be and how a connection between the digital...

  4. Sight Application Analysis Tool

    Bronevetsky, G. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)


    The scale and complexity of scientific applications makes it very difficult to optimize, debug and extend them to support new capabilities. We have developed a tool that supports developers’ efforts to understand the logical flow of their applications and interactions between application components and hardware in a way that scales with application complexity and parallelism.

  5. Tools for Authentication

    White, G


    Many recent Non-proliferation and Arms Control software projects include a software authentication component. In this context, 'authentication' is defined as determining that a software package performs only its intended purpose and performs that purpose correctly and reliably over many years. In addition to visual inspection by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs both to aid the visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary, and have limited extensibility. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool must be based on a complete language compiler infrastructure, that is, one that can parse and digest the full language through its standard grammar. ROSE is precisely such a compiler infrastructure developed within DOE. ROSE is a robust source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C, C++, and FORTRAN. This year, it has been extended to support the automated analysis of binaries. We continue to extend ROSE to address a number of security-specific requirements and apply it to software authentication for Non-proliferation and Arms Control projects. We will give an update on the status of our work.

  6. Incident Information Management Tool

    Pejovic, Vladimir


    Flaws of\tcurrent incident information management at CMS and CERN\tare discussed. A new data\tmodel for future incident database is\tproposed and briefly described. Recently developed draft version of GIS-­‐based tool for incident tracking is presented.

  7. Photutils: Photometry tools

    Bradley, Larry; Sipocz, Brigitta; Robitaille, Thomas; Tollerud, Erik; Deil, Christoph; Vinícius, Zè; Barbary, Kyle; Günther, Hans Moritz; Bostroem, Azalee; Droettboom, Michael; Bray, Erik; Bratholm, Lars Andersen; Pickering, T. E.; Craig, Matt; Pascual, Sergio; Greco, Johnny; Donath, Axel; Kerzendorf, Wolfgang; Littlefair, Stuart; Barentsen, Geert; D'Eugenio, Francesco; Weaver, Benjamin Alan


    Photutils provides tools for detecting and performing photometry of astronomical sources. It can estimate the background and background rms in astronomical images, detect sources in astronomical images, estimate morphological parameters of those sources (e.g., centroid and shape parameters), and perform aperture and PSF photometry. Written in Python, it is an affiliated package of Astropy (ascl:1304.002).

  8. Change Detection Tools

    Dekker, R.J.; Kuenzer, C.; Lehner, M.; Reinartz, P.; Niemeyer, I.; Nussbaum, S.; Lacroix, V.; Sequeira, V.; Stringa, E.; Schöpfer, E.


    In this chapter a wide range of change detection tools is addressed. They are grouped into methods suitable for optical and multispectral data, synthetic aperture radar (SAR) images, and 3D data. Optical and multispectral methods include unsupervised approaches, supervised and knowledge-based approa

  9. Field Information Support Tool


    assessment and analysis tool developed by CASOS at Carnegie Mellon. According to the developer’s Web site (Carley, 2010): It [ORA] contains hundreds of... clinics , and the availability of pharmacies for medical supplies and were mapped in both Google Earth and FusionView. Once collected, the information

  10. Clean Cities Tools



    The U.S. Department of Energy's Clean Cities offers a large collection of Web-based tools on the Alternative Fuels Data Center. These calculators, interactive maps, and data searches can assist fleets, fuels providers, and other transportation decision makers in their efforts to reduce petroleum use.

  11. Tools and Concepts.

    Artis, Margaret, Ed.; And Others

    This guide provides enrichment for students to develop tools and concepts used in various areas of mathematics. The first part presents arithmetic progressions, geometric progressions, and harmonic progression. In the second section, the concept of mathematic induction is developed from intuitive induction, using concrete activities, to the…

  12. The science writing tool

    Schuhart, Arthur L.

    This is a two-part dissertation. The primary part is the text of a science-based composition rhetoric and reader called The Science Writing Tool. This textbook has seven chapters dealing with topics in Science Rhetoric. Each chapter includes a variety of examples of science writing, discussion questions, writing assignments, and instructional resources. The purpose of this text is to introduce lower-division college science majors to the role that rhetoric and communication plays in the conduct of Science, and how these skills contribute to a successful career in Science. The text is designed as a "tool kit," for use by an instructor constructing a science-based composition course or a writing-intensive Science course. The second part of this part of this dissertation reports on student reactions to draft portions of The Science Writing Tool text. In this report, students of English Composition II at Northern Virginia Community College-Annandale were surveyed about their attitudes toward course materials and topics included. The findings were used to revise and expand The Science Writing Tool.

  13. Balancing the tools

    Leroyer, Patrick


    The purpose of this article is to describe the potential of a new combination of functions in lexicographic tools for tourists. So far lexicography has focused on the communicative information needs of tourists, i.e. helping tourists decide what to say in a number of specific tourist situations, ...

  14. Apple Shuns Tracking Tool


    Apple Inc. is advising software de- velopers to stop using a feature in software for its iPhones and iPads .that has been linked to privacyconcerns, a move that would also take away a widely used tool for tracking users and their behavior. Developers who write programs for Apple's lOS operating system have been using a unique.

  15. Nitrogen Trading Tool (NTT)

    The Natural Resources Conservation Service (NRCS) recently developed a prototype web-based nitrogen trading tool to facilitate water quality credit trading. The development team has worked closely with the Agriculture Research Service Soil Plant Nutrient Research Unit (ARS-SPNR) and the Environmenta...

  16. Risk Management Implementation Tool

    Wright, Shayla L.


    Continuous Risk Management (CM) is a software engineering practice with processes, methods, and tools for managing risk in a project. It provides a controlled environment for practical decision making, in order to assess continually what could go wrong, determine which risk are important to deal with, implement strategies to deal with those risk and assure the measure effectiveness of the implemented strategies. Continuous Risk Management provides many training workshops and courses to teach the staff how to implement risk management to their various experiments and projects. The steps of the CRM process are identification, analysis, planning, tracking, and control. These steps and the various methods and tools that go along with them, identification, and dealing with risk is clear-cut. The office that I worked in was the Risk Management Office (RMO). The RMO at NASA works hard to uphold NASA s mission of exploration and advancement of scientific knowledge and technology by defining and reducing program risk. The RMO is one of the divisions that fall under the Safety and Assurance Directorate (SAAD). I worked under Cynthia Calhoun, Flight Software Systems Engineer. My task was to develop a help screen for the Continuous Risk Management Implementation Tool (RMIT). The Risk Management Implementation Tool will be used by many NASA managers to identify, analyze, track, control, and communicate risks in their programs and projects. The RMIT will provide a means for NASA to continuously assess risks. The goals and purposes for this tool is to provide a simple means to manage risks, be used by program and project managers throughout NASA for managing risk, and to take an aggressive approach to advertise and advocate the use of RMIT at each NASA center.

  17. A Real-Time Tool Positioning Sensor for Machine-Tools

    Vicente Mico Serrano


    Full Text Available In machining, natural oscillations, and elastic, gravitational or temperature deformations, are still a problem to guarantee the quality of fabricated parts. In this paper we present an optical measurement system designed to track and localize in 3D a reference retro-reflector close to the machine-tool’s drill. The complete system and its components are described in detail. Several tests, some static (including impacts and rotations and others dynamic (by executing linear and circular trajectories, were performed on two different machine tools. It has been integrated, for the first time, a laser tracking system into the position control loop of a machine-tool. Results indicate that oscillations and deformations close to the tool can be estimated with micrometric resolution and a bandwidth from 0 to more than 100 Hz. Therefore this sensor opens the possibility for on-line compensation of oscillations and deformations.

  18. Method for including detailed evaluation of daylight levels in Be06

    Petersen, Steffen


    Good daylight conditions in office buildings have become an important issue due to new European regulatory demands which include energy consumption for electrical lighting in the building energy frame. Good daylight conditions in offices are thus in increased focus as an energy conserving measure....... In order to evaluate whether a certain design is good daylight design or not building designers must perform detailed evaluation of daylight levels, including the daylight performance of dynamic solar shadings, and include these in the energy performance evaluation. However, the mandatory national...... calculation tool in Denmark (Be06) for evaluating the energy performance of buildings is currently using a simple representation of available daylight in a room and simple assumptions regarding the control of shading devices. In a case example, this is leading to an overestimation of the energy consumption...

  19. The Importance of Detail and Organization of Technical Drawings for the Project Management

    Diego de Avila


    Full Text Available This research takes into account the process of technical improvement through the development and interpretation of technical drawing with the appropriate software. However, it’s known that the technical design, no longer being new in business, tends to effect the production process in a clear and objective way, but there is a concern for the training and use of this standardized form tool. This article presents a case study detailing the use of standardized technical drawing by CAD system and continuously. Seeking to meet the improvement of quality control, production flexibility, reduced costs, documentation and even security, in a company in the northern region of the State of Rio Grande do Sul.


    Ivan Kuric


    Full Text Available Paper deals with aspects of quality and accuracy of machine tools. As the accuracy of machine tools has key factor for product quality, it is important to know the methods for evaluation of quality and accuracy of machine tools. Several aspects of diagnostics of machine tools are described, such as aspects of reliability.

  1. Web Tools: The Second Generation

    Pascopella, Angela


    Web 2.0 tools and technologies, or second generation tools, help districts to save time and money, and eliminate the need to transfer or move files back and forth across computers. Many Web 2.0 tools help students think critically and solve problems, which falls under the 21st-century skills. The second-generation tools are growing in popularity…

  2. The Effects of Seductive Details in an Inflatable Planetarium

    Gillette, Sean

    Astronomy is becoming a forgotten science, which is evident by its relatively low enrollment figures compared to biology, chemistry, and physics. A portable inflatable planetarium brings relevance back to astronomy and offers support to students and educators by simulating realistic astronomical environments. This study sought to determine if learning is improved in an inflatable planetarium by adhering to the design principles of the cognitive theory of multimedia learning (CTML), specifically the coherence principle, in an authentic classroom. Two groups of 5th grade students of similar ability were purposefully assigned using a 1-teacher-to-many-students format with mean lesson lengths of 34 minutes. The experimental group was differentiated with seductive details, defined as interesting but irrelevant facts that can distract learning. The control group ( n = 28), with seductive details excluded, outperformed the experimental group (n = 28), validating the coherence principle and producing a Cohen's effect size of medium practical significance (d = 0.4). These findings suggest that CTML, when applied to planetarium instruction, does increase student learning and that seductive details do have a negative effect on learning. An adult training project was created to instruct educators on the benefits of CTML in astronomy education. This study leads to positive social change by highlighting astronomy education while providing educators with design principles of CTML in authentic settings to maximize learning, aid in the creation of digital media (astronomical simulations/instructional lessons for planetariums) and provide valuable training for owners of inflatable planetariums with the eventual goal of increasing student enrollment of astronomy courses at the local level.

  3. Geospatial Visualization Tool Kit for Scientists Using Fortran

    Chiang, Gen-Tao; White, Toby O. H.; Dove, Martin T.


    In recent years, visualization for the Earth and environmental sciences has developed significantly. Among the most notable advances has been the rise of Web-based tools colloquially called “geobrowsers.” These tools enable users from a range of sciences to access an enormous quantity of satellite and aerial photography with detailed maps to create a high-quality “virtual Earth” [e.g., McCaffrey et al., 2008; Oberlies et al., 2009]. One important geobrowser is Google Earth™ ( It provides free tools for most major computing platforms and handheld devices, together with the ability to incorporate data from users.

  4. Selecting and effectively using a computer aided software engineering tool

    Kuhn, D.L.


    Software engineering is a science by which user requirements are translated into a quality software product. Computer Aided Software Engineering (CASE) is the scientific application of a set of tools and methods to a software which results in high-quality, defect-free, and maintainable software products. The Computer Systems Engineering (CSE) group of Separations Technology at the Savannah River Site has successfully used CASE tools to produce high-quality, reliable, and maintainable software products. This paper details the selection process CSE used to acquire a commonly available CASE product and how the CSE group effectively used this CASE tool to consistently produce quality software. 9 refs.

  5. Design principles of metal-cutting machine tools

    Koenigsberger, F


    Design Principles of Metal-Cutting Machine Tools discusses the fundamentals aspects of machine tool design. The book covers the design consideration of metal-cutting machine, such as static and dynamic stiffness, operational speeds, gearboxes, manual, and automatic control. The text first details the data calculation and the general requirements of the machine tool. Next, the book discusses the design principles, which include stiffness and rigidity of the separate constructional elements and their combined behavior under load, as well as electrical, mechanical, and hydraulic drives for the op

  6. Aircraft wing structural detail design (wing, aileron, flaps, and subsystems)

    Downs, Robert; Zable, Mike; Hughes, James; Heiser, Terry; Adrian, Kenneth


    The goal of this project was to design, in detail, the wing, flaps, and ailerons for a primary flight trainer. Integrated in this design are provisions for the fuel system, the electrical system, and the fuselage/cabin carry-through interface structure. This conceptual design displays the general arrangement of all major components in the wing structure, taking into consideration the requirements set forth by the appropriate sections of Federal Aviation Regulation Part 23 (FAR23) as well as those established in the statement of work.

  7. Interfacing heat exchanger network synthesis and detailed heat exchanger design

    Polley, G.T.; Panjeh Shahi, M.H. (Manchester Univ. (United Kingdom). Inst. of Science and Technology)


    Current heat exchanger network synthesis targeting and design procedures involve the use of assumed stream heat transfer coefficients. However, during detailed heat exchanger design, allowable pressure drops are often the most critical factors. The result can be big differences between the exchanger sizes and costs anticipated by the network designer and those realised by the exchanger designer. This in turn prejudices any optimisation attempted at the network design stage. In this paper it is shown how allowable pressure drop can be used as a basis of network design and consistency between expectation and realisation achieved. (author).

  8. Detailed model of bouncing drops on a bounded, vibrated bath

    Blanchette, Francois; Gilet, Tristan


    We present a detailed model of drops bouncing on a bounded vibrated bath. These drops are known to bounce indefinitely and to exhibit complex and varied vertical dynamics depending on the acceleration of the bath. In addition, in a narrow parameter regime, these drops travel horizontally while being guided by the waves they generate. Our model tracks the drop's vertical radius and position, as well as the eigenmodes of the waves generated via ordinary differential equations only. We accurately capture the vertical dynamics, as well as some of the horizontal dynamics. Our model may be extended to account for interactions with other drops or obstacles, such as slits and corrals.

  9. Po Superconducting Magnet:detail of the windings


    The Po superconducting dipole was built as a prototype beam transport magnet for the SPS extracted proton beam Po. Its main features were: coil aperture 72 mm, length 5 m, room-temperature yoke, NbTi cable conductor impregnated with solder, nominal field 4.2 T at 4.7 K (87% of critical field). It reached its nominal field without any quench. The photo shows a detail of the inner layer winding before superposing the outer layer to form the complete coil of a pole. Worth noticing is the interleaved glass-epoxy sheet (white) with grooved channels for the flow of cooling helium. See also 8307552X.

  10. Detailed observations of the source of terrestrial narrowband electromagnetic radiation

    Kurth, W. S.


    Detailed observations are presented of a region near the terrestrial plasmapause where narrowband electromagnetic radiation (previously called escaping nonthermal continuum radiation) is being generated. These observations show a direct correspondence between the narrowband radio emissions and electron cyclotron harmonic waves near the upper hybrid resonance frequency. In addition, electromagnetic radiation propagating in the Z-mode is observed in the source region which provides an extremely accurate determination of the electron plasma frequency and, hence, density profile of the source region. The data strongly suggest that electrostatic waves and not Cerenkov radiation are the source of the banded radio emissions and define the coupling which must be described by any viable theory.

  11. Regional and detailed research studies for stone resources in Korea



    This report consists of 7 articles. 1) Detail drilling research works on granodiorite stock of Cheanan area near Onyang city in Chungnam province. 2) Regional research studies on granites distributed in Kimje - Jeongeup. 3) Regional survey and feasibility study on diorite rock mass in Kohyeng, Cheonnam province. 4) Regional research study on the stone resources of Hamyang area. 5) A study on variation trends of physical properties of 5 kinds of building stone by means of Weather-Ometer experiment. 6) Borehole radar survey at the granodiorite quarry mine, Cheonan, Chungnam province. 7) Radar velocity tomography in anisotropic media. (author). refs., tabs., figs.

  12. Detail study of SiC MOSFET switching characteristics

    Li, Helong; Munk-Nielsen, Stig


    This paper makes detail study of the latest SiC MOSFETs switching characteristics in relation to gate driver maximum current, gate resistance, common source inductance and parasitic switching loop inductance. The switching performance of SiC MOSFETs in terms of turn on and turn off voltage...... and current are presented. Switching losses analysis is made according to the experiment results. The switching characteristics study and switching losses analysis could give some guidelines of gate driver IC and gate resistance selection, switching losses estimation and circuit design of SiC MOSFETs....

  13. Number plates – confidentiality and your personal details

    Since the 15th May 2008, Geneva’s ‘Service des automobiles et de la navigation’ has offered the possibility of finding out the name and address of owners of vehicles registered in the Canton of Geneva simply by sending a SMS of the car’s number plate to 939.However, owners of number plates registered in Geneva can block the divulgence of their personal details by filling out the following form, available at:

  14. Physics and mathematical tools methods and examples

    Alastuey, Angel; Magro, Marc; Pujol, Pierre


    This book presents mathematical methods and tools which are useful for physicists and engineers: response functions, Kramers-Kronig relations, Green's functions, saddle point approximation. The derivations emphasize the underlying physical arguments and interpretations without any loss of rigor. General introductions describe the main features of the methods, while connections and analogies between a priori different problems are discussed. They are completed by detailed applications in many topics including electromagnetism, hydrodynamics, statistical physics, quantum mechanics, etc. Exercises are also proposed, and their solutions are sketched. A self-contained reading of the book is favored by avoiding too technical derivations, and by providing a short presentation of important tools in the appendices. It is addressed to undergraduate and graduate students in physics, but it can also be used by teachers, researchers and engineers.

  15. Machine tool metrology an industrial handbook

    Smith, Graham T


    Maximizing reader insights into the key scientific disciplines of Machine Tool Metrology, this text will prove useful for the industrial-practitioner and those interested in the operation of machine tools. Within this current level of industrial-content, this book incorporates significant usage of the existing published literature and valid information obtained from a wide-spectrum of manufacturers of plant, equipment and instrumentation before putting forward novel ideas and methodologies. Providing easy to understand bullet points and lucid descriptions of metrological and calibration subjects, this book aids reader understanding of the topics discussed whilst adding a voluminous-amount of footnotes utilised throughout all of the chapters, which adds some additional detail to the subject. Featuring an extensive amount of photographic-support, this book will serve as a key reference text for all those involved in the field. .

  16. A Support Tool for Tagset Mapping

    Teufel, S


    Many different tagsets are used in existing corpora; these tagsets vary according to the objectives of specific projects (which may be as far apart as robust parsing vs. spelling correction). In many situations, however, one would like to have uniform access to the linguistic information encoded in corpus annotations without having to know the classification schemes in detail. This paper describes a tool which maps unstructured morphosyntactic tags to a constraint-based, typed, configurable specification language, a ``standard tagset''. The mapping relies on a manually written set of mapping rules, which is automatically checked for consistency. In certain cases, unsharp mappings are unavoidable, and noise, i.e. groups of word forms {\\sl not} conforming to the specification, will appear in the output of the mapping. The system automatically detects such noise and informs the user about it. The tool has been tested with rules for the UPenn tagset \\cite{up} and the SUSANNE tagset languages.

  17. Anatomically detailed and large-scale simulations studying synapse loss and synchrony using NeuroBox

    Markus eBreit


    Full Text Available The morphology of neurons and networks plays an important role in processing electrical and biochemical signals. Based on neuronal reconstructions, which are becoming abundantly available through databases such as, numerical simulations of Hodgkin-Huxley-type equations, coupled to biochemical models, can be performed in order to systematically investigate the influence of cellular morphology and the connectivity pattern in networks on the underlying function. Development in the area of synthetic neural network generation and morphology reconstruction from microscopy data has brought forth the software tool NeuGen. Coupling this morphology data (either from databases, synthetic or reconstruction to the simulation platform UG 4 (which harbors a neuroscientific portfolio and VRL-Studio, has brought forth the extendible toolbox NeuroBox. NeuroBox allows users to perform numerical simulations on hybrid-dimensional morphology representations. The code basis is designed in a modular way, such that e.g. new channel or synapse types can be added to the library. Workflows can be specified through scripts or through the VRL-Studio graphical workflow representation. Third-party tools, such as ImageJ, can be added to NeuroBox workflows. In this paper, NeuroBox is used to study the electrical and biochemical effects of synapse loss vs. synchrony in neurons, to investigate large morphology data sets within detailed biophysical simulations, and used to demonstrate the capability of utilizing high-performance computing infrastructure for large scale network simulations. Using new synapse distribution methods and Finite Volume based numerical solvers for compartment-type models, our results demonstrate how an increase in synaptic synchronization can compensate synapse loss at the electrical and calcium level, and how detailed neuronal morphology can be integrated in large-scale network simulations.

  18. Straightforward statistics understanding the tools of research

    Geher, Glenn


    Straightforward Statistics: Understanding the Tools of Research is a clear and direct introduction to statistics for the social, behavioral, and life sciences. Based on the author's extensive experience teaching undergraduate statistics, this book provides a narrative presentation of the core principles that provide the foundation for modern-day statistics. With step-by-step guidance on the nuts and bolts of computing these statistics, the book includes detailed tutorials how to use state-of-the-art software, SPSS, to compute the basic statistics employed in modern academic and applied researc

  19. Advanced free-form micro tooling

    Tosello, Guido; Gavillet, J.


    The present deliverable contains the report of the work and results achieved within the framework of WP 2.2 in Tasks 2.2.4 “Advanced free-form micro tooling” in experimental research done regarding practical applications of methods of applying nano structures to tooling solutions. As part of Task 2...... nanometre features can affect physical and optical properties of the surface [Liu03][Por99]. Since sub-μm feature details with ultra-low tolerances have to be manufactured, these structures are usually fabricated using clean room technologies or direct ultra precision machining procedures. Methods such as e...

  20. Algal functional annotation tool

    Lopez, D. [UCLA; Casero, D. [UCLA; Cokus, S. J. [UCLA; Merchant, S. S. [UCLA; Pellegrini, M. [UCLA


    The Algal Functional Annotation Tool is a web-based comprehensive analysis suite integrating annotation data from several pathway, ontology, and protein family databases. The current version provides annotation for the model alga Chlamydomonas reinhardtii, and in the future will include additional genomes. The site allows users to interpret large gene lists by identifying associated functional terms, and their enrichment. Additionally, expression data for several experimental conditions were compiled and analyzed to provide an expression-based enrichment search. A tool to search for functionally-related genes based on gene expression across these conditions is also provided. Other features include dynamic visualization of genes on KEGG pathway maps and batch gene identifier conversion.

  1. Building energy analysis tool

    Brackney, Larry; Parker, Andrew; Long, Nicholas; Metzger, Ian; Dean, Jesse; Lisell, Lars


    A building energy analysis system includes a building component library configured to store a plurality of building components, a modeling tool configured to access the building component library and create a building model of a building under analysis using building spatial data and using selected building components of the plurality of building components stored in the building component library, a building analysis engine configured to operate the building model and generate a baseline energy model of the building under analysis and further configured to apply one or more energy conservation measures to the baseline energy model in order to generate one or more corresponding optimized energy models, and a recommendation tool configured to assess the one or more optimized energy models against the baseline energy model and generate recommendations for substitute building components or modifications.

  2. Automated Standard Hazard Tool

    Stebler, Shane


    The current system used to generate standard hazard reports is considered cumbersome and iterative. This study defines a structure for this system's process in a clear, algorithmic way so that standard hazard reports and basic hazard analysis may be completed using a centralized, web-based computer application. To accomplish this task, a test server is used to host a prototype of the tool during development. The prototype is configured to easily integrate into NASA's current server systems with minimal alteration. Additionally, the tool is easily updated and provides NASA with a system that may grow to accommodate future requirements and possibly, different applications. Results of this project's success are outlined in positive, subjective reviews complete by payload providers and NASA Safety and Mission Assurance personnel. Ideally, this prototype will increase interest in the concept of standard hazard automation and lead to the full-scale production of a user-ready application.

  3. Automatically-Programed Machine Tools

    Purves, L.; Clerman, N.


    Software produces cutter location files for numerically-controlled machine tools. APT, acronym for Automatically Programed Tools, is among most widely used software systems for computerized machine tools. APT developed for explicit purpose of providing effective software system for programing NC machine tools. APT system includes specification of APT programing language and language processor, which executes APT statements and generates NC machine-tool motions specified by APT statements.

  4. Micromachining with Nanostructured Cutting Tools

    Jackson, Mark J


    The purpose of the brief is to explain how nanostructured tools can be used to machine materials at the microscale.  The aims of the brief are to explain to readers how to apply nanostructured tools to micromachining applications. This book describes the application of nanostructured tools to machining engineering materials and includes methods for calculating basic features of micromachining. It explains the nature of contact between tools and work pieces to build a solid understanding of how nanostructured tools are made.

  5. Acoustic characterization of pneumatic percussion tools

    Muract, Jorge; Kadam, Rahul; Burdisso, Ricardo; Johnson, Marty


    Pneumatic powered percussion tools are extensively used in construction industry. They are one of the noisiest machines in the construction industry generating noise levels well above 110 dBA which are well beyond the permissible exposure limit (PEL) of 85 dBA. The paper presents comprehensive analysis of the noise generated from these percussion tools. Noise tests were carried out on different percussion tools ranging from small chipping hammers to rock drills from two major construction equipment manufacturing companies. These tests were carried out in an anechoic room as well as in simulated operating conditions to determine the overall sound power levels. A spherical array of microphones was used to obtain an accurate estimate of the overall sound power levels and the directivity. The overall sound power radiation was found to be in the range of 105-115 dBA. An advanced 63 microphone phased array was used to successfully locate and identify the major sources of noise from these tools. The outcome of the tests is illustrated in detail in the paper. Further the paper will suggest noise control methods to reduce overall sound power radiation and discuss potential performance levels.

  6. Social Data Analytics Tool

    Hussain, Abid; Vatrapu, Ravi


    This paper presents the design, development and demonstrative case studies of the Social Data Analytics Tool, SODATO. Adopting Action Design Framework [1], the objective of SODATO [2] is to collect, store, analyze, and report big social data emanating from the social media engagement of and social...... media conversations about organizations. We report and discuss results from two demonstrative case studies that were conducted using SODATO and conclude with implications and future work....

  7. Channel nut tool

    Olson, Marvin


    A method, system, and apparatus for installing channel nuts includes a shank, a handle formed on a first end of a shank, and an end piece with a threaded shaft configured to receive a channel nut formed on the second end of the shaft. The tool can be used to insert or remove a channel nut in a channel framing system and then removed from the channel nut.

  8. Program Management Tool

    Gawadiak, Yuri; Wong, Alan; Maluf, David; Bell, David; Gurram, Mohana; Tran, Khai Peter; Hsu, Jennifer; Yagi, Kenji; Patel, Hemil


    The Program Management Tool (PMT) is a comprehensive, Web-enabled business intelligence software tool for assisting program and project managers within NASA enterprises in gathering, comprehending, and disseminating information on the progress of their programs and projects. The PMT provides planning and management support for implementing NASA programmatic and project management processes and requirements. It provides an online environment for program and line management to develop, communicate, and manage their programs, projects, and tasks in a comprehensive tool suite. The information managed by use of the PMT can include monthly reports as well as data on goals, deliverables, milestones, business processes, personnel, task plans, monthly reports, and budgetary allocations. The PMT provides an intuitive and enhanced Web interface to automate the tedious process of gathering and sharing monthly progress reports, task plans, financial data, and other information on project resources based on technical, schedule, budget, and management criteria and merits. The PMT is consistent with the latest Web standards and software practices, including the use of Extensible Markup Language (XML) for exchanging data and the WebDAV (Web Distributed Authoring and Versioning) protocol for collaborative management of documents. The PMT provides graphical displays of resource allocations in the form of bar and pie charts using Microsoft Excel Visual Basic for Application (VBA) libraries. The PMT has an extensible architecture that enables integration of PMT with other strategic-information software systems, including, for example, the Erasmus reporting system, now part of the NASA Integrated Enterprise Management Program (IEMP) tool suite, at NASA Marshall Space Flight Center (MSFC). The PMT data architecture provides automated and extensive software interfaces and reports to various strategic information systems to eliminate duplicative human entries and minimize data integrity

  9. Udder Hygiene Analysis tool


    In this report, the pilot of UHC is described. The main objective of the pilot is to make farmers more aware of how to increase udder health in dairy herds. This goes through changing management aspects related to hygiene. This report firstly provides general information about antibiotics and the processes that influence udder health. Secondly, six subjects are described related to udder health. Thirdly, the tools (checklists and roadmap) are shown and fourthly, advises that are written by UH...

  10. Knowing about tools: neural correlates of tool familiarity and experience.

    Vingerhoets, Guy


    The observation of tools is known to elicit a distributed cortical network that reflects close-knit relations of semantic, action-related, and perceptual knowledge. The neural correlates underlying the critical knowledge of skilled tool use, however, remain to be elucidated. In this study, functional magnetic resonance imaging in 14 volunteers compares neural activation during the observation of familiar tools versus equally graspable unfamiliar tools of which the observers have little, if any, functional knowledge. In a second paradigm, the level of tool-experience is investigated by comparing the neural effects of observing frequently versus infrequently used familiar tools. Both familiar and unfamiliar tools activate the classic neural network associated with tool representations. Direct comparison of the activation patterns during the observation of familiar and unfamiliar tools in a priori determined regions of interest (ptool-use knowledge, with supramarginal gyrus storing information about limb and hand positions, and precuneus storing visuospatial information about hand-tool interactions. As no frontal activation survived this contrast, it appears that premotor activity is unrelated to experience based motor knowledge of tool use/function, but rather, is elicited by any graspable tool. Confrontation with unfamiliar or infrequently used tools reveals an increase in inferior temporal and medial and lateral occipital activation, predominantly in the left hemisphere, suggesting that these regions reflect visual feature processing for tool identification.

  11. Dynamic Contingency Analysis Tool


    The Dynamic Contingency Analysis Tool (DCAT) is an open-platform and publicly available methodology to help develop applications that aim to improve the capabilities of power system planning engineers to assess the impact and likelihood of extreme contingencies and potential cascading events across their systems and interconnections. Outputs from the DCAT will help find mitigation solutions to reduce the risk of cascading outages in technically sound and effective ways. The current prototype DCAT implementation has been developed as a Python code that accesses the simulation functions of the Siemens PSS�E planning tool (PSS/E). It has the following features: It uses a hybrid dynamic and steady-state approach to simulating the cascading outage sequences that includes fast dynamic and slower steady-state events. It integrates dynamic models with protection scheme models for generation, transmission, and load. It models special protection systems (SPSs)/remedial action schemes (RASs) and automatic and manual corrective actions. Overall, the DCAT attempts to bridge multiple gaps in cascading-outage analysis in a single, unique prototype tool capable of automatically simulating and analyzing cascading sequences in real systems using multiprocessor computers.While the DCAT has been implemented using PSS/E in Phase I of the study, other commercial software packages with similar capabilities can be used within the DCAT framework.

  12. Detailed kinetic modeling study of n-pentanol oxidation

    Heufer, Karl Alexander


    To help overcome the world\\'s dependence upon fossil fuels, suitable biofuels are promising alternatives that can be used in the transportation sector. Recent research on internal combustion engines shows that short alcoholic fuels (e.g., ethanol or n-butanol) have reduced pollutant emissions and increased knock resistance compared to fossil fuels. Although higher molecular weight alcohols (e.g., n-pentanol and n-hexanol) exhibit higher reactivity that lowers their knock resistance, they are suitable for diesel engines or advanced engine concepts, such as homogeneous charge compression ignition (HCCI), where higher reactivity at lower temperatures is necessary for engine operation. The present study presents a detailed kinetic model for n-pentanol based on modeling rules previously presented for n-butanol. This approach was initially validated using quantum chemistry calculations to verify the most stable n-pentanol conformation and to obtain C-H and C-C bond dissociation energies. The proposed model has been validated against ignition delay time data, speciation data from a jet-stirred reactor, and laminar flame velocity measurements. Overall, the model shows good agreement with the experiments and permits a detailed discussion of the differences between alcohols and alkanes. © 2012 American Chemical Society.

  13. Hdr Imaging for Feature Detection on Detailed Architectural Scenes

    Kontogianni, G.; Stathopoulou, E. K.; Georgopoulos, A.; Doulamis, A.


    3D reconstruction relies on accurate detection, extraction, description and matching of image features. This is even truer for complex architectural scenes that pose needs for 3D models of high quality, without any loss of detail in geometry or color. Illumination conditions influence the radiometric quality of images, as standard sensors cannot depict properly a wide range of intensities in the same scene. Indeed, overexposed or underexposed pixels cause irreplaceable information loss and degrade digital representation. Images taken under extreme lighting environments may be thus prohibitive for feature detection/extraction and consequently for matching and 3D reconstruction. High Dynamic Range (HDR) images could be helpful for these operators because they broaden the limits of illumination range that Standard or Low Dynamic Range (SDR/LDR) images can capture and increase in this way the amount of details contained in the image. Experimental results of this study prove this assumption as they examine state of the art feature detectors applied both on standard dynamic range and HDR images.


    G. Kontogianni


    Full Text Available 3D reconstruction relies on accurate detection, extraction, description and matching of image features. This is even truer for complex architectural scenes that pose needs for 3D models of high quality, without any loss of detail in geometry or color. Illumination conditions influence the radiometric quality of images, as standard sensors cannot depict properly a wide range of intensities in the same scene. Indeed, overexposed or underexposed pixels cause irreplaceable information loss and degrade digital representation. Images taken under extreme lighting environments may be thus prohibitive for feature detection/extraction and consequently for matching and 3D reconstruction. High Dynamic Range (HDR images could be helpful for these operators because they broaden the limits of illumination range that Standard or Low Dynamic Range (SDR/LDR images can capture and increase in this way the amount of details contained in the image. Experimental results of this study prove this assumption as they examine state of the art feature detectors applied both on standard dynamic range and HDR images.

  15. The detailed nature of active central cluster galaxies

    Loubser, S I


    We present detailed integral field unit (IFU) observations of the central few kiloparsecs of the ionised nebulae surrounding four active central cluster galaxies (CCGs) in cooling flow clusters (Abell 0496, 0780, 1644 and 2052). Our sample consists of CCGs with H{\\alpha} filaments, and have existing data from the X-ray regime available. Here, we present the detailed optical emission-line (and simultaneous absorption line) data over a broad wavelength range to probe the dominant ionisation processes, excitation sources, morphology and kinematics of the hot gas (as well as the morphology and kinematics of the stars). This, combined with the other multiwavelength data, will form a complete view of the different phases (hot and cold gas and stars) and how they interact in the processes of star formation and feedback detected in central galaxies in cooling flow clusters, as well as the influence of the host cluster. We derive the optical dust extinction maps of the four nebulae. We also derive a range of different...

  16. Manufacturing details by Neutron Radiography of Archaeological Pottery

    Bernedo, Alfredo Victor Bellido; Latini, Rose Mary [Universidade Federal Fluminense (UFF), Niteroi, RJ (Brazil); Souza, Maria Ines Silvani; Vinagre Filho, Ubirajara Maribondo [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil)


    Full text: The aim of the present work was to investigate manufacturing details of archaeological pot-sherds ceramics by Neutron Radiography. Pottery is perhaps the most important artefact found in excavation. Its archaeological importance relies on the fact that it can reveal cultural traditions and commercial influences in ancient communities. These pottery was recently discovered in archaeological earth circular structures sites in Acre state Brazil and the characteristics of clay used in their manufacture have been studied by modern scientific techniques such as Instrumental Neutron Activation Analysis (INAA), Thermoluminescence Dating and Moessbauer Spectroscopy. Different fragments of pottery were submitted to a neutron flux of the order of 10{sup 5}{sup -2}2:s{sup -1} for 3 minutes in the research reactor Argonauta at the Instituto de Engenharia Nuclear/CNEN. Digital processing techniques using imaging plate were applied to the image of the selected sample. The Neutrongraphy shows two different manufacturing details: palette and rollers. The fragment made by the technique of palette show a homogeneous mass and the neutrongraphy of ceramic fragments made by the technique of the rollers, pottery funeral, can be seen horizontal traces of the junction of rollers, overlapping, forming layers supported on each other. This technique allows you to create more stable structures. Thus, both the technique of the pallet as the roller can be characterized by Neutron Radiography. (author)

  17. On detailed 3D reconstruction of large indoor environments

    Bondarev, Egor


    In this paper we present techniques for highly detailed 3D reconstruction of extra large indoor environments. We discuss the benefits and drawbacks of low-range, far-range and hybrid sensing and reconstruction approaches. The proposed techniques for low-range and hybrid reconstruction, enabling the reconstruction density of 125 points/cm3 on large 100.000 m3 models, are presented in detail. The techniques tackle the core challenges for the above requirements, such as a multi-modal data fusion (fusion of a LIDAR data with a Kinect data), accurate sensor pose estimation, high-density scanning and depth data noise filtering. Other important aspects for extra large 3D indoor reconstruction are the point cloud decimation and real-time rendering. In this paper, we present a method for planar-based point cloud decimation, allowing for reduction of a point cloud size by 80-95%. Besides this, we introduce a method for online rendering of extra large point clouds enabling real-time visualization of huge cloud spaces in conventional web browsers.

  18. Infants Encode Phonetic Detail during Cross-Situational Word Learning

    Escudero, Paola; Mulak, Karen E.; Vlach, Haley A.


    Infants often hear new words in the context of more than one candidate referent. In cross-situational word learning (XSWL), word-object mappings are determined by tracking co-occurrences of words and candidate referents across multiple learning events. Research demonstrates that infants can learn words in XSWL paradigms, suggesting that it is a viable model of real-world word learning. However, these studies have all presented infants with words that have no or minimal phonological overlap (e.g., BLICKET and GAX). Words often contain some degree of phonological overlap, and it is unknown whether infants can simultaneously encode fine phonological detail while learning words via XSWL. We tested 12-, 15-, 17-, and 20-month-olds’ XSWL of eight words that, when paired, formed non-minimal pairs (MPs; e.g., BON–DEET) or MPs (e.g., BON–TON, DEET–DIT). The results demonstrated that infants are able to learn word-object mappings and encode them with sufficient phonetic detail as to identify words in both non-minimal and MP contexts. Thus, this work suggests that infants are able to simultaneously discriminate phonetic differences between words and map words to referents in an implicit learning paradigm such as XSWL. PMID:27708605

  19. Effects of Geometric Details on Slat Noise Generation and Propagation

    Khorrami, Mehdi R.; Lockard, David P.


    The relevance of geometric details to the generation and propagation of noise from leading-edge slats is considered. Typically, such details are omitted in computational simulations and model-scale experiments thereby creating ambiguities in comparisons with acoustic results from flight tests. The current study uses two-dimensional, computational simulations in conjunction with a Ffowcs Williams-Hawkings (FW-H) solver to investigate the effects of previously neglected slat "bulb" and "blade" seals on the local flow field and the associated acoustic radiation. The computations show that the presence of the "blade" seal at the cusp in the simulated geometry significantly changes the slat cove flow dynamics, reduces the amplitudes of the radiated sound, and to a lesser extent, alters the directivity beneath the airfoil. Furthermore, the computations suggest that a modest extension of the baseline "blade" seal further enhances the suppression of slat noise. As a side issue, the utility and equivalence of FW-H methodology for calculating far-field noise as opposed to a more direct approach is examined and demonstrated.

  20. Detailed analysis of Balmer lines in cool dwarf stars

    Barklem, P S; Allende-Prieto, C; Kochukhov, O P; Piskunov, N; O'Mara, B J


    An analysis of H alpha and H beta spectra in a sample of 30 cool dwarf and subgiant stars is presented using MARCS model atmospheres based on the most recent calculations of the line opacities. A detailed quantitative comparison of the solar flux spectra with model spectra shows that Balmer line profile shapes, and therefore the temperature structure in the line formation region, are best represented under the mixing length theory by any combination of a low mixing-length parameter alpha and a low convective structure parameter y. A slightly lower effective temperature is obtained for the sun than the accepted value, which we attribute to errors in models and line opacities. The programme stars span temperatures from 4800 to 7100 K and include a small number of population II stars. Effective temperatures have been derived using a quantitative fitting method with a detailed error analysis. Our temperatures find good agreement with those from the Infrared Flux Method (IRFM) near solar metallicity but show diffe...

  1. Detailed computation of hot-plasma atomic spectra

    Pain, Jean-Christophe; Blenski, Thomas


    We present recent evolutions of the detailed opacity code SCO-RCG which combines statistical modelings of levels and lines with fine-structure calculations. The code now includes the Partially-Resolved-Transition-Array model, which allows one to replace a complex transition array by a small-scale detailed calculation preserving energy and variance of the genuine transition array and yielding improved high-order moments. An approximate method for studying the impact of strong magnetic field on opacity and emissivity was also recently implemented. The Zeeman line profile is modeled by fourth-order Gram-Charlier expansion series, which is a Gaussian multiplied by a linear combination of Hermite polynomials. Electron collisional line broadening is often modeled by a Lorentzian function and one has to calculate the convolution of a Lorentzian with Gram-Charlier distribution for a huge number of spectral lines. Since the numerical cost of the direct convolution would be prohibitive, we propose, in order to obtain t...

  2. Statistical Analysis of Detailed 3-D CFD LES Simulations with Regard to CCV Modeling

    Vítek Oldřich


    Full Text Available The paper deals with statistical analysis of large amount of detailed 3-D CFD data in terms of cycle-to-cycle variations (CCVs. These data were obtained by means of LES calculations of many consecutive cycles. Due to non-linear nature of Navier-Stokes equation set, there is a relatively significant CCV. Hence, every cycle is slightly different – this leads to requirement to perform statistical analysis based on ensemble averaging procedure which enables better understanding of CCV in ICE including its quantification. The data obtained from the averaging procedure provides results on different space resolution levels. The procedure is applied locally, i.e., in every cell of the mesh. Hence there is detailed CCV information on local level – such information can be compared with RANS simulations. Next, volume/mass averaging provides information at specific locations – e.g., gap between electrodes of a spark plug. Finally, volume/mass averaging of the whole combustion chamber leads to global information which can be compared with experimental data or results of system simulation tools (which are based on 0-D/1-D approach.

  3. Engineering tool for the evaluation of global IED effects

    N. Heider


    The engineering tool is validated with small size generic vehicle tests where jump height and the vehicle motion are compared. The software allows a detailed analysis of global IED effects and can be additionally used in an inverse mode for the analysis of incidents with the determination of used HE masses in an IED attack.

  4. The most important tool in MLM-business

    N.G. Gromyko


    Full Text Available In article it is considered one of the most important tools in work of the distributor. The purposes which are pursued by consideration and application of the author's approach to this point in question are opened. Stages of the offered approach, their results, a role and value in MLM are in details stated the industry.

  5. Software Tools: A One-Semester Secondary School Computer Course.

    Bromley, John; Lakatos, John


    Provides a course outline, describes equipment and teacher requirements, discusses student evaluation and course outcomes, and details the computer programs used in a high school course. The course is designed to teach students use of the microcomputer as a tool through hands-on experience with a variety of commercial software programs. (MBR)

  6. Tool-Based Curricula and Visual Learning

    Dragica Vasileska


    Full Text Available In the last twenty years nanotechnology hasrevolutionized the world of information theory, computers andother important disciplines, such as medicine, where it hascontributed significantly in the creation of more sophisticateddiagnostic tools. Therefore, it is important for people working innanotechnology to better understand basic concepts to be morecreative and productive. To further foster the progress onNanotechnology in the USA, the National Science Foundation hascreated the Network for Computational Nanotechnology (NCNand the dissemination of all the information from member andnon-member participants of the NCN is enabled by thecommunity website nanoHUB’s signatureservices online simulation that enables the operation ofsophisticated research and educational simulation engines with acommon browser. No software installation or local computingpower is needed. The simulation tools as well as nano-conceptsare augmented by educational materials, assignments, and toolbasedcurricula, which are assemblies of tools that help studentsexcel in a particular area.As elaborated later in the text, it is the visual mode of learningthat we are exploiting in achieving faster and better results withstudents that go through simulation tool-based curricula. Thereare several tool based curricula already developed on thenanoHUB and undergoing further development, out of which fiveare directly related to nanoelectronics. They are: ABACUS –device simulation module; ACUTE – Computational Electronicsmodule; ANTSY – bending toolkit; and AQME – quantummechanics module. The methodology behind tool-based curriculais discussed in details. Then, the current status of each module ispresented, including user statistics and student learningindicatives. Particular simulation tool is explored further todemonstrate the ease by which students can grasp information.Representative of Abacus is PN-Junction Lab; representative ofAQME is PCPBT tool; and

  7. Lilith: A software framework for the rapid development of scalable tools for distributed computing

    Gentile, A.C.; Evensky, D.A.; Armstrong, R.C.


    Lilith is a general purpose framework, written in Java, that provides a highly scalable distribution of user code across a heterogeneous computing platform. By creation of suitable user code, the Lilith framework can be used for tool development. The scalable performance provided by Lilith is crucial to the development of effective tools for large distributed systems. Furthermore, since Lilith handles the details of code distribution and communication, the user code need focus primarily on the tool functionality, thus, greatly decreasing the time required for tool development. In this paper, the authors concentrate on the use of the Lilith framework to develop scalable tools. The authors review the functionality of Lilith and introduce a typical tool capitalizing on the features of the framework. They present new Objects directly involved with tool creation. They explain details of development and illustrate with an example. They present timing results demonstrating scalability.

  8. Detailed dynamic solid oxide fuel cell modeling for electrochemical impedance spectra simulation

    Hofmann, Ph. [Laboratory of Steam Boilers and Thermal Plants, School of Mechanical Engineering, Thermal Engineering Section, National Technical University of Athens, Heroon Polytechniou 9, 15780 Athens (Greece); Panopoulos, K.D. [Institute for Solid Fuels Technology and Applications, Centre for Research and Technology Hellas, 4th km. Ptolemais-Mpodosakeio Hospital, Region of Kouri, P.O. Box 95, GR 502, 50200 Ptolemais (Greece)


    This paper presents a detailed flexible mathematical model for planar solid oxide fuel cells (SOFCs), which allows the simulation of steady-state performance characteristics, i.e. voltage-current density (V-j) curves, and dynamic operation behavior, with a special capability of simulating electrochemical impedance spectroscopy (EIS). The model is based on physico-chemical governing equations coupled with a detailed multi-component gas diffusion mechanism (Dusty-Gas Model (DGM)) and a multi-step heterogeneous reaction mechanism implicitly accounting for the water-gas-shift (WGS), methane reforming and Boudouard reactions. Spatial discretization can be applied for 1D (button-cell approximation) up to quasi-3D (full size anode supported cell in cross-flow configuration) geometries and is resolved with the finite difference method (FDM). The model is built and implemented on the commercially available modeling and simulations platform gPROMS trademark. Different fuels based on hydrogen, methane and syngas with inert diluents are run. The model is applied to demonstrate a detailed analysis of the SOFC inherent losses and their attribution to the EIS. This is achieved by means of a step-by-step analysis of the involved transient processes such as gas conversion in the main gas chambers/channels, gas diffusion through the porous electrodes together with the heterogeneous reactions on the nickel catalyst, and the double-layer current within the electrochemical reaction zone. The model is an important tool for analyzing SOFC performance fundamentals as well as for design and optimization of materials' and operational parameters. (author)

  9. Multiparameter double hole contrast detail phantom: Ability to detect image displacement due to off position anode stem

    Pauzi, Nur Farahana; Majid, Zafri Azran Abdul; Sapuan, Abdul Halim; Junet, Laila Kalidah [Department of Diagnostic Imaging and Radiotherapy, Kulliyyah of Allied Health Sciences, International Islamic University Malaysia, Jalan Istana, 25200, Kuantan, Pahang (Malaysia); Azemin, Mohd Zulfaezal Che [Department of Optometry and Visual Science, Kulliyyah of Allied Health Sciences, International Islamic University Malaysia, Jalan Istana, 25200, Kuantan, Pahang (Malaysia)


    Contrast Detail phantom is a quality control tool to analyze the performance of imaging devices. Currently, its function is solely to evaluate the contrast detail characteristic of imaging system. It consists of drilled hole which gives effect to the penetration of x-ray beam divergence to pass through the base of each hole. This effect will lead to false appearance of image from its original location but it does not being visualized in the radiograph. In this study, a new design of Contrast Detail phantom’s hole which consists of double hole construction has been developed. It can detect the image displacement which is due to off position of anode stem from its original location. The double hole differs from previous milled hole, whereby it consists of combination of different hole diameters. Small hole diameter (3 mm) is positioned on top of larger hole diameter (10 mm). The thickness of double hole acrylic blocks is 13 mm. Result revealed that Multiparameter Double Hole Contrast Detail phantom can visualize the shifted flaw image quality produced by x-ray machine due to improper position of the anode stem which is attached to rotor and stator. The effective focal spot of x-ray beam also has been shifted from the center of collimator as a result of off-position anode stem. As a conclusion, the new design of double hole Contrast Detail phantom able to measure those parameters in a well manner.

  10. Introducción a phpMyAdmin (4/4)

    Suárez Cueto, Armando


    Curso "Introducción al desarrollo web": inserción de filas de relación, modificación de la estructura de una tabla, definición de los valores a mostrar en la inserción de filas en una tabla de relación. Más información en

  11. Fit to Fight: Admin or Ethos? Embedding Fitness in Air Force Culture


    cycle--from the pool of discharge candidates to commander-initiated discharge actions to the number of completed discharges--but such is not available...physical fitness program,34 and Hollywood feels three workouts should be “a mandatory part of the workweek.”35 Here again, commanders do not have to

  12. Detailed Behavioral Assessment Promotes Accurate Diagnosis in Patients with Disorders of Consciousness

    Yael eGilutz


    Full Text Available Introduction: Assessing the awareness level in patients with disorders of consciousness (DOC is made on the basis of exhibited behaviors. However, since motor signs of awareness (i.e. non-reflex motor responses can be very subtle, differentiating the vegetative from minimally conscious states (which is in itself not clear-cut is often challenging. Even the careful clinician relying on standardized scales may arrive at a wrong diagnosis. Aim: To report our experience in tackling this problem by using two in-house use assessment procedures developed at Reuth Rehabilitation Hospital, and demonstrate their clinical significance by reviewing two cases. Methods: 1.Reuth DOC Response Assessment (RDOC-RA –administered in addition to the standardized tools, and emphasizes the importance of assessing a wide range of motor responses. In our experience, in some patients the only evidence for awareness may be a private specific movement that is not assessed by standard assessment tools. 2. Reuth DOC Periodic Intervention Model (RDOC-PIM - Current literature regarding assessment and diagnosis in DOC refers mostly to the acute phase of up to one year post injury. However, we have found major changes in responsiveness occurring one year or more post-injury in many patients. Therefore, we conduct periodic assessments at predetermined times points to ensure patients are not misdiagnosed or neurological changes overlooked. Results: In the first case the RDOC-RA promoted a more accurate diagnosis than that based on standardized scales alone. The second case shows how the RDOC-PIM allowed us to recognize late recovery and promoted reinstatement of treatment with good results. Conclusions: Adding a detailed periodic assessment of DOC patients to existing scales can yield critical information, promoting better diagnosis, treatment and clinical outcomes. We discuss the implications of this observation for the future development and validation of assessment tools in

  13. Detailed ultraviolet asymptotics for AdS scalar field perturbations

    Evnin, Oleg


    We present a range of methods suitable for accurate evaluation of the leading asymptotics for integrals of products of Jacobi polynomials in limits when the degrees of some or all polynomials inside the integral become large. The structures in question have recently emerged in the context of effective descriptions of small amplitude perturbations in anti-de Sitter (AdS) spacetime. The limit of high degree polynomials corresponds in this situation to effective interactions involving extreme short-wavelength modes, whose dynamics is crucial for the turbulent instabilities that determine the ultimate fate of small AdS perturbations. We explicitly apply the relevant asymptotic techniques to the case of a self-interacting probe scalar field in AdS and extract a detailed form of the leading large degree behavior, including closed form analytic expressions for the numerical coefficients appearing in the asymptotics.

  14. Thermal and mechanical analysis for the detailed model using submodel

    Kuh, Jung Eui; Kang, Chul Hyung; Park, Jeong Hwa [Korea Atomic Energy Research Institute, Taejon (Korea)


    A very big model is required for the TM analysis for HLRW repository, and also very small size of mesh is needed to simulate precisely main parts of analysis, e.g., canister, buffer, etc. However, it is practically impossible due to high memory size and computing time. In this report, a submodel concept in ABAQUS is used to handle this difficulty. A submodel concept is the part interested only is performed detailed modelling and this result is used as a boundary condition of full scale model. To follow this kind of computation procedure temperature distribution in buffer and canister could be computed precisely. This approach can be applied to TM analysis of buffer and canister, or a finite size of repository. 12 refs., 28 figs., 9 tabs. (Author)


    Radha Bhati


    Full Text Available Oral mucosal drug delivery system is widely applicable as novel site for administration of drug for immediate and controlled release action by preventing first pass metabolism and enzymatic degradation due to GI microbial flora. Oral mucosal drug delivery system provides local and systemic action. In this review, attention is focused to give regarding physiology of oral mucosal including tissue permeability, barriers to permeation and route of permeation, biopharmaceutics of buccal and sublingual absorption, factors affecting drug absorption, detailed information of penetration enhancers, design of oral mucosal drug delivery system and role of mucoadhesion and various theories of bioadhesion. Evaluation techniques and selection of animal model for in-vivo studies are also discussed.

  16. Horava-Lifshitz early universe phase transition beyond detailed balance

    Kheyri, F.; Khodadi, M.; Sepangi, H.R. [Shahid Beheshti University, Department of Physics, Tehran (Iran, Islamic Republic of)


    The early universe is believed to have undergone a QCD phase transition to hadrons at about 10 {mu}s after the big bang. We study such a transition in the context of the non-detailed balance Horava-Lifshitz theory by investigating the effects of the dynamical coupling constant {lambda} in a flat universe. The evolution of the relevant physical quantities, namely the energy density {rho}, temperature T, scale factor a and the Hubble parameter H is investigated before, during and after the phase transition, assumed to be of first order. Also, in view of the recent lattice QCD simulations data, we study a cross-over phase transition of the early universe whose results are based on two different sets of lattice data. (orig.)

  17. Detailed HI kinematics of Tully-Fisher calibrator galaxies

    Ponomareva, Anastasia A; Bosma, Albert


    We present spatially-resolved HI kinematics of 32 spiral galaxies which have Cepheid or/and Tip of the Red Giant Branch distances, and define a calibrator sample for the Tully-Fisher relation. The interferometric HI data for this sample were collected from available archives and supplemented with new GMRT observations. This paper describes an uniform analysis of the HI kinematics of this inhomogeneous data set. Our main result is an atlas for our calibrator sample that presents global HI profiles, integrated HI column-density maps, HI surface density profiles and, most importantly, detailed kinematic information in the form of high-quality rotation curves derived from highly-resolved, two-dimensional velocity fields and position-velocity diagrams.

  18. A detailed and verified wind resource atlas for Denmark

    Mortensen, N.G.; Landberg, L.; Rathmann, O.; Nielsen, M.N. [Risoe National Lab., Roskilde (Denmark); Nielsen, P. [Energy and Environmental Data, Aalberg (Denmark)


    A detailed and reliable wind resource atlas covering the entire land area of Denmark has been established. Key words of the methodology are wind atlas analysis, interpolation of wind atlas data sets, automated generation of digital terrain descriptions and modelling of local wind climates. The atlas contains wind speed and direction distributions, as well as mean energy densities of the wind, for 12 sectors and four heights above ground level: 25, 45, 70 and 100 m. The spatial resolution is 200 meters in the horizontal. The atlas has been verified by comparison with actual wind turbine power productions from over 1200 turbines. More than 80% of these turbines were predicted to within 10%. The atlas will become available on CD-ROM and on the Internet. (au)

  19. Detailed analysis of observed antiprotons in cosmic rays

    P Davoudifar


    Full Text Available In the present work, the origin of antiprotons observed in cosmic rays (above the atmosphere is analyzed in details. We have considered the origin of the primaries, (which their interactions with the interstellar medium is one of the most important sources of antiprotons is a supernova type II then used a diffusion model for their propagation. We have used the latest parameterization for antiproton production cross section in pp collisions (instead of well known parameterization introduced by Tan et al. as well as our calculated residence time for primaries. The resulted intensity shows the secondary antiprotons produced in pp collisions in the galaxy, have a high population as one can not consider an excess for extragalactic antiprotons. Also there is a high degree of uncertainty in different parameters.

  20. In Brief: Report details climate change effects on cultural sites

    Zielinski, Sarah


    A new report from UNESCO (United Nations Educational, Scientific, and Cultural Organization) details how 26 World Heritage sites could be affected by coming climate changes. The 26 examples, which are meant to be representative of the range of threats to the 830 sites inscribed in the World Heritage List, are divided into five types: archaeological sites, glaciers, historic cities and settlements, marine biodiversity, and terrestrial biodiversity. Some of the examples include the Great Barrier Reef, which is expected to experience more frequent episodes of coral bleaching; Timbuktu in Mali, threatened by desertification; and the Chavín Archaeological Site in the Peruvian Central Andes, one of the earliest and best-known pre-Columbian sites, which could be affected by glacier melting and flooding. The report, ``Case Studies on Climate Change and World Heritage,'' is available at

  1. The AECL study for an intense neutron - generator (technical details)

    Bartholomew, G.A.; Tunnicliffe, P.R


    The AECL study for an intense neutron-generator has been in progress for two years. Recently the scientific and technical details and the conceptual designs were compiled in a report supporting proposals addressed to AECL's Board of Directors for further work. The compilation is being issued in this form to permit further discussion of the technical aspects. However readers are asked to appreciate that it was written primarily for an AECL audience, and specifically that those chapters giving tentative information about costs, the rate of investment and similar items have been omitted or modified, many references have been made to interim internal reports in order to complete the local documentation, but these references do not imply that the reports themselves can be made generally available. (author)

  2. Detailed explicit solution of the electrodynamic wave equations

    Iryna Yu. Dmitrieva


    Full Text Available Present results concern the general scientific tendency dealing with mathematical modeling and analytical study of electromagnetic field phenomena described by the systems of partial differential equations. Specific electrodynamic engineering process with expofunctional influences is simulated by the differential Maxwell system whose effective research is equivalent to the rigorous solution of the general wave partial differential equation regarding all scalar components of electromagnetic field vector intensities. The given equation is solved explicitly in detail using method of integral transforms and irrespectively to the concrete boundary conditions. Specific cases of unexcited vacuum and isotropic homogeneous medium were considered. Proposed approach can be applied to any finite dimensional system of partial differential equations with piece wise constant coefficients and its corresponding scalar equations representing mathematical models in modern electrodynamics. In comparison with the known results, current research is completely thorough and accurate that implies its direct practical application.

  3. A detailed study of patent system for protection of inventions

    Tulasi G


    Full Text Available Creations of brain are called intellect. Since these creations have good commercial value, are called as property. Inventions are intellectual property and can be protected by patents provided the invention is novel, non-obvious, useful and enabled. To have fare trade among member countries, World Trade Organisation proposed TRIPS agreement. India had taken necessary initiation by signing the World Trade Organisation agreement and transformed to global needs. The aim of this article is to enlighten pharmaceutical professionals especially in the field of research and development about planning inventions by thorough review of prior-art, which saves time and money. A thorough understanding is made possible by providing details of origin; present governing bodies, their role along with the Act that is safeguarding the patent system.

  4. San Joaquin-Tulare Conjunctive Use Model: Detailed model description

    Quinn, N.W.T.


    The San Joaquin - Tulare Conjunctive Use Model (SANTUCM) was originally developed for the San Joaquin Valley Drainage Program to evaluate possible scenarios for long-term management of drainage and drainage - related problems in the western San Joaquin Valley of California. A unique aspect of this model is its coupling of a surface water delivery and reservoir operations model with a regional groundwater model. The model also performs salinity balances along the tributaries and along the main stem of the San Joaquin River to allow assessment of compliance with State Water Resources Control Board water quality objectives for the San Joaquin River. This document is a detailed description of the various subroutines, variables and parameters used in the model.

  5. The Detailed Chemical Abundance Patterns of M31 Globular Clusters

    Colucci, J E; Cohen, J


    We present detailed chemical abundances for $>$20 elements in $\\sim$30 globular clusters in M31. These results have been obtained using high resolution ($\\lambda/\\Delta\\lambda\\sim$24,000) spectra of their integrated light and analyzed using our original method. The globular clusters have galactocentric radii between 2.5 kpc and 117 kpc, and therefore provide abundance patterns for different phases of galaxy formation recorded in the inner and outer halo of M31. We find that the clusters in our survey have a range in metallicity of $-2.2$20 kpc have a small range in abundance of [Fe/H]$=-1.6 \\pm 0.10$. We also measure abundances of alpha, r- and s-process elements. These results constitute the first abundance pattern constraints for old populations in M31 that are comparable to those known for the Milky Way halo.

  6. A Detailed Strategy for Managing Corporation Cyber War Security

    Walid Al-Ahmad


    Full Text Available Modern corporations depend heavily on information and communication technologies and are becoming increasingly interconnected locally and internationally. This interconnectedness and dependency on information technology make corporations vulnerable to cyber attacks. Corporate managers therefore need to understand the growing cyber war threats and implement appropriate strategies to mitigate the risks. This research work is an attempt to develop a generic and detailed strategy to assist corporations in managing the cyber war security. The implementation of such a strategy will definitely lead to a more secure business environment and as a result will attract foreign investments to the Arab countries in the Middle East. Such a strategy can be considered as a first step toward protecting corporations from cyber war threats in an effective manner.

  7. Detailed CFD Modelling of Open Refrigerated Display Cabinets

    Pedro Dinis Gaspar


    Full Text Available A comprehensive and detailed computational fluid dynamics (CFDs modelling of air flow and heat transfer in an open refrigerated display cabinet (ORDC is performed in this study. The physical-mathematical model considers the flow through the internal ducts, across fans and evaporator, and includes the thermal response of food products. The air humidity effect and thermal radiation heat transfer between surfaces are taken into account. Experimental tests were performed to characterize the phenomena near physical extremities and to validate the numerical predictions of air temperature, relative humidity, and velocity. Numerical and experimental results comparison reveals the predictive capabilities of the computational model for the optimized conception and development of this type of equipments. Numerical predictions are used to propose geometrical and functional parametric studies that improve thermal performance of the ORDC and consequently food safety.

  8. Detailed 3-D nuclear analysis of ITER outboard blanket modules

    Bohm, Tim, E-mail: [Fusion Technology Institute, University of Wisconsin-Madison, Madison, WI (United States); Davis, Andrew; Sawan, Mohamed; Marriott, Edward; Wilson, Paul [Fusion Technology Institute, University of Wisconsin-Madison, Madison, WI (United States); Ulrickson, Michael; Bullock, James [Formerly, Fusion Technology, Sandia National Laboratories, Albuquerque, NM (United States)


    Highlights: • Nuclear analysis was performed on detailed CAD models placed in a 40 degree model of ITER. • The regions examined include BM09, the upper ELM coil region (BM11–13), the neutral beam (NB) region (BM13–16), and BM18. • The results show that VV nuclear heating exceeds limits in the NB and upper ELM coil regions. • The results also show that the level of He production in parts of BM18 exceeds limits. • These calculations are being used to modify the design of the ITER blanket modules. - Abstract: In the ITER design, the blanket modules (BM) provide thermal and nuclear shielding for the vacuum vessel (VV), magnets, and other components. We used the CAD based DAG-MCNP5 transport code to analyze detailed models inserted into a 40 degree partially homogenized ITER global model. The regions analyzed include BM09, BM16 near the heating neutral beam injection (HNB) region, BM11–13 near the upper ELM coil region, and BM18. For the BM16 HNB region, the VV nuclear heating behind the NB region exceeds the design limit by up to 80%. For the BM11–13 region, the nuclear heating of the VV exceeds the design limit by up to 45%. For BM18, the results show that He production does not meet the limit necessary for re-welding. The results presented in this work are being used by the ITER Organization Blanket and Tokamak Integration groups to modify the BM design in the cases where limits are exceeded.

  9. Rockballer Sample Acquisition Tool

    Giersch, Louis R.; Cook, Brant T.


    It would be desirable to acquire rock and/or ice samples that extend below the surface of the parent rock or ice in extraterrestrial environments such as the Moon, Mars, comets, and asteroids. Such samples would allow measurements to be made further back into the geologic history of the rock, providing critical insight into the history of the local environment and the solar system. Such samples could also be necessary for sample return mission architectures that would acquire samples from extraterrestrial environments for return to Earth for more detailed scientific investigation.

  10. Gigantic Cosmic Corkscrew Reveals New Details About Mysterious Microquasar


    Making an extra effort to image a faint, gigantic corkscrew traced by fast protons and electrons shot out from a mysterious microquasar paid off for a pair of astrophysicists who gained new insights into the beast's inner workings and also resolved a longstanding dispute over the object's distance. Microquasar SS 433 VLA Image of Microquasar SS 433 CREDIT: Blundell & Bowler, NRAO/AUI/NSF (Click on Image for Larger Version) The astrophysicists used the National Science Foundation's Very Large Array (VLA) radio telescope to capture the faintest details yet seen in the plasma jets emerging from the microquasar SS 433, an object once dubbed the "enigma of the century." As a result, they have changed scientists' understanding of the jets and settled the controversy over its distance "beyond all reasonable doubt," they said. SS 433 is a neutron star or black hole orbited by a "normal" companion star. The powerful gravity of the neutron star or black hole draws material from the stellar wind of its companion into an accretion disk of material tightly circling the dense central object prior to being pulled onto it. This disk propels jets of fast protons and electrons outward from its poles at about a quarter of the speed of light. The disk in SS 433 wobbles like a child's top, causing its jets to trace a corkscrew in the sky every 162 days. The new VLA study indicates that the speed of the ejected particles varies over time, contrary to the traditional model for SS 433. "We found that the actual speed varies between 24 percent to 28 percent of light speed, as opposed to staying constant," said Katherine Blundell, of the University of Oxford in the United Kingdom. "Amazingly, the jets going in both directions change their speeds simultaneously, producing identical speeds in both directions at any given time," Blundell added. Blundell worked with Michael Bowler, also of Oxford. The scientists' findings have been accepted by the Astrophysical Journal Letters. SS 433 New VLA

  11. Micro-computed tomography for small animal imaging: Technological details

    Hao Li; Hui Zhang; Zhiwei Tang; Guangshu Hu


    The high-resolution micro-computed tomography(micro-CT)system has now become an important tool for biological research.The micro-CT system enables a non-invasive inspection to screen anatomical changes in small animals.The promising advantages include high-spatial resolution,high sensitivity to bone and lung,short scan time and cost-effectiveness.The dose received by the small animal might be a critical concern in the research.In this article,the choice of the components,fundamental physical problems,the image reconstruction algorithm and the representative applications of micro-CT are summarized.Some results from our research group are also presented to show high-resolution images obtained by the micro-CT system.

  12. The Design of Tools for Sketching Sensor-Based Interaction

    Brynskov, Martin; Lunding, Rasmus; Vestergaard, Lasse Steenbock


    , flexibility and cost, aimed at wearable and ultra-mobile prototyping where fast reaction is needed (e.g. in controlling sound), and we discuss the general issues facing this category of embodied interaction design tools. We then present the platform in more detail, both regarding hard- ware and software....... In the brief evaluation, we present our initial experiences with the platform both in design projects and in teaching. We conclude that DUL Radio does seem to be a relatively easy-to-use tool for sketching sensor-based interaction compared to other solutions, but that there are many ways to improve it. Target...... users include designers, students, artists etc. with minimal programming and hardware skills, but this paper adresses the issues with designing the tools, which includes technical details....

  13. APT: Aperture Photometry Tool

    Laher, Russ


    Aperture Photometry Tool (APT) is software for astronomers and students interested in manually exploring the photometric qualities of astronomical images. It has a graphical user interface (GUI) which allows the image data associated with aperture photometry calculations for point and extended sources to be visualized and, therefore, more effectively analyzed. Mouse-clicking on a source in the displayed image draws a circular or elliptical aperture and sky annulus around the source and computes the source intensity and its uncertainty, along with several commonly used measures of the local sky background and its variability. The results are displayed and can be optionally saved to an aperture-photometry-table file and plotted on graphs in various ways using functions available in the software. APT is geared toward processing sources in a small number of images and is not suitable for bulk processing a large number of images, unlike other aperture photometry packages (e.g., SExtractor). However, APT does have a convenient source-list tool that enables calculations for a large number of detections in a given image. The source-list tool can be run either in automatic mode to generate an aperture photometry table quickly or in manual mode to permit inspection and adjustment of the calculation for each individual detection. APT displays a variety of useful graphs, including image histogram, and aperture slices, source scatter plot, sky scatter plot, sky histogram, radial profile, curve of growth, and aperture-photometry-table scatter plots and histograms. APT has functions for customizing calculations, including outlier rejection, pixel “picking” and “zapping,” and a selection of source and sky models. The radial-profile-interpolation source model, accessed via the radial-profile-plot panel, allows recovery of source intensity from pixels with missing data and can be especially beneficial in crowded fields.

  14. Social Data Analysis Tool

    Hussain, Abid; Vatrapu, Ravi; Hardt, Daniel;


    As governments, citizens and organizations have moved online there is an increasing need for academic enquiry to adapt to this new context for communication and political action. This adaptation is crucially dependent on researchers being equipped with the necessary methodological tools to extrac...... and analyze web data in the process of investigating substantive questions......., analyze and visualize patterns of web activity. This volume profiles the latest techniques being employed by social scientists to collect and interpret data from some of the most popular social media applications, the political parties' own online activist spaces, and the wider system of hyperlinks...

  15. Water Powered Tools


    Space Spin-Offs, Inc. under a contract with Lewis Research Center and Marshall Space Flight Center produced a new water-powered saw that cuts through concrete and steel plate reducing danger of explosion or electric shock in rescue and other operations. In prototype unit efficient water-powered turbine drives an 8 inch diameter grinding disk at 6,600 rpm. Exhaust water cools disk and workpiece quenching any sparks produced by cutting head. At maximum power, tool easily cuts through quarter inch steel plate. Adapter heads for chain saws, impact wrenches, heavy duty drills, and power hack saws can be fitted.

  16. Cyber Security Evaluation Tool


    CSET is a desktop software tool that guides users through a step-by-step process to assess their control system network security practices against recognized industry standards. The output from CSET is a prioritized list of recommendations for improving the cyber security posture of your organization’s ICS or enterprise network. CSET derives the recommendations from a database of cybersecurity standards, guidelines, and practices. Each recommendation is linked to a set of actions that can be applied to enhance cybersecurity controls.

  17. Jupiter Environment Tool

    Sturm, Erick J.; Monahue, Kenneth M.; Biehl, James P.; Kokorowski, Michael; Ngalande, Cedrick,; Boedeker, Jordan


    The Jupiter Environment Tool (JET) is a custom UI plug-in for STK that provides an interface to Jupiter environment models for visualization and analysis. Users can visualize the different magnetic field models of Jupiter through various rendering methods, which are fully integrated within STK s 3D Window. This allows users to take snapshots and make animations of their scenarios with magnetic field visualizations. Analytical data can be accessed in the form of custom vectors. Given these custom vectors, users have access to magnetic field data in custom reports, graphs, access constraints, coverage analysis, and anywhere else vectors are used within STK.

  18. Geometric reasoning about assembly tools

    Wilson, R.H.


    Planning for assembly requires reasoning about various tools used by humans, robots, or other automation to manipulate, attach, and test parts and subassemblies. This paper presents a general framework to represent and reason about geometric accessibility issues for a wide variety of such assembly tools. Central to the framework is a use volume encoding a minimum space that must be free in an assembly state to apply a given tool, and placement constraints on where that volume must be placed relative to the parts on which the tool acts. Determining whether a tool can be applied in a given assembly state is then reduced to an instance of the FINDPLACE problem. In addition, the author presents more efficient methods to integrate the framework into assembly planning. For tools that are applied either before or after their target parts are mated, one method pre-processes a single tool application for all possible states of assembly of a product in polynomial time, reducing all later state-tool queries to evaluations of a simple expression. For tools applied after their target parts are mated, a complementary method guarantees polynomial-time assembly planning. The author presents a wide variety of tools that can be described adequately using the approach, and surveys tool catalogs to determine coverage of standard tools. Finally, the author describes an implementation of the approach in an assembly planning system and experiments with a library of over one hundred manual and robotic tools and several complex assemblies.

  19. Tool for system optimisation; Electric motors; Vaerktoej til brug ved systemoptimering

    Nielsen, Sandie B.; Jespersen, P.T.; Hvenegaard, C.M. (Teknologisk Institut, Taastrup (Denmark))


    The report describes in detail the development of models of belt transmissions, gears and motor and frequency converter, which is the core of the tool. The report also contains a detailed example of using the program for optimization of a ventilation system. With the tool it is possible to design an energy efficient system, where each individual component is energy efficient and where the components are adjusted relative to demand. (LN)

  20. Tools for Social Construction

    Brynskov, Martin

    In this dissertation, I propose a new type of playful media for children. Based on field work, prototypes, and theoretical development, I define, build, and explore a distinct cross-section of existing and new digital media, tools, and communication devices in order to assess the characteristics...... as transformative social play.             Through a series of three experimental prototype systems—StarCatcher, DARE!, and Pervasive Storyboard—I conduct an iterative design process leading to a formulation of pervasive narratives as the general concept which, I argue, holds promise as the core of future tools.......             The theoretical concepts and analyses are based on theories of play and games and on the activity-based digital habitat framework. The design activities result in experimental prototypes which realize the concepts and, in turn, allow for iterative evaluation and validation of the theoretical models...

  1. Materials science tools for regenerative medicine

    Richardson, Wade Nicholas

    Regenerative therapies originating from recent technological advances in biology could revolutionize medicine in the coming years. In particular, the advent of human pluripotent stem cells (hPSCs), with their ability to become any cell in the adult body, has opened the door to an entirely new way of treating disease. However, currently these medical breakthroughs remain only a promise. To make them a reality, new tools must be developed to surmount the new technical hurdles that have arisen from dramatic departure from convention that this field represents. The collected work presented in this dissertation covers several projects that seek to apply the skills and knowledge of materials science to this tool synthesizing effort. The work is divided into three chapters. The first deals with our work to apply Raman spectroscopy, a tool widely used for materials characterization, to degeneration in cartilage. We have shown that Raman can effectively distinguish the matrix material of healthy and diseased tissue. The second area of work covered is the development of a new confocal image analysis for studying hPSC colonies that are chemical confined to uniform growth regions. This tool has important application in understanding the heterogeneity that may slow the development of hPSC -based treatment, as well as the use of such confinement in the eventually large-scale manufacture of hPSCs for therapeutic use. Third, the use of structural templating in tissue engineering scaffolds is detailed. We have utilized templating to tailor scaffold structures for engineering of constructs mimicking two tissues: cartilage and lung. The work described here represents several important early steps towards large goals in regenerative medicine. These tools show a great deal of potential for accelerating progress in this field that seems on the cusp of helping a great many people with otherwise incurable disease.

  2. Soomlase disainitud tool sai preemia


    Isku OY disaineri Tapio Anttila loodud tool Haiku sai rahvusvahelisel disainikonkursil ADEX 2004 mööbli kategoorias Adex Silveri preemia. Mullu sai sama tool Chicago arhitektuuri- ja disainimuuseumi preemia Good Design Award

  3. Hinged Shields for Machine Tools

    Lallande, J. B.; Poland, W. W.; Tull, S.


    Flaps guard against flying chips, but fold away for tool setup. Clear plastic shield in position to intercept flying chips from machine tool and retracted to give operator access to workpiece. Machine shops readily make such shields for own use.

  4. Transformational Tools and Technologies Project

    National Aeronautics and Space Administration — The Transformational Tools and Technologies (TTT) Project advances state-of-the-art computational and experimental tools and technologies that are vital to aviation...

  5. New Tool to Predict Glaucoma

    ... News About Us Donate In This Section A New Tool to Predict Glaucoma email Send this article ... determine if a patient has glaucoma. Recently, a new tool has become available to eye care specialists ...

  6. Case and Administrative Support Tools

    U.S. Environmental Protection Agency — Case and Administrative Support Tools (CAST) is the secure portion of the Office of General Counsel (OGC) Dashboard business process automation tool used to help...

  7. Detailed source process of the 2007 Tocopilla earthquake.

    Peyrat, S.; Madariaga, R.; Campos, J.; Asch, G.; Favreau, P.; Bernard, P.; Vilotte, J.


    We investigated the detail rupture process of the Tocopilla earthquake (Mw 7.7) of the 14 November 2007 and of the main aftershocks that occurred in the southern part of the North Chile seismic gap using strong motion data. The earthquake happen in the middle of the permanent broad band and strong motion network IPOC newly installed by GFZ and IPGP, and of a digital strong-motion network operated by the University of Chile. The Tocopilla earthquake is the last large thrust subduction earthquake that occurred since the major Iquique 1877 earthquake which produced a destructive tsunami. The Arequipa (2001) and Antofagasta (1995) earthquakes already ruptured the northern and southern parts of the gap, and the intraplate intermediate depth Tarapaca earthquake (2005) may have changed the tectonic loading of this part of the Peru-Chile subduction zone. For large earthquakes, the depth of the seismic rupture is bounded by the depth of the seismogenic zone. What controls the horizontal extent of the rupture for large earthquakes is less clear. Factors that influence the extent of the rupture include fault geometry, variations of material properties and stress heterogeneities inherited from the previous ruptures history. For subduction zones where structures are not well known, what may have stopped the rupture is not obvious. One crucial problem raised by the Tocopilla earthquake is to understand why this earthquake didn't extent further north, and at south, what is the role of the Mejillones peninsula that seems to act as a barrier. The focal mechanism was determined using teleseismic waveforms inversion and with a geodetic analysis (cf. Campos et al.; Bejarpi et al., in the same session). We studied the detailed source process using the strong motion data available. This earthquake ruptured the interplate seismic zone over more than 150 km and generated several large aftershocks, mainly located south of the rupture area. The strong-motion data show clearly two S

  8. Petri Net Tool Overview 1986

    Jensen, Kurt; Feldbrugge, Frits


    This paper provides an overview of the characteristics of all currently available net based tools. It is a compilation of information provided by tool authors or contact persons. A concise one page overview is provided as well.......This paper provides an overview of the characteristics of all currently available net based tools. It is a compilation of information provided by tool authors or contact persons. A concise one page overview is provided as well....

  9. Online Algebraic Tools for Teaching

    Kurz, Terri L.


    Many free online tools exist to complement algebraic instruction at the middle school level. This article presents findings that analyzed the features of algebraic tools to support learning. The findings can help teachers select appropriate tools to facilitate specific topics. (Contains 1 table and 4 figures.)

  10. TNO monitoring plan development tool

    Sijacic, D.; Wildenborg, T.; Steeghs, P.


    TNO has developed a software tool that supports the design of a risk-based monitoring plan for a CO2 storage site. The purpose of the tool is to aid storage site operators by facilitating a structured monitoring technologies selection or evaluation process. The tool makes a selection this recommende

  11. Security for ICT collaboration tools

    Broenink, E.G.; Kleinhuis, G.; Fransen, F.


    In order for collaboration tools to be productive in an operational setting, an information base that is shared across the collaborating parties is needed. Therefore, a lot of research is done for tooling to create such a common information base in a collaboration tool. However, security is often no

  12. Productivity Tools for the Classroom.

    Schiffman, Shirl S.


    Presents rationale for including use of productivity tool software--database management systems, spreadsheets, graphics software, word processing--in classrooms and reviews appropriate strategies for introducing students to these tools. Discussion covers adaptability of these tools to various academic disciplines and illustrates how students…

  13. Detailed thermodynamic analyses of high-speed compressible turbulence

    Towery, Colin; Darragh, Ryan; Poludnenko, Alexei; Hamlington, Peter


    Interactions between high-speed turbulence and flames (or chemical reactions) are important in the dynamics and description of many different combustion phenomena, including autoignition and deflagration-to-detonation transition. The probability of these phenomena to occur depends on the magnitude and spectral content of turbulence fluctuations, which can impact a wide range of science and engineering problems, from the hypersonic scramjet engine to the onset of Type Ia supernovae. In this talk, we present results from new direct numerical simulations (DNS) of homogeneous isotropic turbulence with turbulence Mach numbers ranging from 0 . 05 to 1 . 0 and Taylor-scale Reynolds numbers as high as 700. A set of detailed analyses are described in both Eulerian and Lagrangian reference frames in order to assess coherent (structural) and incoherent (stochastic) thermodynamic flow features. These analyses provide direct insights into the thermodynamics of strongly compressible turbulence. Furthermore, presented results provide a non-reacting baseline for future studies of turbulence-chemistry interactions in DNS with complex chemistry mechanisms. This work was supported by the Air Force Office of Scientific Research (AFOSR) under Award No. FA9550-14-1-0273, and the Department of Defense (DoD) High Performance Computing Modernization Program (HPCMP) under a Frontier project award.

  14. Lévy targeting and the principle of detailed balance.

    Garbaczewski, Piotr; Stephanovich, Vladimir


    We investigate confining mechanisms for Lévy flights under premises of the principle of detailed balance. In this case, the master equation of the jump-type process admits a transformation to the Lévy-Schrödinger semigroup dynamics akin to a mapping of the Fokker-Planck equation into the generalized diffusion equation. This sets a correspondence between above two stochastic dynamical systems, within which we address a (stochastic) targeting problem for an arbitrary stability index μ ε (0,2) of symmetric Lévy drivers. Namely, given a probability density function, specify the semigroup potential, and thence the jump-type dynamics for which this PDF is actually a long-time asymptotic (target) solution of the master equation. Here, an asymptotic behavior of different μ-motion scenarios ceases to depend on μ. That is exemplified by considering Gaussian and Cauchy family target PDFs. A complementary problem of the reverse engineering is analyzed: given a priori a semigroup potential, quantify how sensitive upon the choice of the μ driver is an asymptotic behavior of solutions of the associated master equation and thus an invariant PDF itself. This task is accomplished for so-called μ family of Lévy oscillators.

  15. Detailed study of flat bands appearing in metallic photonic crystals

    Vala, Ali Soltani [Department of Solid State Physics, Faculty of Physics, University of Tabriz, PO Box 51665-163, Tabriz (Iran, Islamic Republic of); Research Institute for Applied Physics and Astronomy, University of Tabriz, PO Box 51665-163, Tabriz (Iran, Islamic Republic of); Sedghi, Aliasghar; Hosseini, Naser [Department of Solid State Physics, Faculty of Physics, University of Tabriz, PO Box 51665-163, Tabriz (Iran, Islamic Republic of); Kalafi, Manouchehr [Department of Solid State Physics, Faculty of Physics, University of Tabriz, PO Box 51665-163, Tabriz (Iran, Islamic Republic of); Research Institute for Applied Physics and Astronomy, University of Tabriz, PO Box 51665-163, Tabriz (Iran, Islamic Republic of); Excellence Centre for Photonics, University of Tabriz, PO Box 51665-163, Tabriz (Iran, Islamic Republic of)


    It has been difficult to compute the band structures of metallic photonic crystals for H-polarization. The existence of surface plasmon modes is the major reason for difficulty due to the localized nature of these modes. In this study, by virtue of the efficiency of the newly developed Dirichlet-to-Neumann map method, we are able to investigate the details of the flat bands in a two dimensional square lattice with metallic cylinders. We have obtained fine band structure for H polarization around the flat band region which has not been reported to the best of our knowledge. Our numerical results show that for the frequency around the surface plasmon, the modes are highly localized at the interface of the cylindrical metallic rods and air background and also by approaching the modes to the surface plasmon frequency the localization length decreases and the number of field's nodes increases considerably. (copyright 2011 WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim) (orig.)

  16. Urban scale air quality modelling using detailed traffic emissions estimates

    Borrego, C.; Amorim, J. H.; Tchepel, O.; Dias, D.; Rafael, S.; Sá, E.; Pimentel, C.; Fontes, T.; Fernandes, P.; Pereira, S. R.; Bandeira, J. M.; Coelho, M. C.


    The atmospheric dispersion of NOx and PM10 was simulated with a second generation Gaussian model over a medium-size south-European city. Microscopic traffic models calibrated with GPS data were used to derive typical driving cycles for each road link, while instantaneous emissions were estimated applying a combined Vehicle Specific Power/Co-operative Programme for Monitoring and Evaluation of the Long-range Transmission of Air Pollutants in Europe (VSP/EMEP) methodology. Site-specific background concentrations were estimated using time series analysis and a low-pass filter applied to local observations. Air quality modelling results are compared against measurements at two locations for a 1 week period. 78% of the results are within a factor of two of the observations for 1-h average concentrations, increasing to 94% for daily averages. Correlation significantly improves when background is added, with an average of 0.89 for the 24 h record. The results highlight the potential of detailed traffic and instantaneous exhaust emissions estimates, together with filtered urban background, to provide accurate input data to Gaussian models applied at the urban scale.

  17. Detailed 3-D nuclear analysis of ITER blanket modules

    Bohm, T.D., E-mail: [University of Wisconsin-Madison, Madison, WI (United States); Sawan, M.E.; Marriott, E.P.; Wilson, P.P.H. [University of Wisconsin-Madison, Madison, WI (United States); Ulrickson, M.; Bullock, J. [Sandia National Laboratories, Albuquerque, NM (United States)


    In ITER, the blanket modules (BM) are arranged around the plasma to provide thermal and nuclear shielding for the vacuum vessel (VV), magnets, and other components. As a part of the BM design process, nuclear analysis is required to determine the level of nuclear heating, helium production, and radiation damage in the BM. Additionally, nuclear heating in the VV is also important for assessing the BM design. We used the CAD based DAG-MCNP5 transport code to analyze detailed models inserted into a 40-degree partially homogenized ITER global model. The regions analyzed include BM01, the neutral beam injection (NB) region, and the upper port region. For BM01, the results show that He production meets the limit necessary for re-welding, and the VV heating behind BM01 is acceptable. For the NBI region, the VV nuclear heating behind the NB region exceeds the design limit by a factor of two. For the upper port region, the nuclear heating of the VV exceeds the design limit by up to 20%. The results presented in this work are being used to modify the BM design in the cases where limits are exceeded.

  18. Detailed analysis of structure and particle trajectories in sheared suspensions

    Morris, Jeffrey; Katyal, Bhavana


    The structure and particle dynamics of sheared suspensions of hard spheres over a range of shear strength to Brownain motion (Péclet number, Pe) have been studied by detailed analysis of extended sampling of Stokesian Dynamics simulations of simple shear. The emphasis is upon large Pe. The structure has been analyzed by decomposition of the pair distribution function, g(r), into spherical harmonics; the harmonics are a complete set for the decompositon. The results indicate a profound and very marked change in structure due to shearing. It is shown that as Pe increases, the structure is increasingly distorted from teh equilibrium spherical symmetry and the number of harmonics required to recompose the original data to within an arbitrary accuracy increases, and this variation depends upon particle fraction. We present information on the content of the dominant harmonics as a function of radial distance for a pair, and interpret the results in terms of preferred directions in the material. Dynamic particle trajectories at time scales long relative to that used for the Brownian step are analyzed in a novel fashion by simple differential geometric measures, such as root mean square path curvature and torsion. Preliminary results illustrate that the path variation from mean flow correlates with the particle stress.

  19. Flow effects on jet energy loss with detailed balance

    CHENG Luan; LIU Jia; WANG EnKe


    In the presence of collective flow a new model potential describing the interaction of the hard jet with scattering centers is derived based on the static color-screened Yukawa potential.The flow effect on jet quenching with detailed balance is investigated in pQCD.It turns out,considering the collective flow with velocity vz along the jet direction,the collective flow decreases the LPM destructive interference comparing to that in the static medium.The gluon absorption plays a more important role in the moving medium.The collective flow increases the energy gain from gluon absorption,however,decreases the energy loss from gluon radiation,which is (1-vz) times as that in the static medium to the first order of opacity.In the presence of collective flow,the second order in opacity correction is relatively small compared to the first order.So that the total effective energy loss is decreased.The flow dependence of the energy loss will affect the suppression of high PT hadron spectrum and anisotropy parameter v2 in high-energy heavy-ion collisions.

  20. Detailed balance limit efficiency of silicon intermediate band solar cells

    Cao Quan; Ma Zhi-Hua; Xue Chun-Lai; Zuo Yu-Hua; Wang Qi-Ming


    The detailed balance method is used to study the potential of the intermediate band solar cell (IBSC),which can improve the efficiency of the Si-based solar cell with a bandgap between 1.1 eV to 1.7 eV. It shows that a crystalline silicon solar cell with an intermediate band located at 0.36 eV below the conduction band or above the valence band can reach a limiting efficiency of 54% at the maximum light concentration,improving greatly than 40.7% of the Shockley-Queisser limit for the single junction Si solar cell. The simulation also shows that the limiting efficiency of the siliconbased solar cell increases as the bandgap increases from 1.1 eV to 1.7 eV,and the amorphous Si solar cell with a bandgap of 1.7 eV exhibits a radiative limiting efficiency of 62.47%,having a better potential.

  1. Detailed Measurements of Structure Functions from Nucleons and Nuclei


    The experiment will study deep inelastic muon nucleon scattering in a wide range of Q|2~(1-200 (GeV/c)|2) and x~(0.005-0.75). The main aims of the experiment are: \\item a)~~~~Detailed measurements of the nuclear dependence of the structure function F^2|A, of R~=~@s^L/@s^T and of the cross-section for J/@y production. They will provide a basis for the understanding of the EMC effect: the modification of quark and gluon distributions due to the nuclear environment. \\item b)~~~~A simultaneous high luminosity measurement of the structure function F^2 on hydrogen and deuterium. This will provide substantially improved accuracy in the knowledge of the neutron structure function F^2|n, of F^2|p-F^2|n and F^2|n/F^2|p and their Q|2 dependence. Furthermore, the data will allow a determination of the strong coupling constant @a^s(Q|2) with reduced experimental and theoretical uncertainties as well as of the ratio of the down to up quark distributions in the valence region. Due to the large x range covered by the experim...

  2. Further ALMA observations and detailed modeling of the Red Rectangle

    Bujarrabal, V; Alcolea, J; Santander-Garcia, M; Van Winckel, H; Contreras, C Sanchez


    We present new high-quality ALMA observations of the Red Rectangle (a well known post-AGB object) in C17O J=6-5 and H13CN J=4-3 line emission and results from a new reduction of already published 13CO J=3-2 data. A detailed model fitting of all the molecular line data, including previous maps and single-dish spectra, was performed using a sophisticated code. These observations and the corresponding modeling allowed us to deepen the analysis of the nebular properties. We also stress the uncertainties in the model fitting. We confirm the presence of a rotating equatorial disk and an outflow, which is mainly formed of gas leaving the disk. The mass of the disk is ~ 0.01 Mo, and that of the CO-rich outflow is ~ 10 times smaller. High temperatures of ~ 100 K are derived for most components. From comparison of the mass values, we roughly estimate the lifetime of the rotating disk, which is found to be of about 10000 yr. Taking data of a few other post-AGB composite nebulae into account, we find that the lifetimes o...

  3. Detailed temporal structure of communication networks in groups of songbirds.

    Stowell, Dan; Gill, Lisa; Clayton, David


    Animals in groups often exchange calls, in patterns whose temporal structure may be influenced by contextual factors such as physical location and the social network structure of the group. We introduce a model-based analysis for temporal patterns of animal call timing, originally developed for networks of firing neurons. This has advantages over cross-correlation analysis in that it can correctly handle common-cause confounds and provides a generative model of call patterns with explicit parameters for the influences between individuals. It also has advantages over standard Markovian analysis in that it incorporates detailed temporal interactions which affect timing as well as sequencing of calls. Further, a fitted model can be used to generate novel synthetic call sequences. We apply the method to calls recorded from groups of domesticated zebra finch (Taeniopygia guttata) individuals. We find that the communication network in these groups has stable structure that persists from one day to the next, and that 'kernels' reflecting the temporal range of influence have a characteristic structure for a calling individual's effect on itself, its partner and on others in the group. We further find characteristic patterns of influences by call type as well as by individual.

  4. Sleep-dependent facilitation of episodic memory details.

    Els van der Helm

    Full Text Available While a role for sleep in declarative memory processing is established, the qualitative nature of this consolidation benefit, and the physiological mechanisms mediating it, remain debated. Here, we investigate the impact of sleep physiology on characteristics of episodic memory using an item- (memory elements and context- (contextual details associated with those elements learning paradigm; the latter being especially dependent on the hippocampus. Following back-to-back encoding of two word lists, each associated with a different context, participants were assigned to either a Nap-group, who obtained a 120-min nap, or a No Nap-group. Six hours post-encoding, participants performed a recognition test involving item-memory and context-memory judgments. In contrast to item-memory, which demonstrated no between-group differences, a significant benefit in context-memory developed in the Nap-group, the extent of which correlated both with the amount of stage-2 NREM sleep and frontal fast sleep-spindles. Furthermore, a difference was observed on the basis of word-list order, with the sleep benefit and associated physiological correlations being selective for the second word-list, learned last (most proximal to sleep. These findings suggest that sleep may preferentially benefit contextual (hippocampal-dependent aspects of memory, supported by sleep-spindle oscillations, and that the temporal order of initial learning differentially determines subsequent offline consolidation.

  5. A detailed study of the enigmatic cluster M82F

    Bastian, N; Smith, L J; Trancho, G; Westmoquette, M S; Gallagher, J S


    We present a detailed study of the stellar cluster M82F, using multi-band high resolution HST imaging and deep ground based optical slit and integral field spectroscopy. Using the imaging we create colour maps of the cluster and surrounding region in order to search for substructure. We find a large amount of substructure, which we interpret as the result of differential extinction across the projected face of the cluster. With this interpretation, we are able to construct a spatially resolved extinction map across the cluster which is used to derive the intrinsic flux distribution. Fitting cluster profiles (King and EFF) to the intrinsic images we find that the cluster is 15-30% larger than previous estimates, and that no strong evidence of mass segregation in this cluster exists. Using the optical spectra, we find that the age of M82F is 60-80 Myr and from its velocity conclude that the cluster is not physically associated with a large HII region that it is projected upon, both in agreement with previous st...

  6. A detailed framework to incorporate dust in hydrodynamical simulations

    Grassi, T; Haugboelle, T; Schleicher, D R G


    Dust plays a key role in the evolution of the ISM and its correct modelling in numerical simulations is therefore fundamental. We present a new and self-consistent model that treats grain thermal coupling with the gas, radiation balance, and surface chemistry for molecular hydrogen. This method can be applied to any dust distribution with an arbitrary number of grain types without affecting the overall computational cost. In this paper we describe in detail the physics and the algorithm behind our approach, and in order to test the methodology, we present some examples of astrophysical interest, namely (i) a one-zone collapse with complete gas chemistry and thermochemical processes, (ii) a 3D model of a low-metallicity collapse of a minihalo starting from cosmological initial conditions, and (iii) a turbulent molecular cloud with H-C-O chemistry (277 reactions), together with self-consistent cooling and heating solved on the fly. Although these examples employ the publicly available code KROME, our approach c...

  7. Detailed model for practical pulverized coal furnaces and gasifiers

    Smith, P.J.; Smoot, L.D.


    This study has been supported by a consortium of nine industrial and governmental sponsors. Work was initiated on May 1, 1985 and completed August 31, 1989. The central objective of this work was to develop, evaluate and apply a practical combustion model for utility boilers, industrial furnaces and gasifiers. Key accomplishments have included: Development of an advanced first-generation, computer model for combustion in three dimensional furnaces; development of a new first generation fouling and slagging submodel; detailed evaluation of an existing NO{sub x} submodel; development and evaluation of an improved radiation submodel; preparation and distribution of a three-volume final report: (a) Volume 1: General Technical Report; (b) Volume 2: PCGC-3 User's Manual; (c) Volume 3: Data Book for Evaluation of Three-Dimensional Combustion Models; and organization of a user's workshop on the three-dimensional code. The furnace computer model developed under this study requires further development before it can be applied generally to all applications; however, it can be used now by specialists for many specific applications, including non-combusting systems and combusting geseous systems. A new combustion center was organized and work was initiated to continue the important research effort initiated by this study. 212 refs., 72 figs., 38 tabs.

  8. Halving the Casimir force with conductive oxides: experimental details

    de Man, Sven; Iannuzzi, Davide


    This work is an extended version of a paper published last year in Physical Review Letters [S. de Man et al., Phys. Rev. Lett. 103, 040402 (2009)], where we presented measurements of the Casimir force between a gold coated sphere and a plate coated with either gold or an indium-tin-oxide (ITO) layer. The experiment, which was performed in air, showed that ITO is sufficiently conducting to prevent charge accumulation, but still transparent enough to halve the Casimir attraction when compared to gold. Here, we report all the experimental details that, due to the limited space available, were omitted in the previous article. We discuss the performance of our setup in terms of stability of the calibration procedure and reproducibility of the Casimir force measurement. We also introduce and demonstrate a new technique to obtain the spring constant of our force sensor. Furthermore, we present a thorough description of the experimental method, a comprehensive explanation of data elaboration and error analysis, and a...

  9. Detailed Spectroscopy of 46Ca with the GRIFFIN Spectrometer

    Pore, Jennifer; Griffin Collaboration Collaboration


    The neutron-rich calcium isotopes are currently a new frontier for modern ab-initio calculations based on NN and 3N forces. Detailed experimental data from these nuclei is necessary for a comprehensive understanding of the region. Many excited states in 46Ca have been previously identified by various reaction mechanisms, most notably from (p ,p') and (p , t) reactions, but many spins are only tentatively assigned or not measured and very few gamma-ray transitions have been placed in the level scheme. A high-statistics data set of the 46K decay into low-lying levels of 46Ca was taken with the new GRIFFIN spectrometer located at TRIUMF-ISAC. The level scheme of 46Ca has been greatly expanded to include 160 new gamma-ray transitions and 12 new excited states. Angular correlations between cascading gamma rays have been investigated to obtain information about the spins of the excited states. An overview of the experiment and a discussion of the results will be presented.

  10. Formation of Jupiter using opacities based on detailed grain physics

    Movshovitz, Naor; Podolak, Morris; Lissauer, Jack J


    Numerical simulations, based on the core-nucleated accretion model, are presented for the formation of Jupiter at 5.2 AU in 3 primordial disks with three different assumed values of the surface density of solid particles. The grain opacities in the envelope of the protoplanet are computed using a detailed model that includes settling and coagulation of grains and that incorporates a recalculation of the grain size distribution at each point in time and space. We generally find lower opacities than the 2% of interstellar values used in previous calculations [Hubickyj, O., Bodenheimer, P., Lissauer, J. J., 2005. Icarus 179, 415--431; Lissauer, J. J., Hubickyj, O., D'Angelo, G., Bodenheimer, P., 2009. Icarus 199, 338-350]. These lower opacities result in more rapid heat loss from and more rapid contraction of the protoplanetary envelope. For a given surface density of solids, the new calculations result in a substantial speedup in formation time as compared with those previous calculations. Formation times are c...

  11. Analysis of Fatigue Crack Growth in Ship Structural Details

    Leheta Heba W.


    Full Text Available Fatigue failure avoidance is a goal that can be achieved only if the fatigue design is an integral part of the original design program. The purpose of fatigue design is to ensure that the structure has adequate fatigue life. Calculated fatigue life can form the basis for meaningful and efficient inspection programs during fabrication and throughout the life of the ship. The main objective of this paper is to develop an add-on program for the analysis of fatigue crack growth in ship structural details. The developed program will be an add-on script in a pre-existing package. A crack propagation in a tanker side connection is analyzed by using the developed program based on linear elastic fracture mechanics (LEFM and finite element method (FEM. The basic idea of the developed application is that a finite element model of this side connection will be first analyzed by using ABAQUS and from the results of this analysis the location of the highest stresses will be revealed. At this location, an initial crack will be introduced to the finite element model and from the results of the new crack model the direction of the crack propagation and the values of the stress intensity factors, will be known. By using the calculated direction of propagation a new segment will be added to the crack and then the model is analyzed again. The last step will be repeated until the calculated stress intensity factors reach the critical value.

  12. Communicating one's local address and emergency contact details

    Information Technology Department, AIS (Administrative Information Services) Group; Human Resources Department, SPS (Services, Procedures and Social) Group


    As part of the ongoing simplification of procedures and rationalisation of administrative processes, the IT, PH (Users Office) and HR Departments have developed two new EDH forms for communicating or updating one's local address and emergency contact details. This is the first time that the forms relating to an official HR procedure can be accessed on a self-service basis and directly updated by the members of personnel themselves. The information recorded remains confidential and may only be accessed by the authorised administrative services and the emergency services. Local address: Members of the personnel must declare any change in their local address (Art. R V 1.38 of the Staff Regulations). This declaration is henceforth made by directly filling out the EDH document indicated below, and without requiring any other spontaneous formality vis-à-vis the department secretariat or the Users Office. It is also possible for any member of the personnel to check whether the local address in the Organizati...

  13. Accounting for highly excited states in detailed opacity calculations

    Pain, Jean-Christophe


    In multiply-charged ion plasmas, a significant number of electrons may occupy high-energy orbitals. These "Rydberg" electrons, when they act as spectators, are responsible for a number of satellites of X-ray absorption or emission lines, yielding a broadening of the red wing of the resonance lines. The contribution of such satellite lines may be important, because of the high degeneracy of the relevant excited configurations which give these large Boltzmann weights. However, it is in general difficult to take these configurations into account since they are likely to give rise to a large number of lines. We propose to model the perturbation induced by the spectators in a way similar to the Partially-Resolved-Transition-Array approach recently published by C. Iglesias. It consists in a partial detailed-line-accounting calculation in which the effect of the Rydberg spectators is included through a shift and width, expressed in terms of the canonical partition functions, which are key-ingredients of the Super-Tr...

  14. Aperture Photometry Tool

    Laher, Russ R.; Gorjian, Varoujan; Rebull, Luisa M.; Masci, Frank J.; Fowler, John W.; Helou, George; Kulkarni, Shrinivas R.; Law, Nicholas M.


    Aperture Photometry Tool (APT) is software for astronomers and students interested in manually exploring the photometric qualities of astronomical images. It is a graphical user interface (GUI) designed to allow the image data associated with aperture photometry calculations for point and extended sources to be visualized and, therefore, more effectively analyzed. The finely tuned layout of the GUI, along with judicious use of color-coding and alerting, is intended to give maximal user utility and convenience. Simply mouse-clicking on a source in the displayed image will instantly draw a circular or elliptical aperture and sky annulus around the source and will compute the source intensity and its uncertainty, along with several commonly used measures of the local sky background and its variability. The results are displayed and can be optionally saved to an aperture-photometry-table file and plotted on graphs in various ways using functions available in the software. APT is geared toward processing sources in a small number of images and is not suitable for bulk processing a large number of images, unlike other aperture photometry packages (e.g., SExtractor). However, APT does have a convenient source-list tool that enables calculations for a large number of detections in a given image. The source-list tool can be run either in automatic mode to generate an aperture photometry table quickly or in manual mode to permit inspection and adjustment of the calculation for each individual detection. APT displays a variety of useful graphs with just the push of a button, including image histogram, x and y aperture slices, source scatter plot, sky scatter plot, sky histogram, radial profile, curve of growth, and aperture-photometry-table scatter plots and histograms. APT has many functions for customizing the calculations, including outlier rejection, pixel "picking" and "zapping," and a selection of source and sky models. The radial-profile-interpolation source model

  15. Sandia PUF Analysis Tool


    This program is a graphical user interface for measuring and performing inter-active analysis of physical unclonable functions (PUFs). It is intended for demonstration and education purposes. See license.txt for license details. The program features a PUF visualization that demonstrates how signatures differ between PUFs and how they exhibit noise over repeated measurements. A similarity scoreboard shows the user how close the current measurement is to the closest chip signatures in the database. Other metrics such as average noise and inter-chip Hamming distances are presented to the user. Randomness tests published in NIST SP 800-22 can be computed and displayed. Noise and inter-chip histograms for the sample of PUFs and repeated PUF measurements can be drawn.

  16. Tools for Understanding Identity

    Creese, Sadie; Gibson-Robinson, Thomas; Goldsmith, Michael; Hodges, Duncan; Kim, Dee DH; Love, Oriana J.; Nurse, Jason R.; Pike, William A.; Scholtz, Jean


    Identity attribution and enrichment is critical to many aspects of law-enforcement and intelligence gathering; this identity typically spans a number of domains in the natural-world such as biographic information (factual information – e.g. names, addresses), biometric information (e.g. fingerprints) and psychological information. In addition to these natural-world projections of identity, identity elements are projected in the cyber-world. Conversely, undesirable elements may use similar techniques to target individuals for spear-phishing attacks (or worse), and potential targets or their organizations may want to determine how to minimize the attack surface exposed. Our research has been exploring the construction of a mathematical model for identity that supports such holistic identities. The model captures the ways in which an identity is constructed through a combination of data elements (e.g. a username on a forum, an address, a telephone number). Some of these elements may allow new characteristics to be inferred, hence enriching the holistic view of the identity. An example use-case would be the inference of real names from usernames, the ‘path’ created by inferring new elements of identity is highlighted in the ‘critical information’ panel. Individual attribution exercises can be understood as paths through a number of elements. Intuitively the entire realizable ‘capability’ can be modeled as a directed graph, where the elements are nodes and the inferences are represented by links connecting one or more antecedents with a conclusion. The model can be operationalized with two levels of tool support described in this paper, the first is a working prototype, the second is expected to reach prototype by July 2013: Understanding the Model The tool allows a user to easily determine, given a particular set of inferences and attributes, which elements or inferences are of most value to an investigator (or an attacker). The tool is also able to take

  17. Decision support tools with an economic flavor

    Bomber, T.M.; Baxter, J.


    This paper discusses criteria for selecting analytical support tools for manufacturing engineering in the early phases of product development, and the lessons learned at Sandia National Laboratories in selecting and applying these tools. The IPPD (Integrated Product and Process Design) process requires manufacturing process developers to be involved earlier than ever before in product development. Operating in an IPPD environment, Sandia`s manufacturing engineers were required to develop early estimates of the cost and performance of manufacturing plans. In early pre-production, there are very little actual data on manufacturing processes and almost no detailed data on the performance of various manufacturing process steps. The manufacturing engineer needs the capability to analyze various manufacturing process flows over a large set of assumptions involving capacity, resource requirements (equipment, labor, material, utilities,...), yields, product designs, etc. If the manufacturing process involves many process steps, or if there are multiple products in a single manufacturing area that share resources, or there are multiple part starts resulting in merged flow for final assembly, then this analysis capability must somehow be mechanized. This situation led them to look to modeling and simulation tools for a solution. Example analyses of manufacturing issues for two product sets in the early phases of product development are presented.

  18. Understanding and Using the Fermi Science Tools

    Asercion, Joseph; Fermi Science Support Center


    The Fermi Science Support Center (FSSC) provides information, documentation, and tools for the analysis of Fermi science data, including both the Large-Area Telescope (LAT) and the Gamma-ray Burst Monitor (GBM). Source and binary versions of the Fermi Science Tools can be downloaded from the FSSC website, and are supported on multiple platforms. An overview document, the Cicerone, provides details of the Fermi mission, the science instruments and their response functions, the science data preparation and analysis process, and interpretation of the results. Analysis Threads and a reference manual available on the FSSC website provide the user with step-by-step instructions for many different types of data analysis: point source analysis - generating maps, spectra, and light curves, pulsar timing analysis, source identification, and the use of python for scripting customized analysis chains. We present an overview of the structure of the Fermi science tools and documentation, and how to acquire them. We also provide examples of standard analyses, including tips and tricks for improving Fermi science analysis.

  19. Virtual Reality Educational Tool for Human Anatomy.

    Izard, Santiago González; Juanes Méndez, Juan A; Palomera, Pablo Ruisoto


    Virtual Reality is becoming widespread in our society within very different areas, from industry to entertainment. It has many advantages in education as well, since it allows visualizing almost any object or going anywhere in a unique way. We will be focusing on medical education, and more specifically anatomy, where its use is especially interesting because it allows studying any structure of the human body by placing the user inside each one. By allowing virtual immersion in a body structure such as the interior of the cranium, stereoscopic vision goggles make these innovative teaching technologies a powerful tool for training in all areas of health sciences. The aim of this study is to illustrate the teaching potential of applying Virtual Reality in the field of human anatomy, where it can be used as a tool for education in medicine. A Virtual Reality Software was developed as an educational tool. This technological procedure is based entirely on software which will run in stereoscopic goggles to give users the sensation of being in a virtual environment, clearly showing the different bones and foramina which make up the cranium, and accompanied by audio explanations. Throughout the results the structure of the cranium is described in detailed from both inside and out. Importance of an exhaustive morphological knowledge of cranial fossae is further discussed. Application for the design of microsurgery is also commented.


    Duško Pavletić


    Full Text Available The paper is dealing with one segment of broader research of universality an systematicness in application of seven basic quality tools (7QC tools, which is possible to use in different areas: power plant, process industry, government, health and tourism services. The aim of the paper was to show on practical examples that there is real possibility of application of 7QC tools. Furthermore, the research has to show to what extent are selected tools in usage and what reasons of avoiding their broader application are. The simple example of successful application of the quality tools are shown on selected company in process industry.

  1. Big Data Mining: Tools & Algorithms

    Adeel Shiraz Hashmi


    Full Text Available We are now in Big Data era, and there is a growing demand for tools which can process and analyze it. Big data analytics deals with extracting valuable information from that complex data which can’t be handled by traditional data mining tools. This paper surveys the available tools which can handle large volumes of data as well as evolving data streams. The data mining tools and algorithms which can handle big data have also been summarized, and one of the tools has been used for mining of large datasets using distributed algorithms.

  2. Detailed characterization of the substrate specificity of mouse wax synthase.

    Miklaszewska, Magdalena; Kawiński, Adam; Banaś, Antoni


    Wax synthases are membrane-associated enzymes catalysing the esterification reaction between fatty acyl-CoA and a long chain fatty alcohol. In living organisms, wax esters function as storage materials or provide protection against harmful environmental influences. In industry, they are used as ingredients for the production of lubricants, pharmaceuticals, and cosmetics. Currently the biological sources of wax esters are limited to jojoba oil. In order to establish a large-scale production of desired wax esters in transgenic high-yielding oilseed plants, enzymes involved in wax esters synthesis from different biological resources should be characterized in detail taking into consideration their substrate specificity. Therefore, this study aims at determining the substrate specificity of one of such enzymes -- the mouse wax synthase. The gene encoding this enzyme was expressed heterologously in Saccharomyces cerevisiae. In the in vitro assays (using microsomal fraction from transgenic yeast), we evaluated the preferences of mouse wax synthase towards a set of combinations of 11 acyl-CoAs with 17 fatty alcohols. The highest activity was observed for 14:0-CoA, 12:0-CoA, and 16:0-CoA in combination with medium chain alcohols (up to 5.2, 3.4, and 3.3 nmol wax esters/min/mg microsomal protein, respectively). Unsaturated alcohols longer than 18°C were better utilized by the enzyme in comparison to the saturated ones. Combinations of all tested alcohols with 20:0-CoA, 22:1-CoA, or Ric-CoA were poorly utilized by the enzyme, and conjugated acyl-CoAs were not utilized at all. Apart from the wax synthase activity, mouse wax synthase also exhibited a very low acyl-CoA:diacylglycerol acyltransferase activity. However, it displayed neither acyl-CoA:monoacylglycerol acyltransferase, nor acyl-CoA:sterol acyltransferase activity.

  3. The impact of model detail on power grid resilience measures

    Auer, S.; Kleis, K.; Schultz, P.; Kurths, J.; Hellmann, F.


    Extreme events are a challenge to natural as well as man-made systems. For critical infrastructure like power grids, we need to understand their resilience against large disturbances. Recently, new measures of the resilience of dynamical systems have been developed in the complex system literature. Basin stability and survivability respectively assess the asymptotic and transient behavior of a system when subjected to arbitrary, localized but large perturbations in frequency and phase. To employ these methods that assess power grid resilience, we need to choose a certain model detail of the power grid. For the grid topology we considered the Scandinavian grid and an ensemble of power grids generated with a random growth model. So far the most popular model that has been studied is the classical swing equation model for the frequency response of generators and motors. In this paper we study a more sophisticated model of synchronous machines that also takes voltage dynamics into account, and compare it to the previously studied model. This model has been found to give an accurate picture of the long term evolution of synchronous machines in the engineering literature for post fault studies. We find evidence that some stable fix points of the swing equation become unstable when we add voltage dynamics. If this occurs the asymptotic behavior of the system can be dramatically altered, and basin stability estimates obtained with the swing equation can be dramatically wrong. We also find that the survivability does not change significantly when taking the voltage dynamics into account. Further, the limit cycle type asymptotic behaviour is strongly correlated with transient voltages that violate typical operational voltage bounds. Thus, transient voltage bounds are dominated by transient frequency bounds and play no large role for realistic parameters.

  4. Detailed temperature mapping-Warming characterizes archipelago zones

    Veneranta, L.; Vanhatalo, J.; Urho, L.


    Rapidly warming shallow archipelago areas have the best energetic options for high ecological production. We analyzed and visualized the spring and summer temperature development in the Finnish coastal areas of the Northern Baltic Sea. Typical for the Baltic is a high annual periodicity and variability in water temperatures. The maximum difference between a single day average temperatures across the study area was 28.3 °C. During wintertime the littoral water temperature can decrease below zero in outer archipelago or open water areas when the protective ice cover is not present and the lowest observed value was -0.5 °C. The depth and exposition are the most important variables explaining the coastal temperature gradients from the innermost to the outermost areas in springtime when water is heated by increasing solar radiation. Temperature differs more within coastal area than between the basins. Water temperature sum was highest in innermost areas, lowest in open water areas and the variation in daily averages was highest in the middle region. At the end of the warming period, the difference in surface water temperatures between the innermost and outermost areas had diminished at the time when the cooling began in August-September. These clear temperature gradients enabled us use the cumulative water temperature to classify the coastal zones in a biologically sensible manner into five regions. Our study shows a novel approach to study detailed spatial variations in water temperatures. The results can further be used, for example, to model and predict the spatial distribution of aquatic biota and to determine appropriate spatio-temporal designs for aquatic biota surveys. The new spatial knowledge of temperature regions will also help the evaluation of possible causes of larger scale climatological changes in a biological context including productivity.

  5. Energy Issues In Mobile Telecom Network: A Detailed Analysis

    P. Balagangadhar Rao


    Full Text Available Diesel and Conventional energy costs are increasing at twice the growth rate of revenues of Mobile Telecom Network infrastructure industry. There is an urgent need to reduce the Operating Expenditure (OPEX in this front. While bridging the rural and urban divide, Telecom Operators should adopt stronger regulations for climate control by reducing the Green house gases like CO2.This strengthens the business case for renewable energy technology usage. Solutions like Solar, Fuel Cells, Wind, Biomass, and Geothermal can be explored and implemented in the arena of energy starving Telecom sector. Such sources provide clean and green energy. They are free and infinitely available. These technologies which use the natural resources are not only suitable for stand alone applications but also have long life span. Their maintenance cost is quite minimal. Most important advantage of the use of these natural resources is to have a low Carbon foot print. These are silent energy sources. Out of these, Solar-based solutions are available as Ground (or Tower mounted variants. Hybrid Technology solutions like Solar-Solar, Solar-DCDG (Direct Current Diesel Generators or Solar-battery bank are to be put into use in order to cut down the OPEX (Operating Expenditure. Further, a single Multi Fuel Cell can also be used, which can run on Ethanol/Bio Fuel/Compressed Natural Gas (CNG/Liquefied Petroleum Gas (LPG/Pyrolysis oil. Also, storage solutions like Lithium ion batteries reduce the Diesel Generator run hours, offering about fifty percent of savings in operating expenditure front. A detailed analysis is made in this paper in respect of the Energy requirements of Mobile Telecom Network; Minimising the Operating Costs by the usage of the technologies that harvest Natural resources; Sharing the Infrastructure by different Operators and bringing Energy efficiency by adopting latest Storage back up technologies.

  6. Lightning climatology in the Congo Basin: detailed analysis

    Soula, Serge; Kigotsi, Jean; Georgis, Jean-François; Barthe, Christelle


    The lightning climatology of the Congo Basin including several countries of Central Africa is analyzed in detail for the first time. It is based on World Wide Lightning Location Network (WWLLN) data for the period from 2005 to 2013. A comparison of these data with the Lightning Imaging Sensor (LIS) data for the same period shows the WWLLN detection efficiency (DE) in the region increases from about 1.70 % in the beginning of the period to 5.90 % in 2013, relative to LIS data, but not uniformly over the whole 2750 km × 2750 km area. Both the annual flash density and the number of stormy days show sharp maximum values localized in eastern of Democratic Republic of Congo (DRC) and west of Kivu Lake, regardless of the reference year and the period of the year. These maxima reach 12.86 fl km-2 and 189 days, respectively, in 2013, and correspond with a very active region located at the rear of the Virunga mountain range characterised with summits that can reach 3000 m. The presence of this range plays a role in the thunderstorm development along the year. The estimation of this local maximum of the lightning density by taking into account the DE, leads to a value consistent with that of the global climatology by Christian et al. (2003) and other authors. Thus, a mean maximum value of about 157 fl km-2 y-1 is found for the annual lightning density. The zonal distribution of the lightning flashes exhibits a maximum between 1°S and 2°S and about 56 % of the flashes located below the equator in the 10°S - 10°N interval. The diurnal evolution of the flash rate has a maximum between 1400 and 1700 UTC, according to the reference year, in agreement with previous works in other regions of the world.

  7. Enabling Detailed Energy Analyses via the Technology Performance Exchange: Preprint

    Studer, D.; Fleming, K.; Lee, E.; Livingood, W.


    One of the key tenets to increasing adoption of energy efficiency solutions in the built environment is improving confidence in energy performance. Current industry practices make extensive use of predictive modeling, often via the use of sophisticated hourly or sub-hourly energy simulation programs, to account for site-specific parameters (e.g., climate zone, hours of operation, and space type) and arrive at a performance estimate. While such methods are highly precise, they invariably provide less than ideal accuracy due to a lack of high-quality, foundational energy performance input data. The Technology Performance Exchange was constructed to allow the transparent sharing of foundational, product-specific energy performance data, and leverages significant, external engineering efforts and a modular architecture to efficiently identify and codify the minimum information necessary to accurately predict product energy performance. This strongly-typed database resource represents a novel solution to a difficult and established problem. One of the most exciting benefits is the way in which the Technology Performance Exchange's application programming interface has been leveraged to integrate contributed foundational data into the Building Component Library. Via a series of scripts, data is automatically translated and parsed into the Building Component Library in a format that is immediately usable to the energy modeling community. This paper (1) presents a high-level overview of the project drivers and the structure of the Technology Performance Exchange; (2) offers a detailed examination of how technologies are incorporated and translated into powerful energy modeling code snippets; and (3) examines several benefits of this robust workflow.

  8. Detailed genetic structure of European bitterling populations in Central Europe

    Veronika Bartáková


    Full Text Available The European bitterling (Rhodeus amarus is a small cyprinid fish whose populations declined markedly between 1950 and 1980. However, its range currently expands, partly due to human-assisted introductions. We determined the genetic variability and detailed spatial structure among bitterling populations in Central Europe and tested alternative hypotheses about colonization of this area. Twelve polymorphic microsatellite loci on a large sample of 688 individuals had been used to analyse genetic variability and population structure. Samples originated from 27 localities with emphasis on area of the Czech Republic where three major sea drainages (Black, Baltic, and Northern Sea meet. Highly variable level of intrapopulation genetic variability had generally been detected and a recent decrease in numbers (“bottleneck” had been indicated by genetic data among six populations. High level of interpopulation differentiation was identified even within the basins. There was a significant role of genetic drift and indications of low dispersal ability of R. amarus. Surprisingly, the Odra River was inhabited by two distinct populations without any genetic signatures of a secondary contact. Czech part of the Odra (Baltic basin was colonized from the Danubian refugium (similarly to adjacent Danubian basin rivers including the Morava, while Polish part of the Odra was genetically similar to the populations in the Vistula River (Baltic basin, that has been colonized by a different (Eastern phylogeographic lineage of R. amarus. Most Czech R. amarus populations were colonized from the Danubian refugium, suggesting potential for a human-mediated colonization of the Odra or Elbe Rivers by R. amarus. One Elbe basin population was genetically mixed from the two (Danubian and Eastern phylogeographic lineages. In general the Czech populations of R. amarus were genetically stable except for a single population which has probably been recently introduced. This research

  9. Detailed Analysis of the Interoccurrence Time Statistics in Seismic Activity

    Tanaka, Hiroki; Aizawa, Yoji


    The interoccurrence time statistics of seismiciry is studied theoretically as well as numerically by taking into account the conditional probability and the correlations among many earthquakes in different magnitude levels. It is known so far that the interoccurrence time statistics is well approximated by the Weibull distribution, but the more detailed information about the interoccurrence times can be obtained from the analysis of the conditional probability. Firstly, we propose the Embedding Equation Theory (EET), where the conditional probability is described by two kinds of correlation coefficients; one is the magnitude correlation and the other is the inter-event time correlation. Furthermore, the scaling law of each correlation coefficient is clearly determined from the numerical data-analysis carrying out with the Preliminary Determination of Epicenter (PDE) Catalog and the Japan Meteorological Agency (JMA) Catalog. Secondly, the EET is examined to derive the magnitude dependence of the interoccurrence time statistics and the multi-fractal relation is successfully formulated. Theoretically we cannot prove the universality of the multi-fractal relation in seismic activity; nevertheless, the theoretical results well reproduce all numerical data in our analysis, where several common features or the invariant aspects are clearly observed. Especially in the case of stationary ensembles the multi-fractal relation seems to obey an invariant curve, furthermore in the case of non-stationary (moving time) ensembles for the aftershock regime the multi-fractal relation seems to satisfy a certain invariant curve at any moving times. It is emphasized that the multi-fractal relation plays an important role to unify the statistical laws of seismicity: actually the Gutenberg-Richter law and the Weibull distribution are unified in the multi-fractal relation, and some universality conjectures regarding the seismicity are briefly discussed.

  10. Mobile Dichotomous Key Application as a Scaffolding Tool in the Museum Setting

    Knight, Kathryn


    This study explored the use of a dichotomous key as a scaffolding tool in the museum setting. The dichotomous key was designed as a scaffolding tool to help students make more detailed observations as they identified various species of birds on display. The dichotomous key was delivered to groups of fifth and seventh graders in two ways: on a…

  11. Guidebook for Using the Tool BEST Cement: Benchmarking and Energy Savings Tool for the Cement Industry

    Galitsky, Christina; Price, Lynn; Zhou, Nan; Fuqiu , Zhou; Huawen, Xiong; Xuemin, Zeng; Lan, Wang


    year); (5) the amount of production of cement by type and grade (in tonnes per year); (6) the electricity generated onsite; and, (7) the energy used by fuel type; and, the amount (in RMB per year) spent on energy. The tool offers the user the opportunity to do a quick assessment or a more detailed assessment--this choice will determine the level of detail of the energy input. The detailed assessment will require energy data for each stage of production while the quick assessment will require only total energy used at the entire facility (see Section 6 for more details on quick versus detailed assessments). The benchmarking tool provides two benchmarks--one for Chinese best practices and one for international best practices. Section 2 describes the differences between these two and how each benchmark was calculated. The tool also asks for a target input by the user for the user to set goals for the facility.

  12. Implications of Model Structure and Detail for Utility Planning: Scenario Case Studies Using the Resource Planning Model

    Mai, Trieu [National Renewable Energy Lab. (NREL), Golden, CO (United States); Barrows, Clayton [National Renewable Energy Lab. (NREL), Golden, CO (United States); Lopez, Anthony [National Renewable Energy Lab. (NREL), Golden, CO (United States); Hale, Elaine [National Renewable Energy Lab. (NREL), Golden, CO (United States); Dyson, Mark [National Renewable Energy Lab. (NREL), Golden, CO (United States); Eurek, Kelly [National Renewable Energy Lab. (NREL), Golden, CO (United States)


    In this report, we analyze the impacts of model configuration and detail in capacity expansion models, computational tools used by utility planners looking to find the least cost option for planning the system and by researchers or policy makers attempting to understand the effects of various policy implementations. The present analysis focuses on the importance of model configurations — particularly those related to capacity credit, dispatch modeling, and transmission modeling — to the construction of scenario futures. Our analysis is primarily directed toward advanced tools used for utility planning and is focused on those impacts that are most relevant to decisions with respect to future renewable capacity deployment. To serve this purpose, we develop and employ the NREL Resource Planning Model to conduct a case study analysis that explores 12 separate capacity expansion scenarios of the Western Interconnection through 2030.

  13. WP3 Prototype development for operational planning tool

    Kristoffersen, Trine; Meibom, Peter; Apfelbeck, J.

    of electricity load and wind power production, and to cover forced outages of power plants and transmission lines. Work has been carried out to include load uncertainty and forced outages in the two main components of the Wilmar Planning tool namely the Scenario Tree Tool and the Joint Market Model. This work...... is documented in chapter 1 and 2. The inclusion of load uncertainty and forced outages in the Scenario Tree Tool enables calculation of the demand for reserve power depending on the forecast horizon. The algorithm is given in Section 3.1. The design of a modified version of the Joint Market Model enabling....... Further, the methodology to identify extreme events on the basis of the existing tools is described. Within the SUPWIND consortium there has been an interest in using the Joint Market Model to model smaller parts of a power system but with more detailed representation of the transmission and distribution...

  14. Dasy Based Tool for The Design of Ice Mechanisms

    Tichánek Radek


    Full Text Available This article presents a tool for designing new mechanisms of internal combustion engines based on the DASY knowledge database. An OHC valve train has been chosen for developing and testing the presented tool. The tool includes both a kinematic and dynamic model connected to a crank train. Values of unknown parameters have been obtained using detailed calibration and consequent validation of three dynamic models with measured data. The values remain stored in DASY and many of them can be used directly to design new mechanisms, even in cases where the geometries of some parts are different. The paper presents three methods which have been used not only for the calibration, but also for the identification of the influence of unknown parameters on valve acceleration and its vibration. The tool has been used to design the cam shapes for a prototype of the new mechanism.

  15. On Computational Fluid Dynamics Tools in Architectural Design

    Kirkegaard, Poul Henning; Hougaard, Mads; Stærdahl, Jesper Winther

    In spite of being apparently easy to use, computational fluid dynamics (CFD) based tools require specialist knowledge for modeling as well as for the interpretation of results. This point of view implies also that users of CFD based tools have to be carefully choosing and using them. Especially...... engineering computational fluid dynamics (CFD) simulation program ANSYS CFX and a CFD based representative program RealFlow are investigated. These two programs represent two types of CFD based tools available for use during phases of an architectural design process. However, as outlined in two case studies...... Centre in Aalborg, Denmark. The obtained results show that detailed and accurate flow predictions can be obtained using a simulation tool like ANSYS CFX. On the other hand RealFlow provides satisfactory flow results for evaluation of a proposed building shape in an early phase of a design process...

  16. Predictions of titanium alloy properties using thermodynamic modeling tools

    Zhang, F.; Xie, F.-Y.; Chen, S.-L.; Chang, Y. A.; Furrer, D.; Venkatesh, V.


    Thermodynamic modeling tools have become essential in understanding the effect of alloy chemistry on the final microstructure of a material. Implementation of such tools to improve titanium processing via parameter optimization has resulted in significant cost savings through the elimination of shop/laboratory trials and tests. In this study, a thermodynamic modeling tool developed at CompuTherm, LLC, is being used to predict β transus, phase proportions, phase chemistries, partitioning coefficients, and phase boundaries of multicomponent titanium alloys. This modeling tool includes Pandat, software for multicomponent phase equilibrium calculations, and PanTitanium, a thermodynamic database for titanium alloys. Model predictions are compared with experimental results for one α-β alloy (Ti-64) and two near-β alloys (Ti-17 and Ti-10-2-3). The alloying elements, especially the interstitial elements O, N, H, and C, have been shown to have a significant effect on the β transus temperature, and are discussed in more detail herein.

  17. Medical text analytics tools for search and classification.

    Huang, Jimmy; An, Aijun; Hu, Vivian; Tu, Karen


    A text-analytic tool has been developed that accepts clinical medical data as input in order to produce patient details. The integrated tool has the following four characteristics. 1) It has a graphical user interface. 2) It has a free-text search tool that is designed to retrieve records using keywords such as "MI" for myocardial infarction. The result set is a display of those sentences in the medical records that contain the keywords. 3) It has three tools to classify patients based on the likelihood of being diagnosed for myocardial infarction, hypertension, or their smoking status. 4) A summary is generated for each patient selected. Large medical data sets provided by the Institute for Clinical Evaluative Sciences were used during the project.

  18. The retrieval of fingerprint friction ridge detail from elephant ivory using reduced-scale magnetic and non-magnetic powdering materials.

    Weston-Ford, Kelly A; Moseley, Mark L; Hall, Lisa J; Marsh, Nicholas P; Morgan, Ruth M; Barron, Leon P


    An evaluation of reduced-size particle powdering methods for the recovery of usable fingermark ridge detail from elephant ivory is presented herein for the first time as a practical and cost-effective tool in forensic analysis. Of two reduced-size powder material types tested, powders with particle sizes ≤ 40 μm offered better chances of recovering ridge detail from unpolished ivory in comparison to a conventional powder material. The quality of developed ridge detail of these powders was also assessed for comparison and automated search suitability. Powder materials and the enhanced ridge detail on ivory were analysed by scanning electron microscopy and energy dispersive X-ray spectroscopy and interactions between their constituents and the ivory discussed. The effect of ageing on the quality of ridge detail recovered showed that the best quality was obtained within 1 week. However, some ridge detail could still be developed up to 28 days after deposition. Cyanoacrylate and fluorescently-labelled cyanoacrylate fuming of ridge detail on ivory was explored and was less effective than reduced-scale powdering in general. This research contributes to the understanding and potential application of smaller scale powdering materials for the development of ridge detail on hard, semi-porous biological material typically seized in wildlife-related crimes.

  19. Relative urban ecosystem health assessment: a method integrating comprehensive evaluation and detailed analysis.

    Su, Meirong; Yang, Zhifeng; Chen, Bin


    Regarding the basic roles of urban ecosystem health assessment (i.e., discovering the comprehensive health status, and diagnosing the limiting factors of urban ecosystems), the general framework integrating comprehensive evaluation and detailed analysis is established, from both bottom-up and top-down directions. Emergy-based health indicators are established to reflect the urban ecosystem health status from a biophysical viewpoint. Considering the intrinsic uncertainty and relativity of urban ecosystem health, set pair analysis is combined with the emergy-based indicators to fill the general framework and evaluate the relative health level of urban ecosystems. These techniques are favorable for understanding the overall urban ecosystem health status and confirming the limiting factors of concerned urban ecosystems from biophysical perspective. Moreover, clustering analysis is applied by combining the health status with spatial geographical conditions. Choosing 26 typical Chinese cities in 2005, relative comprehensive urban ecosystem health levels were evaluated. The higher health levels of Xiamen, Qingdao, Shenzhen, and Zhuhai are in particular contrast to those of Wuhan, Beijing, Yinchuan, and Harbin, which are relatively poor. In addition, the conditions of each factor and related indicators are investigated through set pair analysis, from which the critical limiting factors of Beijing are confirmed. According to clustering analysis results, the urban ecosystems studied are divided into four groups. It is concluded that the proposed framework of urban ecosystem health assessment, which integrates comprehensive evaluation and detailed analysis and is fulfilled by emergy synthesis and set pair analysis, can serve as a useful tool to conduct diagnosis of urban ecosystem health.

  20. Co-Simulation of Detailed Whole Building with the Power System to Study Smart Grid Applications

    Makhmalbaf, Atefe; Fuller, Jason C.; Srivastava, Viraj; Ciraci, Selim; Daily, Jeffrey A.


    Modernization of the power system in a way that ensures a sustainable energy system is arguably one of the most pressing concerns of our time. Buildings are important components in the power system. First, they are the main consumers of electricity and secondly, they do not have constant energy demand. Conventionally, electricity has been difficult to store and should be consumed as it is generated. Therefore, maintaining the demand and supply is critical in the power system. However, to reduce the complexity of power models, buildings (i.e., end-use loads) are traditionally modeled and represented as aggregated “dumb” nodes in the power system. This means we lack effective detailed whole building energy models that can support requirements and emerging technologies of the smart power grid. To gain greater insight into the relationship between building energy demand and power system performance, it is important to constitute a co-simulation framework to support detailed building energy modeling and simulation within the power system to study capabilities promised by the modern power grid. This paper discusses ongoing work at Pacific Northwest National Laboratory and presents underlying tools and framework needed to enable co-simulation of building, building energy systems and their control in the power system to study applications such as demand response, grid-based HVAC control, and deployment of buildings for ancillary services. The optimal goal is to develop an integrated modeling and simulation platform that is flexible, reusable, and scalable. Results of this work will contribute to future building and power system studies, especially those related to the integrated ‘smart grid’. Results are also expected to advance power resiliency and local (micro) scale grid studies where several building and renewable energy systems transact energy directly. This paper also reviews some applications that can be supported and studied using the framework introduced

  1. Integrating a Decision Management Tool with UML Modeling Tools

    Könemann, Patrick

    Numerous design decisions are made while developing software systems, which influence the architecture of these systems as well as following decisions. A number of decision management tools already exist for capturing, documenting, and maintaining design decisions, but also for guiding developers...... the development process. In this report, we propose an integration of a decision management and a UML-based modeling tool, based on use cases we distill from a case study: the modeling tool shall show all decisions related to a model and allow its users to extend or update them; the decision management tool shall...... trigger the modeling tool to realize design decisions in the models. We define tool-independent concepts and architecture building blocks supporting these use cases and present how they can be implemented in the IBM Rational Software Modeler and Architectural Decision Knowledge Wiki. This seamless...

  2. Reliability/Risk Methods and Design Tools for Application in Space Programs

    Townsend, John S.; Smart, Christian


    Since 1984 NASA has funded several major programs to develop Reliability/Risk Methods and tools for engineers to apply in the design and assessment of aerospace hardware. Two probabilistic software tools that show great promise for practical application are the finite element code NESSUS and the system risk analysis code QRAS. This paper examines NASA's past, present, and future directions in reliability and risk engineering applications, Both the NESSUS and QRAS software tools are detailed.

  3. Hurricane Data Analysis Tool

    Liu, Zhong; Ostrenga, Dana; Leptoukh, Gregory


    In order to facilitate Earth science data access, the NASA Goddard Earth Sciences Data Information Services Center (GES DISC) has developed a web prototype, the Hurricane Data Analysis Tool (HDAT; URL:, to allow users to conduct online visualization and analysis of several remote sensing and model datasets for educational activities and studies of tropical cyclones and other weather phenomena. With a web browser and few mouse clicks, users can have a full access to terabytes of data and generate 2-D or time-series plots and animation without downloading any software and data. HDAT includes data from the NASA Tropical Rainfall Measuring Mission (TRMM), the NASA Quick Scatterometer(QuikSCAT) and NECP Reanalysis, and the NCEP/CPC half-hourly, 4-km Global (60 N - 60 S) IR Dataset. The GES DISC archives TRMM data. The daily global rainfall product derived from the 3-hourly multi-satellite precipitation product (3B42 V6) is available in HDAT. The TRMM Microwave Imager (TMI) sea surface temperature from the Remote Sensing Systems is in HDAT as well. The NASA QuikSCAT ocean surface wind and the NCEP Reanalysis provide ocean surface and atmospheric conditions, respectively. The global merged IR product, also known as, the NCEP/CPC half-hourly, 4-km Global (60 N -60 S) IR Dataset, is one of TRMM ancillary datasets. They are globally-merged pixel-resolution IR brightness temperature data (equivalent blackbody temperatures), merged from all available geostationary satellites (GOES-8/10, METEOSAT-7/5 & GMS). The GES DISC has collected over 10 years of the data beginning from February of 2000. This high temporal resolution (every 30 minutes) dataset not only provides additional background information to TRMM and other satellite missions, but also allows observing a wide range of meteorological phenomena from space, such as, hurricanes, typhoons, tropical cyclones, mesoscale convection system, etc. Basic functions include selection of area of

  4. Further ALMA observations and detailed modeling of the Red Rectangle

    Bujarrabal, V.; Castro-Carrizo, A.; Alcolea, J.; Santander-García, M.; Van Winckel, H.; Sánchez Contreras, C.


    Aims We aim to study the rotating and expanding gas in the Red Rectangle, which is a well known object that recently left the asymptotic giant branch (AGB) phase. We analyze the properties of both components and the relation between them. Rotating disks have been very elusive in post-AGB nebulae, in which gas is almost always found to be in expansion. Methods We present new high-quality ALMA observations of C17O J=6−5 and H13CN J=4−3 line emission and results from a new reduction of already published 13CO J=3−2 data. A detailed model fitting of all the molecular line data, including previous maps and single-dish observations of lines of CO, CII, and CI, was performed using a sophisticated code that includes an accurate nonlocal treatment of radiative transfer in 2D. These observations (of low- and high-opacity lines requiring various degrees of excitation) and the corresponding modeling allowed us to deepen the analysis of the nebular properties. We also stress the uncertainties, particularly in the determination of the boundaries of the CO-rich gas and some properties of the outflow. Results We confirm the presence of a rotating equatorial disk and an outflow, which is mainly formed of gas leaving the disk. The mass of the disk is ~ 0.01 M⊙, and that of the CO-rich outflow is around ten times smaller. High temperatures of ≳ 100 K are derived for most components. From comparison of the mass values, we roughly estimate the lifetime of the rotating disk, which is found to be of about 10000 yr. Taking data of a few other post-AGB composite nebulae into account, we find that the lifetimes of disks around post-AGB stars typically range between 5000 and more than 20000 yr. The angular momentum of the disk is found to be high, ~ 9 M⊙ AU km s−1, which is comparable to that of the stellar system at present. Our observations of H13CN show a particularly wide velocity dispersion and indicate that this molecule is only abundant in the inner Keplerian disk, at

  5. Cloud Imagers Offer New Details on Earth's Health


    , limited scientists ability to acquire detailed information about individual particles. Now, experiments with specialized equipment can be flown on standard jets, making it possible for researchers to monitor and more accurately anticipate changes in Earth s atmosphere and weather patterns.

  6. Discerning The Details Of The Cosmic Dark Sector

    Bean, Rachel

    Central objectives: This proposal focuses on the exciting prospect of the existence of new dark sector degrees of freedom and the impact of these on observations of, and inter-relationships between, baryons, dark matter and dark energy. The suggestion of new long-range forces, manifesting themselves as apparent changes to how gravity and visible matter are related, along with possible screening mechanisms in dense regions, opens up the possibility of testing this new physics in vastly different regimes. The proposed research includes the development and investigation of dark sector physics theories and their implications for observations from solar system to cosmic scales. It will be part of an ongoing, highly productive, collaboration between the PI and Co-Is at Cornell and U. Penn and their students. Methods/techniques: The PI and CoIs have a proven track record of establishing theoretically consistent field theoretic descriptions of the dark sector and interaction screening mechanisms. They have established experience in developing novel astrophysical tests for dark sector theories and developing computational forecasting and analysis tools to establish if, and how well, a theory is testable with current and upcoming NASA surveys. Significance to NASA solicitation/interests: The research to be proposed directly supports the NASA ROSES Science Goal for Astrophysics: to “discover how the universe works, explore how the universe began and evolved.” Specifically it addresses the science question “how do matter, energy, space and time behave under the extraordinarily diverse conditions of the cosmos?”and science objective, to “understand the origin and destiny of the universe, and the nature of black holes, dark energy, dark matter, and gravity” The proposed research will be of significant relevance for current and planned NASA cosmological missions, Euclid and WFIRST, as well as current, and to be released, Planck CMB temperature and polarization data

  7. Discovery Mondays: Surveyors' Tools


    Surveyors of all ages, have your rulers and compasses at the ready! This sixth edition of Discovery Monday is your chance to learn about the surveyor's tools - the state of the art in measuring instruments - and see for yourself how they work. With their usual daunting precision, the members of CERN's Surveying Group have prepared some demonstrations and exercises for you to try. Find out the techniques for ensuring accelerator alignment and learn about high-tech metrology systems such as deviation indicators, tracking lasers and total stations. The surveyors will show you how they precisely measure magnet positioning, with accuracy of a few thousandths of a millimetre. You can try your hand at precision measurement using different types of sensor and a modern-day version of the Romans' bubble level, accurate to within a thousandth of a millimetre. You will learn that photogrammetry techniques can transform even a simple digital camera into a remarkable measuring instrument. Finally, you will have a chance t...

  8. Tool and Fixture Design

    Graham, Mark W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)


    In a manufacturing process, a need is identified and a product is created to fill this need. While design and engineering of the final product is important, the tools and fixtures that aid in the creation of the final product are just as important, if not more so. Power supplies assembled at the TA-55 PF-5 have been designed by an excellent engineering team. The task in PF-5 now is to ensure that all steps of the assembly and manufacturing process can be completed safely, reliably, and in a quality repeatable manner. One of these process steps involves soldering fine wires to an electrical connector. During the process development phase, the method of soldering included placing the power supply in a vice in order to manipulate it into a position conducive to soldering. This method is unacceptable from a reliability, repeatability, and ergonomic standpoint. To combat these issues, a fixture was designed to replace the current method. To do so, a twelve step engineering design process was used to create the fixture that would provide a solution to a multitude of problems, and increase the safety and efficiency of production.

  9. CSAM Metrology Software Tool

    Vu, Duc; Sandor, Michael; Agarwal, Shri


    CSAM Metrology Software Tool (CMeST) is a computer program for analysis of false-color CSAM images of plastic-encapsulated microcircuits. (CSAM signifies C-mode scanning acoustic microscopy.) The colors in the images indicate areas of delamination within the plastic packages. Heretofore, the images have been interpreted by human examiners. Hence, interpretations have not been entirely consistent and objective. CMeST processes the color information in image-data files to detect areas of delamination without incurring inconsistencies of subjective judgement. CMeST can be used to create a database of baseline images of packages acquired at given times for comparison with images of the same packages acquired at later times. Any area within an image can be selected for analysis, which can include examination of different delamination types by location. CMeST can also be used to perform statistical analyses of image data. Results of analyses are available in a spreadsheet format for further processing. The results can be exported to any data-base-processing software.

  10. Java Radar Analysis Tool

    Zaczek, Mariusz P.


    Java Radar Analysis Tool (JRAT) is a computer program for analyzing two-dimensional (2D) scatter plots derived from radar returns showing pieces of the disintegrating Space Shuttle Columbia. JRAT can also be applied to similar plots representing radar returns showing aviation accidents, and to scatter plots in general. The 2D scatter plots include overhead map views and side altitude views. The superposition of points in these views makes searching difficult. JRAT enables three-dimensional (3D) viewing: by use of a mouse and keyboard, the user can rotate to any desired viewing angle. The 3D view can include overlaid trajectories and search footprints to enhance situational awareness in searching for pieces. JRAT also enables playback: time-tagged radar-return data can be displayed in time order and an animated 3D model can be moved through the scene to show the locations of the Columbia (or other vehicle) at the times of the corresponding radar events. The combination of overlays and playback enables the user to correlate a radar return with a position of the vehicle to determine whether the return is valid. JRAT can optionally filter single radar returns, enabling the user to selectively hide or highlight a desired radar return.

  11. Tools of radio astronomy

    Wilson, Thomas L; Hüttemeister, Susanne


    This 6th edition of “Tools of Radio Astronomy”, the most used introductory text in radio astronomy, has been revised to reflect the current state of this important branch of astronomy. This includes the use of satellites, low radio frequencies, the millimeter/sub-mm universe, the Cosmic Microwave Background and the increased importance of mm/sub-mm dust emission. Several derivations and presentations of technical aspects of radio astronomy and receivers, such as receiver noise, the Hertz dipole and  beam forming have been updated, expanded, re-worked or complemented by alternative derivations. These reflect advances in technology. The wider bandwidths of the Jansky-VLA and long wave arrays such as LOFAR and mm/sub-mm arrays such as ALMA required an expansion of the discussion of interferometers and aperture synthesis. Developments in data reduction algorithms have been included. As a result of the large amount of data collected in the past 20 years, the discussion of solar system radio astronomy, dust em...

  12. Archiving tools for EOS

    Sindrilaru, Elvin-Alin; Peters, Andreas-Joachim; Duellmann, Dirk


    Archiving data to tape is a critical operation for any storage system, especially for the EOS system at CERN which holds production data for all major LHC experiments. Each collaboration has an allocated quota it can use at any given time therefore, a mechanism for archiving "stale" data is needed so that storage space is reclaimed for online analysis operations. The archiving tool that we propose for EOS aims to provide a robust client interface for moving data between EOS and CASTOR (tape backed storage system) while enforcing best practices when it comes to data integrity and verification. All data transfers are done using a third-party copy mechanism which ensures point-to- point communication between the source and destination, thus providing maximum aggregate throughput. Using ZMQ message-passing paradigm and a process-based approach enabled us to achieve optimal utilisation of the resources and a stateless architecture which can easily be tuned during operation. The modular design and the implementation done in a high-level language like Python, has enabled us to easily extended the code base to address new demands like offering full and incremental backup capabilities.

  13. New Tools for Managing Agricultural P

    Nieber, J. L.; Baker, L. A.; Peterson, H. M.; Ulrich, J.


    Best management practices (BMPs) generally focus on retaining nutrients (especially P) after they enter the watershed. This approach is expensive, unsustainable, and has not led to reductions of P pollution at large scales (e.g., Mississippi River). Although source reduction, which results in reducing inputs of nutrients to a watershed, has long been cited as a preferred approach, we have not had tools to guide source reduction efforts at the watershed level. To augment conventional TMDL tools, we developed an "actionable" watershed P balance approach, based largely on watershed-specific information, yet simple enough to be utilized as a practical tool. Interviews with farmers were used to obtain detailed farm management data, data from livestock permits were adjusted based on site visits, stream P fluxes were calculated from 3 years of monitoring data, and expert knowledge was used to model P fluxes through animal operations. The overall P use efficiency. Puse was calculated as the sum of deliberate exports (P in animals, milk, eggs, and crops) divided by deliberate inputs (P inputs of fertilizer, feed, and nursery animals x 100. The crop P use efficiency was 1.7, meaning that more P was exported as products that was deliberately imported; we estimate that this mining would have resulted in a loss of 6 mg P/kg across the watershed. Despite the negative P balance, the equivalent of 5% of watershed input was lost via stream export. Tile drainage, the presence of buffer strips, and relatively flat topography result in dominance of P loads by ortho-P (66%) and low particulate P. This, together with geochemical analysis (ongoing) suggest that biological processes may be at least as important as sediment transport in controlling P loads. We have developed a P balance calculator tool to enable watershed management organizations to develop watershed P balances and identify opportunities for improving the efficiency of P utilization.

  14. Dataflow Design Tool: User's Manual

    Jones, Robert L., III


    The Dataflow Design Tool is a software tool for selecting a multiprocessor scheduling solution for a class of computational problems. The problems of interest are those that can be described with a dataflow graph and are intended to be executed repetitively on a set of identical processors. Typical applications include signal processing and control law problems. The software tool implements graph-search algorithms and analysis techniques based on the dataflow paradigm. Dataflow analyses provided by the software are introduced and shown to effectively determine performance bounds, scheduling constraints, and resource requirements. The software tool provides performance optimization through the inclusion of artificial precedence constraints among the schedulable tasks. The user interface and tool capabilities are described. Examples are provided to demonstrate the analysis, scheduling, and optimization functions facilitated by the tool.

  15. New tools for learning.

    Dickinson, D


    In the last twenty-five years more has been learned about the human brain than in the past history of mankind. Through the use of new technologies such as PET and CAT scans and functional MRI's, it is now possible to see and learn much about the human brain while it is in the process of thinking. The research of neuroscientists, such as Marian Diamond, has demonstrated that the brain changes physiologically as a result of learning and experience--for better or worse--and that plasticity can continue throughout the lifespan. It appears that there are particular kinds of environments that are most conducive to the development of good mental equipment. They are positive, nurturing, stimulating, and encourage action and interaction. Many of the most effective schools and training programs have created such high-challenge low-threat environments. It is also very clear that intelligence is not a static structure, but an open, dynamic system that can continue to develop throughout life. This understanding is being utilized not only in school systems but in the workplace, where training programs show that even at the adult level people are able to develop their intelligence more fully. Corporations such as Motorola have implemented programs in which they are training their employees, managers, and executives to think, problem-solve and create more effectively using strategies developed by such educational innovators as Reuven Feurstein, J.P. Guilford, and Edward de Bono. A most recent development is in the new kinds of technology that make it possible for people to take responsibility for their own learning as they access and process information through the internet, communicate with experts anywhere in the world, and use software that facilitate higher order thinking and problem-solving. Computers are in no way replacing teachers, but rather these new tools allow them to spend more time being facilitators, mentors, and guides. As a result, teachers and students are able

  16. A computer tool to support in design of industrial Ethernet.

    Lugli, Alexandre Baratella; Santos, Max Mauro Dias; Franco, Lucia Regina Horta Rodrigues


    This paper presents a computer tool to support in the project and development of an industrial Ethernet network, verifying the physical layer (cables-resistance and capacitance, scan time, network power supply-POE's concept "Power Over Ethernet" and wireless), and occupation rate (amount of information transmitted to the network versus the controller network scan time). These functions are accomplished without a single physical element installed in the network, using only simulation. The computer tool has a software that presents a detailed vision of the network to the user, besides showing some possible problems in the network, and having an extremely friendly environment.

  17. Using Business Intelligence Tools for Predictive Analytics in Healthcare System

    Mihaela-Laura IVAN


    Full Text Available The scope of this article is to highlight how healthcare analytics can be improved using Business Intelligence tools. Healthcare system has learned from the previous lessons the necessity of using healthcare analytics for improving patient care, hospital administration, population growth and many others aspects. Business Intelligence solutions applied for the current analysis demonstrate the benefits brought by the new tools, such as SAP HANA, SAP Lumira, and SAP Predictive Analytics. In detailed is analyzed the birth rate with the contribution of different factors to the world.

  18. Physics validation of detector simulation tools for LHC

    Beringer, J


    Extensive studies aimed at validating the physics processes built into the detector simulation tools Geant4 and Fluka are in progress within all Large Hadron Collider (LHC) experiments, within the collaborations developing these tools, and within the LHC Computing Grid (LCG) Simulation Physics Validation Project, which has become the primary forum for these activities. This work includes detailed comparisons with test beam data, as well as benchmark studies of simple geometries and materials with single incident particles of various energies for which experimental data is available. We give an overview of these validation activities with emphasis on the latest results.

  19. CATIA Core Tools Computer Aided Three-Dimensional Interactive Application

    Michaud, Michel


    CATIA Core Tools: Computer-Aided Three-Dimensional Interactive Application explains how to use the essential features of this cutting-edge solution for product design and innovation. The book begins with the basics, such as launching the software, configuring the settings, and managing files. Next, you'll learn about sketching, modeling, drafting, and visualization tools and techniques. Easy-to-follow instructions along with detailed illustrations and screenshots help you get started using several CATIA workbenches right away. Reverse engineering--a valuable product development skill--is also covered in this practical resource.

  20. SSC (Superconducting Super Collider) dipole coil production tooling

    Carson, J.A.; Barczak, E.J.; Bossert, R.C.; Brandt, J.S.; Smith, G.A.


    Superconducting Super Collider dipole coils must be produced to high precision to ensure uniform prestress and even conductor distribution within the collared coil assembly. Tooling is being prepared at Fermilab for the production of high precision 1M and 16.6M SSC dipole coils suitable for mass production. The design and construction methods builds on the Tevatron tooling and production experience. Details of the design and construction methods and measured coil uniformity of 1M coils will be presented. 4 refs., 10 figs.

  1. Digital Sculpting with Mudbox Essential Tools and Techniques for Artists

    de la Flor, Mike


    Digital sculpting is the use of tools to push, pull, smooth, grab, pinch or otherwise manipulate a digital object as if it were made of a real-life substance such as clay. Mudbox is the premier sculpting solution for digital artists, allowing them to naturally and easily sculpt detailed, organic characters and models in a way that feels like traditional sculpting.This book guides CG professionals through the process of creating amazing digital sculptures using the Mudbox arsenal of ground-breaking digital sculpting and 3D painting tools, and porting the models into their Maya or Max work.Artis

  2. Stone tools, language and the brain in human evolution.

    Stout, Dietrich; Chaminade, Thierry


    Long-standing speculations and more recent hypotheses propose a variety of possible evolutionary connections between language, gesture and tool use. These arguments have received important new support from neuroscientific research on praxis, observational action understanding and vocal language demonstrating substantial functional/anatomical overlap between these behaviours. However, valid reasons for scepticism remain as well as substantial differences in detail between alternative evolutionary hypotheses. Here, we review the current status of alternative 'gestural' and 'technological' hypotheses of language origins, drawing on current evidence of the neural bases of speech and tool use generally, and on recent studies of the neural correlates of Palaeolithic technology specifically.

  3. Friction stir welding tool and process for welding dissimilar materials

    Hovanski, Yuri; Grant, Glenn J; Jana, Saumyadeep; Mattlin, Karl F


    A friction stir welding tool and process for lap welding dissimilar materials are detailed. The invention includes a cutter scribe that penetrates and extrudes a first material of a lap weld stack to a preselected depth and further cuts a second material to provide a beneficial geometry defined by a plurality of mechanically interlocking features. The tool backfills the interlocking features generating a lap weld across the length of the interface between the dissimilar materials that enhances the shear strength of the lap weld.

  4. Turn over folders: a proven tool in succession management planning.

    Engells, Thomas E


    The dual challenges of succession management and succession management planning are considerable. A tool, the Turn over Folder, was introduced and described in detail as a useful first step in succession management planning. The adoption of that tool will not in itself produce a succession management plan, but it will orientate the organization and its members to the reality of succession management in all important leadership and critical positions. Succession management is an important consideration in all progressive organizations and well worth the effort.

  5. Environmental tools in product development

    Wenzel, Henrik; Hauschild, Michael Zwicky; Jørgensen, Jørgen


    A precondition for design of environmentally friendly products is that the design team has access to methods and tools supporting the introduction of environmental criteria in product development. A large Danish program, EDIP, is being carried out by the Institute for Product Development, Technical...... University of Denmark, in cooperation with 5 major Danish companies aiming at the development and testing of such tools. These tools are presented in this paper...

  6. Tools for Distributed Systems Monitoring

    Kufel Łukasz


    Full Text Available The management of distributed systems infrastructure requires dedicated set of tools. The one tool that helps visualize current operational state of all systems and notify when failure occurs is available within monitoring solution. This paper provides an overview of monitoring approaches for gathering data from distributed systems and what are the major factors to consider when choosing a monitoring solution. Finally we discuss the tools currently available on the market.

  7. A comparison of MSA tools.

    Essoussi, Nadia; Boujenfa, Khaddouja; Limam, Mohamed


    Multiple sequence alignment (MSA) is essential in phylogenetic, evolutionary and functional analysis. Several MSA tools are available in the literature. Here, we use several MSA tools such as ClustalX, Align-m, T-Coffee, SAGA, ProbCons, MAFFT, MUSCLE and DIALIGN to illustrate comparative phylogenetic trees analysis for two datasets. Results show that there is no single MSA tool that consistently outperforms the rest in producing reliable phylogenetic trees.

  8. Blog: occupational therapy tool

    Miryam Bonadiu Pelosi


    Full Text Available A blog is a site whose structure allows fast articles postage update, or posts. In a blog, texts, images, videosand links to other blogs can be combined, and what makes it a potential therapeutic working tool are the users’ friendlyresources provided for creating and editing systems, which do not require complex knowledge of programmingresources. As in any activity, the professional needs to know it well before being able to use it as a therapeutic resource,but this one is free of charge and it can be performed in gradual steps at different complexity levels. A blog can beused as an idea-organizer, a site where the user decides what is pertinent to be posted, which vocabulary to be used,and the target public, as in a space to share experiences or an integral part in the elaboration of a history book. Tosupport this discussion, three cases where blogs were used as therapeutic resources were presented. In the first case, a12-year-old boy diagnosed with Asperger’s Syndrome created a blog about children characters; on the second, a groupof adolescents with school and social inclusion difficulties developed a blog to tell their adventures; finally, the thirdcase told the story of a girl with cerebral palsy who wrote and published a book and a blog with the help of alternativecommunication strategies. The following themes were addressed as topics for the discussion: activity as therapeuticresource; the different uses of blogs in the worldwide computers net; and the possibility of using blogs as occupationaltherapeutic resources for people with singular life histories, with very diverse abilities, limitations and desires.

  9. Tools: Stuff: Art

    David Kirshner


    Full Text Available Between 1890 and 1898 Erik Satie lived at 6 rue Cortot: ‘in a wardrobe’. Satie was a collector […]. After his death his wardrobe was found to contain 84 handkerchiefs besides 12 identical velvet suits and dozens of umbrellas. 
Trois morceaux en forme de poire […] three pieces in the form of a pear. The title of a piano piece in seven parts by Erik Satie. Satie composed this piece in response to Debussy's criticism that his works lacked a 'sense of form'. What exactly did Debussy mean by this? Where and what actually was this scene of formlessness? 
The first part of the Paper will advance some possible reasoning behind Debussy's comments. Was Debussy questioning Satie's attitude to what Heidegger [in The Origin of the Work of Art] would term the 'thingly' element of the Work of Art, or more precisely – the relationship between 'things' and the 'thing in itself'?
Heidegger's contemplation of 'Form' and his writings on 'tools', 'material' and 'art', and the section dealing with the Temple provides an interesting locus in which to discus Debussy's comments.
The second section gives some ideas of how I reinterpreted this argument to produce a series of visual works inspired by another of Satie's works, Furniture Music – Musique d' ameublement, a piece of music that was not to be listened to. 
Milhaud later recounted: ‘It was no use Satie shouting: “Talk for heaven's sake! Move around! Don't listen!” They kept quiet. They listened. The whole thing went wrong.’

  10. Solar Indices Forecasting Tool

    Henney, Carl John; Shurkin, Kathleen; Arge, Charles; Hill, Frank


    Progress to forecast key space weather parameters using SIFT (Solar Indices Forecasting Tool) with the ADAPT (Air Force Data Assimilative Photospheric flux Transport) model is highlighted in this presentation. Using a magnetic flux transport model, ADAPT, we estimate the solar near-side field distribution that is used as input into empirical models for predicting F10.7(solar 10.7 cm, 2.8 GHz, radio flux), the Mg II core-to-wing ratio, and selected bands of solar far ultraviolet (FUV) and extreme ultraviolet (EUV) irradiance. Input to the ADAPT model includes the inferred photospheric magnetic field from the NISP ground-based instruments, GONG & VSM. Besides a status update regarding ADAPT and SIFT models, we will summarize the findings that: 1) the sum of the absolute value of strong magnetic fields, associated with sunspots, is shown to correlate well with the observed daily F10.7 variability (Henney et al. 2012); and 2) the sum of the absolute value of weak magnetic fields, associated with plage regions, is shown to correlate well with EUV and FUV irradiance variability (Henney et al. 2015). This work utilizes data produced collaboratively between Air Force Research Laboratory (AFRL) and the National Solar Observatory (NSO). The ADAPT model development is supported by AFRL. The input data utilized by ADAPT is obtained by NISP (NSO Integrated Synoptic Program). NSO is operated by the Association of Universities for Research in Astronomy (AURA), Inc., under a cooperative agreement with the National Science Foundation (NSF). The 10.7 cm solar radio flux data service, utilized by the ADAPT/SIFT F10.7 forecasting model, is operated by the National Research Council of Canada and National Resources Canada, with the support of the Canadian Space Agency.

  11. Handbook of Open Source Tools

    Koranne, Sandeep


    Handbook of Open Source Tools introduces a comprehensive collection of advanced open source tools useful in developing software applications. The book contains information on more than 200 open-source tools which include software construction utilities for compilers, virtual-machines, database, graphics, high-performance computing, OpenGL, geometry, algebra, graph theory , GUIs and more. Special highlights for software construction utilities and application libraries are included. Each tool is covered in the context of a real like application development setting. This unique handbook presents

  12. Simulation of Oscillatory Working Tool

    Carmen Debeleac


    Full Text Available The paper presents a study of the resistance forces in soils cutting, with emphasis on their dependence on working tool motion during the loading process and dynamic regimes. The periodic process of cutting of soil by a tool (blade has described. Different intervals in the cycle of steady-state motion of the tool, and several interaction regimes were considered. The analysis has based on a non-linear approximation of the dependence of the soil resistance force on tool motion. Finally, the influence of frequency on the laws governing the interaction in the cyclic process was established.

  13. Oscillation Baselining and Analysis Tool


    PNNL developed a new tool for oscillation analysis and baselining. This tool has been developed under a new DOE Grid Modernization Laboratory Consortium (GMLC) Project (GM0072 - “Suite of open-source applications and models for advanced synchrophasor analysis”) and it is based on the open platform for PMU analysis. The Oscillation Baselining and Analysis Tool (OBAT) performs the oscillation analysis and identifies modes of oscillations (frequency, damping, energy, and shape). The tool also does oscillation event baselining (fining correlation between oscillations characteristics and system operating conditions).

  14. Robotic Mission Simulation Tool Project

    National Aeronautics and Space Administration — Energid Technologies proposes a software tool to predict robotic mission performance and support supervision of robotic missions even when environments and...

  15. Detailed modelling of photon recycling: application to GaAs solar cells

    Balenzategui, J.L. [CIEMAT, Division de Energias Renovables, Avda. Complutense, 22, E-28040 Madrid (Spain); Marti, A. [Instituto de Energia Solar, ETSIT, UPM, Ciudad Universitaria s/n, E-28040 Madrid (Spain)


    The re-absorption of photons emitted in a semiconductor material as a consequence of radiative recombinations, a process referred to as photon recycling (PR), has been researched into for several decades because of its primary influence in increasing the minority carrier lifetime and related parameters. Solar cells with direct bandgap materials and high-absorption coefficients are firm candidates to show PR effects, leading to an improvement in the conversion efficiency of up to 1-2% in absolute terms for cells with conventional designs. However, the formal modelling of PR effects requires the inclusion of additional terms in the standard set of semiconductor equations and researchers usually tend to neglect its influence, because of the lack of available tools for an easy evaluation of this phenomenon in their particular devices. This paper describes a detailed model of PR which allows the incorporation of specific characteristics and optics of GaAs solar cells and, at the same time, solves some of the problems found in previous developments of these numerical models. The methodology for the calculation is based on the use of commercially available programs for semiconductor device simulation that do not initially have the potential for PR modelling and, thus, it can be extended to and applied by other researchers whishing to compare its relative influence on the performance of different structures and materials. (author)

  16. a Detailed Proof of the Fundamental Theorem of STF Multipole Expansion in Linearized Gravity

    Zschocke, Sven


    The linearized field equations of general relativity in harmonic coordinates are given by an inhomogeneous wave equation. In the region exterior to the matter field, the retarded solution of this wave equation can be expanded in terms of 10 Cartesian symmetric and tracefree (STF) multipoles in post-Minkowskian approximation. For such a multipole decomposition only three and rather weak assumptions are required: (1) No-incoming-radiation condition. (2) The matter source is spatially compact. (3) A spherical expansion for the metric outside the matter source is possible. During the last decades, the STF multipole expansion has been established as a powerful tool in several fields of gravitational physics: celestial mechanics, theory of gravitational waves and in the theory of light propagation and astrometry. But despite its formidable importance, an explicit proof of the fundamental theorem of STF multipole expansion has not been presented so far, while only some parts of it are distributed into several publications. In a technical but more didactical form, an explicit and detailed mathematical proof of each individual step of this important theorem of STF multipole expansion is represented.

  17. Modelling The Photooxidation of Toluene: New Concepts For Developing and Validating Detailed Mechanisms

    Wagner, V.; Jenkin, M. E.; Saunders, S.; Stanton, J.; Pilling, M. J.

    The photooxidation of aromatic hydrocarbons contributes significantly to the forma- tion of photochemical smog. The master chemical mechanism (MCM3) contains de- tailed mechanisms for a variety of aromatics, which represent the current understand- ing of the atmospheric photochemical oxidation of these compounds. The comparison of MCM3 simulations with smog chamber experiments have revealed large discrepan- cies, particular in the ozone concentration-time profiles, that suggests that these mech- anisms are not yet suitable for the application in atmospheric models. We will present a variety of tools that help to give a more quantitative understanding of the radical transformation and the breakdown of the carbon skeleton in the aromatic systems. The toluene mechanism is chosen as an example and the significant intermediates, which have most impact on ozone formation, are identified by sensitivity analysis. Further- more, with the aid of budget calculations, we investigate the effect of each of the major reaction channels on the global reactivity of the reaction system. The gathered infor- mation is then discussed in terms of validation concepts for detailed mechanisms with the aid of smog chamber experiments. In this way, we try to explore new concepts in which mechanism development becomes an interactive procedure between kinetic studies, chamber experiments and modelling.

  18. Stark effect modeling in the detailed opacity code SCO-RCG

    Pain, Jean-Christophe; Gilles, Dominique


    The broadening of lines by Stark effect is an important tool for inferring electron density and temperature in plasmas. Stark-effect calculations often rely on atomic data (transition rates, energy levels,...) not always exhaustive and/or valid for isolated atoms. We present a recent development in the detailed opacity code SCO-RCG for K-shell spectroscopy (hydrogen- and helium-like ions). This approach is adapted from the work of Gilles and Peyrusse. Neglecting non-diagonal terms in dipolar and collision operators, the line profile is expressed as a sum of Voigt functions associated to the Stark components. The formalism relies on the use of parabolic coordinates within SO(4) symmetry. The relativistic fine-structure of Lyman lines is included by diagonalizing the hamiltonian matrix associated to quantum states having the same principal quantum number $n$. The resulting code enables one to investigate plasma environment effects, the impact of the microfield distribution, the decoupling between electron and i...

  19. Restoring Detailed Geomagnetic and Environmental Information from Continuous Sediment Paleomagnetic Measurement through Optimised Deconvolution

    Xuan, C.; Oda, H.


    The development of pass-through cryogenic magnetometers has greatly improved our efficiency in collecting paleomagnetic and rock magnetic data from continuous samples such as sediment half-core sections and u-channels. During a pass-through measurement, the magnetometer sensor response inevitably convolves with remanence of the continuous sample. The convolution process results in smoothed measurement and can seriously distort the paleomagnetic signal due to differences in sensor response along different measurement axes. Previous studies have demonstrated that deconvolution can effectively overcome the convolution effect of sensor response and improve the resolution for continuous paleomagnetic data. However, the lack of an easy-to-use deconvolution tool and the difficulty in accurately measuring the magnetometer sensor response have greatly hindered the application of deconvolution. Here, we acquire reliable estimate of sensor response of a pass-through cryogenic magnetometer at the Oregon State University by integrating repeated measurements of a magnetic point source. The point source is fixed in the center of a well-shaped polycarbonate cube with 5 mm edge length, and measured at every 1 mm position along a 40-cm interval while placing the polycarbonate cube at each of the 5 × 5 grid positions over a 2 × 2 cm2 area on the cross section. The acquired sensor response reveals that cross terms (i.e. response of pick-up coil for one axis to magnetic signal along other axes) that were often omitted in previous deconvolution practices are clearly not negligible. Utilizing the detailed estimate of magnetometer sensor response, we present UDECON, a graphical tool for convenient application of optimised deconvolution based on Akaike's Bayesian Information Criterion (ABIC) minimization (Oda and Shibuya, 1996). UDECON directly reads a paleomagnetic measurement file, and allows user to view, compare, and save data before and after deconvolution. Optimised deconvolution

  20. Ethernet Networks: Current Trends and Tools

    El-Sayed, Abdulqader M


    Ethernet topology discovery has gained increasing interest in the recent years. This trend is motivated mostly by increasing number of carrier Ethernet networks as well as the size of these networks, and consequently the increasing sales of these networks. To manage these networks efficiently, detailed and accurate knowledge of their topology is needed. Knowledge of a network's entities and the physical connections between them can be useful in various prospective. Administrators can use topology information for network planning and fault detecting. Topology information can also be used during protocol and routing algorithm development, for performance prediction and as a basis for accurate network simulations. From a network security perspective, threat detection, network monitoring, network access control and forensic investigations can benefit from accurate network topology information. In this paper, we analyze market trends and investigate current tools available for both research and commercial purposes...

  1. IT Data Mining Tool Uses in Aerospace

    Monroe, Gilena A.; Freeman, Kenneth; Jones, Kevin L.


    Data mining has a broad spectrum of uses throughout the realms of aerospace and information technology. Each of these areas has useful methods for processing, distributing, and storing its corresponding data. This paper focuses on ways to leverage the data mining tools and resources used in NASA's information technology area to meet the similar data mining needs of aviation and aerospace domains. This paper details the searching, alerting, reporting, and application functionalities of the Splunk system, used by NASA's Security Operations Center (SOC), and their potential shared solutions to address aircraft and spacecraft flight and ground systems data mining requirements. This paper also touches on capacity and security requirements when addressing sizeable amounts of data across a large data infrastructure.

  2. Analytical and numerical tools for vacuum systems

    Kersevan, R


    Modern particle accelerators have reached a level of sophistication which require a thorough analysis of all their sub-systems. Among the latter, the vacuum system is often a major contributor to the operating performance of a particle accelerator. The vacuum engineer has nowadays a large choice of computational schemes and tools for the correct analysis, design, and engineering of the vacuum system. This paper is a review of the different type of algorithms and methodologies which have been developed and employed in the field since the birth of vacuum technology. The different level of detail between simple back-of-the-envelope calculations and more complex numerical analysis is discussed by means of comparisons. The domain of applicability of each method is discussed, together with its pros and cons.

  3. Using the General Mission Analysis Tool (GMAT)

    Hughes, Steven P.; Conway, Darrel J.; Parker, Joel


    This is a software tutorial and presentation demonstrating the application of the General Mission Analysis Tool (GMAT). These slides will be used to accompany the demonstration. The demonstration discusses GMAT basics, then presents a detailed example of GMAT application to the Transiting Exoplanet Survey Satellite (TESS) mission. This talk is a combination of existing presentations and material; system user guide and technical documentation; a GMAT basics and overview, and technical presentations from the TESS projects on their application of GMAT to critical mission design. The GMAT basics slides are taken from the open source training material. The TESS slides are a streamlined version of the CDR package provided by the project with SBU and ITAR data removed by the TESS project. Slides for navigation and optimal control are borrowed from system documentation and training material.

  4. Measurement of tool forces in diamond turning

    Drescher, J.; Dow, T.A.


    A dynamometer has been designed and built to measure forces in diamond turning. The design includes a 3-component, piezoelectric transducer. Initial experiments with this dynamometer system included verification of its predicted dynamic characteristics as well as a detailed study of cutting parameters. Many cutting experiments have been conducted on OFHC Copper and 6061-T6 Aluminum. Tests have involved investigation of velocity effects, and the effects of depth and feedrate on tool forces. Velocity has been determined to have negligible effects between 4 and 21 m/s. Forces generally increase with increasing depth of cut. Increasing feedrate does not necessarily lead to higher forces. Results suggest that a simple model may not be sufficient to describe the forces produced in the diamond turning process.

  5. Improving the Formatting Tools of CDS Invenio

    Caffaro, J; Pu Faltings, Pearl


    CDS Invenio is the web-based integrated digital library system developed at CERN. It is a strategical tool that supports the archival and open dissemination of documents produced by CERN researchers. This paper reports on my Master’s thesis work done on BibFormat, a module in CDS Invenio, which formats documents metadata. The goal of this project was to implement a completely new formatting module for CDS Invenio. In this report a strong emphasis is put on the user-centered design of the new BibFormat. The bibliographic formatting process and its requirements are discussed. The task analysis and its resulting interaction model are detailed. The document also shows the implemented user interface of BibFormat and gives the results of the user evaluation of this interface. Finally the results of a small usability study of the formats included in CDS Invenio are discussed.

  6. GIS learning tool for world's largest earthquakes and their causes

    Chatterjee, Moumita

    The objective of this thesis is to increase awareness about earthquakes among people, especially young students by showing the five largest and two most predictable earthquake locations in the world and their plate tectonic settings. This is a geographic based interactive tool which could be used for learning about the cause of great earthquakes in the past and the safest places on the earth in order to avoid direct effect of earthquakes. This approach provides an effective way of learning for the students as it is very user friendly and more aligned to the interests of the younger generation. In this tool the user can click on the various points located on the world map which will open a picture and link to the webpage for that point, showing detailed information of the earthquake history of that place including magnitude of quake, year of past quakes and the plate tectonic settings that made this place earthquake prone. Apart from knowing the earthquake related information students will also be able to customize the tool to suit their needs or interests. Students will be able to add/remove layers, measure distance between any two points on the map, select any place on the map and know more information for that place, create a layer from this set to do a detail analysis, run a query, change display settings, etc. At the end of this tool the user has to go through the earthquake safely guidelines in order to be safe during an earthquake. This tool uses Java as programming language and uses Map Objects Java Edition (MOJO) provided by ESRI. This tool is developed for educational purpose and hence its interface has been kept simple and easy to use so that students can gain maximum knowledge through it instead of having a hard time to install it. There are lots of details to explore which can help more about what a GIS based tool is capable of. Only thing needed to run this tool is latest JAVA edition installed in their machine. This approach makes study more fun and

  7. Harnessing Theories for Tool Support

    Liu, Zhiming; Mencl, Vladimir; Ravn, Anders Peter;


    Software development tools need to support more and more phases of the entire development process, because applications must be developed more correctly and efficiently. The tools therefore need to integrate sophisticated checkers, generators and transformations. A feasible approach to ensure hig...

  8. Multi-purpose tool mitten

    Wilcomb, E. F.


    Tool mitten provides a low reaction torque source of power for wrench, screwdriver, or drill activities. The technique employed prevents the attachments from drifting away from the operator. While the tools are specifically designed for space environments, they can be used on steel scaffolding, in high building maintenance, or underwater environments.

  9. WCET Tool Challenge 2011: Report

    Bonenfant, Armelle; Cassé, Hugues; Bünte, Sven;


    Following the successful WCET Tool Challenges in 2006 and 2008, the third event in this series was organized in 2011, again with support from the ARTIST DESIGN Network of Excellence. Following the practice established in the previous Challenges, the WCET Tool Challenge 2011 (WCC’11) defined two k...

  10. Measuring Light with Useful Tools

    Peek, Gina; Hebert, Paulette; Frazier, Robert Scott; Knag, Mihyun


    Lighting, a necessary part of our home and work environment, is often considered as an afterthought. This article describes tools that Extension educators (Agriculture, Family and Consumer Sciences, and 4-H) can use to measure light levels. 4-H youth may also participate. These tools include light meters and Illuminating Engineering Society (IES)…

  11. Disclosure as a regulatory tool


    The chapter analyses how disclure can be used as a regulatory tool and analyses how it has been applied so far in the area of financial market law and consumer law.......The chapter analyses how disclure can be used as a regulatory tool and analyses how it has been applied so far in the area of financial market law and consumer law....

  12. Ludic Educational Game Creation Tool

    Vidakis, Nikolaos; Syntychakis, Efthimios; Kalafatis, Konstantinos


    creation tool features a web editor, where the game narrative can be manipulated, according to specific needs. Moreover, this tool is applied for creating an educational game according to a reference scenario namely teaching schoolers road safety. A ludic approach is used both in game creation and play...

  13. Water management tools for Mississippi

    Our goal is to equip crop producers in the Southeast with tools to improve crop production and management including: • Knowledge of crop and soil water relations • Irrigation scheduling tools for better water management, and • Economic benefits of water conservation technologies Crop performance can...

  14. Integrated Wind Power Planning Tool

    Rosgaard, Martin H.; Hahmann, Andrea N.; Nielsen, Torben S.;

    This poster presents the Public Service Obligation (PSO) funded project PSO 10464 "Integrated Wind Power Planning Tool". The project goal is to integrate a Numerical Weather Prediction (NWP) model with statistical tools in order to assess wind power fluctuations, with focus on short term...... forecasting for existing wind farms, as well as long term power system planning for future wind farms....

  15. Battery switch for downhole tools

    Boling, Brian E.


    An electrical circuit for a downhole tool may include a battery, a load electrically connected to the battery, and at least one switch electrically connected in series with the battery and to the load. The at least one switch may be configured to close when a tool temperature exceeds a selected temperature.

  16. Diagnostic Tools for Learning Organizations.

    Moilanen, Raili


    The Learning Organization Diamond Tool was designed for holistic analysis of 10 learning organization elements at the individual and organizational levels. A test in 25 Finnish organizations established validity. Comparison with existing tools showed that differences derive from their different purposes. (Contains 33 references.) (SK)

  17. System Maturity and Architecture Assessment Methods, Processes, and Tools


    1 For a detailed description of the SRL methodology see Sauser, B., J.E. Ramirez- Marquez , D. Nowicki, A...and Ramirez- Marquez 2009; Magnaye, Sauser et al. 2010). Although there are guidelines and tools to support the assessment process (Nolte, Kennedy...employ these metrics (Tan, Sauser et al. 2011). Graettinger, et al. (Graettinger, Garcia et al. 2002) reports that approaches for readiness level

  18. NASA Planetary Visualization Tool

    Hogan, P.; Kim, R.


    NASA World Wind allows one to zoom from satellite altitude into any place on Earth, leveraging the combination of high resolution LandSat imagery and SRTM elevation data to experience Earth in visually rich 3D, just as if they were really there. NASA World Wind combines LandSat 7 imagery with Shuttle Radar Topography Mission (SRTM) elevation data, for a dramatic view of the Earth at eye level. Users can literally fly across the world's terrain from any location in any direction. Particular focus was put into the ease of usability so people of all ages can enjoy World Wind. All one needs to control World Wind is a two button mouse. Additional guides and features can be accessed though a simplified menu. Navigation is automated with single clicks of a mouse as well as the ability to type in any location and automatically zoom to it. NASA World Wind was designed to run on recent PC hardware with the same technology used by today's 3D video games. NASA World Wind delivers the NASA Blue Marble, spectacular true-color imagery of the entire Earth at 1-kilometer-per-pixel. Using NASA World Wind, you can continue to zoom past Blue Marble resolution to seamlessly experience the extremely detailed mosaic of LandSat 7 data at an impressive 15-meters-per-pixel resolution. NASA World Wind also delivers other color bands such as the infrared spectrum. The NASA Scientific Visualization Studio at Goddard Space Flight Center (GSFC) has produced a set of visually intense animations that demonstrate a variety of subjects such as hurricane dynamics and seasonal changes across the globe. NASA World Wind takes these animations and plays them directly on the world. The NASA Moderate Resolution Imaging Spectroradiometer (MODIS) produces a set of time relevant planetary imagery that's updated every day. MODIS catalogs fires, floods, dust, smoke, storms and volcanic activity. NASA World Wind produces an easily customized view of this information and marks them directly on the globe. When one

  19. Proceedings Fifth Transformation Tool Contest

    Van Gorp, Pieter; Rose, Louis


    The aim of the Transformation Tool Contest (TTC) series is to compare the expressiveness, the usability and the performance of graph and model transformation tools along a number of selected case studies. Participants want to learn about the pros and cons of each tool considering different applications. A deeper understanding of the relative merits of different tool features will help to further improve graph and model transformation tools and to indicate open problems. TTC 2011 involved 25 offline case study solutions: 12 solutions to the Hello World case, 2 solutions to the GMF Model Migration case, 5 solutions to the Compiler Optimization case, and 7 solutions to the Reengineering (i.e., Program Understanding) case. This volume contains the submissions that have passed an additional (post-workshop) reviewing round.

  20. VISTA - computational tools for comparative genomics

    Frazer, Kelly A.; Pachter, Lior; Poliakov, Alexander; Rubin,Edward M.; Dubchak, Inna


    Comparison of DNA sequences from different species is a fundamental method for identifying functional elements in genomes. Here we describe the VISTA family of tools created to assist biologists in carrying out this task. Our first VISTA server at was launched in the summer of 2000 and was designed to align long genomic sequences and visualize these alignments with associated functional annotations. Currently the VISTA site includes multiple comparative genomics tools and provides users with rich capabilities to browse pre-computed whole-genome alignments of large vertebrate genomes and other groups of organisms with VISTA Browser, submit their own sequences of interest to several VISTA servers for various types of comparative analysis, and obtain detailed comparative analysis results for a set of cardiovascular genes. We illustrate capabilities of the VISTA site by the analysis of a 180 kilobase (kb) interval on human chromosome 5 that encodes for the kinesin family member3A (KIF3A) protein.