WorldWideScience

Sample records for source tool written

  1. Improving the use of historical written sources in paleopathology.

    Science.gov (United States)

    Mitchell, Piers D

    2017-12-01

    The texts written by the people of past societies can provide key information that enhances our understanding of disease in the past. Written sources and art can describe cultural contexts that not only help us interpret lesions in excavated human remains, but also provide evidence for past disease events themselves. However, in recent decades many biohistorical articles have been published that claim to diagnose diseases present in past celebrities or well known individuals, often using less than scholarly methodology. This article aims to help researchers use historical written sources and artwork responsibly, thus improving our understanding of health and disease in the past. It explores a broad range of historical sources, from medical texts and histories to legal documents and tax records, and it highlights how the key to interpreting any past text is to understand who wrote it, when it was written, and why it was written. Case studies of plague epidemics, crucifixion, and the spinal deformity of King Richard III are then used to highlight how we might better integrate archaeological and historical evidence. When done well, integrating evidence from both archaeological and historical sources increases the probability of a complete and well-balanced understanding of disease in past societies. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Pylinguistics: an open source library for readability assessment of texts written in Portuguese

    Directory of Open Access Journals (Sweden)

    Castilhos, S.

    2016-12-01

    Full Text Available Readability assessment is an important task in automatic text simplification that aims identify the text complexity by computing a set of metrics. In this paper, we present the development and assessment of an open source library called Pylinguistics to readability assessment of texts written in Portuguese. Additionally, to illustrate the possibilities of our tool, this work also presents an empirical analysis of readability of Brazilian scientific news dissemination.

  3. An open-source optimization tool for solar home systems: A case study in Namibia

    International Nuclear Information System (INIS)

    Campana, Pietro Elia; Holmberg, Aksel; Pettersson, Oscar; Klintenberg, Patrik; Hangula, Abraham; Araoz, Fabian Benavente; Zhang, Yang; Stridh, Bengt; Yan, Jinyue

    2016-01-01

    Highlights: • An open-source optimization tool for solar home systems (SHSs) design is developed. • The optimization tool is written in MS Excel-VBA. • The optimization tool is validated with a commercial and open-source software. • The optimization tool has the potential of improving future SHS installations. - Abstract: Solar home systems (SHSs) represent a viable technical solution for providing electricity to households and improving standard of living conditions in areas not reached by the national grid or local grids. For this reason, several rural electrification programmes in developing countries, including Namibia, have been relying on SHSs to electrify rural off-grid communities. However, the limited technical know-how of service providers, often resulting in over- or under-sized SHSs, is an issue that has to be solved to avoid dissatisfaction of SHSs’ users. The solution presented here is to develop an open-source software that service providers can use to optimally design SHSs components based on the specific electricity requirements of the end-user. The aim of this study is to develop and validate an optimization model written in MS Excel-VBA which calculates the optimal SHSs components capacities guaranteeing the minimum costs and the maximum system reliability. The results obtained with the developed tool showed good agreement with a commercial software and a computational code used in research activities. When applying the developed optimization tool to existing systems, the results identified that several components were incorrectly sized. The tool has thus the potentials of improving future SHSs installations, contributing to increasing satisfaction of end-users.

  4. Mushu, a free- and open source BCI signal acquisition, written in Python.

    Science.gov (United States)

    Venthur, Bastian; Blankertz, Benjamin

    2012-01-01

    The following paper describes Mushu, a signal acquisition software for retrieval and online streaming of Electroencephalography (EEG) data. It is written, but not limited, to the needs of Brain Computer Interfacing (BCI). It's main goal is to provide a unified interface to EEG data regardless of the amplifiers used. It runs under all major operating systems, like Windows, Mac OS and Linux, is written in Python and is free- and open source software licensed under the terms of the GNU General Public License.

  5. The students’ use of written and internet sources and electronic media for assessment in slovene

    Directory of Open Access Journals (Sweden)

    Petra Hromin

    2015-06-01

    Full Text Available The article presents the frequency of using written and online sources as well as of electronic media during preparation of secondary school students for in-class examinations in Slovene language and literature. Within the scope of the above mentioned aspects we have controlled the of age and type of secondary school programmes. In the first part of the article the concept of information and communication technology/multimedia, the concept of e-learning and the concept of student activity are defined. In the second half of the article I present the results of the research, which show the frequency of use of written and web sources as well as of electronic media. These results have shown that with the oral examination of knowledge of grammar and literature the use of the notebook is prevalent, while with the written examination of knowledge of grammar and literature the use of the course book is predominant. The frequency of use of World Wide Web sources and electronic media increases with age and according to the level of difficultness of education programme. Thus the use of the notebook is the most prevalent in vocational schools whereas the use of the course book is predominant at the level of technical gimnazija and general gimnazija programmes.

  6. jSPyDB, an open source database-independent tool for data management

    CERN Document Server

    Pierro, Giuseppe Antonio

    2010-01-01

    Nowadays, the number of commercial tools available for accessing Databases, built on Java or .Net, is increasing. However, many of these applications have several drawbacks: usually they are not open-source, they provide interfaces only with a specific kind of database, they are platform-dependent and very CPU and memory consuming. jSPyDB is a free web based tool written using Python and Javascript. It relies on jQuery and python libraries, and is intended to provide a simple handler to different Database technologies inside a local web browser. Such a tool, exploiting fast access libraries such as SQLAlchemy, is easy to install, and to configure. The design of this tool envisages three layers. The front-end client side in the local web browser communicates with a backend server. Only the server is able to connect to the different databases for the purposes of performing data definition and manipulation. The server makes the data available to the client, so that the user can display and handle them safely. ...

  7. ProteoWizard: open source software for rapid proteomics tools development.

    Science.gov (United States)

    Kessner, Darren; Chambers, Matt; Burke, Robert; Agus, David; Mallick, Parag

    2008-11-01

    The ProteoWizard software project provides a modular and extensible set of open-source, cross-platform tools and libraries. The tools perform proteomics data analyses; the libraries enable rapid tool creation by providing a robust, pluggable development framework that simplifies and unifies data file access, and performs standard proteomics and LCMS dataset computations. The library contains readers and writers of the mzML data format, which has been written using modern C++ techniques and design principles and supports a variety of platforms with native compilers. The software has been specifically released under the Apache v2 license to ensure it can be used in both academic and commercial projects. In addition to the library, we also introduce a rapidly growing set of companion tools whose implementation helps to illustrate the simplicity of developing applications on top of the ProteoWizard library. Cross-platform software that compiles using native compilers (i.e. GCC on Linux, MSVC on Windows and XCode on OSX) is available for download free of charge, at http://proteowizard.sourceforge.net. This website also provides code examples, and documentation. It is our hope the ProteoWizard project will become a standard platform for proteomics development; consequently, code use, contribution and further development are strongly encouraged.

  8. Intrusion Detection using Open Source Tools

    OpenAIRE

    Jack TIMOFTE

    2008-01-01

    We have witnessed in the recent years that open source tools have gained popularity among all types of users, from individuals or small businesses to large organizations and enterprises. In this paper we will present three open source IDS tools: OSSEC, Prelude and SNORT.

  9. Determining the sources of fine-grained sediment using the Sediment Source Assessment Tool (Sed_SAT)

    Science.gov (United States)

    Gorman Sanisaca, Lillian E.; Gellis, Allen C.; Lorenz, David L.

    2017-07-27

    A sound understanding of sources contributing to instream sediment flux in a watershed is important when developing total maximum daily load (TMDL) management strategies designed to reduce suspended sediment in streams. Sediment fingerprinting and sediment budget approaches are two techniques that, when used jointly, can qualify and quantify the major sources of sediment in a given watershed. The sediment fingerprinting approach uses trace element concentrations from samples in known potential source areas to determine a clear signature of each potential source. A mixing model is then used to determine the relative source contribution to the target suspended sediment samples.The computational steps required to apportion sediment for each target sample are quite involved and time intensive, a problem the Sediment Source Assessment Tool (Sed_SAT) addresses. Sed_SAT is a user-friendly statistical model that guides the user through the necessary steps in order to quantify the relative contributions of sediment sources in a given watershed. The model is written using the statistical software R (R Core Team, 2016b) and utilizes Microsoft Access® as a user interface but requires no prior knowledge of R or Microsoft Access® to successfully run the model successfully. Sed_SAT identifies outliers, corrects for differences in size and organic content in the source samples relative to the target samples, evaluates the conservative behavior of tracers used in fingerprinting by applying a “Bracket Test,” identifies tracers with the highest discriminatory power, and provides robust error analysis through a Monte Carlo simulation following the mixing model. Quantifying sediment source contributions using the sediment fingerprinting approach provides local, State, and Federal land management agencies with important information needed to implement effective strategies to reduce sediment. Sed_SAT is designed to assist these agencies in applying the sediment fingerprinting

  10. Overview of the tool-flow for the Montium Processing Tile

    NARCIS (Netherlands)

    Smit, Gerardus Johannes Maria; Rosien, M.A.J.; Guo, Y.; Heysters, P.M.

    This paper presents an overview of a tool chain to support a transformational design methodology. The tool can be used to compile code written in a high level source language, like C, to a coarse grain reconfigurable architecture. The source code is first translated into a Control Data Flow Graph

  11. [Written personalized action plan for atopic dermatitis: a patient education tool].

    Science.gov (United States)

    Gabeff, R; Assathiany, R; Barbarot, S; Salinier, C; Stalder, J-F

    2014-07-01

    Atopic dermatitis (AD) is the most frequent children's chronic skin disease. Management of AD can be difficult because local treatments must be adapted to the skin's condition. Between consultations, sudden changes in the state of the disease can make it difficult to manage local treatment. Parents and children need information that will help them adapt their treatment to the course of their disease. Aiming to enable parents to better treat their atopic child by themselves, we have developed a personalized action plan in order to simplify, personalize, and adapt the medical prescription to the state of the disease. The Personalized Written Action Plan for Atopics (PA2P) is based on the model used in the treatment of asthma, with integrated specificities for AD in children. The aim of this study was to assess the feasibility and pertinence of the PA2P for pediatricians to use in private practice. A total of 479 pediatricians answered a questionnaire sent by e-mail. The vast majority of the respondents gave positive reviews of the tool: 99% of the pediatricians declared the tool to be pertinent, qualifying it as clear and logical. The PA2P appeared to be appropriate for the atopic patient because it improves the families' involvement in the application of local treatment by offering personalized care and by simplifying the doctor's prescription. Finally, 72% of doctors responding to the questionnaire were willing to take part in future studies involving parents. More than a gadget, the PA2P could become a useful tool for therapeutic patient education. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  12. Large Data Visualization with Open-Source Tools

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Visualization and post-processing of large data have become increasingly challenging and require more and more tools to support the diversity of data to process. In this seminar, we will present a suite of open-source tools supported and developed by Kitware to perform large-scale data visualization and analysis. In particular, we will present ParaView, an open-source tool for parallel visualization of massive datasets, the Visualization Toolkit (VTK), an open-source toolkit for scientific visualization, and Tangelohub, a suite of tools for large data analytics. About the speaker Julien Jomier is directing Kitware's European subsidiary in Lyon, France, where he focuses on European business development. Julien works on a variety of projects in the areas of parallel and distributed computing, mobile computing, image processing, and visualization. He is one of the developers of the Insight Toolkit (ITK), the Visualization Toolkit (VTK), and ParaView. Julien is also leading the CDash project, an open-source co...

  13. jSPyDB, an open source database-independent tool for data management

    Science.gov (United States)

    Pierro, Giuseppe Antonio; Cavallari, Francesca; Di Guida, Salvatore; Innocente, Vincenzo

    2011-12-01

    Nowadays, the number of commercial tools available for accessing Databases, built on Java or .Net, is increasing. However, many of these applications have several drawbacks: usually they are not open-source, they provide interfaces only with a specific kind of database, they are platform-dependent and very CPU and memory consuming. jSPyDB is a free web-based tool written using Python and Javascript. It relies on jQuery and python libraries, and is intended to provide a simple handler to different database technologies inside a local web browser. Such a tool, exploiting fast access libraries such as SQLAlchemy, is easy to install, and to configure. The design of this tool envisages three layers. The front-end client side in the local web browser communicates with a backend server. Only the server is able to connect to the different databases for the purposes of performing data definition and manipulation. The server makes the data available to the client, so that the user can display and handle them safely. Moreover, thanks to jQuery libraries, this tool supports export of data in different formats, such as XML and JSON. Finally, by using a set of pre-defined functions, users are allowed to create their customized views for a better data visualization. In this way, we optimize the performance of database servers by avoiding short connections and concurrent sessions. In addition, security is enforced since we do not provide users the possibility to directly execute any SQL statement.

  14. jSPyDB, an open source database-independent tool for data management

    International Nuclear Information System (INIS)

    Pierro, Giuseppe Antonio; Cavallari, Francesca; Di Guida, Salvatore; Innocente, Vincenzo

    2011-01-01

    Nowadays, the number of commercial tools available for accessing Databases, built on Java or .Net, is increasing. However, many of these applications have several drawbacks: usually they are not open-source, they provide interfaces only with a specific kind of database, they are platform-dependent and very CPU and memory consuming. jSPyDB is a free web-based tool written using Python and Javascript. It relies on jQuery and python libraries, and is intended to provide a simple handler to different database technologies inside a local web browser. Such a tool, exploiting fast access libraries such as SQLAlchemy, is easy to install, and to configure. The design of this tool envisages three layers. The front-end client side in the local web browser communicates with a backend server. Only the server is able to connect to the different databases for the purposes of performing data definition and manipulation. The server makes the data available to the client, so that the user can display and handle them safely. Moreover, thanks to jQuery libraries, this tool supports export of data in different formats, such as XML and JSON. Finally, by using a set of pre-defined functions, users are allowed to create their customized views for a better data visualization. In this way, we optimize the performance of database servers by avoiding short connections and concurrent sessions. In addition, security is enforced since we do not provide users the possibility to directly execute any SQL statement.

  15. Open source tools for fluorescent imaging.

    Science.gov (United States)

    Hamilton, Nicholas A

    2012-01-01

    As microscopy becomes increasingly automated and imaging expands in the spatial and time dimensions, quantitative analysis tools for fluorescent imaging are becoming critical to remove both bottlenecks in throughput as well as fully extract and exploit the information contained in the imaging. In recent years there has been a flurry of activity in the development of bio-image analysis tools and methods with the result that there are now many high-quality, well-documented, and well-supported open source bio-image analysis projects with large user bases that cover essentially every aspect from image capture to publication. These open source solutions are now providing a viable alternative to commercial solutions. More importantly, they are forming an interoperable and interconnected network of tools that allow data and analysis methods to be shared between many of the major projects. Just as researchers build on, transmit, and verify knowledge through publication, open source analysis methods and software are creating a foundation that can be built upon, transmitted, and verified. Here we describe many of the major projects, their capabilities, and features. We also give an overview of the current state of open source software for fluorescent microscopy analysis and the many reasons to use and develop open source methods. Copyright © 2012 Elsevier Inc. All rights reserved.

  16. Open Source Tools for Assessment of Global Water Availability, Demands, and Scarcity

    Science.gov (United States)

    Li, X.; Vernon, C. R.; Hejazi, M. I.; Link, R. P.; Liu, Y.; Feng, L.; Huang, Z.; Liu, L.

    2017-12-01

    Water availability and water demands are essential factors for estimating water scarcity conditions. To reproduce historical observations and to quantify future changes in water availability and water demand, two open source tools have been developed by the JGCRI (Joint Global Change Research Institute): Xanthos and GCAM-STWD. Xanthos is a gridded global hydrologic model, designed to quantify and analyze water availability in 235 river basins. Xanthos uses a runoff generation and a river routing modules to simulate both historical and future estimates of total runoff and streamflows on a monthly time step at a spatial resolution of 0.5 degrees. GCAM-STWD is a spatiotemporal water disaggregation model used with the Global Change Assessment Model (GCAM) to spatially downscale global water demands for six major enduse sectors (irrigation, domestic, electricity generation, mining, and manufacturing) from the region scale to the scale of 0.5 degrees. GCAM-STWD then temporally downscales the gridded annual global water demands to monthly results. These two tools, written in Python, can be integrated to assess global, regional or basin-scale water scarcity or water stress. Both of the tools are extensible to ensure flexibility and promote contribution from researchers that utilize GCAM and study global water use and supply.

  17. Open source Modeling and optimization tools for Planning

    Energy Technology Data Exchange (ETDEWEB)

    Peles, S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-10

    Open source modeling and optimization tools for planning The existing tools and software used for planning and analysis in California are either expensive, difficult to use, or not generally accessible to a large number of participants. These limitations restrict the availability of participants for larger scale energy and grid studies in the state. The proposed initiative would build upon federal and state investments in open source software, and create and improve open source tools for use in the state planning and analysis activities. Computational analysis and simulation frameworks in development at national labs and universities can be brought forward to complement existing tools. An open source platform would provide a path for novel techniques and strategies to be brought into the larger community and reviewed by a broad set of stakeholders.

  18. A survey of open source tools for business intelligence

    DEFF Research Database (Denmark)

    Thomsen, Christian; Pedersen, Torben Bach

    2005-01-01

    The industrial use of open source Business Intelligence (BI) tools is not yet common. It is therefore of interest to explore which possibilities are available for open source BI and compare the tools. In this survey paper, we consider the capabilities of a number of open source tools for BI....... In the paper, we consider three Extract-Transform-Load (ETL) tools, three On-Line Analytical Processing (OLAP) servers, two OLAP clients, and four database management systems (DBMSs). Further, we describe the licenses that the products are released under. It is argued that the ETL tools are still not very...

  19. A study of potential sources of linguistic ambiguity in written work instructions.

    Energy Technology Data Exchange (ETDEWEB)

    Matzen, Laura E.

    2009-11-01

    This report describes the results of a small experimental study that investigated potential sources of ambiguity in written work instructions (WIs). The English language can be highly ambiguous because words with different meanings can share the same spelling. Previous studies in the nuclear weapons complex have shown that ambiguous WIs can lead to human error, which is a major cause for concern. To study possible sources of ambiguity in WIs, we determined which of the recommended action verbs in the DOE and BWXT writer's manuals have numerous meanings to their intended audience, making them potentially ambiguous. We used cognitive psychology techniques to conduct a survey in which technicians who use WIs in their jobs indicated the first meaning that came to mind for each of the words. Although the findings of this study are limited by the small number of respondents, we identified words that had many different meanings even within this limited sample. WI writers should pay particular attention to these words and to their most frequent meanings so that they can avoid ambiguity in their writing.

  20. Digital-flight-control-system software written in automated-engineering-design language: A user's guide of verification and validation tools

    Science.gov (United States)

    Saito, Jim

    1987-01-01

    The user guide of verification and validation (V&V) tools for the Automated Engineering Design (AED) language is specifically written to update the information found in several documents pertaining to the automated verification of flight software tools. The intent is to provide, in one document, all the information necessary to adequately prepare a run to use the AED V&V tools. No attempt is made to discuss the FORTRAN V&V tools since they were not updated and are not currently active. Additionally, the current descriptions of the AED V&V tools are contained and provides information to augment the NASA TM 84276. The AED V&V tools are accessed from the digital flight control systems verification laboratory (DFCSVL) via a PDP-11/60 digital computer. The AED V&V tool interface handlers on the PDP-11/60 generate a Univac run stream which is transmitted to the Univac via a Remote Job Entry (RJE) link. Job execution takes place on the Univac 1100 and the job output is transmitted back to the DFCSVL and stored as a PDP-11/60 printfile.

  1. Integrating Technology Tools for Students Struggling with Written Language

    Science.gov (United States)

    Fedora, Pledger

    2015-01-01

    This exploratory study was designed to assess the experience of preservice teachers when integrating written language technology and their likelihood of applying that technology in their future classrooms. Results suggest that after experiencing technology integration, preservice teachers are more likely to use it in their future teaching.

  2. Plasma sources for EUV lithography exposure tools

    International Nuclear Information System (INIS)

    Banine, Vadim; Moors, Roel

    2004-01-01

    The source is an integral part of an extreme ultraviolet lithography (EUVL) tool. Such a source, as well as the EUVL tool, has to fulfil extremely high demands both technical and cost oriented. The EUVL tool operates at a wavelength in the range 13-14 nm, which requires a major re-thinking of state-of-the-art lithography systems operating in the DUV range. The light production mechanism changes from conventional lamps and lasers to relatively high temperature emitting plasmas. The light transport, mainly refractive for DUV, should become reflective for EUV. The source specifications are derived from the customer requirements for the complete tool, which are: throughput, cost of ownership (CoO) and imaging quality. The EUVL system is considered as a follow up of the existing DUV based lithography technology and, while improving the feature resolution, it has to maintain high wafer throughput performance, which is driven by the overall CoO picture. This in turn puts quite high requirements on the collectable in-band power produced by an EUV source. Increased, due to improved feature resolution, critical dimension (CD) control requirements, together with reflective optics restrictions, necessitate pulse-to-pulse repeatability, spatial stability control and repetition rates, which are substantially better than those of current optical systems. All together the following aspects of the source specification will be addressed: the operating wavelength, the EUV power, the hot spot size, the collectable angle, the repetition rate, the pulse-to-pulse repeatability and the debris induced lifetime of components

  3. Open source tools for ATR development and performance evaluation

    Science.gov (United States)

    Baumann, James M.; Dilsavor, Ronald L.; Stubbles, James; Mossing, John C.

    2002-07-01

    Early in almost every engineering project, a decision must be made about tools; should I buy off-the-shelf tools or should I develop my own. Either choice can involve significant cost and risk. Off-the-shelf tools may be readily available, but they can be expensive to purchase and to maintain licenses, and may not be flexible enough to satisfy all project requirements. On the other hand, developing new tools permits great flexibility, but it can be time- (and budget-) consuming, and the end product still may not work as intended. Open source software has the advantages of both approaches without many of the pitfalls. This paper examines the concept of open source software, including its history, unique culture, and informal yet closely followed conventions. These characteristics influence the quality and quantity of software available, and ultimately its suitability for serious ATR development work. We give an example where Python, an open source scripting language, and OpenEV, a viewing and analysis tool for geospatial data, have been incorporated into ATR performance evaluation projects. While this case highlights the successful use of open source tools, we also offer important insight into risks associated with this approach.

  4. Open-source tools for data mining.

    Science.gov (United States)

    Zupan, Blaz; Demsar, Janez

    2008-03-01

    With a growing volume of biomedical databases and repositories, the need to develop a set of tools to address their analysis and support knowledge discovery is becoming acute. The data mining community has developed a substantial set of techniques for computational treatment of these data. In this article, we discuss the evolution of open-source toolboxes that data mining researchers and enthusiasts have developed over the span of a few decades and review several currently available open-source data mining suites. The approaches we review are diverse in data mining methods and user interfaces and also demonstrate that the field and its tools are ready to be fully exploited in biomedical research.

  5. EUV source development for high-volume chip manufacturing tools

    Science.gov (United States)

    Stamm, Uwe; Yoshioka, Masaki; Kleinschmidt, Jürgen; Ziener, Christian; Schriever, Guido; Schürmann, Max C.; Hergenhan, Guido; Borisov, Vladimir M.

    2007-03-01

    Xenon-fueled gas discharge produced plasma (DPP) sources were integrated into Micro Exposure Tools already in 2004. Operation of these tools in a research environment gave early learning for the development of EUV sources for Alpha and Beta-Tools. Further experiments with these sources were performed for basic understanding on EUV source technology and limits, especially the achievable power and reliability. The intermediate focus power of Alpha-Tool sources under development is measured to values above 10 W. Debris mitigation schemes were successfully integrated into the sources leading to reasonable collector mirror lifetimes with target of 10 billion pulses due to the effective debris flux reduction. Source collector mirrors, which withstand the radiation and temperature load of Xenon-fueled sources, have been developed in cooperation with MediaLario Technologies to support intermediate focus power well above 10 W. To fulfill the requirements for High Volume chip Manufacturing (HVM) applications, a new concept for HVM EUV sources with higher efficiency has been developed at XTREME technologies. The discharge produced plasma (DPP) source concept combines the use of rotating disk electrodes (RDE) with laser exited droplet targets. The source concept is called laser assisted droplet RDE source. The fuel of these sources has been selected to be Tin. The conversion efficiency achieved with the laser assisted droplet RDE source is 2-3x higher compared to Xenon. Very high pulse energies well above 200 mJ / 2π sr have been measured with first prototypes of the laser assisted droplet RDE source. If it is possible to maintain these high pulse energies at higher repetition rates a 10 kHz EUV source could deliver 2000 W / 2π sr. According to the first experimental data the new concept is expected to be scalable to an intermediate focus power on the 300 W level.

  6. A Survey of Open Source Tools for Business Intelligence

    DEFF Research Database (Denmark)

    Thomsen, Christian; Pedersen, Torben Bach

    2009-01-01

    The industrial use of open source Business Intelligence (BI) tools is becoming more common, but is still not as widespread as for other types of software. It is therefore of interest to explore which possibilities are available for open source BI and compare the tools. In this survey paper, we co...

  7. The Exercise: An Exercise Generator Tool for the SOURCe Project

    Science.gov (United States)

    Kakoyianni-Doa, Fryni; Tziafa, Eleni; Naskos, Athanasios

    2016-01-01

    The Exercise, an Exercise generator in the SOURCe project, is a tool that complements the properties and functionalities of the SOURCe project, which includes the search engine for the Searchable Online French-Greek parallel corpus for the UniveRsity of Cyprus (SOURCe) (Kakoyianni-Doa & Tziafa, 2013), the PENCIL (an alignment tool)…

  8. Deserts and holy mountains of medieval Serbia: Written sources, spatial patterns, architectural designs

    Directory of Open Access Journals (Sweden)

    Popović Danica

    2007-01-01

    Full Text Available Essential concepts in Christian thought and practice, the desert and holy mountain denote a particular kind of monastic and sacral space. They are secluded from the world, intended for asceticism, and ambivalent in nature they are inhospitable and menacing zones populated with demons, but also a monastic paradise, places for spiritual conversion and encounter with the divine. From earliest times, deserts and holy mountains had a few distinguishing characteristics. All forms of monastic life, from communal to solitary, were practiced side by side there. Monks of a special make-up and distinction known as holy men who were also often founders of illustrious communities, future saints and miracle-workers acted there. Furthermore these locales were important spiritual and bookmaking centre's, and therefore, strongholds of Orthodoxy. When trying to research Serbian material on this topic, we face a specific situation: few surviving sources on the one hand, and devastated monuments on the other. The ultimate consequence is that the entire subject has been neglected. Therefore the study of the Serbian deserts and holy mountains requires a very complex interdisciplinary approach with systematic field work as its essential part. It should address the following issues: corroboration, on the basis of written sources, of the reception of the concept of the monastic desert and holy mountain in a particular, regional, context; the distinct means and mechanisms employed in their physical realization; interpretation of their function; the recognition of patterns preserved in the surviving physical structures. Even the results obtained so far appear to be relevant enough to become included in the sacral topography of the Christian world. The author of this study gives particular attention to the detailed analysis of written sources of various genres - diplomatic sources, hagiographic material, liturgical texts, observation notes - in order to establish the

  9. Reasons for the fall: Written sources and Material evidence for the collapse of Great Moravia

    Directory of Open Access Journals (Sweden)

    Maddalena Betti

    2016-09-01

    Full Text Available This paper re-examines the causes of the fall of Great Moravia, traditionally associated with the expansion of the Magyars into the Danube basin between the end of the ninth and the beginning of the tenth century. It first analyses the written sources, and in particular the Annals of Fulda, which it is argued describe the gradual marginalisation of the polity’s political influence and agency in the region. Second, on the basis of archaeological evidence, the paper attempts to demonstrate that Moravia’s political crisis was closely tied to its fragile socio-economic foundations.

  10. Development and Validation of a Standardized Tool for Prioritization of Information Sources.

    Science.gov (United States)

    Akwar, Holy; Kloeze, Harold; Mukhi, Shamir

    2016-01-01

    To validate the utility and effectiveness of a standardized tool for prioritization of information sources for early detection of diseases. The tool was developed with input from diverse public health experts garnered through survey. Ten raters used the tool to evaluate ten information sources and reliability among raters was computed. The Proc mixed procedure with random effect statement and SAS Macros were used to compute multiple raters' Fleiss Kappa agreement and Kendall's Coefficient of Concordance. Ten disparate information sources evaluated obtained the following composite scores: ProMed 91%; WAHID 90%; Eurosurv 87%; MediSys 85%; SciDaily 84%; EurekAl 83%; CSHB 78%; GermTrax 75%; Google 74%; and CBC 70%. A Fleiss Kappa agreement of 50.7% was obtained for ten information sources and 72.5% for a sub-set of five sources rated, which is substantial agreement validating the utility and effectiveness of the tool. This study validated the utility and effectiveness of a standardized criteria tool developed to prioritize information sources. The new tool was used to identify five information sources suited for use by the KIWI system in the CEZD-IIR project to improve surveillance of infectious diseases. The tool can be generalized to situations when prioritization of numerous information sources is necessary.

  11. A Survey of Open Source Tools for Business Intelligence

    DEFF Research Database (Denmark)

    Thomsen, Christian; Pedersen, Torben Bach

    The industrial use of open source Business Intelligence (BI) tools is becoming more common, but is still not as widespread as for other types of software.  It is therefore of interest to explore which possibilities are available for open source BI and compare the tools. In this survey paper, we c......The industrial use of open source Business Intelligence (BI) tools is becoming more common, but is still not as widespread as for other types of software.  It is therefore of interest to explore which possibilities...... are available for open source BI and compare the tools. In this survey paper, we consider the capabilities of a number of open source tools for BI. In the paper, we consider a number of Extract‐Transform‐Load (ETL) tools, database management systems (DBMSs), On‐Line Analytical Processing (OLAP) servers, and OLAP clients. We find that, unlike the situation a few years ago, there now...

  12. Text mining and visualization case studies using open-source tools

    CERN Document Server

    Chisholm, Andrew

    2016-01-01

    Text Mining and Visualization: Case Studies Using Open-Source Tools provides an introduction to text mining using some of the most popular and powerful open-source tools: KNIME, RapidMiner, Weka, R, and Python. The contributors-all highly experienced with text mining and open-source software-explain how text data are gathered and processed from a wide variety of sources, including books, server access logs, websites, social media sites, and message boards. Each chapter presents a case study that you can follow as part of a step-by-step, reproducible example. You can also easily apply and extend the techniques to other problems. All the examples are available on a supplementary website. The book shows you how to exploit your text data, offering successful application examples and blueprints for you to tackle your text mining tasks and benefit from open and freely available tools. It gets you up to date on the latest and most powerful tools, the data mining process, and specific text mining activities.

  13. Open Source and Proprietary Project Management Tools for SMEs.

    Directory of Open Access Journals (Sweden)

    Veronika Abramova

    2017-05-01

    Full Text Available The dimensional growth and increasing difficulty in project management promoted the development of different tools that serve to facilitate project management and track project schedule, resources and overall progress. These tools offer a variety of features, from task and time management, up to integrated CRM (Customer Relationship Management and ERP (Enterprise Resource Planning modules. Currently, a large number of project management software is available, to assist project team during the entire project lifecycle. We present the main differences between open source and proprietary project management tools and how those could be important for SMEs, describing the key features and how those can assist the project manager and the development team. In this paper, we analyse four open-source project management tools: OpenProject, ProjectLibre, Redmine, LibrePlan and four proprietary tools: Bitrix24, JIRA, Microsoft Project and Asana.

  14. Open Source for Knowledge and Learning Management: Strategies beyond Tools

    Science.gov (United States)

    Lytras, Miltiadis, Ed.; Naeve, Ambjorn, Ed.

    2007-01-01

    In the last years, knowledge and learning management have made a significant impact on the IT research community. "Open Source for Knowledge and Learning Management: Strategies Beyond Tools" presents learning and knowledge management from a point of view where the basic tools and applications are provided by open source technologies.…

  15. Methods and apparatus for safely handling radioactive sources in measuring-while-drilling tools

    International Nuclear Information System (INIS)

    Wraight, P.D.

    1989-01-01

    This patent describes a method for removing a chemical radioactive source from a MWD tool which is coupled in a drill string supported by a drilling rig while a borehole is drilled and includes logging means for measuring formation characteristics in response to irradiation of the adjacent formations by the radioactive source during the drilling operation. The steps of the method are: halting the drilling operation and then removing the drill string from the borehole for moving the MWD tool to a work station at the surface where the source is at a safe working distance from the drilling rig and will be accessible by way of one end of the MWD tool; positioning a radiation shield at a location adjacent to the one end of the MWD tool where the shield is ready for receiving the source as it is moved away from the other end of the MWD tool and then moving the source away from the other end of the MWD tool for enclosing the source within the shield; and once the source is enclosed within the shield; removing the shield together with the enclosed source from the MWD tool for transferring the enclosed source to another work station

  16. BAT: An open-source, web-based audio events annotation tool

    OpenAIRE

    Blai Meléndez-Catalan, Emilio Molina, Emilia Gómez

    2017-01-01

    In this paper we present BAT (BMAT Annotation Tool), an open-source, web-based tool for the manual annotation of events in audio recordings developed at BMAT (Barcelona Music and Audio Technologies). The main feature of the tool is that it provides an easy way to annotate the salience of simultaneous sound sources. Additionally, it allows to define multiple ontologies to adapt to multiple tasks and offers the possibility to cross-annotate audio data. Moreover, it is easy to install and deploy...

  17. Building Eclectic Personal Learning Landscapes with Open Source Tools

    NARCIS (Netherlands)

    Kalz, Marco

    2008-01-01

    Kalz, M. (2005). Building Eclectic Personal Learning Landscapes with Open Source Tools. In F. de Vries, G. Attwell, R. Elferink & A. Tödt (Eds.), Open Source for Education in Europe. Research & Practice (= Proceedings of the Open Source for Education in Europe Conference) (pp. 163-168). 2005,

  18. Building Eclectic Personal Learning Landscapes with Open Source Tools

    OpenAIRE

    Kalz, Marco

    2008-01-01

    Kalz, M. (2005). Building Eclectic Personal Learning Landscapes with Open Source Tools. In F. de Vries, G. Attwell, R. Elferink & A. Tödt (Eds.), Open Source for Education in Europe. Research & Practice (= Proceedings of the Open Source for Education in Europe Conference) (pp. 163-168). 2005, Heerlen, The Netherlands.

  19. Microbial source tracking: a tool for identifying sources of microbial contamination in the food chain.

    Science.gov (United States)

    Fu, Ling-Lin; Li, Jian-Rong

    2014-01-01

    The ability to trace fecal indicators and food-borne pathogens to the point of origin has major ramifications for food industry, food regulatory agencies, and public health. Such information would enable food producers and processors to better understand sources of contamination and thereby take corrective actions to prevent transmission. Microbial source tracking (MST), which currently is largely focused on determining sources of fecal contamination in waterways, is also providing the scientific community tools for tracking both fecal bacteria and food-borne pathogens contamination in the food chain. Approaches to MST are commonly classified as library-dependent methods (LDMs) or library-independent methods (LIMs). These tools will have widespread applications, including the use for regulatory compliance, pollution remediation, and risk assessment. These tools will reduce the incidence of illness associated with food and water. Our aim in this review is to highlight the use of molecular MST methods in application to understanding the source and transmission of food-borne pathogens. Moreover, the future directions of MST research are also discussed.

  20. Introducing Product Lines through Open Source Tools

    OpenAIRE

    Haugen, Øystein

    2008-01-01

    We present an approach to introducing product lines to companies that lower their initial risk by applying open source tools and a smooth learning curve into the use and creation of domain specific modeling combined with standardized variability modeling.

  1. Adding tools to the open source toolbox: The Internet

    Science.gov (United States)

    Porth, Tricia

    1994-01-01

    The Internet offers researchers additional sources of information not easily available from traditional sources such as print volumes or commercial data bases. Internet tools such as e-mail and file transfer protocol (ftp) speed up the way researchers communicate and transmit data. Mosaic, one of the newest additions to the Internet toolbox, allows users to combine tools such as ftp, gopher, wide area information server, and the world wide web with multimedia capabilities. Mosaic has quickly become a popular means of making information available on the Internet because it is versatile and easily customizable.

  2. Megaliths as land-marks. Chronicle of the territorial role of the megalithic monuments through written sources

    Directory of Open Access Journals (Sweden)

    Martinón-Torres, Marcos

    2001-06-01

    Full Text Available Megalithic monuments have played dijferent roles throughout History. One of them has a spatial function, i.e. as landmarks. The aim of this paper has been to collect and analyse every written reference concerning Galician megaliths operating as landmarks between the 6th and 19th centuries AD. On this basis, the evolution of this social-territorial function of the monuments through time is reconstructed, and an interpretative hypothesis for this phenomenon is proposed. Finally, the importance of reviewing written sources as a methodology for archaeological survey and for studies of the topographic settings of monuments is emphasised.

    A lo largo de la Historia, los monumentos megalíticos han desempeñado, entre otras, una función espacial, como marcos de territorio. Para este artículo se recogen y analizan las referencias escritas a megalitos gallegos funcionando como marcadores o identificadores espaciales, entre los siglos VI y XIX d.C. A partir de este registro de fuentes se reconstruye la evolución de este papel social-territorial de los monumentos en las distintas épocas. Se plantea un modelo interpretativo para este fenómeno, y se valora la revisión de fuentes escritas como metodología para la prospección arqueológica y para los estudios de emplazamiento de megalitos.

  3. System level modelling with open source tools

    DEFF Research Database (Denmark)

    Jakobsen, Mikkel Koefoed; Madsen, Jan; Niaki, Seyed Hosein Attarzadeh

    , called ForSyDe. ForSyDe is available under the open Source approach, which allows small and medium enterprises (SME) to get easy access to advanced modeling capabilities and tools. We give an introduction to the design methodology through the system level modeling of a simple industrial use case, and we...

  4. Sports metaphors in Polish written commentaries on politics

    Directory of Open Access Journals (Sweden)

    Jarosław Wiliński

    2015-12-01

    Full Text Available This paper seeks to investigate what sports metaphors are used in Polish written commentaries on politics and what special purpose they serve. In particular, the paper examines structural metaphors that come from the lexicon of popular sports, such as boxing, racing, track and field athletics, sailing, etc. The language data, derived from English Internet websites, has been grouped and discussed according to source domains. Applying George Lakoff and Mark Johnson’s approach to metaphor, the paper attempts to determine both the kind of source domains from which common metaphors are drawn and to what degree structural metaphors are used. The data suggests that many structural metaphors can be found in the language of politics. They are drawn from a wide variety of sports source domains, although the domains of boxing, racing, sailing, and soccer are of particular prominence. It seems that the primary function of structural metaphors in written commentaries is to facilitate the interpretation of facts in a way that is enormously appealing to the reader.

  5. Development and validation of an open source quantification tool for DSC-MRI studies.

    Science.gov (United States)

    Gordaliza, P M; Mateos-Pérez, J M; Montesinos, P; Guzmán-de-Villoria, J A; Desco, M; Vaquero, J J

    2015-03-01

    This work presents the development of an open source tool for the quantification of dynamic susceptibility-weighted contrast-enhanced (DSC) perfusion studies. The development of this tool is motivated by the lack of open source tools implemented on open platforms to allow external developers to implement their own quantification methods easily and without the need of paying for a development license. This quantification tool was developed as a plugin for the ImageJ image analysis platform using the Java programming language. A modular approach was used in the implementation of the components, in such a way that the addition of new methods can be done without breaking any of the existing functionalities. For the validation process, images from seven patients with brain tumors were acquired and quantified with the presented tool and with a widely used clinical software package. The resulting perfusion parameters were then compared. Perfusion parameters and the corresponding parametric images were obtained. When no gamma-fitting is used, an excellent agreement with the tool used as a gold-standard was obtained (R(2)>0.8 and values are within 95% CI limits in Bland-Altman plots). An open source tool that performs quantification of perfusion studies using magnetic resonance imaging has been developed and validated using a clinical software package. It works as an ImageJ plugin and the source code has been published with an open source license. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. E-Sourcing platforms as reasonable marketing tools for suppliers

    OpenAIRE

    Göbl, Martin; Greiter, Thomas

    2014-01-01

    Research questions: E-sourcing platforms offer purchasing organisations often easy access to a high number of relevant suppliers, their goods and services and the accord-ing prices. For the suppliers, e-sourcing platforms are a good and easy pos-sibility to present their products and services to the relevant buyers and to get in contact with potential customers. Subject of this research will be the question, whether e-sourcing platforms are also a reasonable marketing tool for suppliers in or...

  7. Students' Engagement with a Collaborative Wiki Tool Predicts Enhanced Written Exam Performance

    Science.gov (United States)

    Stafford, Tom; Elgueta, Herman; Cameron, Harriet

    2014-01-01

    We introduced voluntary wiki-based exercises to a long-running cognitive psychology course, part of the core curriculum for an undergraduate degree in psychology. Over 2 yearly cohorts, students who used the wiki more also scored higher on the final written exam. Using regression analysis, it is possible to account for students' tendency to score…

  8. Applying open source data visualization tools to standard based medical data.

    Science.gov (United States)

    Kopanitsa, Georgy; Taranik, Maxim

    2014-01-01

    Presentation of medical data in personal health records (PHRs) requires flexible platform independent tools to ensure easy access to the information. Different backgrounds of the patients, especially elder people require simple graphical presentation of the data. Data in PHRs can be collected from heterogeneous sources. Application of standard based medical data allows development of generic visualization methods. Focusing on the deployment of Open Source Tools, in this paper we applied Java Script libraries to create data presentations for standard based medical data.

  9. Open source tools and toolkits for bioinformatics: significance, and where are we?

    Science.gov (United States)

    Stajich, Jason E; Lapp, Hilmar

    2006-09-01

    This review summarizes important work in open-source bioinformatics software that has occurred over the past couple of years. The survey is intended to illustrate how programs and toolkits whose source code has been developed or released under an Open Source license have changed informatics-heavy areas of life science research. Rather than creating a comprehensive list of all tools developed over the last 2-3 years, we use a few selected projects encompassing toolkit libraries, analysis tools, data analysis environments and interoperability standards to show how freely available and modifiable open-source software can serve as the foundation for building important applications, analysis workflows and resources.

  10. Readability of Written Materials for CKD Patients: A Systematic Review.

    Science.gov (United States)

    Morony, Suzanne; Flynn, Michaela; McCaffery, Kirsten J; Jansen, Jesse; Webster, Angela C

    2015-06-01

    The "average" patient has a literacy level of US grade 8 (age 13-14 years), but this may be lower for people with chronic kidney disease (CKD). Current guidelines suggest that patient education materials should be pitched at a literacy level of around 5th grade (age 10-11 years). This study aims to evaluate the readability of written materials targeted at patients with CKD. Systematic review. Patient information materials aimed at adults with CKD and written in English. Patient education materials designed to be printed and read, sourced from practices in Australia and online at all known websites run by relevant international CKD organizations during March 2014. Quantitative analysis of readability using Lexile Analyzer and Flesch-Kincaid tools. We analyzed 80 materials. Both Lexile Analyzer and Flesch-Kincaid analyses suggested that most materials required a minimum of grade 9 (age 14-15 years) schooling to read them. Only 5% of materials were pitched at the recommended level (grade 5). Readability formulas have inherent limitations and do not account for visual information. We did not consider other media through which patients with CKD may access information. Although the study covered materials from the United States, United Kingdom, and Australia, all non-Internet materials were sourced locally, and it is possible that some international paper-based materials were missed. Generalizability may be limited due to exclusion of non-English materials. These findings suggest that patient information materials aimed at patients with CKD are pitched above the average patient's literacy level. This issue is compounded by cognitive decline in patients with CKD, who may have lower literacy than the average patient. It suggests that information providers need to consider their audience more carefully when preparing patient information materials, including user testing with a low-literacy patient population. Copyright © 2015 National Kidney Foundation, Inc. Published by

  11. Open source intelligence: A tool to combat illicit trafficking

    Energy Technology Data Exchange (ETDEWEB)

    Sjoeberg, J [Swedish Armed Forces HQ, Stockholm (Sweden)

    2001-10-01

    The purpose of my presentation is to provide some thoughts on Open Sources and how Open Sources can be used as tools for detecting illicit trafficking and proliferation. To fulfill this purpose I would like to deal with the following points during my presentation: What is Open Source? How can it be defined? - Different sources - Methods. Open Source information can be defined as publicly available information as well as other unclassified information that has limited public distribution or access to it. It comes in print, electronic or oral form. It can be found distributed either to the mass public by print or electronic media or to a much more limited customer base like companies, experts or specialists of some kind including the so called gray literature. Open Source information is not a single source but a multi-source. Thus, you can say that Open Sources does not say anything about the information itself, it only refers to if the information is classified or not.

  12. Open source intelligence: A tool to combat illicit trafficking

    International Nuclear Information System (INIS)

    Sjoeberg, J.

    2001-01-01

    The purpose of my presentation is to provide some thoughts on Open Sources and how Open Sources can be used as tools for detecting illicit trafficking and proliferation. To fulfill this purpose I would like to deal with the following points during my presentation: What is Open Source? How can it be defined? - Different sources - Methods. Open Source information can be defined as publicly available information as well as other unclassified information that has limited public distribution or access to it. It comes in print, electronic or oral form. It can be found distributed either to the mass public by print or electronic media or to a much more limited customer base like companies, experts or specialists of some kind including the so called gray literature. Open Source information is not a single source but a multi-source. Thus, you can say that Open Sources does not say anything about the information itself, it only refers to if the information is classified or not

  13. Speech-language therapy for adolescents with written-language difficulties: The South African context

    Directory of Open Access Journals (Sweden)

    Danel Erasmus

    2013-11-01

    Method: A survey study was conducted, using a self-administered questionnaire. Twenty-two currently practising speech-language therapists who are registered members of the South African Speech-Language-Hearing Association (SASLHA participated in the study. Results: The respondents indicated that they are aware of their role regarding adolescents with written-language difficulties. However, they feel that South-African speech-language therapists are not fulfilling this role. Existing assessment tools and interventions for written-language difficulties are described as inadequate, and culturally and age inappropriate. Yet, the majority of the respondents feel that they are adequately equipped to work with adolescents with written-language difficulties, based on their own experience, self-study and secondary training. The respondents feel that training regarding effective collaboration with teachers is necessary to establish specific roles, and to promote speech-language therapy for adolescents among teachers. Conclusion: Further research is needed in developing appropriate assessment and intervention tools as well as improvement of training at an undergraduate level.

  14. Vapor Intrusion Estimation Tool for Unsaturated Zone Contaminant Sources. User’s Guide

    Science.gov (United States)

    2016-08-30

    estimation process when applying the tool. The tool described here is focused on vapor-phase diffusion from the current vadose zone source , and is not...from the current defined vadose zone source ). The estimated soil gas contaminant concentration obtained from the pre-modeled scenarios for a building...need a full site-specific numerical model to assess the impacts beyond the current vadose zone source . 35 5.0 References Brennan, R.A., N

  15. Business intelligence tools for radiology: creating a prototype model using open-source tools.

    Science.gov (United States)

    Prevedello, Luciano M; Andriole, Katherine P; Hanson, Richard; Kelly, Pauline; Khorasani, Ramin

    2010-04-01

    Digital radiology departments could benefit from the ability to integrate and visualize data (e.g. information reflecting complex workflow states) from all of their imaging and information management systems in one composite presentation view. Leveraging data warehousing tools developed in the business world may be one way to achieve this capability. In total, the concept of managing the information available in this data repository is known as Business Intelligence or BI. This paper describes the concepts used in Business Intelligence, their importance to modern Radiology, and the steps used in the creation of a prototype model of a data warehouse for BI using open-source tools.

  16. Windows Developer Power Tools Turbocharge Windows development with more than 170 free and open source tools

    CERN Document Server

    Avery, James

    2007-01-01

    Software developers need to work harder and harder to bring value to their development process in order to build high quality applications and remain competitive. Developers can accomplish this by improving their productivity, quickly solving problems, and writing better code. A wealth of open source and free software tools are available for developers who want to improve the way they create, build, deploy, and use software. Tools, components, and frameworks exist to help developers at every point in the development process. Windows Developer Power Tools offers an encyclopedic guide to m

  17. Piloting a Structured Practice Audit to Assess ACGME Milestones in Written Handoff Communication in Internal Medicine.

    Science.gov (United States)

    Martin, Shannon K; Farnan, Jeanne M; McConville, John F; Arora, Vineet M

    2015-06-01

    Written communication skills are integral to patient care handoffs. Residency programs require feasible assessment tools that provide timely formative and summative feedback, ideally linked to the Accreditation Council for Graduate Medical Education Milestones. We describe the use of 1 such tool-UPDATED-to assess written handoff communication skills in internal medicine interns. During 2012-2013, the authors piloted a structured practice audit at 1 academic institution to audit written sign-outs completed by 45 interns, using the UPDATED tool, which scores 7 aspects of sign-out communication linked to milestones. Intern sign-outs were audited by trained faculty members throughout the year. Results were incorporated into intern performance reviews and Clinical Competency Committees. A total of 136 sign-outs were audited (averaging 3.1 audits per intern). In the first trimester, 14 interns (31%) had satisfactory audit results. Five interns (11%) had critical deficiencies and received immediate feedback, and the remaining 26 (58%) were assigned future audits due to missing audits or unsatisfactory scores. In the second trimester, 21 interns (68%) had satisfactory results, 1 had critical deficiencies, and 9 (29%) required future audits. Nine of the 10 remaining interns in the final trimester had satisfactory audits. Faculty time was estimated at 10 to 15 minutes per sign-out audited. The UPDATED audit is a milestone-based tool that can be used to assess written sign-out communication skills in internal medicine residency programs. Future work is planned to adapt the tool for use by senior supervisory residents to appraise sign-outs in real time.

  18. Justify Your Answer: The Role of Written Think Aloud in Script Concordance Testing.

    Science.gov (United States)

    Power, Alyssa; Lemay, Jean-Francois; Cooke, Suzette

    2017-01-01

    Construct: Clinical reasoning assessment is a growing area of interest in the medical education literature. Script concordance testing (SCT) evaluates clinical reasoning in conditions of uncertainty and has emerged as an innovative tool in the domain of clinical reasoning assessment. SCT quantifies the degree of concordance between a learner and an experienced clinician and attempts to capture the breadth of responses of expert clinicians, acknowledging the significant yet acceptable variation in practice under situations of uncertainty. SCT has been shown to be a valid and reliable clinical reasoning assessment tool. However, as SCT provides only quantitative information, it may not provide a complete assessment of clinical reasoning. Think aloud (TA) is a qualitative research tool used in clinical reasoning assessment in which learners verbalize their thought process around an assigned task. This study explores the use of TA, in the form of written reflection, in SCT to assess resident clinical reasoning, hypothesizing that the information obtained from the written TA would enrich the quantitative data obtained through SCT. Ninety-one pediatric postgraduate trainees and 21 pediatricians from 4 Canadian training centers completed an online test consisting of 24 SCT cases immediately followed by retrospective written TA. Six of 24 cases were selected to gather TA data. These cases were chosen to allow all phases of clinical decision making (diagnosis, investigation, and treatment) to be represented in the TA data. Inductive thematic analysis was employed when systematically reviewing TA responses. Three main benefits of adding written TA to SCT were identified: (a) uncovering instances of incorrect clinical reasoning despite a correct SCT response, (b) revealing sound clinical reasoning in the context of a suboptimal SCT response, and (c) detecting question misinterpretation. Written TA can optimize SCT by demonstrating when correct examinee responses are based on

  19. EUV sources for the alpha-tools

    Science.gov (United States)

    Pankert, Joseph; Apetz, Rolf; Bergmann, Klaus; Damen, Marcel; Derra, Günther; Franken, Oliver; Janssen, Maurice; Jonkers, Jeroen; Klein, Jürgen; Kraus, Helmar; Krücken, Thomas; List, Andreas; Loeken, Micheal; Mader, Arnaud; Metzmacher, Christof; Neff, Willi; Probst, Sven; Prümmer, Ralph; Rosier, Oliver; Schwabe, Stefan; Seiwert, Stefan; Siemons, Guido; Vaudrevange, Dominik; Wagemann, Dirk; Weber, Achim; Zink, Peter; Zitzen, Oliver

    2006-03-01

    In this paper, we report on the recent progress of the Philips Extreme UV source. The Philips source concept is based on a discharge plasma ignited in a Sn vapor plume that is ablated by a laser pulse. Using rotating electrodes covered with a regenerating tin surface, the problems of electrode erosion and power scaling are fundamentally solved. Most of the work of the past year has been dedicated to develop a lamp system which is operating very reliably and stable under full scanner remote control. Topics addressed were the development of the scanner interface, a dose control system, thermo-mechanical design, positional stability of the source, tin handling, and many more. The resulting EUV source-the Philips NovaTin(R) source-can operate at more than 10kW electrical input power and delivers 200W in-band EUV into 2π continuously. The source is very small, so nearly 100% of the EUV radiation can be collected within etendue limits. The lamp system is fully automated and can operate unattended under full scanner remote control. 500 Million shots of continuous operation without interruption have been realized, electrode lifetime is at least 2 Billion shots. Three sources are currently being prepared, two of them will be integrated into the first EUV Alpha Demonstration tools of ASML. The debris problem was reduced to a level which is well acceptable for scanner operation. First, a considerable reduction of the Sn emission of the source has been realized. The debris mitigation system is based on a two-step concept using a foil trap based stage and a chemical cleaning stage. Both steps were improved considerably. A collector lifetime of 1 Billion shots is achieved, after this operating time a cleaning would be applied. The cleaning step has been verified to work with tolerable Sn residues. From the experimental results, a total collector lifetime of more than 10 Billion shots can be expected.

  20. Serbian Written Sources on the Tatars and the Golden Horde (first half of the 14th century

    Directory of Open Access Journals (Sweden)

    Aleksandar Uzelac

    2014-01-01

    Full Text Available Serbian narrative and documentary texts, written in the first half of the XIV century, represent valuable source material for the research of Tatar political and military influence in the Balkan lands. Most important among them are Vita of King Stephen Uroš II Milutin (1282–1321, extant in three different editions and Vita of Archbishop Daniel II (1324–1337. The first one offers insight into the relations between the Kingdom of Serbia and the powerful Juchid prince Nogai, while in the latter, the key role of Tatar contingents in internal power struggle between king Milutin and his brother Stephen Dragutin is mentioned. The presence of Tatars in the Battle of Velbazhd (1330, fought between Serbia and the Bulgarian Empire, is also attested in various sources, including the so-called Old Serbian chronicles and the Code of Law of Emperor Stephen Dušan (1349. Another group of sources analyzed in the text are several apocryphal writings of South Slavic literature. Their value lies in the fact that they reflect the image of the Tatars in the eyes of the Balkan Slavs. Last, but not least important testimony of Tatar activities in Serbian lands is preserved in place-names of Tatar origin, recorded in royal charters, issued by Milutin’s son Stephen (1321–1331 and grandson Stephen Dušan (1331–1355.

  1. Managing research and surveillance projects in real-time with a novel open-source eManagement tool designed for under-resourced countries.

    Science.gov (United States)

    Steiner, Andreas; Hella, Jerry; Grüninger, Servan; Mhalu, Grace; Mhimbira, Francis; Cercamondi, Colin I; Doulla, Basra; Maire, Nicolas; Fenner, Lukas

    2016-09-01

    A software tool is developed to facilitate data entry and to monitor research projects in under-resourced countries in real-time. The eManagement tool "odk_planner" is written in the scripting languages PHP and Python. The odk_planner is lightweight and uses minimal internet resources. It was designed to be used with the open source software Open Data Kit (ODK). The users can easily configure odk_planner to meet their needs, and the online interface displays data collected from ODK forms in a graphically informative way. The odk_planner also allows users to upload pictures and laboratory results and sends text messages automatically. User-defined access rights protect data and privacy. We present examples from four field applications in Tanzania successfully using the eManagement tool: 1) clinical trial; 2) longitudinal Tuberculosis (TB) Cohort Study with a complex visit schedule, where it was used to graphically display missing case report forms, upload digitalized X-rays, and send text message reminders to patients; 3) intervention study to improve TB case detection, carried out at pharmacies: a tablet-based electronic referral system monitored referred patients, and sent automated messages to remind pharmacy clients to visit a TB Clinic; and 4) TB retreatment case monitoring designed to improve drug resistance surveillance: clinicians at four public TB clinics and lab technicians at the TB reference laboratory used a smartphone-based application that tracked sputum samples, and collected clinical and laboratory data. The user friendly, open source odk_planner is a simple, but multi-functional, Web-based eManagement tool with add-ons that helps researchers conduct studies in under-resourced countries. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  2. Decision support tool for diagnosing the source of variation

    Science.gov (United States)

    Masood, Ibrahim; Azrul Azhad Haizan, Mohamad; Norbaya Jumali, Siti; Ghazali, Farah Najihah Mohd; Razali, Hazlin Syafinaz Md; Shahir Yahya, Mohd; Azlan, Mohd Azwir bin

    2017-08-01

    Identifying the source of unnatural variation (SOV) in manufacturing process is essential for quality control. The Shewhart control chart patterns (CCPs) are commonly used to monitor the SOV. However, a proper interpretation of CCPs associated to its SOV requires a high skill industrial practitioner. Lack of knowledge in process engineering will lead to erroneous corrective action. The objective of this study is to design the operating procedures of computerized decision support tool (DST) for process diagnosis. The DST is an embedded tool in CCPs recognition scheme. Design methodology involves analysis of relationship between geometrical features, manufacturing process and CCPs. The DST contents information about CCPs and its possible root cause error and description on SOV phenomenon such as process deterioration in tool bluntness, offsetting tool, loading error, and changes in materials hardness. The DST will be useful for an industrial practitioner in making effective troubleshooting.

  3. STRATEGIES OF EXPRESSING WRITTEN APOLOGIES IN THE ONLINE NEWSPAPERS

    Directory of Open Access Journals (Sweden)

    Cipto Wardoyo

    2015-12-01

    Full Text Available Expressing apology is a universal activity although people have different strategies or ways to express the apology based on the culture, situation, and context. An apology has played a vital role in verbal politeness; it is certainly impolite when someone does not express an apology when he or she has commited an offence to the others. Apologies in the Pragmatic study is classified under speech act theory. An apology based on Searle (1969 is classified as expressive speech acts because it expresses speaker’s physiological attitude. An apology expresses speaker’s sorrow and regret because he/she has offended hearers or readers.  This paper tries to discuss strategies of editors in expressing written apologies in the online newspaper. The objective of this paper is to explain what the strategies of written apologies are in the online newspaper. This study uses qualitative method; the writer chooses descriptive interpretative technique for analyzing data. There are four written apologies in the online neswpapers as data sources in this paper, the data are taken from The Jakarta Post, The Daily Express, The Sun, and Brisbane Times. The writer tries to describe and analyzes utterances in the data sources based on Olshtain & Cohen theory (1986. There are five main strategies in expressing apologies according to Olshtain & Cohen (1986; they are Illocutionary Force Indicating Device (IFID, expression responsibility, explanation/justification, offer repairs, and promise forbearance. The writer found that all of the written apologies used combination strategies, they used IFID by using performative verb: apologize and be sorry then followed by expression resposbility, explanation, offer repairs, and promise forbearance. Keywords: apologies, speech acts, politeness, pragmatics

  4. Airline Transport Pilot-Airplane (Air Carrier) Written Test Guide.

    Science.gov (United States)

    Federal Aviation Administration (DOT), Washington, DC. Flight Standards Service.

    Presented is information useful to applicants who are preparing for the Airline Transport Pilot-Airplane (Air Carrier) Written Test. The guide describes the basic aeronautical knowledge and associated requirements for certification, as well as information on source material, instructions for taking the official test, and questions that are…

  5. Using Open Source Tools to Create a Mobile Optimized, Crowdsourced Translation Tool

    Directory of Open Access Journals (Sweden)

    Evviva Weinraub Lajoie

    2014-04-01

    Full Text Available In late 2012, OSU Libraries and Press partnered with Maria's Libraries, an NGO in Rural Kenya, to provide users the ability to crowdsource translations of folk tales and existing children's books into a variety of African languages, sub-languages, and dialects. Together, these two organizations have been creating a mobile optimized platform using open source libraries such as Wink Toolkit (a library which provides mobile-friendly interaction from a website and Globalize3 to allow for multiple translations of database entries in a Ruby on Rails application. Research regarding successes of similar tools has been utilized in providing a consistent user interface. The OSU Libraries & Press team delivered a proof-of-concept tool that has the opportunity to promote technology exploration, improve early childhood literacy, change the way we approach foreign language learning, and to provide opportunities for cost-effective, multi-language publishing.

  6. Commissioning software tools at the Advanced Photon Source

    International Nuclear Information System (INIS)

    Emery, L.

    1995-01-01

    A software tool-oriented approach has been adopted in the commissioning of the Advanced Photon Source (APS) at Argonne National Laboratory, particularly in the commissioning of the Positron Accumulator Ring (PAR). The general philosophy is to decompose a complicated procedure involving measurement, data processing, and control into a series of simpler steps, each accomplished by a generic toolkit program. The implementation is greatly facilitated by adopting the SDDS (self-describing data set protocol), which comes with its own toolkit. The combined toolkit has made accelerator physics measurements easier. For instance, the measurement of the optical functions of the PAR and the beamlines connected to it have been largely automated. Complicated measurements are feasible with a combination of tools running independently

  7. Plasma diagnostic tools for optimizing negative hydrogen ion sources

    International Nuclear Information System (INIS)

    Fantz, U.; Falter, H.D.; Franzen, P.; Speth, E.; Hemsworth, R.; Boilson, D.; Krylov, A.

    2006-01-01

    The powerful diagnostic tool of optical emission spectroscopy is used to measure the plasma parameters in negative hydrogen ion sources based on the surface mechanism. Results for electron temperature, electron density, atomic-to-molecular hydrogen density ratio, and gas temperature are presented for two types of sources, a rf source and an arc source, which are currently under development for a neutral beam heating system of ITER. The amount of cesium in the plasma volume is obtained from cesium radiation: the Cs neutral density is five to ten orders of magnitude lower than the hydrogen density and the Cs ion density is two to three orders of magnitude lower than the electron density in front of the grid. It is shown that monitoring of cesium lines is very useful for monitoring the cesium balance in the source. From a line-ratio method negative ion densities are determined. In a well-conditioned source the negative ion density is of the same order of magnitude as the electron density and correlates with extracted current densities

  8. A new energy analysis tool for ground source heat pump systems

    Energy Technology Data Exchange (ETDEWEB)

    Michopoulos, A.; Kyriakis, N. [Process Equipment Design Laboratory, Mechanical Engineering Department, Aristotle University of Thessaloniki, POB 487, 541 24 Thessaloniki (Greece)

    2009-09-15

    A new tool, suitable for energy analysis of vertical ground source heat pump systems, is presented. The tool is based on analytical equations describing the heat exchanged with the ground, developed in Matlab {sup registered} environment. The time step of the simulation can be freely chosen by the user (e.g. 1, 2 h etc.) and the calculation time required is very short. The heating and cooling loads of the building, at the afore mentioned time step, are needed as input, along with the thermophysical properties of the soil and of the ground heat exchanger, the operation characteristic curves of the system's heat pumps and the basic ground source heat exchanger dimensions. The results include the electricity consumption of the system and the heat absorbed from or rejected to the ground. The efficiency of the tool is verified through comparison with actual electricity consumption data collected from an existing large scale ground coupled heat pump installation over a three-year period. (author)

  9. “Materials for the Dictionary of the Old Russian Language in the Written Records” by I.I. Sreznevskiy As the Source of Diachronic Research of the Substantive Word-Formation

    Directory of Open Access Journals (Sweden)

    Anastasiya Yuryevna Vekolova

    2015-12-01

    Full Text Available The article presents the results of the historical research in historical aspect on word-formation based on «Materials for the dictionary of the old Russian language in the written records» by I.I. Sreznevskiy that is characterized as the most important source of lexicographical material for the diachronic research. The dictionary is the only completed lexicographical source that reflects the language in the XI-XVII cent. It includes samples of the old Slavic and the old Russian written monuments, thus demonstrating lexis from the variety of sources. Its entries represent data on lexical, in particular word building system of the Old Russian language. The significance of the «Materials for the dictionary of the old Russian language in the written records» by I.I. Sreznevskiy for the diachronic research of the substantive wordformation is proved with the system of the old Russian substantive derivatives with evaluative suffixes that was allocated in the research. Productive modification formants are revealed, their morphological characteristics are considered. Special attention is concentrated on the analysis of the suffixal frequency. On the basis of the dictionary data connotation of affixes is characterized, options of suffixes are given. It is noted that these morphemes have a positive or negative assessment. The compiler of this dictionary pays attention to the connotation. The suggested indication of the word allows defining the boundaries of suffixes. Examples of the derivatives with evaluative affixes in context are given. It is emphasized that the presence of the usage helps to systematic comprehension of the material.

  10. Thermal Insulation System Analysis Tool (TISTool) User's Manual. Version 1.0.0

    Science.gov (United States)

    Johnson, Wesley; Fesmire, James; Leucht, Kurt; Demko, Jonathan

    2010-01-01

    The Thermal Insulation System Analysis Tool (TISTool) was developed starting in 2004 by Jonathan Demko and James Fesmire. The first edition was written in Excel and Visual BasIc as macros. It included the basic shapes such as a flat plate, cylinder, dished head, and sphere. The data was from several KSC tests that were already in the public literature realm as well as data from NIST and other highly respectable sources. More recently, the tool has been updated with more test data from the Cryogenics Test Laboratory and the tank shape was added. Additionally, the tool was converted to FORTRAN 95 to allow for easier distribution of the material and tool. This document reviews the user instructions for the operation of this system.

  11. WRITTEN COMMUNICATION IN BUSINESS

    OpenAIRE

    Oana COSMAN

    2013-01-01

    The article examines the work of researchers primarily interested in the investigation of written communication in business settings. The author regards 'business discourse' as a field of study with distinct features in the domain of discourse analysis. Thus, the paper overviews the most important contributions to the development of written business discourse with a number of landmark studies. To gain a greater understanding of the written business discourse, the author also investigates some...

  12. VSEARCH: a versatile open source tool for metagenomics.

    Science.gov (United States)

    Rognes, Torbjørn; Flouri, Tomáš; Nichols, Ben; Quince, Christopher; Mahé, Frédéric

    2016-01-01

    VSEARCH is an open source and free of charge multithreaded 64-bit tool for processing and preparing metagenomics, genomics and population genomics nucleotide sequence data. It is designed as an alternative to the widely used USEARCH tool (Edgar, 2010) for which the source code is not publicly available, algorithm details are only rudimentarily described, and only a memory-confined 32-bit version is freely available for academic use. When searching nucleotide sequences, VSEARCH uses a fast heuristic based on words shared by the query and target sequences in order to quickly identify similar sequences, a similar strategy is probably used in USEARCH. VSEARCH then performs optimal global sequence alignment of the query against potential target sequences, using full dynamic programming instead of the seed-and-extend heuristic used by USEARCH. Pairwise alignments are computed in parallel using vectorisation and multiple threads. VSEARCH includes most commands for analysing nucleotide sequences available in USEARCH version 7 and several of those available in USEARCH version 8, including searching (exact or based on global alignment), clustering by similarity (using length pre-sorting, abundance pre-sorting or a user-defined order), chimera detection (reference-based or de novo ), dereplication (full length or prefix), pairwise alignment, reverse complementation, sorting, and subsampling. VSEARCH also includes commands for FASTQ file processing, i.e., format detection, filtering, read quality statistics, and merging of paired reads. Furthermore, VSEARCH extends functionality with several new commands and improvements, including shuffling, rereplication, masking of low-complexity sequences with the well-known DUST algorithm, a choice among different similarity definitions, and FASTQ file format conversion. VSEARCH is here shown to be more accurate than USEARCH when performing searching, clustering, chimera detection and subsampling, while on a par with USEARCH for paired

  13. EEGNET: An Open Source Tool for Analyzing and Visualizing M/EEG Connectome.

    Science.gov (United States)

    Hassan, Mahmoud; Shamas, Mohamad; Khalil, Mohamad; El Falou, Wassim; Wendling, Fabrice

    2015-01-01

    The brain is a large-scale complex network often referred to as the "connectome". Exploring the dynamic behavior of the connectome is a challenging issue as both excellent time and space resolution is required. In this context Magneto/Electroencephalography (M/EEG) are effective neuroimaging techniques allowing for analysis of the dynamics of functional brain networks at scalp level and/or at reconstructed sources. However, a tool that can cover all the processing steps of identifying brain networks from M/EEG data is still missing. In this paper, we report a novel software package, called EEGNET, running under MATLAB (Math works, inc), and allowing for analysis and visualization of functional brain networks from M/EEG recordings. EEGNET is developed to analyze networks either at the level of scalp electrodes or at the level of reconstructed cortical sources. It includes i) Basic steps in preprocessing M/EEG signals, ii) the solution of the inverse problem to localize / reconstruct the cortical sources, iii) the computation of functional connectivity among signals collected at surface electrodes or/and time courses of reconstructed sources and iv) the computation of the network measures based on graph theory analysis. EEGNET is the unique tool that combines the M/EEG functional connectivity analysis and the computation of network measures derived from the graph theory. The first version of EEGNET is easy to use, flexible and user friendly. EEGNET is an open source tool and can be freely downloaded from this webpage: https://sites.google.com/site/eegnetworks/.

  14. ThinkHazard!: an open-source, global tool for understanding hazard information

    Science.gov (United States)

    Fraser, Stuart; Jongman, Brenden; Simpson, Alanna; Nunez, Ariel; Deparday, Vivien; Saito, Keiko; Murnane, Richard; Balog, Simone

    2016-04-01

    Rapid and simple access to added-value natural hazard and disaster risk information is a key issue for various stakeholders of the development and disaster risk management (DRM) domains. Accessing available data often requires specialist knowledge of heterogeneous data, which are often highly technical and can be difficult for non-specialists in DRM to find and exploit. Thus, availability, accessibility and processing of these information sources are crucial issues, and an important reason why many development projects suffer significant impacts from natural hazards. The World Bank's Global Facility for Disaster Reduction and Recovery (GFDRR) is currently developing a new open-source tool to address this knowledge gap: ThinkHazard! The main aim of the ThinkHazard! project is to develop an analytical tool dedicated to facilitating improvements in knowledge and understanding of natural hazards among non-specialists in DRM. It also aims at providing users with relevant guidance and information on handling the threats posed by the natural hazards present in a chosen location. Furthermore, all aspects of this tool will be open and transparent, in order to give users enough information to understand its operational principles. In this presentation, we will explain the technical approach behind the tool, which translates state-of-the-art probabilistic natural hazard data into understandable hazard classifications and practical recommendations. We will also demonstrate the functionality of the tool, and discuss limitations from a scientific as well as an operational perspective.

  15. BioSimplify: an open source sentence simplification engine to improve recall in automatic biomedical information extraction

    OpenAIRE

    Jonnalagadda, Siddhartha; Gonzalez, Graciela

    2011-01-01

    BioSimplify is an open source tool written in Java that introduces and facilitates the use of a novel model for sentence simplification tuned for automatic discourse analysis and information extraction (as opposed to sentence simplification for improving human readability). The model is based on a "shot-gun" approach that produces many different (simpler) versions of the original sentence by combining variants of its constituent elements. This tool is optimized for processing biomedical scien...

  16. Synchrotron light sources: A powerful tool for science and technology

    International Nuclear Information System (INIS)

    Schlachter, F.; Robinson, A.

    1996-01-01

    A new generation of synchrotron light sources is producing extremely bright beams of vacuum-ultraviolet and x-ray radiation, poweful new tools for research in a wide variety of basic and applied sciences. Spectromicroscopy using high spectral and spatial resolution is a new way of seeing, offering many opportunities in the study of matter. Development of a new light source provides the country or region of the world in which the light source is located many new opportunities: a focal point for research in many scientific and technological areas, a means of upgrading the technology infrastructure of the country, a means of training students, and a potential service to industry. A light source for Southeast Asia would thus be a major resource for many years. Scientists and engineers from light sources around the world look forward to providing assistance to make this a reality in Southeast Asia

  17. Synchrotron light sources: A powerful tool for science and technology

    International Nuclear Information System (INIS)

    Schlachter, F.; Robinson, A.

    1996-01-01

    A new generation of synchrotron light sources is producing extremely bright beams of vacuum-ultraviolet and x-ray radiation, powerful new tools for research in a wide variety of basic and applied sciences. Spectromicroscopy using high spectral and spatial resolution is a new way of seeing, offering many opportunities in the study of matter. Development of a new light source provides the country or region of the world in which the light source is located many new opportunities: a focal point for research in many scientific and technological areas, a means of upgrading the technology infrastructure of the country, a means of training students, and a potential service to industry. A light source for Southeast Asia would thus be a major resource for many years. Scientists and engineers from light sources around the world look forward to providing assistance to make this a reality in Southeast Asia

  18. Biomechanical ToolKit: Open-source framework to visualize and process biomechanical data.

    Science.gov (United States)

    Barre, Arnaud; Armand, Stéphane

    2014-04-01

    C3D file format is widely used in the biomechanical field by companies and laboratories to store motion capture systems data. However, few software packages can visualize and modify the integrality of the data in the C3D file. Our objective was to develop an open-source and multi-platform framework to read, write, modify and visualize data from any motion analysis systems using standard (C3D) and proprietary file formats (used by many companies producing motion capture systems). The Biomechanical ToolKit (BTK) was developed to provide cost-effective and efficient tools for the biomechanical community to easily deal with motion analysis data. A large panel of operations is available to read, modify and process data through C++ API, bindings for high-level languages (Matlab, Octave, and Python), and standalone application (Mokka). All these tools are open-source and cross-platform and run on all major operating systems (Windows, Linux, MacOS X). Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  19. Rapid development of medical imaging tools with open-source libraries.

    Science.gov (United States)

    Caban, Jesus J; Joshi, Alark; Nagy, Paul

    2007-11-01

    Rapid prototyping is an important element in researching new imaging analysis techniques and developing custom medical applications. In the last ten years, the open source community and the number of open source libraries and freely available frameworks for biomedical research have grown significantly. What they offer are now considered standards in medical image analysis, computer-aided diagnosis, and medical visualization. A cursory review of the peer-reviewed literature in imaging informatics (indeed, in almost any information technology-dependent scientific discipline) indicates the current reliance on open source libraries to accelerate development and validation of processes and techniques. In this survey paper, we review and compare a few of the most successful open source libraries and frameworks for medical application development. Our dual intentions are to provide evidence that these approaches already constitute a vital and essential part of medical image analysis, diagnosis, and visualization and to motivate the reader to use open source libraries and software for rapid prototyping of medical applications and tools.

  20. Radiotherapy: an interactive learning tool

    International Nuclear Information System (INIS)

    Frenzel, T.; Kruell, A.; Schmidt, R.; Dobrucki, W.; Malys, B.

    1998-01-01

    The program is primarily intended for radiological medical technicians, student nurses, students of medicine and physics, and doctors. It is designed as a tool for vocational training and further training and gives comprehensive insight into the daily routines of a radiotherapy unit. The chapters deal with: fundamental biological aspects - fundamental physical aspects - radiation sources and irradiation systems - preparatory examinations - therapies and concepts - irradiation planning - irradiation performance - termination of irradiation treatment. For every page displayed, spoken texts and written, on-screen keywords, illustrations, animated sequences and a large number of videos have been combined in a way easy to digest. The software of the program permits handling also by learners less familiar with computer-based learning. (orig./) [de

  1. VSEARCH: a versatile open source tool for metagenomics

    Directory of Open Access Journals (Sweden)

    Torbjørn Rognes

    2016-10-01

    Full Text Available Background VSEARCH is an open source and free of charge multithreaded 64-bit tool for processing and preparing metagenomics, genomics and population genomics nucleotide sequence data. It is designed as an alternative to the widely used USEARCH tool (Edgar, 2010 for which the source code is not publicly available, algorithm details are only rudimentarily described, and only a memory-confined 32-bit version is freely available for academic use. Methods When searching nucleotide sequences, VSEARCH uses a fast heuristic based on words shared by the query and target sequences in order to quickly identify similar sequences, a similar strategy is probably used in USEARCH. VSEARCH then performs optimal global sequence alignment of the query against potential target sequences, using full dynamic programming instead of the seed-and-extend heuristic used by USEARCH. Pairwise alignments are computed in parallel using vectorisation and multiple threads. Results VSEARCH includes most commands for analysing nucleotide sequences available in USEARCH version 7 and several of those available in USEARCH version 8, including searching (exact or based on global alignment, clustering by similarity (using length pre-sorting, abundance pre-sorting or a user-defined order, chimera detection (reference-based or de novo, dereplication (full length or prefix, pairwise alignment, reverse complementation, sorting, and subsampling. VSEARCH also includes commands for FASTQ file processing, i.e., format detection, filtering, read quality statistics, and merging of paired reads. Furthermore, VSEARCH extends functionality with several new commands and improvements, including shuffling, rereplication, masking of low-complexity sequences with the well-known DUST algorithm, a choice among different similarity definitions, and FASTQ file format conversion. VSEARCH is here shown to be more accurate than USEARCH when performing searching, clustering, chimera detection and subsampling

  2. An Open-Source Tool Set Enabling Analog-Digital-Software Co-Design

    Directory of Open Access Journals (Sweden)

    Michelle Collins

    2016-02-01

    Full Text Available This paper presents an analog-digital hardware-software co-design environment for simulating and programming reconfigurable systems. The tool simulates, designs, as well as enables experimental measurements after compiling to configurable systems in the same integrated design tool framework. High level software in Scilab/Xcos (open-source programs similar to MATLAB/Simulink that converts the high-level block description by the user to blif format (sci2blif, which acts as an input to the modified VPR tool, including the code v p r 2 s w c s , encoding the specific platform through specific architecture files, resulting in a targetable switch list on the resulting configurable analog–digital system. The resulting tool uses an analog and mixed-signal library of components, enabling users and future researchers access to the basic analog operations/computations that are possible.

  3. The Value of Open Source Software Tools in Qualitative Research

    Science.gov (United States)

    Greenberg, Gary

    2011-01-01

    In an era of global networks, researchers using qualitative methods must consider the impact of any software they use on the sharing of data and findings. In this essay, I identify researchers' main areas of concern regarding the use of qualitative software packages for research. I then examine how open source software tools, wherein the publisher…

  4. Sports metaphors in Polish written commentaries on politics

    OpenAIRE

    Jarosław Wiliński

    2015-01-01

    This paper seeks to investigate what sports metaphors are used in Polish written commentaries on politics and what special purpose they serve. In particular, the paper examines structural metaphors that come from the lexicon of popular sports, such as boxing, racing, track and field athletics, sailing, etc. The language data, derived from English Internet websites, has been grouped and discussed according to source domains. Applying George Lakoff and Mark Johnson’s approach to metaphor, the p...

  5. Semantic integration of gene expression analysis tools and data sources using software connectors

    Science.gov (United States)

    2013-01-01

    Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools

  6. Picante: R tools for integrating phylogenies and ecology.

    Science.gov (United States)

    Kembel, Steven W; Cowan, Peter D; Helmus, Matthew R; Cornwell, William K; Morlon, Helene; Ackerly, David D; Blomberg, Simon P; Webb, Campbell O

    2010-06-01

    Picante is a software package that provides a comprehensive set of tools for analyzing the phylogenetic and trait diversity of ecological communities. The package calculates phylogenetic diversity metrics, performs trait comparative analyses, manipulates phenotypic and phylogenetic data, and performs tests for phylogenetic signal in trait distributions, community structure and species interactions. Picante is a package for the R statistical language and environment written in R and C, released under a GPL v2 open-source license, and freely available on the web (http://picante.r-forge.r-project.org) and from CRAN (http://cran.r-project.org).

  7. Glimpses into the transition world: New graduate nurses' written reflections.

    Science.gov (United States)

    Walton, Jo Ann; Lindsay, Natalie; Hales, Caz; Rook, Helen

    2018-01-01

    This study was born out of our reflections as educators responsible for helping new graduate nurses transition into their first year of professional practice through a formal education programme. Finding ourselves wondering about many of the questions the students raised with us, we set about looking more closely at what could be gleaned from the students' experience, captured in their written work over the course of a year. To identify the challenges and learning experiences revealed in reflective assignments written by new graduate nurses undertaking a postgraduate course as part of their transition to registered nurse practice. Data consisted of the written work of two cohorts of students who had completed a postgraduate university course as part of their transition to new graduate practice in New Zealand. Fifty four reflective essays completed by twenty seven participating students were collected and their contents analysed thematically. Five key themes were identified. The students' reflections noted individual attributes - personal and professional strengths and weaknesses; professional behaviour - actions such as engaging help and support, advocating for patients' needs and safety and putting their own feelings aside; situational challenges such as communication difficulties, both systemic and interpersonal, and the pressure of competing demands. Students also identified rewards - results they experienced such as achieving the nursing outcomes they desired, and commented on reflection as a useful tool. The findings shed light on the experiences of new graduates, and how they fare through this critical phase of career development. Challenges relating to the emotional labour of nursing work are particularly evident. In addition the reflective essay is shown to be a powerful tool for assisting both new graduate nurses and their lecturers to reflect on the learning opportunities inherent in current clinical practice environments. Copyright © 2017 Elsevier Ltd

  8. Online characterization of planetary surfaces: PlanetServer, an open-source analysis and visualization tool

    Science.gov (United States)

    Marco Figuera, R.; Pham Huu, B.; Rossi, A. P.; Minin, M.; Flahaut, J.; Halder, A.

    2018-01-01

    The lack of open-source tools for hyperspectral data visualization and analysis creates a demand for new tools. In this paper we present the new PlanetServer, a set of tools comprising a web Geographic Information System (GIS) and a recently developed Python Application Programming Interface (API) capable of visualizing and analyzing a wide variety of hyperspectral data from different planetary bodies. Current WebGIS open-source tools are evaluated in order to give an overview and contextualize how PlanetServer can help in this matters. The web client is thoroughly described as well as the datasets available in PlanetServer. Also, the Python API is described and exposed the reason of its development. Two different examples of mineral characterization of different hydrosilicates such as chlorites, prehnites and kaolinites in the Nili Fossae area on Mars are presented. As the obtained results show positive outcome in hyperspectral analysis and visualization compared to previous literature, we suggest using the PlanetServer approach for such investigations.

  9. An Analysis of Written Feedback on a PhD Thesis

    Science.gov (United States)

    Kumar, Vijay; Stracke, Elke

    2007-01-01

    This paper offers an interim analysis of written feedback on a first draft of a PhD thesis. It first looks at two sources of data: in-text feedback and overall feedback. Looking at how language is used in its situational context, we then coded the feedback and developed a model for analysis based on three fundamental functions of speech:…

  10. Demonstrating High-Accuracy Orbital Access Using Open-Source Tools

    Science.gov (United States)

    Gilbertson, Christian; Welch, Bryan

    2017-01-01

    Orbit propagation is fundamental to almost every space-based analysis. Currently, many system analysts use commercial software to predict the future positions of orbiting satellites. This is one of many capabilities that can replicated, with great accuracy, without using expensive, proprietary software. NASAs SCaN (Space Communication and Navigation) Center for Engineering, Networks, Integration, and Communications (SCENIC) project plans to provide its analysis capabilities using a combination of internal and open-source software, allowing for a much greater measure of customization and flexibility, while reducing recurring software license costs. MATLAB and the open-source Orbit Determination Toolbox created by Goddard Space Flight Center (GSFC) were utilized to develop tools with the capability to propagate orbits, perform line-of-sight (LOS) availability analyses, and visualize the results. The developed programs are modular and can be applied for mission planning and viability analysis in a variety of Solar System applications. The tools can perform 2 and N-body orbit propagation, find inter-satellite and satellite to ground station LOS access (accounting for intermediate oblate spheroid body blocking, geometric restrictions of the antenna field-of-view (FOV), and relativistic corrections), and create animations of planetary movement, satellite orbits, and LOS accesses. The code is the basis for SCENICs broad analysis capabilities including dynamic link analysis, dilution-of-precision navigation analysis, and orbital availability calculations.

  11. SPPTOOLS: Programming tools for the IRAF SPP language

    Science.gov (United States)

    Fitzpatrick, M.

    1992-01-01

    An IRAF package to assist in SPP code development and debugging is described. SPP is the machine-independent programming language used by virtually all IRAF tasks. Tools have been written to aide both novice and advanced SPP programmers with development and debugging by providing tasks to check the code for the number and type of arguments in all calls to IRAF VOS library procedures, list the calling sequences of IRAF tasks, create a database of identifiers for quick access, check for memory which is not freed, and a source code formatter. Debugging is simplified since the programmer is able to get a better understanding of the structure of his/her code, and IRAF library procedure calls (probably the most common source of errors) are automatically checked for correctness.

  12. IB: A Monte Carlo simulation tool for neutron scattering instrument design under PVM and MPI

    International Nuclear Information System (INIS)

    Zhao Jinkui

    2011-01-01

    Design of modern neutron scattering instruments relies heavily on Monte Carlo simulation tools for optimization. IB is one such tool written in C++ and implemented under Parallel Virtual Machine and the Message Passing Interface. The program was initially written for the design and optimization of the EQ-SANS instrument at the Spallation Neutron Source. One of its features is the ability to group simple instrument components into more complex ones at the user input level, e.g. grouping neutron mirrors into neutron guides and curved benders. The simulation engine manages the grouped components such that neutrons entering a group are properly operated upon by all components, multiple times if needed, before exiting the group. Thus, only a few basic optical modules are needed at the programming level. For simulations that require higher computer speeds, the program can be compiled and run in parallel modes using either the PVM or the MPI architectures.

  13. Toward better Alzheimer's research information sources for the public.

    Science.gov (United States)

    Payne, Perry W

    2013-03-01

    The National Plan to Address Alzheimer's Disease calls for a new relationship between researchers and members of the public. This relationship is one that provides research information to patients and allows patients to provide ideas to researchers. One way to describe it is a "bidirectional translational relationship." Despite the numerous sources of online and offline information about Alzheimer's disease, there is no information source which currently provides this interaction. This article proposes the creation an Alzheimer's research information source dedicated to monitoring Alzheimer's research literature and providing user friendly, publicly accessible summaries of data written specifically for a lay audience. This information source should contain comprehensive, updated, user friendly, publicly available, reviews of Alzheimer's research and utilize existing online multimedia/social networking tools to provide information in useful formats that help patients, caregivers, and researchers learn rapidly from one another.

  14. Methods and tools to evaluate the availability of renewable energy sources

    International Nuclear Information System (INIS)

    Angelis-Dimakis, Athanasios; Kartalidis, Avraam; Biberacher, Markus; Gadocha, Sabine; Dominguez, Javier; Pinedo, Irene; Fiorese, Giulia; Gnansounou, Edgard; Panichelli, Luis; Guariso, Giorgio; Robba, Michela

    2011-01-01

    The recent statements of both the European Union and the US Presidency pushed in the direction of using renewable forms of energy, in order to act against climate changes induced by the growing concentration of carbon dioxide in the atmosphere. In this paper, a survey regarding methods and tools presently available to determine potential and exploitable energy in the most important renewable sectors (i.e., solar, wind, wave, biomass and geothermal energy) is presented. Moreover, challenges for each renewable resource are highlighted as well as the available tools that can help in evaluating the use of a mix of different sources. (author)

  15. Multiband Study of Radio Sources of the RCR Catalogue with Virtual Observatory Tools

    Directory of Open Access Journals (Sweden)

    Zhelenkova O. P.

    2012-09-01

    Full Text Available We present early results of our multiband study of the RATAN Cold Revised (RCR catalogue obtained from seven cycles of the “Cold” survey carried with the RATAN-600 radio telescope at 7.6 cm in 1980-1999, at the declination of the SS 433 source. We used the 2MASS and LAS UKIDSS infrared surveys, the DSS-II and SDSS DR7 optical surveys, as well as the USNO-B1 and GSC-II catalogues, the VLSS, TXS, NVSS, FIRST and GB6 radio surveys to accumulate information about the sources. For radio sources that have no detectable optical candidate in optical or infrared catalogues, we additionally looked through images in several bands from the SDSS, LAS UKIDSS, DPOSS, 2MASS surveys and also used co-added frames in different bands. We reliably identified 76% of radio sources of the RCR catalogue. We used the ALADIN and SAOImage DS9 scripting capabilities, interoperability services of ALADIN and TOPCAT, and also other Virtual Observatory (VO tools and resources, such as CASJobs, NED, Vizier, and WSA, for effective data access, visualization and analysis. Without VO tools it would have been problematic to perform our study.

  16. Students’ engagement with a collaborative wiki tool predicts enhanced written exam performance

    Directory of Open Access Journals (Sweden)

    Tom Stafford

    2014-08-01

    Full Text Available We introduced voluntary wiki-based exercises to a long-running cognitive psychology course, part of the core curriculum for an undergraduate degree in psychology. Over 2 yearly cohorts, students who used the wiki more also scored higher on the final written exam. Using regression analysis, it is possible to account for students’ tendency to score well on other psychology exams, thus statistically removing some obvious candidate third factors, such as general talent or enthusiasm for psychology, which might drive this correlation. Such an analysis shows that both high- and low-grading students who used the wiki got higher scores on the final exam, with engaged wiki users scoring an average of an extra 5 percentage points. We offer an interpretation of the mechanisms of action in terms of the psychological literature on learning and memory.

  17. WannierTools: An open-source software package for novel topological materials

    Science.gov (United States)

    Wu, QuanSheng; Zhang, ShengNan; Song, Hai-Feng; Troyer, Matthias; Soluyanov, Alexey A.

    2018-03-01

    We present an open-source software package WannierTools, a tool for investigation of novel topological materials. This code works in the tight-binding framework, which can be generated by another software package Wannier90 (Mostofi et al., 2008). It can help to classify the topological phase of a given material by calculating the Wilson loop, and can get the surface state spectrum, which is detected by angle resolved photoemission (ARPES) and in scanning tunneling microscopy (STM) experiments. It also identifies positions of Weyl/Dirac points and nodal line structures, calculates the Berry phase around a closed momentum loop and Berry curvature in a part of the Brillouin zone (BZ).

  18. M2Lite: An Open-source, Light-weight, Pluggable and Fast Proteome Discoverer MSF to mzIdentML Tool.

    Science.gov (United States)

    Aiyetan, Paul; Zhang, Bai; Chen, Lily; Zhang, Zhen; Zhang, Hui

    2014-04-28

    Proteome Discoverer is one of many tools used for protein database search and peptide to spectrum assignment in mass spectrometry-based proteomics. However, the inadequacy of conversion tools makes it challenging to compare and integrate its results to those of other analytical tools. Here we present M2Lite, an open-source, light-weight, easily pluggable and fast conversion tool. M2Lite converts proteome discoverer derived MSF files to the proteomics community defined standard - the mzIdentML file format. M2Lite's source code is available as open-source at https://bitbucket.org/paiyetan/m2lite/src and its compiled binaries and documentation can be freely downloaded at https://bitbucket.org/paiyetan/m2lite/downloads.

  19. Visualization and analysis of atomistic simulation data with OVITO–the Open Visualization Tool

    International Nuclear Information System (INIS)

    Stukowski, Alexander

    2010-01-01

    The Open Visualization Tool (OVITO) is a new 3D visualization software designed for post-processing atomistic data obtained from molecular dynamics or Monte Carlo simulations. Unique analysis, editing and animations functions are integrated into its easy-to-use graphical user interface. The software is written in object-oriented C++, controllable via Python scripts and easily extendable through a plug-in interface. It is distributed as open-source software and can be downloaded from the website http://ovito.sourceforge.net/

  20. Validation of the translation of an instrument to measure reliability of written information on treatment choices: a study on attention deficit/hyperactivity disorder (ADHD).

    Science.gov (United States)

    Montoya, A; Llopis, N; Gilaberte, I

    2011-12-01

    DISCERN is an instrument designed to help patients assess the reliability of written information on treatment choices. Originally created in English, there is no validated Spanish version of this instrument. This study seeks to validate the Spanish translation of the DISCERN instrument used as a primary measure on a multicenter study aimed to assess the reliability of web-based information on treatment choices for attention deficit/hyperactivity disorder (ADHD). We used a modified version of a method for validating translated instruments in which the original source-language version is formally compared with the back-translated source-language version. Each item was ranked in terms of comparability of language, similarity of interpretability, and degree of understandability. Responses used Likert scales ranging from 1 to 7, where 1 indicates the best interpretability, language and understandability, and 7 indicates the worst. Assessments were performed by 20 raters fluent in the source language. The Spanish translation of DISCERN, based on ratings of comparability, interpretability and degree of understandability (mean score (SD): 1.8 (1.1), 1.4 (0.9) and 1.6 (1.1), respectively), was considered extremely comparable. All items received a score of less than three, therefore no further revision of the translation was needed. The validation process showed that the quality of DISCERN translation was high, validating the comparable language of the tool translated on assessing written information on treatment choices for ADHD.

  1. A Large-Scale Analysis of Variance in Written Language.

    Science.gov (United States)

    Johns, Brendan T; Jamieson, Randall K

    2018-01-22

    The collection of very large text sources has revolutionized the study of natural language, leading to the development of several models of language learning and distributional semantics that extract sophisticated semantic representations of words based on the statistical redundancies contained within natural language (e.g., Griffiths, Steyvers, & Tenenbaum, ; Jones & Mewhort, ; Landauer & Dumais, ; Mikolov, Sutskever, Chen, Corrado, & Dean, ). The models treat knowledge as an interaction of processing mechanisms and the structure of language experience. But language experience is often treated agnostically. We report a distributional semantic analysis that shows written language in fiction books varies appreciably between books from the different genres, books from the same genre, and even books written by the same author. Given that current theories assume that word knowledge reflects an interaction between processing mechanisms and the language environment, the analysis shows the need for the field to engage in a more deliberate consideration and curation of the corpora used in computational studies of natural language processing. Copyright © 2018 Cognitive Science Society, Inc.

  2. DeltaSA tool for source apportionment benchmarking, description and sensitivity analysis

    Science.gov (United States)

    Pernigotti, D.; Belis, C. A.

    2018-05-01

    DeltaSA is an R-package and a Java on-line tool developed at the EC-Joint Research Centre to assist and benchmark source apportionment applications. Its key functionalities support two critical tasks in this kind of studies: the assignment of a factor to a source in factor analytical models (source identification) and the model performance evaluation. The source identification is based on the similarity between a given factor and source chemical profiles from public databases. The model performance evaluation is based on statistical indicators used to compare model output with reference values generated in intercomparison exercises. The references values are calculated as the ensemble average of the results reported by participants that have passed a set of testing criteria based on chemical profiles and time series similarity. In this study, a sensitivity analysis of the model performance criteria is accomplished using the results of a synthetic dataset where "a priori" references are available. The consensus modulated standard deviation punc gives the best choice for the model performance evaluation when a conservative approach is adopted.

  3. Requirements for UML and OWL Integration Tool for User Data Consistency Modeling and Testing

    DEFF Research Database (Denmark)

    Nytun, J. P.; Jensen, Christian Søndergaard; Oleshchuk, V. A.

    2003-01-01

    The amount of data available on the Internet is continuously increasing, consequentially there is a growing need for tools that help to analyse the data. Testing of consistency among data received from different sources is made difficult by the number of different languages and schemas being used....... In this paper we analyze requirements for a tool that support integration of UML models and ontologies written in languages like the W3C Web Ontology Language (OWL). The tool can be used in the following way: after loading two legacy models into the tool, the tool user connects them by inserting modeling......, an important part of this technique is attaching of OCL expressions to special boolean class attributes that we call consistency attributes. The resulting integration model can be used for automatic consistency testing of two instances of the legacy models by automatically instantiate the whole integration...

  4. A plug-in to Eclipse for VHDL source codes: functionalities

    Science.gov (United States)

    Niton, B.; Poźniak, K. T.; Romaniuk, R. S.

    The paper presents an original application, written by authors, which supports writing and edition of source codes in VHDL language. It is a step towards fully automatic, augmented code writing for photonic and electronic systems, also systems based on FPGA and/or DSP processors. An implementation is described, based on VEditor. VEditor is a free license program. Thus, the work presented in this paper supplements and extends this free license. The introduction characterizes shortly available tools on the market which serve for aiding the design processes of electronic systems in VHDL. Particular attention was put on plug-ins to the Eclipse environment and Emacs program. There are presented detailed properties of the written plug-in such as: programming extension conception, and the results of the activities of formatter, re-factorizer, code hider, and other new additions to the VEditor program.

  5. An open source GIS-based tool to integrate the fragmentation mechanism in rockfall propagation

    Science.gov (United States)

    Matas, Gerard; Lantada, Nieves; Gili, Josep A.; Corominas, Jordi

    2015-04-01

    Rockfalls are frequent instability processes in road cuts, open pit mines and quarries, steep slopes and cliffs. Even though the stability of rock slopes can be determined using analytical approaches, the assessment of large rock cliffs require simplifying assumptions due to the difficulty of working with a large amount of joints, the scattering of both the orientations and strength parameters. The attitude and persistency of joints within the rock mass define the size of kinematically unstable rock volumes. Furthermore the rock block will eventually split in several fragments during its propagation downhill due its impact with the ground surface. Knowledge of the size, energy, trajectory… of each block resulting from fragmentation is critical in determining the vulnerability of buildings and protection structures. The objective of this contribution is to present a simple and open source tool to simulate the fragmentation mechanism in rockfall propagation models and in the calculation of impact energies. This tool includes common modes of motion for falling boulders based on the previous literature. The final tool is being implemented in a GIS (Geographic Information Systems) using open source Python programming. The tool under development will be simple, modular, compatible with any GIS environment, open source, able to model rockfalls phenomena correctly. It could be used in any area susceptible to rockfalls with a previous adjustment of the parameters. After the adjustment of the model parameters to a given area, a simulation could be performed to obtain maps of kinetic energy, frequency, stopping density and passing heights. This GIS-based tool and the analysis of the fragmentation laws using data collected from recent rockfall have being developed within the RockRisk Project (2014-2016). This project is funded by the Spanish Ministerio de Economía y Competitividad and entitled "Rockfalls in cliffs: risk quantification and its prevention"(BIA2013-42582-P).

  6. Eyewitness Culture and History: Primary Written Sources. The Iconoclast.

    Science.gov (United States)

    McMurtry, John

    1995-01-01

    Asserts that contemporary history and historiography is "official" history that ignores the daily struggles of people for their continued survival. Argues that, while public illiteracy has nearly disappeared, individuals are ignorant of the wealth of primary-source materials of other cultures' histories. (CFR)

  7. 47 CFR 76.936 - Written decision.

    Science.gov (United States)

    2010-10-01

    ... CABLE TELEVISION SERVICE Cable Rate Regulation § 76.936 Written decision. (a) A franchising authority... of interested parties. A franchising authority is not required to issue a written decision that...

  8. Machine Tool Software

    Science.gov (United States)

    1988-01-01

    A NASA-developed software package has played a part in technical education of students who major in Mechanical Engineering Technology at William Rainey Harper College. Professor Hack has been using (APT) Automatically Programmed Tool Software since 1969 in his CAD/CAM Computer Aided Design and Manufacturing curriculum. Professor Hack teaches the use of APT programming languages for control of metal cutting machines. Machine tool instructions are geometry definitions written in APT Language to constitute a "part program." The part program is processed by the machine tool. CAD/CAM students go from writing a program to cutting steel in the course of a semester.

  9. For whom were Gospels written?

    Directory of Open Access Journals (Sweden)

    Richard Bauckham

    1999-12-01

    Full Text Available This arlcie challenges the current consensus in Gospels scholarship that each Gospel was written for a specific church or group of churches. It argues that, since all our evidence about the early Christian movement shows it to have been a network of communities in constant, close communication, since all our evidence about early Christian leaders, such as might have written Gospels, shows them to have been typically people who travelled widely around the churches, and since, moreover, the evidence we have about early Christian literature shows that it did in fact circulate rapidily and widely, the strong probability is that the Gospels were written for general circulation around all the churches.

  10. BioSimplify: an open source sentence simplification engine to improve recall in automatic biomedical information extraction.

    Science.gov (United States)

    Jonnalagadda, Siddhartha; Gonzalez, Graciela

    2010-11-13

    BioSimplify is an open source tool written in Java that introduces and facilitates the use of a novel model for sentence simplification tuned for automatic discourse analysis and information extraction (as opposed to sentence simplification for improving human readability). The model is based on a "shot-gun" approach that produces many different (simpler) versions of the original sentence by combining variants of its constituent elements. This tool is optimized for processing biomedical scientific literature such as the abstracts indexed in PubMed. We tested our tool on its impact to the task of PPI extraction and it improved the f-score of the PPI tool by around 7%, with an improvement in recall of around 20%. The BioSimplify tool and test corpus can be downloaded from https://biosimplify.sourceforge.net.

  11. RETRANS - A tool to verify the functional equivalence of automatically generated source code with its specification

    International Nuclear Information System (INIS)

    Miedl, H.

    1998-01-01

    Following the competent technical standards (e.g. IEC 880) it is necessary to verify each step in the development process of safety critical software. This holds also for the verification of automatically generated source code. To avoid human errors during this verification step and to limit the cost effort a tool should be used which is developed independently from the development of the code generator. For this purpose ISTec has developed the tool RETRANS which demonstrates the functional equivalence of automatically generated source code with its underlying specification. (author)

  12. Minimal Poems Written in 1979 Minimal Poems Written in 1979

    Directory of Open Access Journals (Sweden)

    Sandra Sirangelo Maggio

    2008-04-01

    Full Text Available The reading of M. van der Slice's Minimal Poems Written in 1979 (the work, actually, has no title reminded me of a book I have seen a long time ago. called Truth, which had not even a single word printed inside. In either case we have a sample of how often excentricities can prove efficient means of artistic creativity, in this new literary trend known as Minimalism. The reading of M. van der Slice's Minimal Poems Written in 1979 (the work, actually, has no title reminded me of a book I have seen a long time ago. called Truth, which had not even a single word printed inside. In either case we have a sample of how often excentricities can prove efficient means of artistic creativity, in this new literary trend known as Minimalism.

  13. An integrated, open-source set of tools for urban vulnerability monitoring from Earth observation data

    Science.gov (United States)

    De Vecchi, Daniele; Harb, Mostapha; Dell'Acqua, Fabio; Aurelio Galeazzo, Daniel

    2015-04-01

    Aim: The paper introduces an integrated set of open-source tools designed to process medium and high-resolution imagery with the aim to extract vulnerability indicators [1]. Problem: In the context of risk monitoring [2], a series of vulnerability proxies can be defined, such as the extension of a built-up area or buildings regularity [3]. Different open-source C and Python libraries are already available for image processing and geospatial information (e.g. OrfeoToolbox, OpenCV and GDAL). They include basic processing tools but not vulnerability-oriented workflows. Therefore, it is of significant importance to provide end-users with a set of tools capable to return information at a higher level. Solution: The proposed set of python algorithms is a combination of low-level image processing and geospatial information handling tools along with high-level workflows. In particular, two main products are released under the GPL license: source code, developers-oriented, and a QGIS plugin. These tools were produced within the SENSUM project framework (ended December 2014) where the main focus was on earthquake and landslide risk. Further development and maintenance is guaranteed by the decision to include them in the platform designed within the FP 7 RASOR project . Conclusion: With the lack of a unified software suite for vulnerability indicators extraction, the proposed solution can provide inputs for already available models like the Global Earthquake Model. The inclusion of the proposed set of algorithms within the RASOR platforms can guarantee support and enlarge the community of end-users. Keywords: Vulnerability monitoring, remote sensing, optical imagery, open-source software tools References [1] M. Harb, D. De Vecchi, F. Dell'Acqua, "Remote sensing-based vulnerability proxies in the EU FP7 project SENSUM", Symposium on earthquake and landslide risk in Central Asia and Caucasus: exploiting remote sensing and geo-spatial information management, 29-30th January 2014

  14. Talking to Texts and Sketches: The Function of Written and Graphic Mediation in Engineering Design.

    Science.gov (United States)

    Lewis, Barbara

    2000-01-01

    Describes the author's research that explores the role of language, particularly texts, in the engineering design process. Notes that results of this case study support a new "mediated" model of engineering design as an inventional activity in which designers use talk, written language, and other symbolic representations as tools to think about…

  15. A spectroscopic tool for identifying sources of origin for materials of military interest

    Science.gov (United States)

    Miziolek, Andrzej W.; De Lucia, Frank C.

    2014-05-01

    There is a need to identify the source of origin for many items of military interest, including ammunition and weapons that may be circulated and traded in illicit markets. Both fieldable systems (man-portable or handheld) as well as benchtop systems in field and home base laboratories are desired for screening and attribution purposes. Laser Induced Breakdown Spectroscopy (LIBS) continues to show significant capability as a promising new tool for materials identification, matching, and provenance. With the use of the broadband, high resolution spectrometer systems, the LIBS devices can not only determine the elemental inventory of the sample, but they are also capable of elemental fingerprinting to signify sources of origin of various materials. We present the results of an initial study to differentiate and match spent cartridges from different manufacturers and countries. We have found that using Partial Least Squares Discriminant Analysis (PLS-DA) we are able to achieve on average 93.3% True Positives and 5.3% False Positives. These results add to the large body of publications that have demonstrated that LIBS is a particularly suitable tool for source of origin determinations.

  16. GNU Data Language (GDL) - a free and open-source implementation of IDL

    Science.gov (United States)

    Arabas, Sylwester; Schellens, Marc; Coulais, Alain; Gales, Joel; Messmer, Peter

    2010-05-01

    GNU Data Language (GDL) is developed with the aim of providing an open-source drop-in replacement for the ITTVIS's Interactive Data Language (IDL). It is free software developed by an international team of volunteers led by Marc Schellens - the project's founder (a list of contributors is available on the project's website). The development is hosted on SourceForge where GDL continuously ranks in the 99th percentile of most active projects. GDL with its library routines is designed as a tool for numerical data analysis and visualisation. As its proprietary counterparts (IDL and PV-WAVE), GDL is used particularly in geosciences and astronomy. GDL is dynamically-typed, vectorized and has object-oriented programming capabilities. The library routines handle numerical calculations, data visualisation, signal/image processing, interaction with host OS and data input/output. GDL supports several data formats such as netCDF, HDF4, HDF5, GRIB, PNG, TIFF, DICOM, etc. Graphical output is handled by X11, PostScript, SVG or z-buffer terminals, the last one allowing output to be saved in a variety of raster graphics formats. GDL is an incremental compiler with integrated debugging facilities. It is written in C++ using the ANTLR language-recognition framework. Most of the library routines are implemented as interfaces to open-source packages such as GNU Scientific Library, PLPlot, FFTW, ImageMagick, and others. GDL features a Python bridge (Python code can be called from GDL; GDL can be compiled as a Python module). Extensions to GDL can be written in C++, GDL, and Python. A number of open software libraries written in IDL, such as the NASA Astronomy Library, MPFIT, CMSVLIB and TeXtoIDL are fully or partially functional under GDL. Packaged versions of GDL are available for several Linux distributions and Mac OS X. The source code compiles on some other UNIX systems, including BSD and OpenSolaris. The presentation will cover the current status of the project, the key

  17. User Friendly Open GIS Tool for Large Scale Data Assimilation - a Case Study of Hydrological Modelling

    Science.gov (United States)

    Gupta, P. K.

    2012-08-01

    Open source software (OSS) coding has tremendous advantages over proprietary software. These are primarily fuelled by high level programming languages (JAVA, C++, Python etc...) and open source geospatial libraries (GDAL/OGR, GEOS, GeoTools etc.). Quantum GIS (QGIS) is a popular open source GIS package, which is licensed under GNU GPL and is written in C++. It allows users to perform specialised tasks by creating plugins in C++ and Python. This research article emphasises on exploiting this capability of QGIS to build and implement plugins across multiple platforms using the easy to learn - Python programming language. In the present study, a tool has been developed to assimilate large spatio-temporal datasets such as national level gridded rainfall, temperature, topographic (digital elevation model, slope, aspect), landuse/landcover and multi-layer soil data for input into hydrological models. At present this tool has been developed for Indian sub-continent. An attempt is also made to use popular scientific and numerical libraries to create custom applications for digital inclusion. In the hydrological modelling calibration and validation are important steps which are repetitively carried out for the same study region. As such the developed tool will be user friendly and used efficiently for these repetitive processes by reducing the time required for data management and handling. Moreover, it was found that the developed tool can easily assimilate large dataset in an organised manner.

  18. An Extensible Open-Source Compiler Infrastructure for Testing

    Energy Technology Data Exchange (ETDEWEB)

    Quinlan, D; Ur, S; Vuduc, R

    2005-12-09

    Testing forms a critical part of the development process for large-scale software, and there is growing need for automated tools that can read, represent, analyze, and transform the application's source code to help carry out testing tasks. However, the support required to compile applications written in common general purpose languages is generally inaccessible to the testing research community. In this paper, we report on an extensible, open-source compiler infrastructure called ROSE, which is currently in development at Lawrence Livermore National Laboratory. ROSE specifically targets developers who wish to build source-based tools that implement customized analyses and optimizations for large-scale C, C++, and Fortran90 scientific computing applications (on the order of a million lines of code or more). However, much of this infrastructure can also be used to address problems in testing, and ROSE is by design broadly accessible to those without a formal compiler background. This paper details the interactions between testing of applications and the ways in which compiler technology can aid in the understanding of those applications. We emphasize the particular aspects of ROSE, such as support for the general analysis of whole programs, that are particularly well-suited to the testing research community and the scale of the problems that community solves.

  19. Impact of design research on industrial practice tools, technology, and training

    CERN Document Server

    Lindemann, Udo

    2016-01-01

    Showcasing exemplars of how various aspects of design research were successfully transitioned into and influenced, design practice, this book features chapters written by eminent international researchers and practitioners from industry on the Impact of Design Research on Industrial Practice. Chapters written by internationally acclaimed researchers of design analyse the findings (guidelines, methods and tools), technologies/products and educational approaches that have been transferred as tools, technologies and people to transform industrial practice of engineering design, whilst the chapters that are written by industrial practitioners describe their experience of how various tools, technologies and training impacted design practice. The main benefit of this book, for educators, researchers and practitioners in (engineering) design, will be access to a comprehensive coverage of case studies of successful transfer of outcomes of design research into practice; as well as guidelines and platforms for successf...

  20. Model-based evaluation of the use of polycyclic aromatic hydrocarbons molecular diagnostic ratios as a source identification tool

    International Nuclear Information System (INIS)

    Katsoyiannis, Athanasios; Breivik, Knut

    2014-01-01

    Polycyclic Aromatic Hydrocarbons (PAHs) molecular diagnostic ratios (MDRs) are unitless concentration ratios of pair-PAHs with the same molecular weight (MW); MDRs have long been used as a tool for PAHs source identification purposes. In the present paper, the efficiency of the MDR methodology is evaluated through the use of a multimedia fate model, the calculation of characteristic travel distances (CTD) and the estimation of air concentrations for individual PAHs as a function of distance from an initial point source. The results show that PAHs with the same MW are sometimes characterized by substantially different CTDs and therefore their air concentrations and hence MDRs are predicted to change as the distance from the original source increases. From the assessed pair-PAHs, the biggest CTD difference is seen for Fluoranthene (107 km) vs. Pyrene (26 km). This study provides a strong indication that MDRs are of limited use as a source identification tool. -- Highlights: • Model-based evaluation of the PAHs molecular diagnostic ratios efficiency. • Individual PAHs are characterized by different characteristic travel distances. • MDRs are proven to be a limited tool for source identification. • Use of MDRs for other environmental media is likely unfeasible. -- PAHs molecular diagnostic ratios which change greatly as a function of distance from the emitting source are improper for source identification purposes

  1. Industrial ion sources broadbeam gridless ion source technology

    CERN Document Server

    Zhurin, Viacheslav V

    2012-01-01

    Due to the large number of uses of ion sources in academia and industry, those who utilize these sources need up to date and coherent information to keep themselves abreast of developments and options, and to chose ideal solutions for quality and cost-effectiveness. This book, written by an author with a strong industrial background and excellent standing, is the comprehensive guide users and developers of ion sources have been waiting for. Providing a thorough refresher on the physics involved, this resource systematically covers the source types, components, and the operational parameters.

  2. The Systems Biology Research Tool: evolvable open-source software

    Directory of Open Access Journals (Sweden)

    Wright Jeremiah

    2008-06-01

    Full Text Available Abstract Background Research in the field of systems biology requires software for a variety of purposes. Software must be used to store, retrieve, analyze, and sometimes even to collect the data obtained from system-level (often high-throughput experiments. Software must also be used to implement mathematical models and algorithms required for simulation and theoretical predictions on the system-level. Results We introduce a free, easy-to-use, open-source, integrated software platform called the Systems Biology Research Tool (SBRT to facilitate the computational aspects of systems biology. The SBRT currently performs 35 methods for analyzing stoichiometric networks and 16 methods from fields such as graph theory, geometry, algebra, and combinatorics. New computational techniques can be added to the SBRT via process plug-ins, providing a high degree of evolvability and a unifying framework for software development in systems biology. Conclusion The Systems Biology Research Tool represents a technological advance for systems biology. This software can be used to make sophisticated computational techniques accessible to everyone (including those with no programming ability, to facilitate cooperation among researchers, and to expedite progress in the field of systems biology.

  3. EpiTools: An Open-Source Image Analysis Toolkit for Quantifying Epithelial Growth Dynamics.

    Science.gov (United States)

    Heller, Davide; Hoppe, Andreas; Restrepo, Simon; Gatti, Lorenzo; Tournier, Alexander L; Tapon, Nicolas; Basler, Konrad; Mao, Yanlan

    2016-01-11

    Epithelia grow and undergo extensive rearrangements to achieve their final size and shape. Imaging the dynamics of tissue growth and morphogenesis is now possible with advances in time-lapse microscopy, but a true understanding of their complexities is limited by automated image analysis tools to extract quantitative data. To overcome such limitations, we have designed a new open-source image analysis toolkit called EpiTools. It provides user-friendly graphical user interfaces for accurately segmenting and tracking the contours of cell membrane signals obtained from 4D confocal imaging. It is designed for a broad audience, especially biologists with no computer-science background. Quantitative data extraction is integrated into a larger bioimaging platform, Icy, to increase the visibility and usability of our tools. We demonstrate the usefulness of EpiTools by analyzing Drosophila wing imaginal disc growth, revealing previously overlooked properties of this dynamic tissue, such as the patterns of cellular rearrangements. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  4. Flux and brightness calculations for various synchrotron radiation sources

    International Nuclear Information System (INIS)

    Weber, J.M.; Hulbert, S.L.

    1991-11-01

    Synchrotron radiation (SR) storage rings are powerful scientific and technological tools. The first generation of storage rings in the US., e.g., SURF (Washington, D.C.), Tantalus (Wisconsin), SSRL (Stanford), and CHESS (Cornell), revolutionized VUV, soft X-ray, and hard X-ray science. The second (present) generation of storage rings, e.g. the NSLS VUV and XRAY rings and Aladdin (Wisconsin), have sustained the revolution by providing higher stored currents and up to a factor of ten smaller electron beam sizes than the first generation sources. This has made possible a large number of experiments that could not performed using first generation sources. In addition, the NSLS XRAY ring design optimizes the performance of wigglers (high field periodic magnetic insertion devices). The third generation storage rings, e.g. ALS (Berkeley) and APS (Argonne), are being designed to optimize the performance of undulators (low field periodic magnetic insertion devices). These extremely high brightness sources will further revolutionize x-ray science by providing diffraction-limited x-ray beams. The output of undulators and wigglers is distinct from that of bending magnets in magnitude, spectral shape, and in spatial and angular size. Using published equations, we have developed computer programs to calculate the flux, central intensity, and brightness output bending magnets and selected wigglers and undulators of the NSLS VUV and XRAY rings, the Advanced Light Source (ALS), and the Advanced Photon Source (APS). Following is a summary of the equations used, the graphs and data produced, and the computer codes written. These codes, written in the C programming language, can be used to calculate the flux, central intensity, and brightness curves for bending magnets and insertion devices on any storage ring

  5. A Benchmarking Analysis of Open-Source Business Intelligence Tools in Healthcare Environments

    Directory of Open Access Journals (Sweden)

    Andreia Brandão

    2016-10-01

    Full Text Available In recent years, a wide range of Business Intelligence (BI technologies have been applied to different areas in order to support the decision-making process. BI enables the extraction of knowledge from the data stored. The healthcare industry is no exception, and so BI applications have been under investigation across multiple units of different institutions. Thus, in this article, we intend to analyze some open-source/free BI tools on the market and their applicability in the clinical sphere, taking into consideration the general characteristics of the clinical environment. For this purpose, six BI tools were selected, analyzed, and tested in a practical environment. Then, a comparison metric and a ranking were defined for the tested applications in order to choose the one that best applies to the extraction of useful knowledge and clinical data in a healthcare environment. Finally, a pervasive BI platform was developed using a real case in order to prove the tool viability.

  6. Written Language Shift among Norwegian Youth

    Directory of Open Access Journals (Sweden)

    Kamil ÖZERK

    2013-07-01

    Full Text Available In Norway there are two written Norwegian languages, Bokmål and Nynorsk. Of these two written languages Bokmål is being used by the majority of the people, and Bokmål has the highest prestige in the society. This article is about the shift of written language from Nynorsk to Bokmål among young people in a traditional Nynorsk district in the country. Drawing on empirical data we conclude that many adolescents are experiencing written language shift. We discuss various reasons for this phenomenon in the linguistic landscape of Norway. In our discussions we emphasize the importance of the school with regard to language maintenance and language revitalization. We call for a new language policy in the educational system that can prevent language shift. Having several dialects and two officially written forms of Norwegian in the country, creates a special linguistic landscape in Norway. Despite the fact that the Norwegian language situation is in several ways unique, it’s done very little research on how the existing policy works in practice. Our research reveals that the existing language policy and practice in the school system is not powerful enough to prevent language shift and language decay among the youngsters. The school system functions like a fabric for language shift.

  7. Feedforward: helping students interpret written feedback

    OpenAIRE

    Hurford, Donna; Read, Andrew

    2008-01-01

    "Assessment for Learning is the process of seeking and interpreting evidence for use by learners... "(Assessment Reform Group, 2002, p.2): for the Higher Education tutor, written feedback forms an integral part of this. This short article reports on teaching methods to engage students in feedback and assessment of their written work.

  8. Interactive, open source, travel time scenario modelling: tools to facilitate participation in health service access analysis.

    Science.gov (United States)

    Fisher, Rohan; Lassa, Jonatan

    2017-04-18

    Modelling travel time to services has become a common public health tool for planning service provision but the usefulness of these analyses is constrained by the availability of accurate input data and limitations inherent in the assumptions and parameterisation. This is particularly an issue in the developing world where access to basic data is limited and travel is often complex and multi-modal. Improving the accuracy and relevance in this context requires greater accessibility to, and flexibility in, travel time modelling tools to facilitate the incorporation of local knowledge and the rapid exploration of multiple travel scenarios. The aim of this work was to develop simple open source, adaptable, interactive travel time modelling tools to allow greater access to and participation in service access analysis. Described are three interconnected applications designed to reduce some of the barriers to the more wide-spread use of GIS analysis of service access and allow for complex spatial and temporal variations in service availability. These applications are an open source GIS tool-kit and two geo-simulation models. The development of these tools was guided by health service issues from a developing world context but they present a general approach to enabling greater access to and flexibility in health access modelling. The tools demonstrate a method that substantially simplifies the process for conducting travel time assessments and demonstrate a dynamic, interactive approach in an open source GIS format. In addition this paper provides examples from empirical experience where these tools have informed better policy and planning. Travel and health service access is complex and cannot be reduced to a few static modeled outputs. The approaches described in this paper use a unique set of tools to explore this complexity, promote discussion and build understanding with the goal of producing better planning outcomes. The accessible, flexible, interactive and

  9. Font size matters--emotion and attention in cortical responses to written words.

    Science.gov (United States)

    Bayer, Mareike; Sommer, Werner; Schacht, Annekathrin

    2012-01-01

    For emotional pictures with fear-, disgust-, or sex-related contents, stimulus size has been shown to increase emotion effects in attention-related event-related potentials (ERPs), presumably reflecting the enhanced biological impact of larger emotion-inducing pictures. If this is true, size should not enhance emotion effects for written words with symbolic and acquired meaning. Here, we investigated ERP effects of font size for emotional and neutral words. While P1 and N1 amplitudes were not affected by emotion, the early posterior negativity started earlier and lasted longer for large relative to small words. These results suggest that emotion-driven facilitation of attention is not necessarily based on biological relevance, but might generalize to stimuli with arbitrary perceptual features. This finding points to the high relevance of written language in today's society as an important source of emotional meaning.

  10. Testing tool for software concerning nuclear power plant safety

    International Nuclear Information System (INIS)

    Boulc'h, J.; Le Meur, M.; Collart, J.M.; Segalard, J.; Uberschlag, J.

    1984-11-01

    In the present case, softwares to be analyzed are all written in assembler language. This paper presents the study and the realization of a tool to analyze softwares which have an important role for nuclear reactor protection and sauvegarde: principles of the tool design, working principle, realization and evolution of dynamic analyze tool [fr

  11. Poling of UV-written Waveguides

    DEFF Research Database (Denmark)

    Arentoft, Jesper; Kristensen, Martin; Hübner, Jörg

    1999-01-01

    We report poling of UV-written silica waveguides. Thermal poling induces an electro-optic coefficient of 0.05 pm/V. We also demonstrate simultaneous UV-writing and UV-poling. No measurable decay in the induced electro-optic effect was detected after nine months......We report poling of UV-written silica waveguides. Thermal poling induces an electro-optic coefficient of 0.05 pm/V. We also demonstrate simultaneous UV-writing and UV-poling. No measurable decay in the induced electro-optic effect was detected after nine months...

  12. 29 CFR 100.610 - Written demand for payment.

    Science.gov (United States)

    2010-07-01

    ... Procedures § 100.610 Written demand for payment. (a) The NLRB will promptly make written demand upon the debtor for payment of money or the return of specific property. The written demand for payment will be... late charges will be 60 days from the date that the demand letter is mailed or hand-delivered. (b) The...

  13. Blind source separation theory and applications

    CERN Document Server

    Yu, Xianchuan; Xu, Jindong

    2013-01-01

    A systematic exploration of both classic and contemporary algorithms in blind source separation with practical case studies    The book presents an overview of Blind Source Separation, a relatively new signal processing method.  Due to the multidisciplinary nature of the subject, the book has been written so as to appeal to an audience from very different backgrounds. Basic mathematical skills (e.g. on matrix algebra and foundations of probability theory) are essential in order to understand the algorithms, although the book is written in an introductory, accessible style. This book offers

  14. USER FRIENDLY OPEN GIS TOOL FOR LARGE SCALE DATA ASSIMILATION – A CASE STUDY OF HYDROLOGICAL MODELLING

    Directory of Open Access Journals (Sweden)

    P. K. Gupta

    2012-08-01

    Full Text Available Open source software (OSS coding has tremendous advantages over proprietary software. These are primarily fuelled by high level programming languages (JAVA, C++, Python etc... and open source geospatial libraries (GDAL/OGR, GEOS, GeoTools etc.. Quantum GIS (QGIS is a popular open source GIS package, which is licensed under GNU GPL and is written in C++. It allows users to perform specialised tasks by creating plugins in C++ and Python. This research article emphasises on exploiting this capability of QGIS to build and implement plugins across multiple platforms using the easy to learn – Python programming language. In the present study, a tool has been developed to assimilate large spatio-temporal datasets such as national level gridded rainfall, temperature, topographic (digital elevation model, slope, aspect, landuse/landcover and multi-layer soil data for input into hydrological models. At present this tool has been developed for Indian sub-continent. An attempt is also made to use popular scientific and numerical libraries to create custom applications for digital inclusion. In the hydrological modelling calibration and validation are important steps which are repetitively carried out for the same study region. As such the developed tool will be user friendly and used efficiently for these repetitive processes by reducing the time required for data management and handling. Moreover, it was found that the developed tool can easily assimilate large dataset in an organised manner.

  15. Pika: A snow science simulation tool built using the open-source framework MOOSE

    Science.gov (United States)

    Slaughter, A.; Johnson, M.

    2017-12-01

    The Department of Energy (DOE) is currently investing millions of dollars annually into various modeling and simulation tools for all aspects of nuclear energy. An important part of this effort includes developing applications based on the open-source Multiphysics Object Oriented Simulation Environment (MOOSE; mooseframework.org) from Idaho National Laboratory (INL).Thanks to the efforts of the DOE and outside collaborators, MOOSE currently contains a large set of physics modules, including phase-field, level set, heat conduction, tensor mechanics, Navier-Stokes, fracture and crack propagation (via the extended finite-element method), flow in porous media, and others. The heat conduction, tensor mechanics, and phase-field modules, in particular, are well-suited for snow science problems. Pika--an open-source MOOSE-based application--is capable of simulating both 3D, coupled nonlinear continuum heat transfer and large-deformation mechanics applications (such as settlement) and phase-field based micro-structure applications. Additionally, these types of problems may be coupled tightly in a single solve or across length and time scales using a loosely coupled Picard iteration approach. In addition to the wide range of physics capabilities, MOOSE-based applications also inherit an extensible testing framework, graphical user interface, and documentation system; tools that allow MOOSE and other applications to adhere to nuclear software quality standards. The snow science community can learn from the nuclear industry and harness the existing effort to build simulation tools that are open, modular, and share a common framework. In particular, MOOSE-based multiphysics solvers are inherently parallel, dimension agnostic, adaptive in time and space, fully coupled, and capable of interacting with other applications. The snow science community should build on existing tools to enable collaboration between researchers and practitioners throughout the world, and advance the

  16. The Influence of Group Formation on Learner Participation, Language Complexity, and Corrective Behaviour in Synchronous Written Chat as Part of Academic German Studies

    Science.gov (United States)

    Fredriksson, Christine

    2015-01-01

    Synchronous written chat and instant messaging are tools which have been used and explored in online language learning settings for at least two decades. Research literature has shown that such tools give second language (L2) learners opportunities for language learning, e.g. , the interaction in real time with peers and native speakers, the…

  17. Open source tools for the information theoretic analysis of neural data

    Directory of Open Access Journals (Sweden)

    Robin A. A Ince

    2010-05-01

    Full Text Available The recent and rapid development of open-source software tools for the analysis of neurophysiological datasets consisting of multiple simultaneous recordings of spikes, field potentials and other neural signals holds the promise for a significant advance in the standardization, transparency, quality, reproducibility and variety of techniques used to analyze neurophysiological data and integrate the information obtained at different spatial and temporal scales. In this Review we focus on recent advances in open source toolboxes for the information theoretic analysis of neural responses. We also present examples of their use to investigate the role of spike timing precision, correlations across neurons and field potential fluctuations in the encoding of sensory information. These information toolboxes, available both in Matlab and Python programming environments, hold the potential to enlarge the domain of application of information theory to neuroscience and to lead to new discoveries about how neurons encode and transmit information.

  18. Open source tools for the information theoretic analysis of neural data.

    Science.gov (United States)

    Ince, Robin A A; Mazzoni, Alberto; Petersen, Rasmus S; Panzeri, Stefano

    2010-01-01

    The recent and rapid development of open source software tools for the analysis of neurophysiological datasets consisting of simultaneous multiple recordings of spikes, field potentials and other neural signals holds the promise for a significant advance in the standardization, transparency, quality, reproducibility and variety of techniques used to analyze neurophysiological data and for the integration of information obtained at different spatial and temporal scales. In this review we focus on recent advances in open source toolboxes for the information theoretic analysis of neural responses. We also present examples of their use to investigate the role of spike timing precision, correlations across neurons, and field potential fluctuations in the encoding of sensory information. These information toolboxes, available both in MATLAB and Python programming environments, hold the potential to enlarge the domain of application of information theory to neuroscience and to lead to new discoveries about how neurons encode and transmit information.

  19. Like an extended family: Relationships that emerge when older caregivers use written messages to communicate in an ICT-based healthcare service.

    Science.gov (United States)

    Solli, Hilde; Bjørk, Ida Torunn; Hvalvik, Sigrun; Hellesø, Ragnhild

    2018-03-01

    To explore the relationships that emerge amongst caregivers of persons with dementia and stroke when caregivers use written messages as their communication tool in a closed information and communication technology (ICT)-based support group. An explorative design with a qualitative approach was used that applied systematic text condensation (STC) to analyse 173 written messages extracted from a web forum. Empathetic, empowering and familiar relationships emerged amongst peers of older caregivers when the caregivers used written messages as their communication tool. The empathetic relationship was characterised by sincerity and openness when the caregivers shared emotions related to caregiving. The empowering relationship reflected a fellowship based on solidarity influenced by a sense of optimism and a willingness to share knowledge to support one another in overcoming challenges. In the familiar relationship, the caregivers were thoughtful and good-humoured with one another and displayed an attitude of consideration towards one another, as in an extended family. The use of computer-mediated communication in health care service will change the context of establishing and maintaining interpersonal relationships. Therefore, greater knowledge regarding how the peers of caregivers interact with one another is vital so nurses may better support and educate ICT-based support groups.

  20. Uses of the word "macula" in written English, 1400-present.

    Science.gov (United States)

    Schwartz, Stephen G; Leffler, Christopher T

    2014-01-01

    We compiled uses of the word "macula" in written English by searching multiple databases, including the Early English Books Online Text Creation Partnership, America's Historical Newspapers, the Gale Cengage Collections, and others. "Macula" has been used: as a non-medical "spot" or "stain", literal or figurative, including in astronomy and in Shakespeare; as a medical skin lesion, occasionally with a following descriptive adjective, such as a color (macula alba); as a corneal lesion, including the earliest identified use in English, circa 1400; and to describe the center of the retina. Francesco Buzzi described a yellow color in the posterior pole ("retina tinta di un color giallo") in 1782, but did not use the word "macula". "Macula lutea" was published by Samuel Thomas von Sömmering by 1799, and subsequently used in 1818 by James Wardrop, which appears to be the first known use in English. The Google n-gram database shows a marked increase in the frequencies of both "macula" and "macula lutea" following the introduction of the ophthalmoscope in 1850. "Macula" has been used in multiple contexts in written English. Modern databases provide powerful tools to explore historical uses of this word, which may be underappreciated by contemporary ophthalmologists. Copyright © 2014 Elsevier Inc. All rights reserved.

  1. TOOLPACK1, Tools for Development and Maintenance of FORTRAN 77 Program

    International Nuclear Information System (INIS)

    Cowell, Wayne R.

    1993-01-01

    1 - Description of program or function: TOOLPACK1 consists of the following categories of software; (1) an integrated collection of tools intended to support the development and maintenance of FORTRAN 77 programs, in particular moderate-sized collections of mathematical software; (2) several user/Toolpack interfaces, one of which is selected for use at any particular installation; (3) three implementations of the tool/system interface, called TIE (Tool Interface to the Environment). The tools are written in FORTRAN 77 and are portable among TIE installations. The source contains symbolic constants as macro names and must be expanded with a suitable macro expander before being compiled and loaded. A portable macro expander is supplied in TOOLPACK1. The tools may be divided into three functional areas: general, documentation, and processing. One tool, the macro processor, Can be used in any of these categories. ISTDC: data comparison tool is designed mainly for comparing files of numeric values, and files with embedded text. ISTET Expands tabs. ISTFI: finds all the include files that a file needs. ISTGP Searches multiple files for occurrences of a regular expression. ISTHP: will provide limited help information about tools. ISTMP: The macro processor may be used to pre-process a file. The processor provides macro replacement, inclusion, conditional replacement, and processing capabilities for complex file processing. ISTSP: TIE-conforming version of the SPLIT utility to split up the concatenated files used on the tape. ISTSV: save/restore utility to save and restore sub-trees of the Portable File Store (PFS). ISTTD: text comparison tool. ISTVC: simple text file version controller. ISTAL: aids is a preprocessor that can be used to generate specific information from intermediate files created by other tools. The information that can be generated includes call-graphs, cross reference listings, segment execution frequencies, and symbol information. ISTAL can also strip

  2. 12 CFR 704.16 - Contracts/written agreements.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Contracts/written agreements. 704.16 Section 704.16 Banks and Banking NATIONAL CREDIT UNION ADMINISTRATION REGULATIONS AFFECTING CREDIT UNIONS CORPORATE CREDIT UNIONS § 704.16 Contracts/written agreements. Services, facilities, personnel, or equipment...

  3. Evanescent fields of laser written waveguides

    Science.gov (United States)

    Jukić, Dario; Pohl, Thomas; Götte, Jörg B.

    2015-03-01

    We investigate the evanescent field at the surface of laser written waveguides. The waveguides are written by a direct femtosecond laser writing process into fused silica, which is then sanded down to expose the guiding layer. These waveguides support eigenmodes which have an evanescent field reaching into the vacuum above the waveguide. We study the governing wave equations and present solution for the fundamental eigenmodes of the modified waveguides.

  4. Programming database tools for the casual user

    International Nuclear Information System (INIS)

    Katz, R.A; Griffiths, C.

    1990-01-01

    The AGS Distributed Control System (AGSDCS) uses a relational database management system (INTERBASE) for the storage of all data associated with the control of the particle accelerator complex. This includes the static data which describes the component devices of the complex, as well as data for application program startup and data records that are used in analysis. Due to licensing restraints, it was necessary to develop tools to allow programs requiring access to a database to be unconcerned whether or not they were running on a licensed node. An in-house database server program was written, using Apollo mailbox communication protocols, allowing application programs via calls to this server to access the interbase database. Initially, the tools used by the server to actually access the database were written using the GDML C host language interface. Through the evolutionary learning process these tools have been converted to Dynamic SQL. Additionally, these tools have been extracted from the exclusive province of the database server and placed in their own library. This enables application programs to use these same tools on a licensed node without using the database server and without having to modify the application code. The syntax of the C calls remain the same

  5. P-TRAP: a Panicle TRAit Phenotyping tool.

    Science.gov (United States)

    A L-Tam, Faroq; Adam, Helene; Anjos, António dos; Lorieux, Mathias; Larmande, Pierre; Ghesquière, Alain; Jouannic, Stefan; Shahbazkia, Hamid Reza

    2013-08-29

    In crops, inflorescence complexity and the shape and size of the seed are among the most important characters that influence yield. For example, rice panicles vary considerably in the number and order of branches, elongation of the axis, and the shape and size of the seed. Manual low-throughput phenotyping methods are time consuming, and the results are unreliable. However, high-throughput image analysis of the qualitative and quantitative traits of rice panicles is essential for understanding the diversity of the panicle as well as for breeding programs. This paper presents P-TRAP software (Panicle TRAit Phenotyping), a free open source application for high-throughput measurements of panicle architecture and seed-related traits. The software is written in Java and can be used with different platforms (the user-friendly Graphical User Interface (GUI) uses Netbeans Platform 7.3). The application offers three main tools: a tool for the analysis of panicle structure, a spikelet/grain counting tool, and a tool for the analysis of seed shape. The three tools can be used independently or simultaneously for analysis of the same image. Results are then reported in the Extensible Markup Language (XML) and Comma Separated Values (CSV) file formats. Images of rice panicles were used to evaluate the efficiency and robustness of the software. Compared to data obtained by manual processing, P-TRAP produced reliable results in a much shorter time. In addition, manual processing is not repeatable because dry panicles are vulnerable to damage. The software is very useful, practical and collects much more data than human operators. P-TRAP is a new open source software that automatically recognizes the structure of a panicle and the seeds on the panicle in numeric images. The software processes and quantifies several traits related to panicle structure, detects and counts the grains, and measures their shape parameters. In short, P-TRAP offers both efficient results and a user

  6. Leveraging Existing Mission Tools in a Re-Usable, Component-Based Software Environment

    Science.gov (United States)

    Greene, Kevin; Grenander, Sven; Kurien, James; z,s (fshir. z[orttr); z,scer; O'Reilly, Taifun

    2006-01-01

    Emerging methods in component-based software development offer significant advantages but may seem incompatible with existing mission operations applications. In this paper we relate our positive experiences integrating existing mission applications into component-based tools we are delivering to three missions. In most operations environments, a number of software applications have been integrated together to form the mission operations software. In contrast, with component-based software development chunks of related functionality and data structures, referred to as components, can be individually delivered, integrated and re-used. With the advent of powerful tools for managing component-based development, complex software systems can potentially see significant benefits in ease of integration, testability and reusability from these techniques. These benefits motivate us to ask how component-based development techniques can be relevant in a mission operations environment, where there is significant investment in software tools that are not component-based and may not be written in languages for which component-based tools even exist. Trusted and complex software tools for sequencing, validation, navigation, and other vital functions cannot simply be re-written or abandoned in order to gain the advantages offered by emerging component-based software techniques. Thus some middle ground must be found. We have faced exactly this issue, and have found several solutions. Ensemble is an open platform for development, integration, and deployment of mission operations software that we are developing. Ensemble itself is an extension of an open source, component-based software development platform called Eclipse. Due to the advantages of component-based development, we have been able to vary rapidly develop mission operations tools for three surface missions by mixing and matching from a common set of mission operation components. We have also had to determine how to

  7. An Open-Source Web-Based Tool for Resource-Agnostic Interactive Translation Prediction

    Directory of Open Access Journals (Sweden)

    Daniel Torregrosa

    2014-09-01

    Full Text Available We present a web-based open-source tool for interactive translation prediction (ITP and describe its underlying architecture. ITP systems assist human translators by making context-based computer-generated suggestions as they type. Most of the ITP systems in literature are strongly coupled with a statistical machine translation system that is conveniently adapted to provide the suggestions. Our system, however, follows a resource-agnostic approach and suggestions are obtained from any unmodified black-box bilingual resource. This paper reviews our ITP method and describes the architecture of Forecat, a web tool, partly based on the recent technology of web components, that eases the use of our ITP approach in any web application requiring this kind of translation assistance. We also evaluate the performance of our method when using an unmodified Moses-based statistical machine translation system as the bilingual resource.

  8. Managing Written Directives: A Software Solution to Streamline Workflow.

    Science.gov (United States)

    Wagner, Robert H; Savir-Baruch, Bital; Gabriel, Medhat S; Halama, James R; Bova, Davide

    2017-06-01

    A written directive is required by the U.S. Nuclear Regulatory Commission for any use of 131 I above 1.11 MBq (30 μCi) and for patients receiving radiopharmaceutical therapy. This requirement has also been adopted and must be enforced by the agreement states. As the introduction of new radiopharmaceuticals increases therapeutic options in nuclear medicine, time spent on regulatory paperwork also increases. The pressure of managing these time-consuming regulatory requirements may heighten the potential for inaccurate or incomplete directive data and subsequent regulatory violations. To improve on the paper-trail method of directive management, we created a software tool using a Health Insurance Portability and Accountability Act (HIPAA)-compliant database. This software allows for secure data-sharing among physicians, technologists, and managers while saving time, reducing errors, and eliminating the possibility of loss and duplication. Methods: The software tool was developed using Visual Basic, which is part of the Visual Studio development environment for the Windows platform. Patient data are deposited in an Access database on a local HIPAA-compliant secure server or hard disk. Once a working version had been developed, it was installed at our institution and used to manage directives. Updates and modifications of the software were released regularly until no more significant problems were found with its operation. Results: The software has been used at our institution for over 2 y and has reliably kept track of all directives. All physicians and technologists use the software daily and find it superior to paper directives. They can retrieve active directives at any stage of completion, as well as completed directives. Conclusion: We have developed a software solution for the management of written directives that streamlines and structures the departmental workflow. This solution saves time, centralizes the information for all staff to share, and decreases

  9. 45 CFR 99.26 - Unsponsored written material.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Unsponsored written material. 99.26 Section 99.26 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION PROCEDURE FOR HEARINGS FOR THE CHILD CARE AND DEVELOPMENT FUND Hearing Procedures § 99.26 Unsponsored written material. Letters...

  10. Perception and Assessment of Verbal and Written Information on Sex and Relationships after Hematopoietic Stem Cell Transplantation.

    Science.gov (United States)

    Wendt, Christel

    2017-12-01

    This study aimed to investigate experiences of verbal and written information about sex and relationships among men and women treated with hematopoietic stem cell transplantation. The study also aimed to investigate the demand for information and assessment of the quality of written patient information material entitled "Sex and relationships in the treatment of blood diseases." Few studies exist that shed any light on the demand for information about sex and relationships on the part of patients with hematological diseases before, during, and after their treatment. A total of 216 patients undergoing treatment for malignant blood diseases between 2000 and 2010 participated in this study. Patients' experiences of information about sex and relationships, and their opinions about the written patient information, were assessed using a questionnaire created specifically for this study. Most patients (81 %) had not received information about sex and relationships from a healthcare professional. Almost 90 % of men felt that verbal information was important, compared with 82 % of women. The majority also held that written information was important. These results indicate that patients, regardless of gender, age, and treatment, consider oral and written information about sex and relationships to be important and that the healthcare system should provide the information. The written patient information was considered to play an important role in creating an opening for a conversation about a sensitive topic such as sexuality, and also as a source of reference and support for the patient and his/her partner.

  11. CellProfiler and KNIME: open source tools for high content screening.

    Science.gov (United States)

    Stöter, Martin; Niederlein, Antje; Barsacchi, Rico; Meyenhofer, Felix; Brandl, Holger; Bickle, Marc

    2013-01-01

    High content screening (HCS) has established itself in the world of the pharmaceutical industry as an essential tool for drug discovery and drug development. HCS is currently starting to enter the academic world and might become a widely used technology. Given the diversity of problems tackled in academic research, HCS could experience some profound changes in the future, mainly with more imaging modalities and smart microscopes being developed. One of the limitations in the establishment of HCS in academia is flexibility and cost. Flexibility is important to be able to adapt the HCS setup to accommodate the multiple different assays typical of academia. Many cost factors cannot be avoided, but the costs of the software packages necessary to analyze large datasets can be reduced by using Open Source software. We present and discuss the Open Source software CellProfiler for image analysis and KNIME for data analysis and data mining that provide software solutions which increase flexibility and keep costs low.

  12. Identifying Sources of Clinical Conflict: A Tool for Practice and Training in Bioethics Mediation.

    Science.gov (United States)

    Bergman, Edward J

    2015-01-01

    Bioethics mediators manage a wide range of clinical conflict emanating from diverse sources. Parties to clinical conflict are often not fully aware of, nor willing to express, the true nature and scope of their conflict. As such, a significant task of the bioethics mediator is to help define that conflict. The ability to assess and apply the tools necessary for an effective mediation process can be facilitated by each mediator's creation of a personal compendium of sources that generate clinical conflict, to provide an orientation for the successful management of complex dilemmatic cases. Copyright 2015 The Journal of Clinical Ethics. All rights reserved.

  13. Total organic carbon, an important tool in an holistic approach to hydrocarbon source fingerprinting

    Energy Technology Data Exchange (ETDEWEB)

    Boehm, P.D.; Burns, W.A.; Page, D.S.; Bence, A.E.; Mankiewicz, P.J.; Brown, J.S.; Douglas, G.S. [Battelle Member Inst., Waltham, MA (United States)

    2002-07-01

    The identification and allocation of multiple hydrocarbon sources in marine sediments is best achieved using an holistic approach. Total organic carbon (TOC) is one important tool that can constrain the contributions of specific sources and rule out incorrect source allocations in cases where inputs are dominated by fossil organic carbon. In a study of the benthic sediments from Prince William Sound (PWS) and the Gulf of Alaska (GOA), we find excellent agreement between measured TOC and TOC calculated from hydrocarbon fingerprint matches of polycyclic aromatic hydrocarbons (PAH) and chemical biomarkers. Confirmation by two such independent source indicators (TOC and fingerprint matches) provides evidence that source allocations determined by the fingerprint matches are robust and that the major TOC sources have been correctly identified. Fingerprint matches quantify the hydrocarbon contributions of various sources to the benthic sediments and the degree of hydrocarbon winnowing by waves and currents. TOC contents are then calculated using source allocation results from fingerprint matches and the TOCs of contributing sources. Comparisons of the actual sediment TOC values and those calculated from source allocation support our earlier published findings that the natural petrogenic hydrocarbon background in sediments in this area comes from eroding Tertiary shales and associated oil seeps along the northern GOA coast and exclude thermally mature area coals from being important contributors to the PWS background due to their high TOC content.

  14. Oral and Literate Strategies in Spoken and Written Narratives.

    Science.gov (United States)

    Tannen, Deborah

    1982-01-01

    Discusses comparative analysis of spoken and written versions of a narrative to demonstrate that features which have been identified as characterizing oral discourse are also found in written discourse and that the written short story combines syntactic complexity expected in writing with features which create involvement expected in speaking.…

  15. Automatic differentiation tools in the dynamic simulation of chemical engineering processes

    Directory of Open Access Journals (Sweden)

    Castro M.C.

    2000-01-01

    Full Text Available Automatic Differentiation is a relatively recent technique developed for the differentiation of functions applicable directly to the source code to compute the function written in standard programming languages. That technique permits the automatization of the differentiation step, crucial for dynamic simulation and optimization of processes. The values for the derivatives obtained with AD are exact (to roundoff. The theoretical exactness of the AD comes from the fact that it uses the same rules of differentiation as in differential calculus, but these rules are applied to an algorithmic specification of the function rather than to a formula. The main purpose of this contribution is to discuss the impact of Automatic Differentiation in the field of dynamic simulation of chemical engineering processes. The influence of the differentiation technique on the behavior of the integration code, the performance of the generated code and the incorporation of AD tools in consistent initialization tools are discussed from the viewpoint of dynamic simulation of typical models in chemical engineering.

  16. Open source tools for management and archiving of digital microscopy data to allow integration with patient pathology and treatment information.

    Science.gov (United States)

    Khushi, Matloob; Edwards, Georgina; de Marcos, Diego Alonso; Carpenter, Jane E; Graham, J Dinny; Clarke, Christine L

    2013-02-12

    Virtual microscopy includes digitisation of histology slides and the use of computer technologies for complex investigation of diseases such as cancer. However, automated image analysis, or website publishing of such digital images, is hampered by their large file sizes. We have developed two Java based open source tools: Snapshot Creator and NDPI-Splitter. Snapshot Creator converts a portion of a large digital slide into a desired quality JPEG image. The image is linked to the patient's clinical and treatment information in a customised open source cancer data management software (Caisis) in use at the Australian Breast Cancer Tissue Bank (ABCTB) and then published on the ABCTB website (http://www.abctb.org.au) using Deep Zoom open source technology. Using the ABCTB online search engine, digital images can be searched by defining various criteria such as cancer type, or biomarkers expressed. NDPI-Splitter splits a large image file into smaller sections of TIFF images so that they can be easily analysed by image analysis software such as Metamorph or Matlab. NDPI-Splitter also has the capacity to filter out empty images. Snapshot Creator and NDPI-Splitter are novel open source Java tools. They convert digital slides into files of smaller size for further processing. In conjunction with other open source tools such as Deep Zoom and Caisis, this suite of tools is used for the management and archiving of digital microscopy images, enabling digitised images to be explored and zoomed online. Our online image repository also has the capacity to be used as a teaching resource. These tools also enable large files to be sectioned for image analysis. The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/5330903258483934.

  17. Open source tools for management and archiving of digital microscopy data to allow integration with patient pathology and treatment information

    Directory of Open Access Journals (Sweden)

    Khushi Matloob

    2013-02-01

    Full Text Available Abstract Background Virtual microscopy includes digitisation of histology slides and the use of computer technologies for complex investigation of diseases such as cancer. However, automated image analysis, or website publishing of such digital images, is hampered by their large file sizes. Results We have developed two Java based open source tools: Snapshot Creator and NDPI-Splitter. Snapshot Creator converts a portion of a large digital slide into a desired quality JPEG image. The image is linked to the patient’s clinical and treatment information in a customised open source cancer data management software (Caisis in use at the Australian Breast Cancer Tissue Bank (ABCTB and then published on the ABCTB website (http://www.abctb.org.au using Deep Zoom open source technology. Using the ABCTB online search engine, digital images can be searched by defining various criteria such as cancer type, or biomarkers expressed. NDPI-Splitter splits a large image file into smaller sections of TIFF images so that they can be easily analysed by image analysis software such as Metamorph or Matlab. NDPI-Splitter also has the capacity to filter out empty images. Conclusions Snapshot Creator and NDPI-Splitter are novel open source Java tools. They convert digital slides into files of smaller size for further processing. In conjunction with other open source tools such as Deep Zoom and Caisis, this suite of tools is used for the management and archiving of digital microscopy images, enabling digitised images to be explored and zoomed online. Our online image repository also has the capacity to be used as a teaching resource. These tools also enable large files to be sectioned for image analysis. Virtual Slides The virtual slide(s for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/5330903258483934

  18. Neurophysiological analytics for all! Free open-source software tools for documenting, analyzing, visualizing, and sharing using electronic notebooks.

    Science.gov (United States)

    Rosenberg, David M; Horn, Charles C

    2016-08-01

    Neurophysiology requires an extensive workflow of information analysis routines, which often includes incompatible proprietary software, introducing limitations based on financial costs, transfer of data between platforms, and the ability to share. An ecosystem of free open-source software exists to fill these gaps, including thousands of analysis and plotting packages written in Python and R, which can be implemented in a sharable and reproducible format, such as the Jupyter electronic notebook. This tool chain can largely replace current routines by importing data, producing analyses, and generating publication-quality graphics. An electronic notebook like Jupyter allows these analyses, along with documentation of procedures, to display locally or remotely in an internet browser, which can be saved as an HTML, PDF, or other file format for sharing with team members and the scientific community. The present report illustrates these methods using data from electrophysiological recordings of the musk shrew vagus-a model system to investigate gut-brain communication, for example, in cancer chemotherapy-induced emesis. We show methods for spike sorting (including statistical validation), spike train analysis, and analysis of compound action potentials in notebooks. Raw data and code are available from notebooks in data supplements or from an executable online version, which replicates all analyses without installing software-an implementation of reproducible research. This demonstrates the promise of combining disparate analyses into one platform, along with the ease of sharing this work. In an age of diverse, high-throughput computational workflows, this methodology can increase efficiency, transparency, and the collaborative potential of neurophysiological research. Copyright © 2016 the American Physiological Society.

  19. OpenDrift - an open source framework for ocean trajectory modeling

    Science.gov (United States)

    Dagestad, Knut-Frode; Breivik, Øyvind; Ådlandsvik, Bjørn

    2016-04-01

    We will present a new, open source tool for modeling the trajectories and fate of particles or substances (Lagrangian Elements) drifting in the ocean, or even in the atmosphere. The software is named OpenDrift, and has been developed at Norwegian Meteorological Institute in cooperation with Institute of Marine Research. OpenDrift is a generic framework written in Python, and is openly available at https://github.com/knutfrode/opendrift/. The framework is modular with respect to three aspects: (1) obtaining input data, (2) the transport/morphological processes, and (3) exporting of results to file. Modularity is achieved through well defined interfaces between components, and use of a consistent vocabulary (CF conventions) for naming of variables. Modular input implies that it is not necessary to preprocess input data (e.g. currents, wind and waves from Eulerian models) to a particular file format. Instead "reader modules" can be written/used to obtain data directly from any original source, including files or through web based protocols (e.g. OPeNDAP/Thredds). Modularity of processes implies that a model developer may focus on the geophysical processes relevant for the application of interest, without needing to consider technical tasks such as reading, reprojecting, and colocating input data, rotation and scaling of vectors and model output. We will show a few example applications of using OpenDrift for predicting drifters, oil spills, and search and rescue objects.

  20. Polymorphism and Module-Reuse Mechanisms for Algebraic Petri Nets in CoopnTools

    OpenAIRE

    Buffo, Mathieu; Buchs, Didier; Donatelli, S.; Kleijn, J.

    1999-01-01

    This paper introduces CoopnTools, a tool set allowing the support of object-oriented specifications written by means of the language CO-OPN/2, based on synchronised algebraic Petri nets. In particular, this paper shows how concrete mechanisms dealing with polymorphism and module-reuse are implemented in CoopnTools.

  1. Mobile devices tools and technologies

    CERN Document Server

    Collins, Lauren

    2015-01-01

    Mobile Devices: Tools and Technologies provides readers with an understanding of the mobile landscape available to app developers, system and network engineers, and the avid techie. As the trend of mobile technology has enabled the continuous development of ubiquitous applications, this book offers insights into tools and technologies critical to evaluating and implementing mobile strategies.The book is organized into four parts of 18 contributed chapters written by engineers in the areas of application and database development, mobile enterprise strategy, and networking and security. Througho

  2. Effects of Written and Auditory Language-Processing Skills on Written Passage Comprehension in Middle and High School Students

    Science.gov (United States)

    Caplan, David; Waters, Gloria; Bertram, Julia; Ostrowski, Adam; Michaud, Jennifer

    2016-01-01

    The authors assessed 4,865 middle and high school students for the ability to recognize and understand written and spoken morphologically simple words, morphologically complex words, and the syntactic structure of sentences and for the ability to answer questions about facts presented in a written passage and to make inferences based on those…

  3. Identification of facilitators and barriers to residents' use of a clinical reasoning tool.

    Science.gov (United States)

    DiNardo, Deborah; Tilstra, Sarah; McNeil, Melissa; Follansbee, William; Zimmer, Shanta; Farris, Coreen; Barnato, Amber E

    2018-03-28

    While there is some experimental evidence to support the use of cognitive forcing strategies to reduce diagnostic error in residents, the potential usability of such strategies in the clinical setting has not been explored. We sought to test the effect of a clinical reasoning tool on diagnostic accuracy and to obtain feedback on its usability and acceptability. We conducted a randomized behavioral experiment testing the effect of this tool on diagnostic accuracy on written cases among post-graduate 3 (PGY-3) residents at a single internal medical residency program in 2014. Residents completed written clinical cases in a proctored setting with and without prompts to use the tool. The tool encouraged reflection on concordant and discordant aspects of each case. We used random effects regression to assess the effect of the tool on diagnostic accuracy of the independent case sets, controlling for case complexity. We then conducted audiotaped structured focus group debriefing sessions and reviewed the tapes for facilitators and barriers to use of the tool. Of 51 eligible PGY-3 residents, 34 (67%) participated in the study. The average diagnostic accuracy increased from 52% to 60% with the tool, a difference that just met the test for statistical significance in adjusted analyses (p=0.05). Residents reported that the tool was generally acceptable and understandable but did not recognize its utility for use with simple cases, suggesting the presence of overconfidence bias. A clinical reasoning tool improved residents' diagnostic accuracy on written cases. Overconfidence bias is a potential barrier to its use in the clinical setting.

  4. An open-source software tool for the generation of relaxation time maps in magnetic resonance imaging

    International Nuclear Information System (INIS)

    Messroghli, Daniel R; Rudolph, Andre; Abdel-Aty, Hassan; Wassmuth, Ralf; Kühne, Titus; Dietz, Rainer; Schulz-Menger, Jeanette

    2010-01-01

    In magnetic resonance (MR) imaging, T1, T2 and T2* relaxation times represent characteristic tissue properties that can be quantified with the help of specific imaging strategies. While there are basic software tools for specific pulse sequences, until now there is no universal software program available to automate pixel-wise mapping of relaxation times from various types of images or MR systems. Such a software program would allow researchers to test and compare new imaging strategies and thus would significantly facilitate research in the area of quantitative tissue characterization. After defining requirements for a universal MR mapping tool, a software program named MRmap was created using a high-level graphics language. Additional features include a manual registration tool for source images with motion artifacts and a tabular DICOM viewer to examine pulse sequence parameters. MRmap was successfully tested on three different computer platforms with image data from three different MR system manufacturers and five different sorts of pulse sequences: multi-image inversion recovery T1; Look-Locker/TOMROP T1; modified Look-Locker (MOLLI) T1; single-echo T2/T2*; and multi-echo T2/T2*. Computing times varied between 2 and 113 seconds. Estimates of relaxation times compared favorably to those obtained from non-automated curve fitting. Completed maps were exported in DICOM format and could be read in standard software packages used for analysis of clinical and research MR data. MRmap is a flexible cross-platform research tool that enables accurate mapping of relaxation times from various pulse sequences. The software allows researchers to optimize quantitative MR strategies in a manufacturer-independent fashion. The program and its source code were made available as open-source software on the internet

  5. An open-source software tool for the generation of relaxation time maps in magnetic resonance imaging

    Directory of Open Access Journals (Sweden)

    Kühne Titus

    2010-07-01

    Full Text Available Abstract Background In magnetic resonance (MR imaging, T1, T2 and T2* relaxation times represent characteristic tissue properties that can be quantified with the help of specific imaging strategies. While there are basic software tools for specific pulse sequences, until now there is no universal software program available to automate pixel-wise mapping of relaxation times from various types of images or MR systems. Such a software program would allow researchers to test and compare new imaging strategies and thus would significantly facilitate research in the area of quantitative tissue characterization. Results After defining requirements for a universal MR mapping tool, a software program named MRmap was created using a high-level graphics language. Additional features include a manual registration tool for source images with motion artifacts and a tabular DICOM viewer to examine pulse sequence parameters. MRmap was successfully tested on three different computer platforms with image data from three different MR system manufacturers and five different sorts of pulse sequences: multi-image inversion recovery T1; Look-Locker/TOMROP T1; modified Look-Locker (MOLLI T1; single-echo T2/T2*; and multi-echo T2/T2*. Computing times varied between 2 and 113 seconds. Estimates of relaxation times compared favorably to those obtained from non-automated curve fitting. Completed maps were exported in DICOM format and could be read in standard software packages used for analysis of clinical and research MR data. Conclusions MRmap is a flexible cross-platform research tool that enables accurate mapping of relaxation times from various pulse sequences. The software allows researchers to optimize quantitative MR strategies in a manufacturer-independent fashion. The program and its source code were made available as open-source software on the internet.

  6. Sourcing While Reading Divergent Expert Accounts: Pathways from Views of Knowing to Written Argumentation

    Science.gov (United States)

    Barzilai, Sarit; Tzadok, Eynav; Eshet-Alkalai, Yoram

    2015-01-01

    Sourcing is vital for knowledge construction from online information sources, yet learners may find it difficult to engage in effective sourcing. Sourcing can be particularly challenging when lay readers encounter conflicting expert accounts of controversial topics, a situation which is increasingly common when learning online. The aim of this…

  7. Data Integration Tool: From Permafrost Data Translation Research Tool to A Robust Research Application

    Science.gov (United States)

    Wilcox, H.; Schaefer, K. M.; Jafarov, E. E.; Strawhacker, C.; Pulsifer, P. L.; Thurmes, N.

    2016-12-01

    The United States National Science Foundation funded PermaData project led by the National Snow and Ice Data Center (NSIDC) with a team from the Global Terrestrial Network for Permafrost (GTN-P) aimed to improve permafrost data access and discovery. We developed a Data Integration Tool (DIT) to significantly speed up the time of manual processing needed to translate inconsistent, scattered historical permafrost data into files ready to ingest directly into the GTN-P. We leverage this data to support science research and policy decisions. DIT is a workflow manager that divides data preparation and analysis into a series of steps or operations called widgets. Each widget does a specific operation, such as read, multiply by a constant, sort, plot, and write data. DIT allows the user to select and order the widgets as desired to meet their specific needs. Originally it was written to capture a scientist's personal, iterative, data manipulation and quality control process of visually and programmatically iterating through inconsistent input data, examining it to find problems, adding operations to address the problems, and rerunning until the data could be translated into the GTN-P standard format. Iterative development of this tool led to a Fortran/Python hybrid then, with consideration of users, licensing, version control, packaging, and workflow, to a publically available, robust, usable application. Transitioning to Python allowed the use of open source frameworks for the workflow core and integration with a javascript graphical workflow interface. DIT is targeted to automatically handle 90% of the data processing for field scientists, modelers, and non-discipline scientists. It is available as an open source tool in GitHub packaged for a subset of Mac, Windows, and UNIX systems as a desktop application with a graphical workflow manager. DIT was used to completely translate one dataset (133 sites) that was successfully added to GTN-P, nearly translate three datasets

  8. Recent negative ion source developments at ORNL

    International Nuclear Information System (INIS)

    Alton, G.D.

    1979-01-01

    According to specifications written for the 25 MV ORNL tandem accelerator, the ion source used during acceptance testing must be capable of producing a negative ion beam of intensity greater than or equal to 7.5 μA within a phase space of less than or equal to 1 π cm-mrad (MeV)/sup 1/2/. The specifications were written prior to the development of an ion source with such capabilities but fortunately Andersen and Tykesson introduced a source in 1975 which could easily meet the specified requirements. The remarkable beam intensity and quality properties of this source has motivated the development of other sources which utilize sputtering in the presence of a diffuse cesium plasma - some of which will be described in these proceedings. This report describes results of studies associated with the development of a modified Aarhus geometry and an axial geometry source which utilize sputtering in the presence of a diffuse cesium plasma for the production of negative ion beams

  9. ALPHACAL: A new user-friendly tool for the calibration of alpha-particle sources.

    Science.gov (United States)

    Timón, A Fernández; Vargas, M Jurado; Gallardo, P Álvarez; Sánchez-Oro, J; Peralta, L

    2018-05-01

    In this work, we present and describe the program ALPHACAL, specifically developed for the calibration of alpha-particle sources. It is therefore more user-friendly and less time-consuming than multipurpose codes developed for a wide range of applications. The program is based on the recently developed code AlfaMC, which simulates specifically the transport of alpha particles. Both cylindrical and point sources mounted on the surface of polished backings can be simulated, as is the convention in experimental measurements of alpha-particle sources. In addition to the efficiency calculation and determination of the backscattering coefficient, some additional tools are available to the user, like the visualization of energy spectrum, use of energy cut-off or low-energy tail corrections. ALPHACAL has been implemented in C++ language using QT library, so it is available for Windows, MacOs and Linux platforms. It is free and can be provided under request to the authors. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. Role of Open Source Tools and Resources in Virtual Screening for Drug Discovery.

    Science.gov (United States)

    Karthikeyan, Muthukumarasamy; Vyas, Renu

    2015-01-01

    Advancement in chemoinformatics research in parallel with availability of high performance computing platform has made handling of large scale multi-dimensional scientific data for high throughput drug discovery easier. In this study we have explored publicly available molecular databases with the help of open-source based integrated in-house molecular informatics tools for virtual screening. The virtual screening literature for past decade has been extensively investigated and thoroughly analyzed to reveal interesting patterns with respect to the drug, target, scaffold and disease space. The review also focuses on the integrated chemoinformatics tools that are capable of harvesting chemical data from textual literature information and transform them into truly computable chemical structures, identification of unique fragments and scaffolds from a class of compounds, automatic generation of focused virtual libraries, computation of molecular descriptors for structure-activity relationship studies, application of conventional filters used in lead discovery along with in-house developed exhaustive PTC (Pharmacophore, Toxicophores and Chemophores) filters and machine learning tools for the design of potential disease specific inhibitors. A case study on kinase inhibitors is provided as an example.

  11. A Quantitative Analysis of Uncertainty in the Grading of Written Exams in Mathematics and Physics

    Science.gov (United States)

    Hammer, Hugo Lewi; Habib, Laurence

    2016-01-01

    The most common way to grade students in courses at university and university college level is to use final written exams. The aim of final exams is generally to provide a reliable and a valid measurement of the extent to which a student has achieved the learning outcomes for the course. A source of uncertainty in grading students based on an exam…

  12. An open source software tool to assign the material properties of bone for ABAQUS finite element simulations.

    Science.gov (United States)

    Pegg, Elise C; Gill, Harinderjit S

    2016-09-06

    A new software tool to assign the material properties of bone to an ABAQUS finite element mesh was created and compared with Bonemat, a similar tool originally designed to work with Ansys finite element models. Our software tool (py_bonemat_abaqus) was written in Python, which is the chosen scripting language for ABAQUS. The purpose of this study was to compare the software packages in terms of the material assignment calculation and processing speed. Three element types were compared (linear hexahedral (C3D8), linear tetrahedral (C3D4) and quadratic tetrahedral elements (C3D10)), both individually and as part of a mesh. Comparisons were made using a CT scan of a hemi-pelvis as a test case. A small difference, of -0.05kPa on average, was found between Bonemat version 3.1 (the current version) and our Python package. Errors were found in the previous release of Bonemat (version 3.0 downloaded from www.biomedtown.org) during calculation of the quadratic tetrahedron Jacobian, and conversion of the apparent density to modulus when integrating over the Young׳s modulus field. These issues caused up to 2GPa error in the modulus assignment. For these reasons, we recommend users upgrade to the most recent release of Bonemat. Processing speeds were assessed for the three different element types. Our Python package took significantly longer (110s on average) to perform the calculations compared with the Bonemat software (10s). Nevertheless, the workflow advantages of the package and added functionality makes 'py_bonemat_abaqus' a useful tool for ABAQUS users. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. Tracing catchment fine sediment sources using the new SIFT (SedIment Fingerprinting Tool) open source software.

    Science.gov (United States)

    Pulley, S; Collins, A L

    2018-09-01

    The mitigation of diffuse sediment pollution requires reliable provenance information so that measures can be targeted. Sediment source fingerprinting represents one approach for supporting these needs, but recent methodological developments have resulted in an increasing complexity of data processing methods rendering the approach less accessible to non-specialists. A comprehensive new software programme (SIFT; SedIment Fingerprinting Tool) has therefore been developed which guides the user through critical data analysis decisions and automates all calculations. Multiple source group configurations and composite fingerprints are identified and tested using multiple methods of uncertainty analysis. This aims to explore the sediment provenance information provided by the tracers more comprehensively than a single model, and allows for model configurations with high uncertainties to be rejected. This paper provides an overview of its application to an agricultural catchment in the UK to determine if the approach used can provide a reduction in uncertainty and increase in precision. Five source group classifications were used; three formed using a k-means cluster analysis containing 2, 3 and 4 clusters, and two a-priori groups based upon catchment geology. Three different composite fingerprints were used for each classification and bi-plots, range tests, tracer variability ratios and virtual mixtures tested the reliability of each model configuration. Some model configurations performed poorly when apportioning the composition of virtual mixtures, and different model configurations could produce different sediment provenance results despite using composite fingerprints able to discriminate robustly between the source groups. Despite this uncertainty, dominant sediment sources were identified, and those in close proximity to each sediment sampling location were found to be of greatest importance. This new software, by integrating recent methodological developments in

  14. Verification of Simulation Tools

    International Nuclear Information System (INIS)

    Richard, Thierry

    2015-01-01

    Before qualifying a simulation tool, the requirements shall first be clearly identified, i.e.: - What type of study needs to be carried out? - What phenomena need to be modeled? This phase involves writing a precise technical specification. Once the requirements are defined, the most adapted product shall be selected from the various software options available on the market. Before using a particular version of a simulation tool to support the demonstration of nuclear safety studies, the following requirements shall be met. - An auditable quality assurance process complying with development international standards shall be developed and maintained, - A process of verification and validation (V and V) shall be implemented. This approach requires: writing a report and/or executive summary of the V and V activities, defining a validated domain (domain in which the difference between the results of the tools and those of another qualified reference is considered satisfactory for its intended use). - Sufficient documentation shall be available, - A detailed and formal description of the product (software version number, user configuration, other settings and parameters) in the targeted computing environment shall be available. - Source codes corresponding to the software shall be archived appropriately. When these requirements are fulfilled, the version of the simulation tool shall be considered qualified for a defined domain of validity, in a given computing environment. The functional verification shall ensure that: - the computer architecture of the tool does not include errors, - the numerical solver correctly represents the physical mathematical model, - equations are solved correctly. The functional verification can be demonstrated through certification or report of Quality Assurance. The functional validation shall allow the user to ensure that the equations correctly represent the physical phenomena in the perimeter of intended use. The functional validation can

  15. CAMPAIGN: an open-source library of GPU-accelerated data clustering algorithms.

    Science.gov (United States)

    Kohlhoff, Kai J; Sosnick, Marc H; Hsu, William T; Pande, Vijay S; Altman, Russ B

    2011-08-15

    Data clustering techniques are an essential component of a good data analysis toolbox. Many current bioinformatics applications are inherently compute-intense and work with very large datasets. Sequential algorithms are inadequate for providing the necessary performance. For this reason, we have created Clustering Algorithms for Massively Parallel Architectures, Including GPU Nodes (CAMPAIGN), a central resource for data clustering algorithms and tools that are implemented specifically for execution on massively parallel processing architectures. CAMPAIGN is a library of data clustering algorithms and tools, written in 'C for CUDA' for Nvidia GPUs. The library provides up to two orders of magnitude speed-up over respective CPU-based clustering algorithms and is intended as an open-source resource. New modules from the community will be accepted into the library and the layout of it is such that it can easily be extended to promising future platforms such as OpenCL. Releases of the CAMPAIGN library are freely available for download under the LGPL from https://simtk.org/home/campaign. Source code can also be obtained through anonymous subversion access as described on https://simtk.org/scm/?group_id=453. kjk33@cantab.net.

  16. Using Plickers as an Assessment Tool in Health and Physical Education Settings

    Science.gov (United States)

    Chng, Lena; Gurvitch, Rachel

    2018-01-01

    Written tests are one of the most common assessment tools classroom teachers use today. Despite its popularity, administering written tests or surveys, especially in health and physical education settings, is time consuming. In addition to the time taken to type and print out the tests or surveys, health and physical education teachers must grade…

  17. An Open Source Tool to Test Interoperability

    Science.gov (United States)

    Bermudez, L. E.

    2012-12-01

    Scientists interact with information at various levels from gathering of the raw observed data to accessing portrayed processed quality control data. Geoinformatics tools help scientist on the acquisition, storage, processing, dissemination and presentation of geospatial information. Most of the interactions occur in a distributed environment between software components that take the role of either client or server. The communication between components includes protocols, encodings of messages and managing of errors. Testing of these communication components is important to guarantee proper implementation of standards. The communication between clients and servers can be adhoc or follow standards. By following standards interoperability between components increase while reducing the time of developing new software. The Open Geospatial Consortium (OGC), not only coordinates the development of standards but also, within the Compliance Testing Program (CITE), provides a testing infrastructure to test clients and servers. The OGC Web-based Test Engine Facility, based on TEAM Engine, allows developers to test Web services and clients for correct implementation of OGC standards. TEAM Engine is a JAVA open source facility, available at Sourceforge that can be run via command line, deployed in a web servlet container or integrated in developer's environment via MAVEN. The TEAM Engine uses the Compliance Test Language (CTL) and TestNG to test HTTP requests, SOAP services and XML instances against Schemas and Schematron based assertions of any type of web service, not only OGC services. For example, the OGC Web Feature Service (WFS) 1.0.0 test has more than 400 test assertions. Some of these assertions includes conformance of HTTP responses, conformance of GML-encoded data; proper values for elements and attributes in the XML; and, correct error responses. This presentation will provide an overview of TEAM Engine, introduction of how to test via the OGC Testing web site and

  18. A Feynman graph selection tool in GRACE system

    International Nuclear Information System (INIS)

    Yuasa, Fukuko; Ishikawa, Tadashi; Kaneko, Toshiaki

    2001-01-01

    We present a Feynman graph selection tool grcsel, which is an interpreter written in C language. In the framework of GRACE, it enables us to get a subset of Feynman graphs according to given conditions

  19. Transforming Biology Assessment with Machine Learning: Automated Scoring of Written Evolutionary Explanations

    Science.gov (United States)

    Nehm, Ross H.; Ha, Minsu; Mayfield, Elijah

    2012-02-01

    This study explored the use of machine learning to automatically evaluate the accuracy of students' written explanations of evolutionary change. Performance of the Summarization Integrated Development Environment (SIDE) program was compared to human expert scoring using a corpus of 2,260 evolutionary explanations written by 565 undergraduate students in response to two different evolution instruments (the EGALT-F and EGALT-P) that contained prompts that differed in various surface features (such as species and traits). We tested human-SIDE scoring correspondence under a series of different training and testing conditions, using Kappa inter-rater agreement values of greater than 0.80 as a performance benchmark. In addition, we examined the effects of response length on scoring success; that is, whether SIDE scoring models functioned with comparable success on short and long responses. We found that SIDE performance was most effective when scoring models were built and tested at the individual item level and that performance degraded when suites of items or entire instruments were used to build and test scoring models. Overall, SIDE was found to be a powerful and cost-effective tool for assessing student knowledge and performance in a complex science domain.

  20. Uses of the Word “Macula” in Written English, 1400-Present

    Science.gov (United States)

    Schwartz, Stephen G.; Leffler, Christopher T.

    2014-01-01

    We compiled uses of the word “macula” in written English by searching multiple databases, including the Early English Books Online Text Creation Partnership, America’s Historical Newspapers, the Gale Cengage Collections, and others. “Macula” has been used: as a non-medical “spot” or “stain”, literal or figurative, including in astronomy and in Shakespeare; as a medical skin lesion, occasionally with a following descriptive adjective, such as a color (macula alba); as a corneal lesion, including the earliest identified use in English, circa 1400; and to describe the center of the retina. Francesco Buzzi described a yellow color in the posterior pole (“retina tinta di un color giallo”) in 1782, but did not use the word “macula”. “Macula lutea” was published by Samuel Thomas von Sömmering by 1799, and subsequently used in 1818 by James Wardrop, which appears to be the first known use in English. The Google n-gram database shows a marked increase in the frequencies of both “macula” and “macula lutea” following the introduction of the ophthalmoscope in 1850. “Macula” has been used in multiple contexts in written English. Modern databases provide powerful tools to explore historical uses of this word, which may be underappreciated by contemporary ophthalmologists. PMID:24913329

  1. Examining Elementary Students' Development of Oral and Written Argumentation Practices Through Argument-Based Inquiry

    Science.gov (United States)

    Chen, Ying-Chih; Hand, Brian; Park, Soonhye

    2016-05-01

    Argumentation, and the production of scientific arguments are critical elements of inquiry that are necessary for helping students become scientifically literate through engaging them in constructing and critiquing ideas. This case study employed a mixed methods research design to examine the development in 5th grade students' practices of oral and written argumentation from one unit to another over 16 weeks utilizing the science writing heuristic approach. Data sources included five rounds of whole-class discussion focused on group presentations of arguments that occurred over eleven class periods; students' group writings; interviews with six target students and the teacher; and the researcher's field notes. The results revealed five salient trends in students' development of oral and written argumentative practices over time: (1) Students came to use more critique components as they participated in more rounds of whole-class discussion focused on group presentations of arguments; (2) by challenging each other's arguments, students came to focus on the coherence of the argument and the quality of evidence; (3) students came to use evidence to defend, support, and reject arguments; (4) the quality of students' writing continuously improved over time; and (5) students connected oral argument skills to written argument skills as they had opportunities to revise their writing after debating and developed awareness of the usefulness of critique from peers. Given the development in oral argumentative practices and the quality of written arguments over time, this study indicates that students' development of oral and written argumentative practices is positively related to each other. This study suggests that argumentative practices should be framed through both a social and epistemic understanding of argument-utilizing talk and writing as vehicles to create norms of these complex practices.

  2. Open-Source tools: Incidence in the wireless security of the Technical University of Babahoyo

    Directory of Open Access Journals (Sweden)

    Joffre León-Acurio

    2018-02-01

    Full Text Available Computer security is a fundamental part of an organization, especially in Higher Education institutions, where there is very sensitive information, capable of being vulnerable by diffeerent methods of intrusion, the most common being free access through wireless points. The main objective of this research is to analyze the impact of the open source tools in charge of managing the security information of the wireless network, such as OSSIM, a set of active and passive components used to manage events that generate tra c within the network. net. This research exposes the use of free software as a viable option of low cost to solve the problems that a ict student sta , such as lack of access to academic services, problems of wireless interconnectivity, with the purpose to restore confidence in students in the Use of the services offered by the institution for research-related development, guaranteeing free and free access to the internet. The level of dissatisfaction on the part of the students con rms the problem presented at the Technical University of Babahoyo, thus confirming the positive influence of the Open-Source tools for the institution’s wireless security.

  3. SedInConnect: a stand-alone, free and open source tool for the assessment of sediment connectivity

    Science.gov (United States)

    Crema, Stefano; Cavalli, Marco

    2018-02-01

    There is a growing call, within the scientific community, for solid theoretic frameworks and usable indices/models to assess sediment connectivity. Connectivity plays a significant role in characterizing structural properties of the landscape and, when considered in combination with forcing processes (e.g., rainfall-runoff modelling), can represent a valuable analysis for an improved landscape management. In this work, the authors present the development and application of SedInConnect: a free, open source and stand-alone application for the computation of the Index of Connectivity (IC), as expressed in Cavalli et al. (2013) with the addition of specific innovative features. The tool is intended to have a wide variety of users, both from the scientific community and from the authorities involved in the environmental planning. Thanks to its open source nature, the tool can be adapted and/or integrated according to the users' requirements. Furthermore, presenting an easy-to-use interface and being a stand-alone application, the tool can help management experts in the quantitative assessment of sediment connectivity in the context of hazard and risk assessment. An application to a sample dataset and an overview on up-to-date applications of the approach and of the tool shows the development potential of such analyses. The modelled connectivity, in fact, appears suitable not only to characterize sediment dynamics at the catchment scale but also to integrate prediction models and as a tool for helping geomorphological interpretation.

  4. Oral vs. written evaluation of students

    DEFF Research Database (Denmark)

    Asklund, U.; Bendix, Lars Gotfred

    2003-01-01

    In this short paper we discuss the advantages and drawbacks of oral and written evaluation of students. First in more general terms and then followed by details of what we did in our course and our experience. Finally, we outline some topics for further study and discussions......In this short paper we discuss the advantages and drawbacks of oral and written evaluation of students. First in more general terms and then followed by details of what we did in our course and our experience. Finally, we outline some topics for further study and discussions...

  5. OSS4EVA: Using Open-Source Tools to Fulfill Digital Preservation Requirements

    Directory of Open Access Journals (Sweden)

    Heidi Dowding

    2016-10-01

    Full Text Available This paper builds on the findings of a workshop held at the 2015 International Conference on Digital Preservation (iPRES, entitled, “Using Open-Source Tools to Fulfill Digital Preservation Requirements” (OSS4PRES hereafter. This day-long workshop brought together participants from across the library and archives community, including practitioners, proprietary vendors, and representatives from open-source projects. The resulting conversations were surprisingly revealing: while OSS’ significance within the preservation landscape was made clear, participants noted that there are a number of roadblocks that discourage or altogether prevent its use in many organizations. Overcoming these challenges will be necessary to further widespread, sustainable OSS adoption within the digital preservation community. This article will mine the rich discussions that took place at OSS4PRES to (1 summarize the workshop’s key themes and major points of debate, (2 provide a comprehensive analysis of the opportunities, gaps, and challenges that using OSS entails at a philosophical, institutional, and individual level, and (3 offer a tangible set of recommendations for future work designed to broaden community engagement and enhance the sustainability of open source initiatives, drawing on both participants’ experience as well as additional research.

  6. Recommendations for reducing ambiguity in written procedures.

    Energy Technology Data Exchange (ETDEWEB)

    Matzen, Laura E.

    2009-11-01

    Previous studies in the nuclear weapons complex have shown that ambiguous work instructions (WIs) and operating procedures (OPs) can lead to human error, which is a major cause for concern. This report outlines some of the sources of ambiguity in written English and describes three recommendations for reducing ambiguity in WIs and OPs. The recommendations are based on commonly used research techniques in the fields of linguistics and cognitive psychology. The first recommendation is to gather empirical data that can be used to improve the recommended word lists that are provided to technical writers. The second recommendation is to have a review in which new WIs and OPs and checked for ambiguities and clarity. The third recommendation is to use self-paced reading time studies to identify any remaining ambiguities before the new WIs and OPs are put into use. If these three steps are followed for new WIs and OPs, the likelihood of human errors related to ambiguity could be greatly reduced.

  7. doit – Automation Tool

    Directory of Open Access Journals (Sweden)

    2010-09-01

    Full Text Available

    This article describes how traditional build-tools work, what are the shortcomings of this model for modern software development and finally how doit solve these problems. doit is written in python, it comes from the idea of bringing the power of build-tools to execute any kind of tasks.

  8. Enhancing Formal Modelling Tool Support with Increased Automation

    DEFF Research Database (Denmark)

    Lausdahl, Kenneth

    Progress report for the qualification exam report for PhD Student Kenneth Lausdahl. Initial work on enhancing tool support for the formal method VDM and the concept of unifying a abstract syntax tree with the ability for isolated extensions is described. The tool support includes a connection to ...... to UML and a test automation principle based on traces written as a kind of regular expressions....

  9. The challenge of giving written thesis feedback to nursing students.

    Science.gov (United States)

    Tuvesson, Hanna; Borglin, Gunilla

    2014-11-01

    Providing effective written feedback on nursing student's assignments can be a challenging task for any assessor. Additionally, as the student groups tend to become larger, written feedback is likely to gain an overall more prominent position than verbal feedback. Lack of formal training or regular discussion in the teaching faculty about the skill set needed to provide written feedback could negatively affect the students' learning abilities. In this brief paper, we discuss written feedback practices, whilst using the Bachelor of Science in Nursing thesis as an example. Our aim is to highlight the importance of an informed understanding of the impact written feedback can have on students. Creating awareness about this can facilitate the development of more strategic and successful written feedback strategies. We end by offering examples of some relatively simple strategies for improving this practice. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. MASTODON: A geosciences simulation tool built using the open-source framework MOOSE

    Science.gov (United States)

    Slaughter, A.

    2017-12-01

    The Department of Energy (DOE) is currently investing millions of dollars annually into various modeling and simulation tools for all aspects of nuclear energy. An important part of this effort includes developing applications based on the open-source Multiphysics Object Oriented Simulation Environment (MOOSE; mooseframework.org) from Idaho National Laboratory (INL).Thanks to the efforts of the DOE and outside collaborators, MOOSE currently contains a large set of physics modules, including phase field, level set, heat conduction, tensor mechanics, Navier-Stokes, fracture (extended finite-element method), and porous media, among others. The tensor mechanics and contact modules, in particular, are well suited for nonlinear geosciences problems. Multi-hazard Analysis for STOchastic time-DOmaiN phenomena (MASTODON; https://seismic-research.inl.gov/SitePages/Mastodon.aspx)--a MOOSE-based application--is capable of analyzing the response of 3D soil-structure systems to external hazards with current development focused on earthquakes. It is capable of simulating seismic events and can perform extensive "source-to-site" simulations including earthquake fault rupture, nonlinear wave propagation, and nonlinear soil-structure interaction analysis. MASTODON also includes a dynamic probabilistic risk assessment capability that enables analysts to not only perform deterministic analyses, but also easily perform probabilistic or stochastic simulations for the purpose of risk assessment. Although MASTODON has been developed for the nuclear industry, it can be used to assess the risk for any structure subjected to earthquakes.The geosciences community can learn from the nuclear industry and harness the enormous effort underway to build simulation tools that are open, modular, and share a common framework. In particular, MOOSE-based multiphysics solvers are inherently parallel, dimension agnostic, adaptive in time and space, fully coupled, and capable of interacting with other

  11. 5 CFR 179.306 - Written agreement for repayment.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false Written agreement for repayment. 179.306 Section 179.306 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS CLAIMS COLLECTION STANDARDS Administrative Offset § 179.306 Written agreement for repayment. A debtor who admits...

  12. Improving Written Language Performance of Adolescents with Asperger Syndrome

    Science.gov (United States)

    Delano, Monica E

    2007-01-01

    The effects of a multicomponent intervention involving self-regulated strategy development delivered via video self-modeling on the written language performance of 3 students with Asperger syndrome were examined. During intervention sessions, each student watched a video of himself performing strategies for increasing the number of words written and the number of functional essay elements. He then wrote a persuasive essay. The number of words written and number of functional essay elements included in each essay were measured. Each student demonstrated gains in the number of words written and number of functional essay elements. Maintenance of treatment effects at follow-up varied across targets and participants. Implications for future research are suggested. PMID:17624076

  13. Accurate modeling of UV written waveguide components

    DEFF Research Database (Denmark)

    Svalgaard, Mikael

    BPM simulation results of UV written waveguide components that are indistinguishable from measurements can be achieved on the basis of trajectory scan data and an equivalent step index profile that is very easy to measure.......BPM simulation results of UV written waveguide components that are indistinguishable from measurements can be achieved on the basis of trajectory scan data and an equivalent step index profile that is very easy to measure....

  14. Accurate modelling of UV written waveguide components

    DEFF Research Database (Denmark)

    Svalgaard, Mikael

    BPM simulation results of UV written waveguide components that are indistinguishable from measurements can be achieved on the basis of trajectory scan data and an equivalent step index profile that is very easy to measure.......BPM simulation results of UV written waveguide components that are indistinguishable from measurements can be achieved on the basis of trajectory scan data and an equivalent step index profile that is very easy to measure....

  15. [Written and pictorial content in magazines and their possible relationship to eating disorders].

    Science.gov (United States)

    Szabó, Kornélia; Túry, Ferenc

    2012-02-01

    In the current study we reviewed the literature on studies exploring the magazine reading frequency, written and pictorial contents appearing in magazines and their connection to eating disorders. Reading different fashion and fitness magazines has effect on readers through several indirect and direct factors and through trustable and false information. They affect readers' body satisfaction, self-esteem, eating habits and more generally their health behavior. Different theories have been explained to account for these associations and several other studies examined empirically the connection between the frequency of magazine reading and eating disorders, as well as the symptoms leading to eating disorders. We analyzed and summarized articles between 1975 and 2009 from online databases. We used the following sources: Science Direct (http://www.sciencedirect.com/), Springer-Verlag GmbH (http://www.springerlink.com/) and SAGE Publications Ltd (http://online.sagepub. com/). The pictorial and written magazine contents were associated with the development and maintenance of eating disorders or with symptoms that might lead to eating disorders. The publications compared to previous years featured an increased number of advertisements for unhealthy foods, for unhealthy radical diet plans and exercise programs. Furthermore the magazines contained conflicting messages about nutrition, body functions and eating disorders. Written and pictorial magazine contents, messages might increase the risk for development of eating disorders, especially in vulnerable individuals.

  16. Written Teacher Feedback: Aspects of Quality, Benefits and Challenges

    DEFF Research Database (Denmark)

    Holmeier, Monika; Grob, Regula; Nielsen, Jan Alexis

    2018-01-01

    was provided based on rubrics and templates for open comments. For this purpose, written teacher feedback itself, student artefacts and data from questionnaires were analysed. Furthermore, the benefits and challenges that teachers noticed in using written feedback will be examined. Finally......, it will be discussed which means of support for teachers seem necessary in order to foster the implementation of written teacher feedback as part of formative assessment in inquiry-based science education....

  17. Interpretation of Written Contracts in England

    Directory of Open Access Journals (Sweden)

    Neil Andrews

    2014-01-01

    Full Text Available This article examines the leading principles governing interpretation of written contracts under English law. This is a comprehensive and incisive analysis of the current law and of the relevant doctrines, including the equitable principles of rectification, as well as the powers of appeal courts or of the High Court when hearing an appeal from an arbitral award. The topic of interpretation of written contracts is fast-moving. It is of fundamental importance because this is the most significant commercial focus for dispute and because of the number of cross-border transactions to which English law is expressly applied by businesses.

  18. Software Engineering Tools for Scientific Models

    Science.gov (United States)

    Abrams, Marc; Saboo, Pallabi; Sonsini, Mike

    2013-01-01

    Software tools were constructed to address issues the NASA Fortran development community faces, and they were tested on real models currently in use at NASA. These proof-of-concept tools address the High-End Computing Program and the Modeling, Analysis, and Prediction Program. Two examples are the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) atmospheric model in Cell Fortran on the Cell Broadband Engine, and the Goddard Institute for Space Studies (GISS) coupled atmosphere- ocean model called ModelE, written in fixed format Fortran.

  19. The European source term code ESTER - basic ideas and tools for coupling of ATHLET and ESTER

    International Nuclear Information System (INIS)

    Schmidt, F.; Schuch, A.; Hinkelmann, M.

    1993-04-01

    The French software house CISI and IKE of the University of Stuttgart have developed during 1990 and 1991 in the frame of the Shared Cost Action Reactor Safety the informatic structure of the European Source TERm Evaluation System (ESTER). Due to this work tools became available which allow to unify on an European basis both code development and code application in the area of severe core accident research. The behaviour of reactor cores is determined by thermal hydraulic conditions. Therefore for the development of ESTER it was important to investigate how to integrate thermal hydraulic code systems with ESTER applications. This report describes the basic ideas of ESTER and improvements of ESTER tools in view of a possible coupling of the thermal hydraulic code system ATHLET and ESTER. Due to the work performed during this project the ESTER tools became the most modern informatic tools presently available in the area of severe accident research. A sample application is given which demonstrates the use of the new tools. (orig.) [de

  20. 42 CFR 2.16 - Security for written records.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Security for written records. 2.16 Section 2.16 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL PROVISIONS CONFIDENTIALITY OF ALCOHOL AND DRUG ABUSE PATIENT RECORDS General Provisions § 2.16 Security for written records...

  1. Age of acquisition and word frequency in written picture naming.

    Science.gov (United States)

    Bonin, P; Fayol, M; Chalard, M

    2001-05-01

    This study investigates age of acquisition (AoA) and word frequency effects in both spoken and written picture naming. In the first two experiments, reliable AoA effects on object naming speed, with objective word frequency controlled for, were found in both spoken (Experiment 1) and written picture naming (Experiment 2). In contrast, no reliable objective word frequency effects were observed on naming speed, with AoA controlled for, in either spoken (Experiment 3) or written (Experiment 4) picture naming. The implications of the findings for written picture naming are briefly discussed.

  2. Tools for Trade Analysis and Open Source Information Monitoring for Non-proliferation

    International Nuclear Information System (INIS)

    Cojazzi, G.G.M.; Versino, C.; Wolfart, E.; Renda, G.; Janssens, W.A.M.; )

    2015-01-01

    The new state level approach being proposed by IAEA envisions an objective based and information driven safeguards approach utilizing all relevant information to improve the effectiveness and efficiency of safeguards. To this goal the IAEA makes also use of open source information, here broadly defined as any information that is neither classified nor proprietary. It includes, but is not limited to: media sources, government and non-governmental reports and analyzes, commercial data, and scientific/technical literature, including trade data. Within the EC support programme to IAEA, JRC has surveyed and catalogued open sources on import-export customs trade data and developed tools for supporting the use of the related databases in safeguards. The JRC software The Big Table, (TBT), supports i.a.: a) the search through a collection of reference documents relevant to trade analysis (legal/regulatory documents, technical handbooks); b) the selection of items of interests to specific verifications and c) the mapping of these items to customs commodities searchable in trade databases. In the field of open source monitoring, JRC is developing and operating a ''Nuclear Security Media Monitor'' (NSMM), which is a web-based multilingual news aggregation system that automatically collects news articles from pre-defined web sites. NSMM is a domain specific version of the general JRC-Europe Media Monitor (EMM). NSMM has been established within the EC support programme with the aim, i.e., to streamline IAEA's process of open source information monitoring. In the first part, the paper will recall the trade data sources relevant for non-proliferation and will then illustrate the main features of TBT, recently coupled with the IAEA Physical Model, and new visualization techniques applied to trade data. In the second part it will present the main aspects of the NSMM also by illustrating some of uses done at JRC. (author)

  3. GEAS Spectroscopy Tools for Authentic Research Investigations in the Classroom

    Science.gov (United States)

    Rector, Travis A.; Vogt, Nicole P.

    2018-06-01

    Spectroscopy is one of the most powerful tools that astronomers use to study the universe. However relatively few resources are available that enable undergraduates to explore astronomical spectra interactively. We present web-based applications which guide students through the analysis of real spectra of stars, galaxies, and quasars. The tools are written in HTML5 and function in all modern web browsers on computers and tablets. No software needs to be installed nor do any datasets need to be downloaded, enabling students to use the tools in or outside of class (e.g., for online classes).Approachable GUIs allow students to analyze spectra in the same manner as professional astronomers. The stellar spectroscopy tool can fit a continuum with a blackbody and identify spectral features, as well as fit line profiles and determine equivalent widths. The galaxy and AGN tools can also measure redshifts and calcium break strengths. The tools provide access to an archive of hundreds of spectra obtained with the optical telescopes at Kitt Peak National Observatory. It is also possible to load your own spectra or to query the Sloan Digital Sky Survey (SDSS) database.We have also developed curricula to investigate these topics: spectral classification, variable stars, redshift, and AGN classification. We will present the functionality of the tools and describe the associated curriculum. The tools are part of the General Education Astronomy Source (GEAS) project based at New Mexico State University, with support from the National Science Foundation (NSF, AST-0349155) and the National Aeronautics and Space Administration (NASA, NNX09AV36G). Curriculum development was supported by the NSF (DUE-0618849 and DUE-0920293).

  4. Acoustic holography as a metrological tool for characterizing medical ultrasound sources and fields

    Science.gov (United States)

    Sapozhnikov, Oleg A.; Tsysar, Sergey A.; Khokhlova, Vera A.; Kreider, Wayne

    2015-01-01

    Acoustic holography is a powerful technique for characterizing ultrasound sources and the fields they radiate, with the ability to quantify source vibrations and reduce the number of required measurements. These capabilities are increasingly appealing for meeting measurement standards in medical ultrasound; however, associated uncertainties have not been investigated systematically. Here errors associated with holographic representations of a linear, continuous-wave ultrasound field are studied. To facilitate the analysis, error metrics are defined explicitly, and a detailed description of a holography formulation based on the Rayleigh integral is provided. Errors are evaluated both for simulations of a typical therapeutic ultrasound source and for physical experiments with three different ultrasound sources. Simulated experiments explore sampling errors introduced by the use of a finite number of measurements, geometric uncertainties in the actual positions of acquired measurements, and uncertainties in the properties of the propagation medium. Results demonstrate the theoretical feasibility of keeping errors less than about 1%. Typical errors in physical experiments were somewhat larger, on the order of a few percent; comparison with simulations provides specific guidelines for improving the experimental implementation to reduce these errors. Overall, results suggest that holography can be implemented successfully as a metrological tool with small, quantifiable errors. PMID:26428789

  5. Acoustic holography as a metrological tool for characterizing medical ultrasound sources and fields.

    Science.gov (United States)

    Sapozhnikov, Oleg A; Tsysar, Sergey A; Khokhlova, Vera A; Kreider, Wayne

    2015-09-01

    Acoustic holography is a powerful technique for characterizing ultrasound sources and the fields they radiate, with the ability to quantify source vibrations and reduce the number of required measurements. These capabilities are increasingly appealing for meeting measurement standards in medical ultrasound; however, associated uncertainties have not been investigated systematically. Here errors associated with holographic representations of a linear, continuous-wave ultrasound field are studied. To facilitate the analysis, error metrics are defined explicitly, and a detailed description of a holography formulation based on the Rayleigh integral is provided. Errors are evaluated both for simulations of a typical therapeutic ultrasound source and for physical experiments with three different ultrasound sources. Simulated experiments explore sampling errors introduced by the use of a finite number of measurements, geometric uncertainties in the actual positions of acquired measurements, and uncertainties in the properties of the propagation medium. Results demonstrate the theoretical feasibility of keeping errors less than about 1%. Typical errors in physical experiments were somewhat larger, on the order of a few percent; comparison with simulations provides specific guidelines for improving the experimental implementation to reduce these errors. Overall, results suggest that holography can be implemented successfully as a metrological tool with small, quantifiable errors.

  6. 49 CFR 1018.20 - Written demand for payment.

    Science.gov (United States)

    2010-10-01

    ... Collection of Claims § 1018.20 Written demand for payment. (a) The Board shall make appropriate written demand upon the debtor for payment of money in terms which specify: (1) The basis for the indebtedness... the debtor has explicitly refused to pay, or that sending a further demand is futile. Depending upon...

  7. Written Cultural Heritage in the Context of Adopted Legal Regulations

    Directory of Open Access Journals (Sweden)

    Eva Kodrič-Dačić

    2013-09-01

    Full Text Available ABSTRACTPurpose: Libraries collect written cultural heritage which is not only the most valuable part of their collections but also a part of library materials which is, due to digitalization projects in the last decade, becoming more and more interesting to librarians and library users. The main goal of the study is a theoretical research of library materials acknowledged as Slovenian heritage. By defining the basic terms it highlights the attributes which are immanent to library materials, derived from the context of their origin or later destiny. Slovenian library legislation concerning protection of written cultural heritage is also critically analysed.Methodology/approach: Comparative analyses of European and Slovenian legislation concerning librarianship and written cultural heritage. Research limitation: Research was mainly limited to professional literature and resources dealing with written cultural heritage. Originality/practical implications: Results of the research serve as formal criteria for definition of library materials as written heritage and suggest how to improve legislation in the field of protection of written heritage in libraries. 

  8. Using bio.tools to generate and annotate workbench tool descriptions

    DEFF Research Database (Denmark)

    Hillion, Kenzo-Hugo; Kuzmin, Ivan; Khodak, Anton

    2017-01-01

    - which have been registered in the ELIXIR tools registry (https://bio.tools) - into workbench environments by generating tool description templates. ToolDog includes two modules. The first module analyses the source code of the bioinformatics software with language-specific plugins, and generates...

  9. WMT: The CSDMS Web Modeling Tool

    Science.gov (United States)

    Piper, M.; Hutton, E. W. H.; Overeem, I.; Syvitski, J. P.

    2015-12-01

    The Community Surface Dynamics Modeling System (CSDMS) has a mission to enable model use and development for research in earth surface processes. CSDMS strives to expand the use of quantitative modeling techniques, promotes best practices in coding, and advocates for the use of open-source software. To streamline and standardize access to models, CSDMS has developed the Web Modeling Tool (WMT), a RESTful web application with a client-side graphical interface and a server-side database and API that allows users to build coupled surface dynamics models in a web browser on a personal computer or a mobile device, and run them in a high-performance computing (HPC) environment. With WMT, users can: Design a model from a set of components Edit component parameters Save models to a web-accessible server Share saved models with the community Submit runs to an HPC system Download simulation results The WMT client is an Ajax application written in Java with GWT, which allows developers to employ object-oriented design principles and development tools such as Ant, Eclipse and JUnit. For deployment on the web, the GWT compiler translates Java code to optimized and obfuscated JavaScript. The WMT client is supported on Firefox, Chrome, Safari, and Internet Explorer. The WMT server, written in Python and SQLite, is a layered system, with each layer exposing a web service API: wmt-db: database of component, model, and simulation metadata and output wmt-api: configure and connect components wmt-exe: launch simulations on remote execution servers The database server provides, as JSON-encoded messages, the metadata for users to couple model components, including descriptions of component exchange items, uses and provides ports, and input parameters. Execution servers are network-accessible computational resources, ranging from HPC systems to desktop computers, containing the CSDMS software stack for running a simulation. Once a simulation completes, its output, in NetCDF, is packaged

  10. TRANSIT--A Software Tool for Himar1 TnSeq Analysis.

    Directory of Open Access Journals (Sweden)

    Michael A DeJesus

    2015-10-01

    Full Text Available TnSeq has become a popular technique for determining the essentiality of genomic regions in bacterial organisms. Several methods have been developed to analyze the wealth of data that has been obtained through TnSeq experiments. We developed a tool for analyzing Himar1 TnSeq data called TRANSIT. TRANSIT provides a graphical interface to three different statistical methods for analyzing TnSeq data. These methods cover a variety of approaches capable of identifying essential genes in individual datasets as well as comparative analysis between conditions. We demonstrate the utility of this software by analyzing TnSeq datasets of M. tuberculosis grown on glycerol and cholesterol. We show that TRANSIT can be used to discover genes which have been previously implicated for growth on these carbon sources. TRANSIT is written in Python, and thus can be run on Windows, OSX and Linux platforms. The source code is distributed under the GNU GPL v3 license and can be obtained from the following GitHub repository: https://github.com/mad-lab/transit.

  11. Running the source term code package in Elebra MX-850

    International Nuclear Information System (INIS)

    Guimaraes, A.C.F.; Goes, A.G.A.

    1988-01-01

    The source term package (STCP) is one of the main tools applied in calculations of behavior of fission products from nuclear power plants. It is a set of computer codes to assist the calculations of the radioactive materials leaving from the metallic containment of power reactors to the environment during a severe reactor accident. The original version of STCP runs in SDC computer systems, but as it has been written in FORTRAN 77, is possible run it in others systems such as IBM, Burroughs, Elebra, etc. The Elebra MX-8500 version of STCP contains 5 codes:March 3, Trapmelt, Tcca, Vanessa and Nava. The example presented in this report has taken into consideration a small LOCA accident into a PWR type reactor. (M.I.)

  12. Genome sequencing of bacteria: sequencing, de novo assembly and rapid analysis using open source tools.

    Science.gov (United States)

    Kisand, Veljo; Lettieri, Teresa

    2013-04-01

    De novo genome sequencing of previously uncharacterized microorganisms has the potential to open up new frontiers in microbial genomics by providing insight into both functional capabilities and biodiversity. Until recently, Roche 454 pyrosequencing was the NGS method of choice for de novo assembly because it generates hundreds of thousands of long reads (tools for processing NGS data are increasingly free and open source and are often adopted for both their high quality and role in promoting academic freedom. The error rate of pyrosequencing the Alcanivorax borkumensis genome was such that thousands of insertions and deletions were artificially introduced into the finished genome. Despite a high coverage (~30 fold), it did not allow the reference genome to be fully mapped. Reads from regions with errors had low quality, low coverage, or were missing. The main defect of the reference mapping was the introduction of artificial indels into contigs through lower than 100% consensus and distracting gene calling due to artificial stop codons. No assembler was able to perform de novo assembly comparable to reference mapping. Automated annotation tools performed similarly on reference mapped and de novo draft genomes, and annotated most CDSs in the de novo assembled draft genomes. Free and open source software (FOSS) tools for assembly and annotation of NGS data are being developed rapidly to provide accurate results with less computational effort. Usability is not high priority and these tools currently do not allow the data to be processed without manual intervention. Despite this, genome assemblers now readily assemble medium short reads into long contigs (>97-98% genome coverage). A notable gap in pyrosequencing technology is the quality of base pair calling and conflicting base pairs between single reads at the same nucleotide position. Regardless, using draft whole genomes that are not finished and remain fragmented into tens of contigs allows one to characterize

  13. A portable software tool for computing digitally reconstructed radiographs

    International Nuclear Information System (INIS)

    Chaney, Edward L.; Thorn, Jesse S.; Tracton, Gregg; Cullip, Timothy; Rosenman, Julian G.; Tepper, Joel E.

    1995-01-01

    Purpose: To develop a portable software tool for fast computation of digitally reconstructed radiographs (DRR) with a friendly user interface and versatile image format and display options. To provide a means for interfacing with commercial and custom three-dimensional (3D) treatment planning systems. To make the tool freely available to the Radiation Oncology community. Methods and Materials: A computer program for computing DRRs was enhanced with new features and rewritten to increase computational efficiency. A graphical user interface was added to improve ease of data input and DRR display. Installer, programmer, and user manuals were written, and installation test data sets were developed. The code conforms to the specifications of the Cooperative Working Group (CWG) of the National Cancer Institute (NCI) Contract on Radiotherapy Treatment Planning Tools. Results: The interface allows the user to select DRR input data and image formats primarily by point-and-click mouse operations. Digitally reconstructed radiograph formats are predefined by configuration files that specify 19 calculation parameters. Enhancements include improved contrast resolution for visualizing surgical clips, an extended source model to simulate the penumbra region in a computed port film, and the ability to easily modify the CT numbers of objects contoured on the planning computed tomography (CT) scans. Conclusions: The DRR tool can be used with 3D planning systems that lack this functionality, or perhaps improve the quality and functionality of existing DRR software. The tool can be interfaced to 3D planning systems that run on most modern graphics workstations, and can also function as a stand-alone program

  14. iPhone Open Application Development Write Native Applications Using the Open Source Tool Chain

    CERN Document Server

    Zdziarski, Jonathan

    2008-01-01

    Developers everywhere are eager to create applications for the iPhone, and many of them prefer the open source, community-developed tool chain to Apple's own toolkit. This new edition of iPhone Open Application Development covers the latest version of the open toolkit -- now updated for Apple's iPhone 2.x software and iPhone 3G -- and explains in clear language how to create applications using Objective-C and the iPhone API.

  15. Beam diagnostic tools for the negative hydrogen ion source test facility ELISE

    International Nuclear Information System (INIS)

    Nocentini, Riccardo; Fantz, Ursel; Franzen, Peter; Froeschle, Markus; Heinemann, Bernd; Riedl, Rudolf; Ruf, Benjamin; Wuenderlich, Dirk

    2013-01-01

    Highlights: ► We present an overview of beam diagnostic tools foreseen for the new testbed ELISE. ► A sophisticated diagnostic calorimeter allows beam profile measurement. ► A tungsten wire mesh in the beam path provides a qualitative picture of the beam. ► Stripping losses and beam divergence are measured by H α Doppler shift spectroscopy. -- Abstract: The test facility ELISE, presently being commissioned at IPP, is a first step in the R and D roadmap for the RF driven ion source and extraction system of the ITER NBI system. The “half-size” ITER-like test facility includes a negative hydrogen ion source that can be operated for 1 h. ELISE is expected to extract an ion beam of 20 A at 60 kV for 10 s every 3 min, therefore delivering a total power of 1.2 MW. The extraction area has a geometry that closely reproduces the ITER design, with the same width and half the height, i.e. 1 m × 1 m. This paper presents an overview of beam diagnostic tools foreseen for ELISE. For the commissioning phase, a simple beam dump with basic diagnostic capabilities has been installed. In the second phase, the beam dump will be substituted by a more sophisticated diagnostic calorimeter to allow beam profile measurement. Additionally, a tungsten wire mesh will be introduced in the beam path to provide a qualitative picture of beam size and position. Stripping losses and beam divergence will be measured by means of H α Doppler shift spectroscopy. An absolute calibration is foreseen in order to measure beam intensity

  16. A Coupling Tool for Parallel Molecular Dynamics-Continuum Simulations

    KAUST Repository

    Neumann, Philipp; Tchipev, Nikola

    2012-01-01

    We present a tool for coupling Molecular Dynamics and continuum solvers. It is written in C++ and is meant to support the developers of hybrid molecular - continuum simulations in terms of both realisation of the respective coupling algorithm

  17. Failure to Follow Written Procedures

    Science.gov (United States)

    2017-12-01

    Most tasks in aviation have a mandated written procedure to be followed specifically under the Code of Federal Regulations (CFR) Part 14, Section 43.13(a). However, the incidence of Failure to Follow Procedure (FFP) events continues to be a major iss...

  18. FREEWAT: an HORIZON 2020 project to build open source tools for water management.

    Science.gov (United States)

    Rossetto, Rudy; Borsi, Iacopo; Foglia, Laura

    2015-04-01

    tools for better producing feasibility and management plans; (ii) a set of activities devoted to fix bugs and to provide a well-integrated interface for the different tools implemented. Further capabilities to be integrated are: - a dedicated module for water management and planning that will help to manage and aggregate all the distributed data coming from the simulation scenarios; - a whole module for calibration, uncertainty and sensitivity analysis; - a module for solute transport in the unsaturated zone; - a module for crop growth and water requirements in agriculture; - tools for dealing with groundwater quality issues; - tools for the analysis, interpretation and visualization of hydrogeological data. Through creating a common environment among water research/professionals, policy makers and implementers, FREEWAT main impact will be on enhancing science- and participatory approach and evidence-based decision making in water resource management, hence producing relevant and appropriate outcomes for policy implementation. The Consortium is constituted by partners from various water sectors from 10 EU countries, plus Turkey and Ukraine. Synergies with the UNESCO HOPE initiative on free and open source software in water management greatly boost the value of the project. Large stakeholders involvement is thought to guarantee results dissemination and exploitation. Acknowledgements This paper is presented within the framework of the project FREEWAT, which has received funding from the European Union's Horizon 2020 research and innovation programme under Grant Agreement n. 642224. References MARSOL (2014). Demonstrating Managed Aquifer Recharge as a Solution to Water Scarcity and Drought www.marsol.eu [accessed 4 January 2015] Rossetto, R., Borsi, I., Schifani, C., Bonari, E., Mogorovich P. & Primicerio M. (2013) - SID&GRID: integrating hydrological modeling in GIS environment hydroinformatics system for the management of the water resource. Rendiconti Online Societa

  19. Written Expression Performance in Adolescents with Attention-Deficit/Hyperactivity Disorder (ADHD)

    Science.gov (United States)

    DeBono, Tony; Hosseini, Armita; Cairo, Cassandra; Ghelani, Karen; Tannock, Rosemary; Toplak, Maggie E.

    2012-01-01

    We examined written expression performance in a sample of adolescents with ADHD and subthreshold ADHD using two different strategies: examining performance on standardized measures of written expression and using other indicators of written expression developed in this study. We examined associations between standardized measures of written…

  20. Introduction of CLIL approach in Sociological Doctoral Programmes: the Ethnolinguistic Focus on Theses Written in Russian or in English

    Directory of Open Access Journals (Sweden)

    Maria Pavenkova

    2016-06-01

    Full Text Available The paper examines some limits of the introduction of SFL-based CLIL approach in non-western sociological doctoral programmes. The author is focusing specifically on pragmatic markers as tools for the structuring of science written discourse and using this approach to identify the differences between Russian and English academic genres.  Data was collected from doctoral theses in Russian and in English from the field of sociology of management. It is shown that the average number of pragmatic markers at 1000 words-3.81 in Russian theses and 2.27 in doctoral theses written in English. The author suggests that these variations are associated with the structure and goals of a scholarly paper. English academic genres are more empirical, whereas Russian focused on the development of theory.

  1. Global search tool for the Advanced Photon Source Integrated Relational Model of Installed Systems (IRMIS) database

    International Nuclear Information System (INIS)

    Quock, D.E.R.; Cianciarulo, M.B.

    2007-01-01

    The Integrated Relational Model of Installed Systems (IRMIS) is a relational database tool that has been implemented at the Advanced Photon Source to maintain an updated account of approximately 600 control system software applications, 400,000 process variables, and 30,000 control system hardware components. To effectively display this large amount of control system information to operators and engineers, IRMIS was initially built with nine Web-based viewers: Applications Organizing Index, IOC, PLC, Component Type, Installed Components, Network, Controls Spares, Process Variables, and Cables. However, since each viewer is designed to provide details from only one major category of the control system, the necessity for a one-stop global search tool for the entire database became apparent. The user requirements for extremely fast database search time and ease of navigation through search results led to the choice of Asynchronous JavaScript and XML (AJAX) technology in the implementation of the IRMIS global search tool. Unique features of the global search tool include a two-tier level of displayed search results, and a database data integrity validation and reporting mechanism.

  2. The Efficacy of Social Media as a Research Tool and Information Source for Safeguards Verification

    International Nuclear Information System (INIS)

    Skoeld, T.; Feldman, Y.

    2015-01-01

    The IAEA Department of Safeguards aims to provide credible assurances to the international community that States are fulfiling their safeguards obligations in that all nuclear material remains in peaceful use. In order to draw a soundly-based safeguards conclusion for a State that has a safeguards agreement in force with the IAEA, the Department establishes a knowledge base of the State's nuclear-related infrastructure and activities against which a State's declarations are evaluated for correctness and completeness. Open source information is one stream of data that is used in the evaluation of nuclear fuel cycle activities in the State. The Department is continuously working to ensure that it has access to the most up-to-date, accurate, relevant and credible open source information available, and has begun to examine the use of social media as a new source of information. The use of social networking sites has increased exponentially in the last decade. In fact, social media has emerged as the key vehicle for delivering and acquiring information in near real-time. Therefore, it has become necessary for the open source analyst to consider social media as an essential element in the broader concept of open source information. Characteristics, such as ''immediacy'', ''recency'', ''interractiveness'', which set social networks apart from the ''traditional media'', are also the same attributes that present a challenge for using social media as an efficient information-delivery platform and a credible source of information. New tools and technologies for social media analytics have begun to emerge to help systematically monitor and mine this large body of data. The paper will survey the social media landscape in an effort to identify platforms that could be of value for safeguards verification purposes. It will explore how a number of social networking sites, such as Twitter

  3. Teaching Written Communication Strategies: A Training to Improve Writing

    Directory of Open Access Journals (Sweden)

    Hanane Benali Taouis

    2018-03-01

    Full Text Available This research can be described as an experimental quantitative one including: a strategy training; two homogenous experimental groups with different levels of proficiency; and two homogenous control groups. The subjects are 60 Spanish high school students, who have been selected after taking the Oxford Quick Placement-Test. The study aims at investigating the possible relationship between the effect of the strategy training and the subjects' level of proficiency. It is also designed to analyze the effect of the training on the use of communication strategies in the written medium. It is meant to study the effect of the strategy training on the subjects' writing skill in English. The results show that the students' level of proficiency exerts a strong effect on the subjects' use of written communication strategies (CSs and on their strategy preference in written production. They also demonstrate how strategy training improves the subjects' written communication ability.

  4. PLOTLIB: a computerized nuclear waste source-term library storage and retrieval system

    International Nuclear Information System (INIS)

    Marshall, J.R.; Nowicki, J.A.

    1978-01-01

    The PLOTLIB code was written to provide computer access to the Nuclear Waste Source-Term Library for those users with little previous computer programming experience. The principles of user orientation, quick accessibility, and versatility were extensively employed in the development of the PLOTLIB code to accomplish this goal. The Nuclear Waste Source-Term Library consists of 16 ORIGEN computer runs incorporating a wide variety of differing light water reactor (LWR) fuel cycles and waste streams. The typical isotopic source-term data consist of information on watts, curies, grams, etc., all of which are compiled as a function of time after reactor discharge and unitized on a per metric ton heavy metal basis. The information retrieval code, PLOTLIB, is used to process source-term information requests into computer plots and/or user-specified output tables. This report will serve both as documentation of the current data library and as an operations manual for the PLOTLIB computer code. The accompanying input description, program listing, and sample problems make this code package an easily understood tool for the various nuclear waste studies under way at the Office of Waste Isolation

  5. THE ALL-SOURCE GREEN’S FUNCTION AND ITS APPLICATIONS TO TSUNAMI PROBLEMS

    Directory of Open Access Journals (Sweden)

    ZHIGANG XU

    2007-01-01

    Full Text Available The classical Green’s function provides the global linear response to impulse forcing at a particular source location. It is a type of one-source-all-receiver Green’s function. This paper presents a new type of Green’s function, referred to as the all-source-one-receiver, or for short the all-source Green’s function (ASGF, in which the solution at a point of interest (POI can be written in terms of global forcing without requiring the solution at other locations. The ASGF is particularly applicable to tsunami problems. The response to forcing anywhere in the global ocean can be determined within a few seconds on an ordinary personal computer or on a web server. The ASGF also brings in two new types of tsunami charts, one for the arrival time and the second for the gain, without assuming the location of the epicenter or reversibility of the tsunami travel path. Thus it provides a useful tool for tsunami hazard preparedness and to rapidly calculate the real-time responses at selected POIs for a tsunami generated anywhere in the world’s oceans.

  6. PRAGMATIC AND RHETORICAL STRATEGIES IN THE ENGLISH-WRITTEN JOKES

    Directory of Open Access Journals (Sweden)

    Dyah Rochmawati

    2017-05-01

    Full Text Available Understanding verbal jokes in English is problematic for English as Foreign Language (EFL readers since understanding the jokes requires understanding their linguistic, cultural and social elements. Since a joke constitutes a complex and paradoxical phenomenon, it needs multiple approaches of analyses—such as pragmatic and rhetorical analyses—in order to investigate the multiple layers of meanings it carries. Recently there has been a shift in humor studies, emphasizing linguistic humors and involving the field of rhetoric. These studies, however, have mostly addressed the connection between rhetoric and spoken jokes in persuasion. The present study therefore applied Austin’s Speech Act Theory (1975 and Grice’s Cooperative Principles (1957, and Berger’s rhetorical techniques (1993 to crack the funniness of the written jokes. Specifically, the study aims at describing: how the (1 rhetorical and (2 pragmatic strategies are used in the jokes, and (3 how the pragmatic and rhetorical strategies complement to create humor. The study employed a qualitative research method. Some jokes were purposively selected from the Reader’s Digest and two online sources: http://jokes.cc.com/, and http://www.ajokeaday.com/. Document studies were the means of data collection. The collected data were then analyzed using a qualitative content analysis. The results showed that that there was a relationship between the two pragmatic theories, i.e., Speech Act Theory and Cooperative Principles, and Berger’s rhetorical techniques. The results offered an alternative reading and richer understanding of how written jokes employed pragmatic and rhetorical strategies to advance their rhetorical objectives and humor functions.

  7. 19 CFR 148.111 - Written declaration for unaccompanied articles.

    Science.gov (United States)

    2010-04-01

    ... 19 Customs Duties 2 2010-04-01 2010-04-01 false Written declaration for unaccompanied articles... of the United States § 148.111 Written declaration for unaccompanied articles. The baggage... covers articles which do not accompany him and: (a) The articles are entitled to free entry under the $1...

  8. Comparative implementation of Handwritten and Machine written Gurmukhi text utilizing appropriate parameters

    Science.gov (United States)

    Kaur, Jaswinder; Jagdev, Gagandeep, Dr.

    2018-01-01

    Optical character recognition is concerned with the recognition of optically processed characters. The recognition is done offline after the writing or printing has been completed, unlike online recognition where the computer has to recognize the characters instantly as they are drawn. The performance of character recognition depends upon the quality of scanned documents. The preprocessing steps are used for removing low-frequency background noise and normalizing the intensity of individual scanned documents. Several filters are used for reducing certain image details and enabling an easier or faster evaluation. The primary aim of the research work is to recognize handwritten and machine written characters and differentiate them. The language opted for the research work is Punjabi Gurmukhi and tool utilized is Matlab.

  9. Source SDK development essentials

    CERN Document Server

    Bernier, Brett

    2014-01-01

    The Source Authoring Tools are the pieces of software used to create custom content for games made with Valve's Source engine. Creating mods and maps for your games without any programming knowledge can be time consuming. These tools allow you to create your own maps and levels without the need for any coding knowledge. All the tools that you need to start creating your own levels are built-in and ready to go! This book will teach you how to use the Authoring Tools provided with Source games and will guide you in creating your first maps and mods (modifications) using Source. You will learn ho

  10. SAFEPAQ-II, a new tool for the production of activation data libraries

    Energy Technology Data Exchange (ETDEWEB)

    Forrest, R.A. E-mail: robin.forrest@ukaea.org.uk

    2001-04-01

    Activation data and inventory codes are a major input to much of the safety related work carried out on fusion devices. The inventory code recommended for European activation calculations is FISPACT-99; this requires a large amount of nuclear data, which is available in the European Activation File (EAF-99). The production of an EAF library uses new sources of data, both evaluated and calculated, differential measurements and integral data. In order to store, evaluate, and use all the various data sources an efficient software tool is required. Earlier versions of EAF have been produced using the tools SYMPAL and SAFEPAQ, which enabled a large degree of automation as compared with the original construction 'by hand'. However, these relied on the direct manipulation of the ENDF formatted text files using FORTRAN-77. This is not an efficient approach, as editing of the text files is inconvenient and liable to errors. It was decided to use relational databases to store the data, with data extraction carried out by standard queries written in SQL. Other objectives were the provision of a user-friendly graphical interface to allow data to be viewed and manipulated and a high level of QA by logging all data changes. These objectives have been realised by the SAFEPAQ-II application; this uses the ideas of the previous tools, but has been designed from scratch using new methods. Visual Basic is used to build the application running under Windows NT 4, which is linked to a series of ACCESS databases.

  11. SAFEPAQ-II, a new tool for the production of activation data libraries

    International Nuclear Information System (INIS)

    Forrest, R.A.

    2001-01-01

    Activation data and inventory codes are a major input to much of the safety related work carried out on fusion devices. The inventory code recommended for European activation calculations is FISPACT-99; this requires a large amount of nuclear data, which is available in the European Activation File (EAF-99). The production of an EAF library uses new sources of data, both evaluated and calculated, differential measurements and integral data. In order to store, evaluate, and use all the various data sources an efficient software tool is required. Earlier versions of EAF have been produced using the tools SYMPAL and SAFEPAQ, which enabled a large degree of automation as compared with the original construction 'by hand'. However, these relied on the direct manipulation of the ENDF formatted text files using FORTRAN-77. This is not an efficient approach, as editing of the text files is inconvenient and liable to errors. It was decided to use relational databases to store the data, with data extraction carried out by standard queries written in SQL. Other objectives were the provision of a user-friendly graphical interface to allow data to be viewed and manipulated and a high level of QA by logging all data changes. These objectives have been realised by the SAFEPAQ-II application; this uses the ideas of the previous tools, but has been designed from scratch using new methods. Visual Basic is used to build the application running under Windows NT 4, which is linked to a series of ACCESS databases

  12. Written pain neuroscience education in fibromyalgia: a multicenter randomized controlled trial.

    Science.gov (United States)

    van Ittersum, Miriam W; van Wilgen, C Paul; van der Schans, Cees P; Lambrecht, Luc; Groothoff, Johan W; Nijs, Jo

    2014-11-01

    Mounting evidence supports the use of face-to-face pain neuroscience education for the treatment of chronic pain patients. This study aimed at examining whether written education about pain neuroscience improves illness perceptions, catastrophizing, and health status in patients with fibromyalgia. A double-blind, multicenter randomized controlled clinical trial with 6-month follow-up was conducted. Patients with FM (n = 114) that consented to participate were randomly allocated to receive either written pain neuroscience education or written relaxation training. Written pain neuroscience education comprised of a booklet with pain neuroscience education plus a telephone call to clarify any difficulties; the relaxation group received a booklet with relaxation education and a telephone call. The revised illness perception questionnaire, Pain Catastrophizing Scale, and fibromyalgia impact questionnaire were used as outcome measures. Both patients and assessors were blinded. Repeated-measures analyses with last observation carried forward principle were performed. Cohen's d effect sizes (ES) were calculated for all within-group changes and between-group differences. The results reveal that written pain neuroscience education does not change the impact of FM on daily life, catastrophizing, or perceived symptoms of patients with FM. Compared with written relaxation training, written pain neuroscience education improved beliefs in a chronic timeline of FM (P = 0.03; ES = 0.50), but it does not impact upon other domains of illness perceptions. Compared with written relaxation training, written pain neuroscience education slightly improved illness perceptions of patients with FM, but it did not impart clinically meaningful effects on pain, catastrophizing, or the impact of FM on daily life. Face-to-face sessions of pain neuroscience education are required to change inappropriate cognitions and perceived health in patients with FM. © 2013 World Institute of Pain.

  13. The influence of the pregroove on the shape of thermomagnetically written domains

    International Nuclear Information System (INIS)

    Ichihara, K.

    1990-01-01

    In order to clarify the influence of pregrooved substrates on the shape of thermomagnetically written domains, the difference between the shape of the domains written on a pregrooved area and that written on a mirror area have been examined. Trilayered magneto-optical media, which had rare-earth- (RE-) rich TbFeCo films, transition-metal-rich TbFeCo films, and RE-rich GdTbFeCo films as a recording layer, were sputtered on disk substrates. The substrates had both a pregrooved area and a mirror area in a recording track. The domains were written in each medium by varying the recording power and the external field, and were observed by an Ar + -laser scanning polarized microscope. In the case of TbFeCo media which were written with lower recording power condition, the shape of the domains on a pregrooved area were almost the same as those written on a mirror area. On the other hand, the widths of the domains written on a mirror area became larger than those of domains written on a pregrooved area when the recording power was increased. In the case of a GdTbFeCo medium, the widths of the domains written on a mirror area were much larger than those of domains written on a pregrooved area independent of the recording conditions. The lengths of the domains written on both areas were almost the same for all cases. It is believed that the reason for the experimental results is that thermal diffusion in the film plane is suppressed at the step of a pregroove. The different result between TbFeCo and GdTbFeCo films is believed to come from the differences in the contracting forces on the domain walls during the writing process

  14. On written expression of primary school pupils

    Directory of Open Access Journals (Sweden)

    Stevanović Jelena

    2009-01-01

    Full Text Available Normative rules of standard Serbian language are acquired during primary and secondary education through curriculum demands of Serbian language instruction, which takes place in three fields: grammar, orthography and culture of expression. Topic of interest in this paper is the quality of written expression of 6th and 7th grade pupils, in the context of all three fields specified to be mastered by the curriculum of Serbian language. Research comprised 148 primary school pupils from Belgrade. Linguistic analysis of spontaneously created written text was performed, in the conditions where it was not explicitly demanded form the pupil to write correctly. The results indicate that the majority of pupils make spelling and grammatical errors, meeting the condition for the basic level of mastering the knowledge in Serbian language according to the standards specified for the end of compulsory education. In addition to this, a considerable majority of pupils has a satisfactory level of culture of written expression. Pupils more often make spelling than grammatical errors. Seventh grade pupils are better than sixth grade pupils with respect to adhering to grammar rules and according to culture of written expression, while the mark in Serbian language and general school achievement of pupils correlate only with the degree of adhering to the orthographic rules. It was concluded that not only individual programs of support for pupils who make more errors are necessary, but also launching national projects for the development of linguistic competence of the young in Serbia.

  15. The Written Communication Skills That Matter Most for Accountants

    Science.gov (United States)

    Riley, Tracey J.; Simons, Kathleen A.

    2016-01-01

    Given the importance of effective written communication skills to the discipline of accounting, faculty must emphasize these skills in their classroom in order to adequately prepare students for successful careers in the field. Since 2000, only two studies in the accounting literature have examined which written communication skills are needed by…

  16. Exploring TechQuests Through Open Source and Tools That Inspire Digital Natives

    Science.gov (United States)

    Hayden, K.; Ouyang, Y.; Kilb, D.; Taylor, N.; Krey, B.

    2008-12-01

    "There is little doubt that K-12 students need to understand and appreciate the Earth on which they live. They can achieve this understanding only if their teachers are well prepared". Dan Barstow, Director of Center for Earth and Space Science Education at TERC. The approach of San Diego County's Cyberinfrastructure Training, Education, Advancement, and Mentoring (SD Cyber-TEAM) project is to build understandings of Earth systems for middle school teachers and students through a collaborative that has engaged the scientific community in the use of cyber-based tools and environments for learning. The SD Cyber-TEAM has used Moodle, an open source management system with social networking tools, that engage digital native students and their teachers in collaboration and sharing of ideas and research related to Earth science. Teachers participate in on-line professional dialog through chat, wikis, blogs, forums, journals and other tools and choose the tools that will best fit their classroom. The use of Moodle during the Summer Cyber Academy developed a cyber-collaboratory environment where teaching strategies were discussed, supported and actualized by participants. These experiences supported digital immigrants (teachers) in adapting teaching strategies using technologies that are most attractive and familiar to students (digital natives). A new study by the National School Boards Association and Grunwald Associates LLC indicated that "the online behaviors of U.S. teens and 'tweens shows that 96 percent of students with online access use social networking technologies, such as chatting, text messaging, blogging, and visiting online communities such as Facebook, MySpace, and Webkinz". While SD Cyber-TEAM teachers are implementing TechQuests in classrooms they use these social networking elements to capture student interest and address the needs of digital natives. Through the Moodle environment, teachers have explored a variety of learning objects called Tech

  17. Assessing student written problem solutions: A problem-solving rubric with application to introductory physics

    Science.gov (United States)

    Docktor, Jennifer L.; Dornfeld, Jay; Frodermann, Evan; Heller, Kenneth; Hsu, Leonardo; Jackson, Koblar Alan; Mason, Andrew; Ryan, Qing X.; Yang, Jie

    2016-06-01

    Problem solving is a complex process valuable in everyday life and crucial for learning in the STEM fields. To support the development of problem-solving skills it is important for researchers and curriculum developers to have practical tools that can measure the difference between novice and expert problem-solving performance in authentic classroom work. It is also useful if such tools can be employed by instructors to guide their pedagogy. We describe the design, development, and testing of a simple rubric to assess written solutions to problems given in undergraduate introductory physics courses. In particular, we present evidence for the validity, reliability, and utility of the instrument. The rubric identifies five general problem-solving processes and defines the criteria to attain a score in each: organizing problem information into a Useful Description, selecting appropriate principles (Physics Approach), applying those principles to the specific conditions in the problem (Specific Application of Physics), using Mathematical Procedures appropriately, and displaying evidence of an organized reasoning pattern (Logical Progression).

  18. 10 CFR 2.813 - Written communications.

    Science.gov (United States)

    2010-01-01

    ... other written communications under the regulations of this chapter is requested but not required to cite whenever practical, in the upper right corner of the first page of the submission, the specific regulation...

  19. 42: An Open-Source Simulation Tool for Study and Design of Spacecraft Attitude Control Systems

    Science.gov (United States)

    Stoneking, Eric

    2018-01-01

    Simulation is an important tool in the analysis and design of spacecraft attitude control systems. The speaker will discuss the simulation tool, called simply 42, that he has developed over the years to support his own work as an engineer in the Attitude Control Systems Engineering Branch at NASA Goddard Space Flight Center. 42 was intended from the outset to be high-fidelity and powerful, but also fast and easy to use. 42 is publicly available as open source since 2014. The speaker will describe some of 42's models and features, and discuss its applicability to studies ranging from early concept studies through the design cycle, integration, and operations. He will outline 42's architecture and share some thoughts on simulation development as a long-term project.

  20. The MATH--Open Source Application for Easier Learning of Numerical Mathematics

    Science.gov (United States)

    Glaser-Opitz, Henrich; Budajová, Kristina

    2016-01-01

    The article introduces a software application (MATH) supporting an education of Applied Mathematics, with focus on Numerical Mathematics. The MATH is an easy to use tool supporting various numerical methods calculations with graphical user interface and integrated plotting tool for graphical representation written in Qt with extensive use of Qwt…

  1. Oral and written language in late adulthood: findings from the Nun Study.

    Science.gov (United States)

    Mitzner, Tracy L; Kemper, Susan

    2003-01-01

    As a part of the Nun Study, a longitudinal investigation of aging and Alzheimer's disease, oral and written autobiographies from 118 older women were analyzed to examine the relationship between spoken and written language. The written language samples were more complex than the oral samples, both conceptually and grammatically. The relationship between the linguistic measures and participant characteristics was also examined. The results suggest that the grammatical and conceptual characteristics of oral and written language are affected by participant differences in education, cognitive status, and physical function and that written language samples have greater power than oral language samples to differentiate between high- and low-ability older adults.

  2. Development of parallel/serial program analyzing tool

    International Nuclear Information System (INIS)

    Watanabe, Hiroshi; Nagao, Saichi; Takigawa, Yoshio; Kumakura, Toshimasa

    1999-03-01

    Japan Atomic Energy Research Institute has been developing 'KMtool', a parallel/serial program analyzing tool, in order to promote the parallelization of the science and engineering computation program. KMtool analyzes the performance of program written by FORTRAN77 and MPI, and it reduces the effort for parallelization. This paper describes development purpose, design, utilization and evaluation of KMtool. (author)

  3. Mesoscale brain explorer, a flexible python-based image analysis and visualization tool.

    Science.gov (United States)

    Haupt, Dirk; Vanni, Matthieu P; Bolanos, Federico; Mitelut, Catalin; LeDue, Jeffrey M; Murphy, Tim H

    2017-07-01

    Imaging of mesoscale brain activity is used to map interactions between brain regions. This work has benefited from the pioneering studies of Grinvald et al., who employed optical methods to image brain function by exploiting the properties of intrinsic optical signals and small molecule voltage-sensitive dyes. Mesoscale interareal brain imaging techniques have been advanced by cell targeted and selective recombinant indicators of neuronal activity. Spontaneous resting state activity is often collected during mesoscale imaging to provide the basis for mapping of connectivity relationships using correlation. However, the information content of mesoscale datasets is vast and is only superficially presented in manuscripts given the need to constrain measurements to a fixed set of frequencies, regions of interest, and other parameters. We describe a new open source tool written in python, termed mesoscale brain explorer (MBE), which provides an interface to process and explore these large datasets. The platform supports automated image processing pipelines with the ability to assess multiple trials and combine data from different animals. The tool provides functions for temporal filtering, averaging, and visualization of functional connectivity relations using time-dependent correlation. Here, we describe the tool and show applications, where previously published datasets were reanalyzed using MBE.

  4. BYMUR software: a free and open source tool for quantifying and visualizing multi-risk analyses

    Science.gov (United States)

    Tonini, Roberto; Selva, Jacopo

    2013-04-01

    The BYMUR software aims to provide an easy-to-use open source tool for both computing multi-risk and managing/visualizing/comparing all the inputs (e.g. hazard, fragilities and exposure) as well as the corresponding results (e.g. risk curves, risk indexes). For all inputs, a complete management of inter-model epistemic uncertainty is considered. The BYMUR software will be one of the final products provided by the homonymous ByMuR project (http://bymur.bo.ingv.it/) funded by Italian Ministry of Education, Universities and Research (MIUR), focused to (i) provide a quantitative and objective general method for a comprehensive long-term multi-risk analysis in a given area, accounting for inter-model epistemic uncertainty through Bayesian methodologies, and (ii) apply the methodology to seismic, volcanic and tsunami risks in Naples (Italy). More specifically, the BYMUR software will be able to separately account for the probabilistic hazard assessment of different kind of hazardous phenomena, the relative (time-dependent/independent) vulnerabilities and exposure data, and their possible (predefined) interactions: the software will analyze these inputs and will use them to estimate both single- and multi- risk associated to a specific target area. In addition, it will be possible to connect the software to further tools (e.g., a full hazard analysis), allowing a dynamic I/O of results. The use of Python programming language guarantees that the final software will be open source and platform independent. Moreover, thanks to the integration of some most popular and rich-featured Python scientific modules (Numpy, Matplotlib, Scipy) with the wxPython graphical user toolkit, the final tool will be equipped with a comprehensive Graphical User Interface (GUI) able to control and visualize (in the form of tables, maps and/or plots) any stage of the multi-risk analysis. The additional features of importing/exporting data in MySQL databases and/or standard XML formats (for

  5. Teaching Computation in Primary School without Traditional Written Algorithms

    Science.gov (United States)

    Hartnett, Judy

    2015-01-01

    Concerns regarding the dominance of the traditional written algorithms in schools have been raised by many mathematics educators, yet the teaching of these procedures remains a dominant focus in in primary schools. This paper reports on a project in one school where the staff agreed to put the teaching of the traditional written algorithm aside,…

  6. Written cohesion in children with and without language learning disabilities.

    Science.gov (United States)

    Koutsoftas, Anthony D; Petersen, Victoria

    2017-09-01

    Cohesion refers to the linguistic elements of discourse that contribute to its continuity and is an important element to consider as part of written language intervention, especially in children with language learning disabilities (LLD). There is substantial evidence that children with LLD perform more poorly than typically developing (TD) peers on measures of cohesion in spoken language and on written transcription measures; however, there is far less research comparing groups on cohesion as a measure of written language across genres. The current study addresses this gap through the following two aims. First, to describe and compare cohesion in narrative and expository writing samples of children with and without language learning disabilities. Second, to relate measures of cohesion to written transcription and translation measures, oral language, and writing quality. Fifty intermediate-grade children produced one narrative and one expository writing sample from which measures of written cohesion were obtained. These included the frequency, adequacy and complexity of referential and conjunctive ties. Expository samples resulted in more complex cohesive ties and children with TD used more complex ties than peers with LLD. Different relationships among cohesion measures and writing were observed for narrative verse expository samples. Findings from this study demonstrate cohesion as a discourse-level measure of written transcription and how the use of cohesion can vary by genre and group (LLD, TD). Clinical implications for assessment, intervention, and future research are provided. © 2016 Royal College of Speech and Language Therapists.

  7. MODEL WRITTEN TEXTS IN THE RECOMMENDED SENIOR HIGH SCHOOL ENGLISH TEXTBOOKS

    Directory of Open Access Journals (Sweden)

    Dwi Rukmini

    2009-01-01

    Full Text Available Abstract: This article is based on the study on the model written texts provided in the Senior High School English textbooks. It is aimed at finding out whether those models are written by considering the English two contexts, cultural and situational, which encircle them. The data are all written texts provided in the six recommended English textbooks published by six different publishers. The results reveal that only eleven out of 115 model written texts tend to be incompatible with the two contexts encircling them, this implies that 104 of them (93.43% are likely to be compatible and can be used as model texts.

  8. An open source automatic quality assurance (OSAQA) tool for the ACR MRI phantom.

    Science.gov (United States)

    Sun, Jidi; Barnes, Michael; Dowling, Jason; Menk, Fred; Stanwell, Peter; Greer, Peter B

    2015-03-01

    Routine quality assurance (QA) is necessary and essential to ensure MR scanner performance. This includes geometric distortion, slice positioning and thickness accuracy, high contrast spatial resolution, intensity uniformity, ghosting artefact and low contrast object detectability. However, this manual process can be very time consuming. This paper describes the development and validation of an open source tool to automate the MR QA process, which aims to increase physicist efficiency, and improve the consistency of QA results by reducing human error. The OSAQA software was developed in Matlab and the source code is available for download from http://jidisun.wix.com/osaqa-project/. During program execution QA results are logged for immediate review and are also exported to a spreadsheet for long-term machine performance reporting. For the automatic contrast QA test, a user specific contrast evaluation was designed to improve accuracy for individuals on different display monitors. American College of Radiology QA images were acquired over a period of 2 months to compare manual QA and the results from the proposed OSAQA software. OSAQA was found to significantly reduce the QA time from approximately 45 to 2 min. Both the manual and OSAQA results were found to agree with regard to the recommended criteria and the differences were insignificant compared to the criteria. The intensity homogeneity filter is necessary to obtain an image with acceptable quality and at the same time keeps the high contrast spatial resolution within the recommended criterion. The OSAQA tool has been validated on scanners with different field strengths and manufacturers. A number of suggestions have been made to improve both the phantom design and QA protocol in the future.

  9. Developmental screening and parents' written comments: an added dimension to the parents' evaluation of developmental status questionnaire.

    Science.gov (United States)

    Cox, Joanne E; Huntington, Noelle; Saada, Adrianna; Epee-Bounya, Alexandra; Schonwald, Alison D

    2010-12-01

    The aim of this study was to better understand the utility of using the Parents' Evaluation of Developmental Status (PEDS) in well-child visits by analyzing themes and patterns in parents' written responses on the PEDS form. We reviewed a consecutive sample of medical records with PEDS forms for children aged 6 months to 9 years (site 1) and 3 to 5 years (site 2). We recorded the concerns that parents identified in response to the 10 PEDS questions along with demographic information. We then categorized parents' written comments about those concerns according to comment content. We used qualitative and quantitative methods for analysis. We collected 752 PEDS forms. Ninety percent of the parents endorsed at least 1 concern (94.6% on the English forms versus 69.7% on the Spanish forms; P Parents qualified 27.5% of their concerns with a written comment. In 23.9% of cases in which parents identified a concern and provided a written comment, the content of the comment did not match the question's intent; rates of mismatch were similar for the English and Spanish forms. Among comments regarding behavioral concerns, 12% reflected a misunderstanding of age-appropriate behavior. Medical concerns accounted for 14.1% of the comments; these concerns were more common on English forms (61.3%) than on Spanish forms (1.7%) (P Parents frequently used the PEDS forms to communicate additional concerns regarding their child or provide positive feedback on their child's progress. The inappropriate developmental expectations, limited health literacy, and culturally distinct comments on the PEDS forms reinforce the importance of using screening tools to enhance the care provided during visits but not to replace patient-provider communication.

  10. The security analyzer: A security analyzer program written in Prolog

    International Nuclear Information System (INIS)

    Zimmerman, B.D.; Densley, P.J.

    1986-09-01

    The Security Analyzer is a software tool capable of analyzing the effectiveness of a facility's security system. It is written in the Prolog logic programming computer language, using entity-relationship data modeling techniques. The program performs the following functions: (1) provides descriptive, locational and operational status information about intrusion detectors and assessment devices (i.e., ''sensors'' and ''cameras'') upon request; (2) provides for storage and retrieval of maintenance history information for various components of the security system (including intrusion detectors), and allows for changing that information as desired; (3) provides a ''search'' mode, wherein all paths are found from any specified physical location to another specified location which satisfy user chosen ''intruder detection'' probability and elapsed time criteria (i.e., the program finds the ''weakest paths'' from a security point of view). The first two of these functions can be provided fairly easily with a conventional database program; the third function could be provided using Fortran or some similar language, though with substantial difficulty. In the Security Analyzer program, all these functions are provided in a simple and straight-forward manner. This simplicity is possible because the program is written in the symbolic (as opposed to numeric) processing language Prolog, and because the knowledge base is structured according to entity-relationship modeling principles. Also, the use of Prolog and the entity-relationship modeling technique allows the capabilities of the Security analyzer program, both for knowledge base interrogation and for searching-type operations, to be easily expanded in ways that would be very difficult for a numeric and more algorithmically deterministic language such as Fortran to duplicate. 4 refs

  11. What does the media say about palliative care? A descriptive study of news coverage in written media in Spain

    Science.gov (United States)

    García, Miriam; Navas, Alejandro; Olza, Inés; Gómez-Baceiredo, Beatriz; Pujol, Francesc; Garralda, Eduardo; Centeno, Carlos

    2017-01-01

    Introduction The goal of palliative care (PC) is to improve the quality of life of terminal stage patients and their families. The subject frequently appears in the mass-media and this helps create a socially accepted identity. The aim of this study is to describe and analyse PC related news items appeared in the Spanish written media. Methodology A descriptive cross-sectional study was designed. Considering diffusion, scope and the range in editorial policy criteria, four printed newspapers (PN) were selected, together with four exclusively digital media sources (DM). Through Mynews, a newspaper content depository, and the search tool for each DM website, articles published between 2009 and 2014 which included the terms "palliative care" and "palliative medicine" were sought. A questionnaire was created to characterise each article identified and a descriptive analysis was undertaken. Results A total of 627 articles were identified, of which 359 (57%) were published in PN (42% in the printed editions -PE- 16% in their online editions -OE-) and 268 (43%) in DM. In general, they appeared mainly in sections concerning Health (23%), Culture and Society (18%) and General/Home News (15%). In PE, just 2% were found in the Health section and nearly 70% in Culture and Society and General/Home News. Most of the articles were informative in nature and contained socio-political messages (90%). Statements by PC professionals were found in 35% of the articles and by politicians in 32%. The most frequent content was related to facing end of life (74%) and patient quality of life (70%). Conclusions The Spanish written media reflects the socio-political interest aroused by PC. Nevertheless, messages circulating about PC do not describe professional practice, or the contribution of the same for patients. Content more in line with the clinical practice might help contribute to the development of this new area of medicine. PMID:28968433

  12. What does the media say about palliative care? A descriptive study of news coverage in written media in Spain.

    Science.gov (United States)

    Carrasco, José Miguel; García, Miriam; Navas, Alejandro; Olza, Inés; Gómez-Baceiredo, Beatriz; Pujol, Francesc; Garralda, Eduardo; Centeno, Carlos

    2017-01-01

    The goal of palliative care (PC) is to improve the quality of life of terminal stage patients and their families. The subject frequently appears in the mass-media and this helps create a socially accepted identity. The aim of this study is to describe and analyse PC related news items appeared in the Spanish written media. A descriptive cross-sectional study was designed. Considering diffusion, scope and the range in editorial policy criteria, four printed newspapers (PN) were selected, together with four exclusively digital media sources (DM). Through Mynews, a newspaper content depository, and the search tool for each DM website, articles published between 2009 and 2014 which included the terms "palliative care" and "palliative medicine" were sought. A questionnaire was created to characterise each article identified and a descriptive analysis was undertaken. A total of 627 articles were identified, of which 359 (57%) were published in PN (42% in the printed editions -PE- 16% in their online editions -OE-) and 268 (43%) in DM. In general, they appeared mainly in sections concerning Health (23%), Culture and Society (18%) and General/Home News (15%). In PE, just 2% were found in the Health section and nearly 70% in Culture and Society and General/Home News. Most of the articles were informative in nature and contained socio-political messages (90%). Statements by PC professionals were found in 35% of the articles and by politicians in 32%. The most frequent content was related to facing end of life (74%) and patient quality of life (70%). The Spanish written media reflects the socio-political interest aroused by PC. Nevertheless, messages circulating about PC do not describe professional practice, or the contribution of the same for patients. Content more in line with the clinical practice might help contribute to the development of this new area of medicine.

  13. What does the media say about palliative care? A descriptive study of news coverage in written media in Spain.

    Directory of Open Access Journals (Sweden)

    José Miguel Carrasco

    Full Text Available The goal of palliative care (PC is to improve the quality of life of terminal stage patients and their families. The subject frequently appears in the mass-media and this helps create a socially accepted identity. The aim of this study is to describe and analyse PC related news items appeared in the Spanish written media.A descriptive cross-sectional study was designed. Considering diffusion, scope and the range in editorial policy criteria, four printed newspapers (PN were selected, together with four exclusively digital media sources (DM. Through Mynews, a newspaper content depository, and the search tool for each DM website, articles published between 2009 and 2014 which included the terms "palliative care" and "palliative medicine" were sought. A questionnaire was created to characterise each article identified and a descriptive analysis was undertaken.A total of 627 articles were identified, of which 359 (57% were published in PN (42% in the printed editions -PE- 16% in their online editions -OE- and 268 (43% in DM. In general, they appeared mainly in sections concerning Health (23%, Culture and Society (18% and General/Home News (15%. In PE, just 2% were found in the Health section and nearly 70% in Culture and Society and General/Home News. Most of the articles were informative in nature and contained socio-political messages (90%. Statements by PC professionals were found in 35% of the articles and by politicians in 32%. The most frequent content was related to facing end of life (74% and patient quality of life (70%.The Spanish written media reflects the socio-political interest aroused by PC. Nevertheless, messages circulating about PC do not describe professional practice, or the contribution of the same for patients. Content more in line with the clinical practice might help contribute to the development of this new area of medicine.

  14. Providing written language services in the schools: the time is now.

    Science.gov (United States)

    Fallon, Karen A; Katz, Lauren A

    2011-01-01

    The current study was conducted to investigate the provision of written language services by school-based speech-language pathologists (SLPs). Specifically, the study examined SLPs' knowledge, attitudes, and collaborative practices in the area of written language services as well as the variables that impact provision of these services. Public school-based SLPs from across the country were solicited for participation in an online, Web-based survey. Data from 645 full-time SLPs from 49 states were evaluated using descriptive statistics and logistic regression. Many school-based SLPs reported not providing any services in the area of written language to students with written language weaknesses. Knowledge, attitudes, and collaborative practices were mixed. A logistic regression revealed three variables likely to predict high levels of service provision in the area of written language. Data from the current study revealed that many struggling readers and writers on school-based SLPs' caseloads are not receiving services from their SLPs. Implications for SLPs' preservice preparation, continuing education, and doctoral preparation are discussed.

  15. OLS Client and OLS Dialog: Open Source Tools to Annotate Public Omics Datasets.

    Science.gov (United States)

    Perez-Riverol, Yasset; Ternent, Tobias; Koch, Maximilian; Barsnes, Harald; Vrousgou, Olga; Jupp, Simon; Vizcaíno, Juan Antonio

    2017-10-01

    The availability of user-friendly software to annotate biological datasets and experimental details is becoming essential in data management practices, both in local storage systems and in public databases. The Ontology Lookup Service (OLS, http://www.ebi.ac.uk/ols) is a popular centralized service to query, browse and navigate biomedical ontologies and controlled vocabularies. Recently, the OLS framework has been completely redeveloped (version 3.0), including enhancements in the data model, like the added support for Web Ontology Language based ontologies, among many other improvements. However, the new OLS is not backwards compatible and new software tools are needed to enable access to this widely used framework now that the previous version is no longer available. We here present the OLS Client as a free, open-source Java library to retrieve information from the new version of the OLS. It enables rapid tool creation by providing a robust, pluggable programming interface and common data model to programmatically access the OLS. The library has already been integrated and is routinely used by several bioinformatics resources and related data annotation tools. Secondly, we also introduce an updated version of the OLS Dialog (version 2.0), a Java graphical user interface that can be easily plugged into Java desktop applications to access the OLS. The software and related documentation are freely available at https://github.com/PRIDE-Utilities/ols-client and https://github.com/PRIDE-Toolsuite/ols-dialog. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Exploring the Written Dialogues of Two First-Year Secondary Science Teachers in an Online Mentoring Program

    Science.gov (United States)

    Bang, EunJin; Luft, Julie A.

    2014-02-01

    This study explored the yearlong learning processes of two first-year secondary science teachers participating in an online mentoring program, through examination of their written dialogues within the program and other data. Using a case study method, this study (a) explored the patterns of written dialogues between the two new teachers and their mentors over the course of a year, (b) documented pertinent topics of importance, and finally (c) illustrated the new realities created in the mentees' classrooms as a result of the online mentoring process. Penelope and Bradley, who taught at an urban school and at a suburban school respectively, were selected as subjects. Our analysis revealed that the two pairs of mentee-mentors showed different participation patterns that affected the intensity of the creation of new realities, and affected whether the mentees tried/vetted new teaching practices suggested by their mentors. Yet, analysis also revealed that certain elements in the written dialogues between pairs were found to be similar, in that construction of knowledge was evident between both pairs when friction developed and appropriate teamwork emerged to deal with it. The topics of greatest interest and importance within the dialogues were those related to the logistics of the school system and the processes and methodologies of teaching. These results suggest that online mentoring programs are an effective dialogical tool for transferring the knowledge of experts to novices, and for thus expediting the professional induction and growth of new science teachers.

  17. Source reliability in auditory health persuasion : Its antecedents and consequences

    NARCIS (Netherlands)

    Elbert, Sarah P.; Dijkstra, Arie

    2015-01-01

    Persuasive health messages can be presented through an auditory channel, thereby enhancing the salience of the source, making it fundamentally different from written or pictorial information. We focused on the determinants of perceived source reliability in auditory health persuasion by

  18. The Bristol Radiology Report Assessment Tool (BRRAT): Developing a workplace-based assessment tool for radiology reporting skills

    International Nuclear Information System (INIS)

    Wallis, A.; Edey, A.; Prothero, D.; McCoubrie, P.

    2013-01-01

    Aim: To review the development of a workplace-based assessment tool to assess the quality of written radiology reports and assess its reliability, feasibility, and validity. Materials and methods: A comprehensive literature review and rigorous Delphi study enabled the development of the Bristol Radiology Report Assessment Tool (BRRAT), which consists of 19 questions and a global assessment score. Three assessors applied the assessment tool to 240 radiology reports provided by 24 radiology trainees. Results: The reliability coefficient for the 19 questions was 0.79 and the equivalent coefficient for the global assessment scores was 0.67. Generalizability coefficients demonstrate that higher numbers of assessors and assessments are needed to reach acceptable levels of reliability for summative assessments due to assessor subjectivity. Conclusion: The study methodology gives good validity and strong foundation in best-practice. The assessment tool developed for radiology reporting is reliable and most suited to formative assessments

  19. Development of a tool dedicated to the evaluation of hydrogen term source for technological Wastes: assumptions, physical models, and validation

    Energy Technology Data Exchange (ETDEWEB)

    Lamouroux, C. [CEA Saclay, Nuclear Energy Division /DANS, Department of physico-chemistry, 91191 Gif sur yvette (France); Esnouf, S. [CEA Saclay, DSM/IRAMIS/SIS2M/Radiolysis Laboratory , 91191 Gif sur yvette (France); Cochin, F. [Areva NC,recycling BU, DIRP/RDP tour Areva, 92084 Paris La Defense (France)

    2013-07-01

    In radioactive waste packages hydrogen is generated, in one hand, from the radiolysis of wastes (mainly organic materials) and, in the other hand, from the radiolysis of water content in the cement matrix. In order to assess hydrogen generation 2 tools based on operational models have been developed. One is dedicated to the determination of the hydrogen source term issues from the radiolysis of the wastes: the STORAGE tool (Simulation Tool Of Emission Radiolysis Gas), the other deals with the hydrogen source term gas, produced by radiolysis of the cement matrices (the Damar tool). The approach used by the STORAGE tool for assessing the production rate of radiolysis gases is divided into five steps: 1) Specification of the data packages, in particular, inventories and radiological materials defined for a package medium; 2) Determination of radiochemical yields for the different constituents and the laws of behavior associated, this determination of radiochemical yields is made from the PRELOG database in which radiochemical yields in different irradiation conditions have been compiled; 3) Definition of hypothesis concerning the composition and the distribution of contamination inside the package to allow assessment of the power absorbed by the constituents; 4) Sum-up of all the contributions; And finally, 5) validation calculations by comparison with a reduced sampling of packages. Comparisons with measured values confirm the conservative character of the methodology and give confidence in the safety margins for safety analysis report.

  20. Development of a traceability analysis method based on case grammar for NPP requirement documents written in Korean language

    International Nuclear Information System (INIS)

    Yoo, Yeong Jae; Seong, Poong Hyun; Kim, Man Cheol

    2004-01-01

    Software inspection is widely believed to be an effective method for software verification and validation (V and V). However, software inspection is labor-intensive and, since it uses little technology, software inspection is viewed upon as unsuitable for a more technology-oriented development environment. Nevertheless, software inspection is gaining in popularity. KAIST Nuclear I and C and Information Engineering Laboratory (NICIEL) has developed software management and inspection support tools, collectively named 'SIS-RT.' SIS-RT is designed to partially automate the software inspection processes. SIS-RT supports the analyses of traceability between a given set of specification documents. To make SIS-RT compatible for documents written in Korean, certain techniques in natural language processing have been studied. Among the techniques considered, case grammar is most suitable for analyses of the Korean language. In this paper, we propose a methodology that uses a case grammar approach to analyze the traceability between documents written in Korean. A discussion regarding some examples of such an analysis will follow

  1. A methodology for improving the SIS-RT in analyzing the traceability of the documents written in Korean language

    International Nuclear Information System (INIS)

    Yoo, Yeong Jae; Kim, Man Cheol; Seong, Poong Hyun

    2002-01-01

    Inspection is widely believed to be an effective software verification and validation (V and V) method. However, software inspection is labor-intensive. This labor-intensive nature is compounded by a view that since software inspection uses little technology, they do not fit in well with a more technology-oriented development environment. Nevertheless, software inspection is gaining in popularity. The researchers of KAIST I and C laboratory developed the software tool managing and supporting inspection tasks, named SIS-RT. SIS-RT is designed to partially automate the software inspection processes. SIS-RT supports the analyses of traceability between the spec documents. To make SIS-RT prepared for the spec document written in Korean language, certain techniques in natural language processing have been reviewed. Among those, the case grammar is most suitable for the analyses of Korean language. In this paper, the methodology for analyzing the traceability between spec documents written in Korean language will be proposed based on the case grammar

  2. Simulations of a spectral gamma-ray logging tool response to a surface source distribution on the borehole wall

    International Nuclear Information System (INIS)

    Wilson, R.D.; Conaway, J.G.

    1991-01-01

    We have developed Monte Carlo and discrete ordinates simulation models for the large-detector spectral gamma-ray (SGR) logging tool in use at the Nevada Test Site. Application of the simulation models produced spectra for source layers on the borehole wall, either from potassium-bearing mudcakes or from plate-out of radon daughter products. Simulations show that the shape and magnitude of gamma-ray spectra from sources distributed on the borehole wall depend on radial position with in the air-filled borehole as well as on hole diameter. No such dependence is observed for sources uniformly distributed in the formation. In addition, sources on the borehole wall produce anisotropic angular fluxes at the higher scattered energies and at the source energy. These differences in borehole effects and in angular flux are important to the process of correcting SGR logs for the presence of potassium mudcakes; they also suggest a technique for distinguishing between spectral contributions from formation sources and sources on the borehole wall. These results imply the existence of a standoff effect not present for spectra measured in air-filled boreholes from formation sources. 5 refs., 11 figs

  3. The Influence of Process Drama on Elementary Students' Written Language

    Science.gov (United States)

    Anderson, Alida

    2012-01-01

    This article describes the influence of process drama on fourth grade students' written language productivity and specificity. Participants included 16 students with learning and/or behavioral challenges at an urban public charter school. The influence of process drama on students' written language was compared across contextualized and…

  4. Free and Open Source Tools (FOSTs): An Empirical Investigation of Pre-Service Teachers' Competencies, Attitudes, and Pedagogical Intentions

    Science.gov (United States)

    Asing-Cashman, Joyce G.; Gurung, Binod; Limbu, Yam B.; Rutledge, David

    2014-01-01

    This study examines the digital native pre-service teachers' (DNPSTs) perceptions of their competency, attitude, and pedagogical intention to use free and open source tools (FOSTs) in their future teaching. Participants were 294 PSTs who responded to pre-course surveys at the beginning of an educational technology course. Using the structural…

  5. An open source GIS tool to quantify the visual impact of wind turbines and photovoltaic panels

    International Nuclear Information System (INIS)

    Minelli, Annalisa; Marchesini, Ivan; Taylor, Faith E.; De Rosa, Pierluigi; Casagrande, Luca; Cenci, Michele

    2014-01-01

    Although there are clear economic and environmental incentives for producing energy from solar and wind power, there can be local opposition to their installation due to their impact upon the landscape. To date, no international guidelines exist to guide quantitative visual impact assessment of these facilities, making the planning process somewhat subjective. In this paper we demonstrate the development of a method and an Open Source GIS tool to quantitatively assess the visual impact of these facilities using line-of-site techniques. The methods here build upon previous studies by (i) more accurately representing the shape of energy producing facilities, (ii) taking into account the distortion of the perceived shape and size of facilities caused by the location of the observer, (iii) calculating the possible obscuring of facilities caused by terrain morphology and (iv) allowing the combination of various facilities to more accurately represent the landscape. The tool has been applied to real and synthetic case studies and compared to recently published results from other models, and demonstrates an improvement in accuracy of the calculated visual impact of facilities. The tool is named r.wind.sun and is freely available from GRASS GIS AddOns. - Highlights: • We develop a tool to quantify wind turbine and photovoltaic panel visual impact. • The tool is freely available to download and edit as a module of GRASS GIS. • The tool takes into account visual distortion of the shape and size of objects. • The accuracy of calculation of visual impact is improved over previous methods

  6. An open source GIS tool to quantify the visual impact of wind turbines and photovoltaic panels

    Energy Technology Data Exchange (ETDEWEB)

    Minelli, Annalisa, E-mail: Annalisa.Minelli@univ-brest.fr [Insitute Universitaire Européen de la Mer, Université de la Bretagne Occidentale, Rue Dumont D' Urville, 29280 Plouzané (France); Marchesini, Ivan, E-mail: Ivan.Marchesini@irpi.cnr.it [National Research Council (CNR), Research Insitute for Geo-hydrological Protection (IRPI), Strada della Madonna Alta 126, 06125 Perugia (Italy); Taylor, Faith E., E-mail: Faith.Taylor@kcl.ac.uk [Earth and Environmental Dynamics Research Group, Department of Geography, King' s College London, Strand, London WC2R 2LS (United Kingdom); De Rosa, Pierluigi, E-mail: Pierluigi.Derosa@unipg.it [Physics and Geology Department, University of Perugia, Via Zefferino Faina 4, 06123 Perugia (Italy); Casagrande, Luca, E-mail: Luca.Casagrande@gfosservices.it [Gfosservices S.A., Open Source GIS-WebGIS Solutions, Spatial Data Infrastructures, Planning and Counseling, Via F.lli Cairoli 24, 06127 Perugia (Italy); Cenci, Michele, E-mail: mcenci@regione.umbria.it [Servizio Energia qualità dell' ambiente, rifiuti, attività estrattive, Regione Umbia, Corso Vannucci 96, 06121 Perugia (Italy)

    2014-11-15

    Although there are clear economic and environmental incentives for producing energy from solar and wind power, there can be local opposition to their installation due to their impact upon the landscape. To date, no international guidelines exist to guide quantitative visual impact assessment of these facilities, making the planning process somewhat subjective. In this paper we demonstrate the development of a method and an Open Source GIS tool to quantitatively assess the visual impact of these facilities using line-of-site techniques. The methods here build upon previous studies by (i) more accurately representing the shape of energy producing facilities, (ii) taking into account the distortion of the perceived shape and size of facilities caused by the location of the observer, (iii) calculating the possible obscuring of facilities caused by terrain morphology and (iv) allowing the combination of various facilities to more accurately represent the landscape. The tool has been applied to real and synthetic case studies and compared to recently published results from other models, and demonstrates an improvement in accuracy of the calculated visual impact of facilities. The tool is named r.wind.sun and is freely available from GRASS GIS AddOns. - Highlights: • We develop a tool to quantify wind turbine and photovoltaic panel visual impact. • The tool is freely available to download and edit as a module of GRASS GIS. • The tool takes into account visual distortion of the shape and size of objects. • The accuracy of calculation of visual impact is improved over previous methods.

  7. Written narrative practices in elementary school students.

    Science.gov (United States)

    Romano-Soares, Soraia; Soares, Aparecido José Couto; Cárnio, Maria Silvia

    2010-01-01

    Promotion of a written narratives production program in the third grade of an Elementary School. To analyze two written narrative practice proposals in order to verify which resources are more efficient in benefitting the textual productions of third grade Elementary School students. Sixty students were selected from two third grade groups of a public Elementary School in São Paulo (Brazil). For the analysis, students were divided into two groups (Group A and Group B). Fourteen children's storybooks were used. In Group A, the story was orally told by the researchers in a colloquial manner, keeping the narrator role and the original structure proposed by the author. In Group B, the story was fully read. The book was projected onto a screen and read aloud so the students could follow the reading and observe the corresponding illustrations. Voice changing resources in the characters' dialogues were used. In the overall comparison, statistically significant results were found for moment (initial and final assessments) and for interaction between groups. It was observed that both groups presented substantial development from initial to final assessment. The Written Narratives Promotion Program based on the shared reading of children's storybooks constituted a more effective strategy than telling the stories using a single reader.

  8. Concreteness and Imagery Effects in the Written Composition of Definitions.

    Science.gov (United States)

    Sadoski, Mark; Kealy, William A.; Goetz, Ernest T.; Paivio, Allan

    1997-01-01

    In two experiments, undergraduates (n=48 and n=50) composed written definitions of concrete and abstract nouns that were matched for frequency of use and meaningfulness. Results support previous research suggesting that common cognitive mechanisms underlie production of spoken and written language as explained by dual coding theory. (SLD)

  9. Quantity and quality of written feedback, action plans, and student ...

    African Journals Online (AJOL)

    Background. Mini-clinical-evaluation exercise (mini-CEX) assessment forms that have been modified with the addition of specific spaces on separate sheets are expected to improve the quantity and quality of written feedback and the action plan for further learning which is agreed upon, and to encourage written reflection.

  10. Classifying Written Texts Through Rhythmic Features

    NARCIS (Netherlands)

    Balint, Mihaela; Dascalu, Mihai; Trausan-Matu, Stefan

    2016-01-01

    Rhythm analysis of written texts focuses on literary analysis and it mainly considers poetry. In this paper we investigate the relevance of rhythmic features for categorizing texts in prosaic form pertaining to different genres. Our contribution is threefold. First, we define a set of rhythmic

  11. Written mathematical traditions in Ancient Mesopotamia

    DEFF Research Database (Denmark)

    Høyrup, Jens

    2015-01-01

    Writing, as well as various mathematical techniques, were created in proto-literate Uruk in order to serve accounting, and Mesopotamian mathematics as we know it was always expressed in writing. In so far, mathematics generically regarded was always part of the generic written tradition....

  12. The Quality of Written Feedback by Attendings of Internal Medicine Residents.

    Science.gov (United States)

    Jackson, Jeffrey L; Kay, Cynthia; Jackson, Wilkins C; Frank, Michael

    2015-07-01

    Attending evaluations are commonly used to evaluate residents. Evaluate the quality of written feedback of internal medicine residents. Retrospective. Internal medicine residents and faculty at the Medical College of Wisconsin from 2004 to 2012. From monthly evaluations of residents by attendings, a randomly selected sample of 500 written comments by attendings were qualitatively coded and rated as high-, moderate-, or low-quality feedback by two independent coders with good inter-rater reliability (kappa: 0.94). Small group exercises with residents and attendings also coded the utterances as high, moderate, or low quality and developed criteria for this categorization. In-service examination scores were correlated with written feedback. There were 228 internal medicine residents who had 6,603 evaluations by 334 attendings. Among 500 randomly selected written comments, there were 2,056 unique utterances: 29% were coded as nonspecific statements, 20% were comments about resident personality, 16% about patient care, 14% interpersonal communication, 7% medical knowledge, 6% professionalism, and 4% each on practice-based learning and systems-based practice. Based on criteria developed by group exercises, the majority of written comments were rated as moderate quality (65%); 22% were rated as high quality and 13% as low quality. Attendings who provided high-quality feedback rated residents significantly lower in all six of the Accreditation Council for Graduate Medical Education (ACGME) competencies (p service examination scores. Most attending written evaluation was of moderate or low quality. Attendings who provided high-quality feedback appeared to be more discriminating, providing significantly lower ratings of residents in all six ACGME core competencies, and across a greater range. Attendings' negative written comments on medical knowledge correlated with lower in-service training scores.

  13. Handbook of bibliometric indicators quantitative tools for studying and evaluating research

    CERN Document Server

    Todeschini, Roberto

    2016-01-01

    At last, the first systematic guide to the growing jungle of citation indices and other bibliometric indicators. Written with the aim of providing a complete and unbiased overview of all available statistical measures for scientific productivity, the core of this reference is an alphabetical dictionary of indices and other algorithms used to evaluate the importance and impact of researchers and their institutions. In 150 major articles, the authors describe all indices in strictly mathematical terms without passing judgement on their relative merit. From widely used measures, such as the journal impact factor or the h-index, to highly specialized indices, all indicators currently in use in the sciences and humanities are described, and their application explained. The introductory section and the appendix contain a wealth of valuable supporting information on data sources, tools and techniques for bibliometric and scientometric analysis - for individual researchers as well as their funders and publishers.

  14. [Alcohol advertising in written mass media in Spain].

    Science.gov (United States)

    Montes-Santiago, J; Alvarez Muñiz, M L; Baz Lomba, A

    2007-03-01

    Alcohol advertising is a powerful factor of incitation to consumption. We analyzed the alcohol advertising, especially that youth-focused, in written mass media in Spain during the period 2002-2006. Annual cross-sectional study of advertisements in 41 widely difused written mass media (average readers: 10,1 millions). Media admitting alcohol publicity were 29% in the whole. (2,9 millions of readers on average, 29% of total readers). Alcohol advertising constituted the 3,8% of global publicity and the 8,6% of the publicity in media admitting alcohol publicity. In this period only 4% of the media (2,4% of total readers) inserted antidrug campaigns. In brief, three out of 10 total readers and one out of 12 people older than 15 years suffered the impact of tobacco advertising. Young people were included in 33% of alcohol advertisements and 3 out of 6 of youth-oriented magazines permitted a such publicity. Alcohol publicity remains high in written mass media in Spain. By contrast few people received informative antidrug campaigns. Advertising was preferentially directed to young people.

  15. Promoting Strong Written Communication Skills

    Science.gov (United States)

    Narayanan, M.

    2015-12-01

    The reason that an improvement in the quality of technical writing is still needed in the classroom is due to the fact that universities are facing challenging problems not only on the technological front but also on the socio-economic front. The universities are actively responding to the changes that are taking place in the global consumer marketplace. Obviously, there are numerous benefits of promoting strong written communication skills. They can be summarized into the following six categories. First, and perhaps the most important: The University achieves learner satisfaction. The learner has documented verbally, that the necessary knowledge has been successfully acquired. This results in learner loyalty that in turn will attract more qualified learners.Second, quality communication lowers the cost per pupil, consequently resulting in increased productivity backed by a stronger economic structure and forecast. Third, quality communications help to improve the cash flow and cash reserves of the university. Fourth, having high quality communication enables the university to justify the need for high costs of tuition and fees. Fifth, better quality in written communication skills result in attracting top-quality learners. This will lead to happier and satisfied learners, not to mention greater prosperity for the university as a whole. Sixth, quality written communication skills result in reduced complaints, thus meaning fewer hours spent on answering or correcting the situation. The University faculty and staff are thus able to devote more time on scholarly activities, meaningful research and productive community service. References Boyer, Ernest L. (1990). Scholarship reconsidered: Priorities of the Professorate.Princeton, NJ: Carnegie Foundation for the Advancement of Teaching. Hawkins, P., & Winter, J. (1997). Mastering change: Learning the lessons of the enterprise.London: Department for Education and Employment. Buzzel, Robert D., and Bradley T. Gale. (1987

  16. Assessing student written problem solutions: A problem-solving rubric with application to introductory physics

    Directory of Open Access Journals (Sweden)

    Jennifer L. Docktor

    2016-05-01

    Full Text Available Problem solving is a complex process valuable in everyday life and crucial for learning in the STEM fields. To support the development of problem-solving skills it is important for researchers and curriculum developers to have practical tools that can measure the difference between novice and expert problem-solving performance in authentic classroom work. It is also useful if such tools can be employed by instructors to guide their pedagogy. We describe the design, development, and testing of a simple rubric to assess written solutions to problems given in undergraduate introductory physics courses. In particular, we present evidence for the validity, reliability, and utility of the instrument. The rubric identifies five general problem-solving processes and defines the criteria to attain a score in each: organizing problem information into a Useful Description, selecting appropriate principles (Physics Approach, applying those principles to the specific conditions in the problem (Specific Application of Physics, using Mathematical Procedures appropriately, and displaying evidence of an organized reasoning pattern (Logical Progression.

  17. The Bristol Radiology Report Assessment Tool (BRRAT): developing a workplace-based assessment tool for radiology reporting skills.

    Science.gov (United States)

    Wallis, A; Edey, A; Prothero, D; McCoubrie, P

    2013-11-01

    To review the development of a workplace-based assessment tool to assess the quality of written radiology reports and assess its reliability, feasibility, and validity. A comprehensive literature review and rigorous Delphi study enabled the development of the Bristol Radiology Report Assessment Tool (BRRAT), which consists of 19 questions and a global assessment score. Three assessors applied the assessment tool to 240 radiology reports provided by 24 radiology trainees. The reliability coefficient for the 19 questions was 0.79 and the equivalent coefficient for the global assessment scores was 0.67. Generalizability coefficients demonstrate that higher numbers of assessors and assessments are needed to reach acceptable levels of reliability for summative assessments due to assessor subjectivity. The study methodology gives good validity and strong foundation in best-practice. The assessment tool developed for radiology reporting is reliable and most suited to formative assessments. Copyright © 2013 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.

  18. 12 CFR 516.110 - Who may submit a written comment?

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 5 2010-01-01 2010-01-01 false Who may submit a written comment? 516.110 Section 516.110 Banks and Banking OFFICE OF THRIFT SUPERVISION, DEPARTMENT OF THE TREASURY APPLICATION PROCESSING PROCEDURES Comment Procedures § 516.110 Who may submit a written comment? Any person may submit a...

  19. 21 CFR 14.35 - Written submissions to an advisory committee.

    Science.gov (United States)

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Written submissions to an advisory committee. 14.35 Section 14.35 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES... of the written summary along with a proposed agenda outlining the topics to be covered and...

  20. Appropriating Written French: Literacy Practices in a Parisian Elementary Classroom

    Science.gov (United States)

    Rockwell, Elsie

    2012-01-01

    In this article, I examine French language instruction in an elementary classroom serving primarily children of Afro-French immigrants in Paris. I show that a prevalent French language ideology privileges written over oral expression and associates full mastery of written French with rational thought and full inclusion in the French polity. This…

  1. Building Authenticity in Social Media Tools to Recruit Postsecondary Students

    Science.gov (United States)

    Sandlin, Jean Kelso; Peña, Edlyn Vallejo

    2014-01-01

    An increasing number of institutions utilize social media tools, including student-written blogs, on their admission websites in an effort to enhance authenticity in their recruitment marketing materials. This study offers a framework for understanding what contributes to prospective college students' perceptions of social media authenticity…

  2. HITCal: a software tool for analysis of video head impulse test responses.

    Science.gov (United States)

    Rey-Martinez, Jorge; Batuecas-Caletrio, Angel; Matiño, Eusebi; Perez Fernandez, Nicolás

    2015-09-01

    The developed software (HITCal) may be a useful tool in the analysis and measurement of the saccadic video head impulse test (vHIT) responses and with the experience obtained during its use the authors suggest that HITCal is an excellent method for enhanced exploration of vHIT outputs. To develop a (software) method to analyze and explore the vHIT responses, mainly saccades. HITCal was written using a computational development program; the function to access a vHIT file was programmed; extended head impulse exploration and measurement tools were created and an automated saccade analysis was developed using an experimental algorithm. For pre-release HITCal laboratory tests, a database of head impulse tests (HITs) was created with the data collected retrospectively in three reference centers. This HITs database was evaluated by humans and was also computed with HITCal. The authors have successfully built HITCal and it has been released as open source software; the developed software was fully operative and all the proposed characteristics were incorporated in the released version. The automated saccades algorithm implemented in HITCal has good concordance with the assessment by human observers (Cohen's kappa coefficient = 0.7).

  3. Assessment of Written Expression Skills of University Students in Terms of Text Completion Technique

    Directory of Open Access Journals (Sweden)

    Abdulkadir KIRBAŞ

    2017-12-01

    Full Text Available Writing is to transfer the visualised ideas on the paper. Writing, one of the language skills, is a significant tool of communication which provides the permanency of information conveying emotions and thoughts. Since writing has both cognitive and physical aspects, it makes writing the hardest and the latest language skill to improve. The studies show that writing activity is the most difficult skill students have difficulty. In higher education, in order to improve writing skills of students and give basic information and skills about writing skills written expression, composition and writing education lessons are taught both in the department of Turkish Language and Literature and in the departments of Turkish Language in the Faculties of Education. One of the aims of these lessons is to teach students written expression techniques together with the purposes and practices. One of the written expression techniques is text completion skill that improves student’s creativity and enhances her/his imaginary world. The purpose of this study is to assess students’ skills of using text completion technique with reference to the writing studies of students in higher education. the sample of the study consists of 85 college students studying in the department of Turkish Language and Literature in Gümüşhane University in 2016-2017 academic year. The data of the study were obtained from the written expression studies of the students. The introduction part of the article ‘On Reading’ by F. Bacon was given to the students and they were required to complete the text. ‘Text Completion Rating Scale in Writing Expression’ was developed to assess the data of the study by taking opinions of lecturers and Turkish education experts. The data of the study were presented with percentage and frequency rates. At the end of the study, it was concluded that students had weakness in some skills such as writing an effective body part about the topic given

  4. Relating genes to function: identifying enriched transcription factors using the ENCODE ChIP-Seq significance tool.

    Science.gov (United States)

    Auerbach, Raymond K; Chen, Bin; Butte, Atul J

    2013-08-01

    Biological analysis has shifted from identifying genes and transcripts to mapping these genes and transcripts to biological functions. The ENCODE Project has generated hundreds of ChIP-Seq experiments spanning multiple transcription factors and cell lines for public use, but tools for a biomedical scientist to analyze these data are either non-existent or tailored to narrow biological questions. We present the ENCODE ChIP-Seq Significance Tool, a flexible web application leveraging public ENCODE data to identify enriched transcription factors in a gene or transcript list for comparative analyses. The ENCODE ChIP-Seq Significance Tool is written in JavaScript on the client side and has been tested on Google Chrome, Apple Safari and Mozilla Firefox browsers. Server-side scripts are written in PHP and leverage R and a MySQL database. The tool is available at http://encodeqt.stanford.edu. abutte@stanford.edu Supplementary material is available at Bioinformatics online.

  5. Rotorcraft Optimization Tools: Incorporating Rotorcraft Design Codes into Multi-Disciplinary Design, Analysis, and Optimization

    Science.gov (United States)

    Meyn, Larry A.

    2018-01-01

    One of the goals of NASA's Revolutionary Vertical Lift Technology Project (RVLT) is to provide validated tools for multidisciplinary design, analysis and optimization (MDAO) of vertical lift vehicles. As part of this effort, the software package, RotorCraft Optimization Tools (RCOTOOLS), is being developed to facilitate incorporating key rotorcraft conceptual design codes into optimizations using the OpenMDAO multi-disciplinary optimization framework written in Python. RCOTOOLS, also written in Python, currently supports the incorporation of the NASA Design and Analysis of RotorCraft (NDARC) vehicle sizing tool and the Comprehensive Analytical Model of Rotorcraft Aerodynamics and Dynamics II (CAMRAD II) analysis tool into OpenMDAO-driven optimizations. Both of these tools use detailed, file-based inputs and outputs, so RCOTOOLS provides software wrappers to update input files with new design variable values, execute these codes and then extract specific response variable values from the file outputs. These wrappers are designed to be flexible and easy to use. RCOTOOLS also provides several utilities to aid in optimization model development, including Graphical User Interface (GUI) tools for browsing input and output files in order to identify text strings that are used to identify specific variables as optimization input and response variables. This paper provides an overview of RCOTOOLS and its use

  6. Evidence for a Limited-Cascading Account of Written Word Naming

    Science.gov (United States)

    Bonin, Patrick; Roux, Sebastien; Barry, Christopher; Canell, Laura

    2012-01-01

    We address the issue of how information flows within the written word production system by examining written object-naming latencies. We report 4 experiments in which we manipulate variables assumed to have their primary impact at the level of object recognition (e.g., quality of visual presentation of pictured objects), at the level of semantic…

  7. The connectome viewer toolkit: an open source framework to manage, analyze, and visualize connectomes.

    Science.gov (United States)

    Gerhard, Stephan; Daducci, Alessandro; Lemkaddem, Alia; Meuli, Reto; Thiran, Jean-Philippe; Hagmann, Patric

    2011-01-01

    Advanced neuroinformatics tools are required for methods of connectome mapping, analysis, and visualization. The inherent multi-modality of connectome datasets poses new challenges for data organization, integration, and sharing. We have designed and implemented the Connectome Viewer Toolkit - a set of free and extensible open source neuroimaging tools written in Python. The key components of the toolkit are as follows: (1) The Connectome File Format is an XML-based container format to standardize multi-modal data integration and structured metadata annotation. (2) The Connectome File Format Library enables management and sharing of connectome files. (3) The Connectome Viewer is an integrated research and development environment for visualization and analysis of multi-modal connectome data. The Connectome Viewer's plugin architecture supports extensions with network analysis packages and an interactive scripting shell, to enable easy development and community contributions. Integration with tools from the scientific Python community allows the leveraging of numerous existing libraries for powerful connectome data mining, exploration, and comparison. We demonstrate the applicability of the Connectome Viewer Toolkit using Diffusion MRI datasets processed by the Connectome Mapper. The Connectome Viewer Toolkit is available from http://www.cmtk.org/

  8. The Connectome Viewer Toolkit: an open source framework to manage, analyze and visualize connectomes

    Directory of Open Access Journals (Sweden)

    Stephan eGerhard

    2011-06-01

    Full Text Available Abstract Advanced neuroinformatics tools are required for methods of connectome mapping, analysis and visualization. The inherent multi-modality of connectome datasets poses new challenges for data organization, integration and sharing. We have designed and implemented the Connectome Viewer Toolkit --- a set of free and extensible open-source neuroimaging tools written in Python. The key components of the toolkit are as follows: 1. The Connectome File Format is an XML-based container format to standardize multi-modal data integration and structured metadata annotation. 2. The Connectome File Format Library enables management and sharing of connectome files. 3. The Connectome Viewer is an integrated research and development environment for visualization and analysis of multi-modal connectome data. The Connectome Viewer's plugin architecture supports extensions with network analysis packages and an interactive scripting shell, to enable easy development and community contributions. Integration with tools from the scientific Python community allows the leveraging of numerous existing libraries for powerful connectome data mining, exploration and comparison. We demonstrate the applicability of the Connectome Viewer Toolkit using Diffusion MRI datasets processed by the Connectome Mapper. The Connectome Viewer Toolkit is available from http://www.cmtk.org/.

  9. Modelling of contaminant transfers in a ventilated room in the near-field of an accidental emission source; Modelisation du transfert d'un aerocontaminant dans un local ventile en champ proche d'une source d'emission accidentelle

    Energy Technology Data Exchange (ETDEWEB)

    Guerra, D.

    2004-11-15

    Nowadays, predicting the space-time evolution of a pollutant released in a ventilated room including a process operation remains hard to achieve. However this prediction is imperative in hazardous activities, such as nuclear ones. The study consists in predicting space-time evolution of an airborne contaminant dispersion in the near-field emission source around a workplace, following an accidental rupture of a containment enclosure. The whole work is based on experiments of gas tracing, and on multidimensional simulations using CFD tools. The proposed model is written as a correlated function of various parameters: leak geometry (slot or circular opening), emission type (continuous or puff), initial velocity and emission duration. Influence of ventilation and obstructions (room walls) have been also studied in the case of continuous leaks. All final models, for gaseous pollutants, are written as correlations inspired by the theory of free turbulent jet flows. These models are easy to use within the framework of safety evaluations dealing with radioactive material containment and radiological protection inside nuclear facilities. (author)

  10. 37 CFR 251.43 - Written cases.

    Science.gov (United States)

    2010-07-01

    ... and redirect) must be referenced. (d) In the case of a royalty fee distribution proceeding, each party... ROYALTY PANEL RULES AND PROCEDURES COPYRIGHT ARBITRATION ROYALTY PANEL RULES OF PROCEDURE Procedures of Copyright Arbitration Royalty Panels § 251.43 Written cases. (a) All parties who have filed a notice of...

  11. Sustaining an Online, Shared Community Resource for Models, Robust Open source Software Tools and Data for Volcanology - the Vhub Experience

    Science.gov (United States)

    Patra, A. K.; Valentine, G. A.; Bursik, M. I.; Connor, C.; Connor, L.; Jones, M.; Simakov, N.; Aghakhani, H.; Jones-Ivey, R.; Kosar, T.; Zhang, B.

    2015-12-01

    Over the last 5 years we have created a community collaboratory Vhub.org [Palma et al, J. App. Volc. 3:2 doi:10.1186/2191-5040-3-2] as a place to find volcanology-related resources, and a venue for users to disseminate tools, teaching resources, data, and an online platform to support collaborative efforts. As the community (current active users > 6000 from an estimated community of comparable size) embeds the tools in the collaboratory into educational and research workflows it became imperative to: a) redesign tools into robust, open source reusable software for online and offline usage/enhancement; b) share large datasets with remote collaborators and other users seamlessly with security; c) support complex workflows for uncertainty analysis, validation and verification and data assimilation with large data. The focus on tool development/redevelopment has been twofold - firstly to use best practices in software engineering and new hardware like multi-core and graphic processing units. Secondly we wish to enhance capabilities to support inverse modeling, uncertainty quantification using large ensembles and design of experiments, calibration, validation. Among software engineering practices we practice are open source facilitating community contributions, modularity and reusability. Our initial targets are four popular tools on Vhub - TITAN2D, TEPHRA2, PUFF and LAVA. Use of tools like these requires many observation driven data sets e.g. digital elevation models of topography, satellite imagery, field observations on deposits etc. These data are often maintained in private repositories that are privately shared by "sneaker-net". As a partial solution to this we tested mechanisms using irods software for online sharing of private data with public metadata and access limits. Finally, we adapted use of workflow engines (e.g. Pegasus) to support the complex data and computing workflows needed for usage like uncertainty quantification for hazard analysis using physical

  12. The association of patients' oral health literacy and dental school communication tools: a pilot study.

    Science.gov (United States)

    Tam, Amy; Yue, Olivia; Atchison, Kathryn A; Richards, Jessica K; Holtzman, Jennifer S

    2015-05-01

    The aim of this pilot study was to assess adult patients' ability to read and understand two communication tools at the University of California, Los Angeles, School of Dentistry: the dental school clinic website and a patient education brochure pertaining to sedation in children that was written by dental school personnel. A convenience sample of 100 adults seeking treatment at the school's general dental clinic during 2012-13 completed a health literacy screening instrument. They were then asked to read clinic educational and informational materials and complete a survey. Analyses were conducted to determine the association between the subjects' oral health literacy and sociodemographics and their ability to locate and interpret information in written oral health information materials. SMOG and Flesch-Kincade formulas were used to assess the readability level of the electronic and written communication tools. The results demonstrated an association between these adults' oral health literacy and their dental knowledge and ability to navigate health information website resources and understand health education materials. Health literacy was not associated with age or gender, but was associated with education and race/ethnicity. The SMOG Readability Index determined that the website and the sedation form were written at a ninth grade reading level. These results suggest that dental schools and other health care organizations should incorporate a health-literate approach for their digital and written materials to enhance patients' ability to navigate and understand health information, regardless of their health literacy.

  13. 34 CFR 32.9 - Written decision.

    Science.gov (United States)

    2010-07-01

    ... the employee has submitted the financial statement and written explanation required under § 32.4(c... stating the facts supporting the nature and origin of the debt and the hearing official's analysis... determination of the existence and the amount of the overpayment or the extreme financial hardship caused by the...

  14. GEMMER: GEnome-wide tool for Multi-scale Modeling data Extraction and Representation for Saccharomyces cerevisiae.

    Science.gov (United States)

    Mondeel, Thierry D G A; Crémazy, Frédéric; Barberis, Matteo

    2018-02-01

    Multi-scale modeling of biological systems requires integration of various information about genes and proteins that are connected together in networks. Spatial, temporal and functional information is available; however, it is still a challenge to retrieve and explore this knowledge in an integrated, quick and user-friendly manner. We present GEMMER (GEnome-wide tool for Multi-scale Modelling data Extraction and Representation), a web-based data-integration tool that facilitates high quality visualization of physical, regulatory and genetic interactions between proteins/genes in Saccharomyces cerevisiae. GEMMER creates network visualizations that integrate information on function, temporal expression, localization and abundance from various existing databases. GEMMER supports modeling efforts by effortlessly gathering this information and providing convenient export options for images and their underlying data. GEMMER is freely available at http://gemmer.barberislab.com. Source code, written in Python, JavaScript library D3js, PHP and JSON, is freely available at https://github.com/barberislab/GEMMER. M.Barberis@uva.nl. Supplementary data are available at Bioinformatics online. © The Author(s) 2018. Published by Oxford University Press.

  15. Open-source tool for automatic import of coded surveying data to multiple vector layers in GIS environment

    Directory of Open Access Journals (Sweden)

    Eva Stopková

    2016-12-01

    Full Text Available This paper deals with a tool that enables import of the coded data in a singletext file to more than one vector layers (including attribute tables, together withautomatic drawing of line and polygon objects and with optional conversion toCAD. Python script v.in.survey is available as an add-on for open-source softwareGRASS GIS (GRASS Development Team. The paper describes a case study basedon surveying at the archaeological mission at Tell-el Retaba (Egypt. Advantagesof the tool (e.g. significant optimization of surveying work and its limits (demandson keeping conventions for the points’ names coding are discussed here as well.Possibilities of future development are suggested (e.g. generalization of points’names coding or more complex attribute table creation.

  16. The neutron porosity tool

    International Nuclear Information System (INIS)

    Oelgaard, P.L.

    1988-01-01

    The report contains a review of available information on neutron porosity tools with the emphasis on dual thermal-neutron-detector porosity tools and epithermal-neutron-detector porosity tools. The general principle of such tools is discussed and theoretical models are very briefly reviewed. Available data on tool designs are summarized with special regard to the source-detector distance. Tool operational data, porosity determination and correction of measurements are briefly discussed. (author) 15 refs

  17. Financial analysis as a marketing tool in the process of awareness increase in the area of renewable energy sources

    Directory of Open Access Journals (Sweden)

    Marcela Taušová

    2007-04-01

    Full Text Available Alternative sources of energy represent a great area of progress nowadays. The trend of the 21. century is energetically demanding with an increaming tendency to use fossil fuels, sources of which are however limited. The article will deal with an inevitability of the use of marketing tools with the aim to increase the share of these energetical resources on the Slovak market. The result will be obtaining of some financial advantage for future users on one side and the increase of volume of sales for vendors on the other side.

  18. Open source tools for standardized privacy protection of medical images

    Science.gov (United States)

    Lien, Chung-Yueh; Onken, Michael; Eichelberg, Marco; Kao, Tsair; Hein, Andreas

    2011-03-01

    In addition to the primary care context, medical images are often useful for research projects and community healthcare networks, so-called "secondary use". Patient privacy becomes an issue in such scenarios since the disclosure of personal health information (PHI) has to be prevented in a sharing environment. In general, most PHIs should be completely removed from the images according to the respective privacy regulations, but some basic and alleviated data is usually required for accurate image interpretation. Our objective is to utilize and enhance these specifications in order to provide reliable software implementations for de- and re-identification of medical images suitable for online and offline delivery. DICOM (Digital Imaging and Communications in Medicine) images are de-identified by replacing PHI-specific information with values still being reasonable for imaging diagnosis and patient indexing. In this paper, this approach is evaluated based on a prototype implementation built on top of the open source framework DCMTK (DICOM Toolkit) utilizing standardized de- and re-identification mechanisms. A set of tools has been developed for DICOM de-identification that meets privacy requirements of an offline and online sharing environment and fully relies on standard-based methods.

  19. TH-C-12A-12: Veritas: An Open Source Tool to Facilitate User Interaction with TrueBeam Developer Mode

    Energy Technology Data Exchange (ETDEWEB)

    Mishra, P [Brigham and Women' s Hospital, Harvard Medical School, Boston, MA (United States); Varian Medical Systems, Palo Alto, CA (United States); Lewis, J [Brigham and Women' s Hospital, Harvard Medical School, Boston, MA (United States); Etmektzoglou, T; Svatos, M [Varian Medical Systems, Palo Alto, CA (United States)

    2014-06-15

    Purpose: To address the challenges of creating delivery trajectories and imaging sequences with TrueBeam Developer Mode, a new open-source graphical XML builder, Veritas, has been developed, tested and made freely available. Veritas eliminates most of the need to understand the underlying schema and write XML scripts, by providing a graphical menu for each control point specifying the state of 30 mechanical/dose axes. All capabilities of Developer Mode are accessible in Veritas. Methods: Veritas was designed using QT Designer, a ‘what-you-is-what-you-get’ (WYSIWIG) tool for building graphical user interfaces (GUI). Different components of the GUI are integrated using QT's signals and slots mechanism. Functionalities are added using PySide, an open source, cross platform Python binding for the QT framework. The XML code generated is immediately visible, making it an interactive learning tool. A user starts from an anonymized DICOM file or XML example and introduces delivery modifications, or begins their experiment from scratch, then uses the GUI to modify control points as desired. The software automatically generates XML plans following the appropriate schema. Results: Veritas was tested by generating and delivering two XML plans at Brigham and Women's Hospital. The first example was created to irradiate the letter ‘B’ with a narrow MV beam using dynamic couch movements. The second was created to acquire 4D CBCT projections for four minutes. The delivery of the letter ‘B’ was observed using a 2D array of ionization chambers. Both deliveries were generated quickly in Veritas by non-expert Developer Mode users. Conclusion: We introduced a new open source tool Veritas for generating XML plans (delivery trajectories and imaging sequences). Veritas makes Developer Mode more accessible by reducing the learning curve for quick translation of research ideas into XML plans. Veritas is an open source initiative, creating the possibility for future

  20. TH-C-12A-12: Veritas: An Open Source Tool to Facilitate User Interaction with TrueBeam Developer Mode

    International Nuclear Information System (INIS)

    Mishra, P; Lewis, J; Etmektzoglou, T; Svatos, M

    2014-01-01

    Purpose: To address the challenges of creating delivery trajectories and imaging sequences with TrueBeam Developer Mode, a new open-source graphical XML builder, Veritas, has been developed, tested and made freely available. Veritas eliminates most of the need to understand the underlying schema and write XML scripts, by providing a graphical menu for each control point specifying the state of 30 mechanical/dose axes. All capabilities of Developer Mode are accessible in Veritas. Methods: Veritas was designed using QT Designer, a ‘what-you-is-what-you-get’ (WYSIWIG) tool for building graphical user interfaces (GUI). Different components of the GUI are integrated using QT's signals and slots mechanism. Functionalities are added using PySide, an open source, cross platform Python binding for the QT framework. The XML code generated is immediately visible, making it an interactive learning tool. A user starts from an anonymized DICOM file or XML example and introduces delivery modifications, or begins their experiment from scratch, then uses the GUI to modify control points as desired. The software automatically generates XML plans following the appropriate schema. Results: Veritas was tested by generating and delivering two XML plans at Brigham and Women's Hospital. The first example was created to irradiate the letter ‘B’ with a narrow MV beam using dynamic couch movements. The second was created to acquire 4D CBCT projections for four minutes. The delivery of the letter ‘B’ was observed using a 2D array of ionization chambers. Both deliveries were generated quickly in Veritas by non-expert Developer Mode users. Conclusion: We introduced a new open source tool Veritas for generating XML plans (delivery trajectories and imaging sequences). Veritas makes Developer Mode more accessible by reducing the learning curve for quick translation of research ideas into XML plans. Veritas is an open source initiative, creating the possibility for future

  1. Comprehension and Writing Strategy Training Improves Performance on Content-Specific Source-Based Writing Tasks

    Science.gov (United States)

    Weston-Sementelli, Jennifer L.; Allen, Laura K.; McNamara, Danielle S.

    2018-01-01

    Source-based essays are evaluated both on the quality of the writing and the content appropriate interpretation and use of source material. Hence, composing a high-quality source-based essay (an essay written based on source material) relies on skills related to both reading (the sources) and writing (the essay) skills. As such, source-based…

  2. Prosodic Parallelism – comparing spoken and written language

    Directory of Open Access Journals (Sweden)

    Richard Wiese

    2016-10-01

    Full Text Available The Prosodic Parallelism hypothesis claims adjacent prosodic categories to prefer identical branching of internal adjacent constituents. According to Wiese and Speyer (2015, this preference implies feet contained in the same phonological phrase to display either binary or unary branching, but not different types of branching. The seemingly free schwa-zero alternations at the end of some words in German make it possible to test this hypothesis. The hypothesis was successfully tested by conducting a corpus study which used large-scale bodies of written German. As some open questions remain, and as it is unclear whether Prosodic Parallelism is valid for the spoken modality as well, the present study extends this inquiry to spoken German. As in the previous study, the results of a corpus analysis recruiting a variety of linguistic constructions are presented. The Prosodic Parallelism hypothesis can be demonstrated to be valid for spoken German as well as for written German. The paper thus contributes to the question whether prosodic preferences are similar between the spoken and written modes of a language. Some consequences of the results for the production of language are discussed.

  3. DISCOURSE AND PARTICIPATION IN ESL FACE-TO-FACE AND WRITTEN ELECTRONIC CONFERENCES

    Directory of Open Access Journals (Sweden)

    Michael Fitze

    2006-01-01

    Full Text Available This study was a comparative investigation of face-to-face and written electronic conferences. The participants were advanced English as a second language (hereafter: ESL students. The two types of conferences were compared in terms of textual features and participation. There was no statistically significant difference in the total number of words that students produced in an equivalent amount of time in the two types of conferences. The discourse in written electronic conferences displayed greater lexical range, and students in these conferences produced more discourse demonstrating interactive competence. The statistically significant finding of increased lexical range in written electronic conferences persisted even when the interactive discourse was eliminated from the conference transcripts and the transcripts were reanalyzed. This finding suggests that, during written electronic conferences, students were better able to use and practice a wider range of vocabulary related to the topics. For one of the groups, participation in written electronic conferences was more balanced among students, while for the other group participation was about equally balanced regardless of the conference setting. This last finding came as a surprise and points to a need for further research into variables that might mediate balanced participation in face-to-face and written electronic conferences.

  4. Towards a Theory of Vernacularisation: Insights from Written Chinese Vernaculars

    Science.gov (United States)

    Snow, Don

    2013-01-01

    This paper examines the history of four Chinese vernaculars which have developed written forms, and argues that five of the patterns Hanan identifies in the early development of Bai Hua can also be found in the early development of written Wu, Cantonese, and Minnan. In each of the cases studied, there is a clear pattern of early use of the…

  5. Word frequencies in written and spoken English based on the British National Corpus

    CERN Document Server

    Leech, Geoffrey; Wilson, Andrew (All Of Lancaster University)

    2014-01-01

    Word Frequencies in Written and Spoken English is a landmark volume in the development of vocabulary frequency studies. Whereas previous books have in general given frequency information about the written language only, this book provides information on both speech and writing. It not only gives information about the language as a whole, but also about the differences between spoken and written English, and between different spoken and written varieties of the language. The frequencies are derived from a wide ranging and up-to-date corpus of English: the British Na

  6. Sharing clinical decisions for multimorbidity case management using social network and open-source tools.

    Science.gov (United States)

    Martínez-García, Alicia; Moreno-Conde, Alberto; Jódar-Sánchez, Francisco; Leal, Sandra; Parra, Carlos

    2013-12-01

    . The professionals valued positively all the items in the questionnaire. As part of the SCP, opensource tools for CDS will be incorporated to provide recommendations for medication and problem interactions, as well as to calculate indexes or scales from validated questionnaires. They will receive the patient summary information provided by the regional Electronic Health Record system through a web service with the information defined according to the virtual Medical Record specification. Clinical Wall has been developed to allow communication and coordination between the healthcare professionals involved in multimorbidity patient care. Agreed decisions were about coordination for appointment changing, patient conditions, diagnosis tests, and prescription changes and renewal. The application of interoperability standards and open source software can bridge the gap between knowledge and clinical practice, while enabling interoperability and scalability. Open source with the social network encourages adoption and facilitates collaboration. Although the results obtained for use indicators are still not as high as it was expected, based on the promising results obtained in the acceptance questionnaire of SMP, we expect that the new CDS tools will increase the use by the health professionals. Copyright © 2013 Elsevier Inc. All rights reserved.

  7. OSeMOSYS: The Open Source Energy Modeling System

    International Nuclear Information System (INIS)

    Howells, Mark; Rogner, Holger; Strachan, Neil; Heaps, Charles; Huntington, Hillard; Kypreos, Socrates; Hughes, Alison; Silveira, Semida; DeCarolis, Joe; Bazillian, Morgan; Roehrl, Alexander

    2011-01-01

    This paper discusses the design and development of the Open Source Energy Modeling System (OSeMOSYS). It describes the model's formulation in terms of a 'plain English' description, algebraic formulation, implementation-in terms of its full source code, as well as a detailed description of the model inputs, parameters, and outputs. A key feature of the OSeMOSYS implementation is that it is contained in less than five pages of documented, easily accessible code. Other existing energy system models that do not have this emphasis on compactness and openness makes the barrier to entry by new users much higher, as well as making the addition of innovative new functionality very difficult. The paper begins by describing the rationale for the development of OSeMOSYS and its structure. The current preliminary implementation of the model is then demonstrated for a discrete example. Next, we explain how new development efforts will build on the existing OSeMOSYS codebase. The paper closes with thoughts regarding the organization of the OSeMOSYS community, associated capacity development efforts, and linkages to other open source efforts including adding functionality to the LEAP model. - Highlights: → OSeMOSYS is a new free and open source energy systems. → This model is written in a simple, open, flexible and transparent manner to support teaching. → OSeMOSYS is based on free software and optimizes using a free solver. → This model replicates the results of many popular tools, such as MARKAL. → A link between OSeMOSYS and LEAP has been developed.

  8. 7 CFR 900.40 - Written testimony and USDA data request requirements.

    Science.gov (United States)

    2010-01-01

    ... and Nut Marketing Agreements and Marketing Orders § 900.40 Written testimony and USDA data request... 7 Agriculture 8 2010-01-01 2010-01-01 false Written testimony and USDA data request requirements...) AGRICULTURAL MARKETING SERVICE (Marketing Agreements and Orders; Fruits, Vegetables, Nuts), DEPARTMENT OF...

  9. Calculation of beam source geometry of electron accelerator for radiation technologies

    International Nuclear Information System (INIS)

    Balalykin, N.I.; Derendyaev, Yu.S.; Dolbilov, G.V.; Karlov, A.A.; Korenev, S.A.; Petrov, V.A.; Smolyakova, T.F.

    1994-01-01

    ELLIPT and GRAFOR programmes written in FORTRAN language were developed to calculate the geometry of an electron source. The programmes enable calculation of electromagnetic field of the source and electron trajectories in the source under preset boundary and initial conditions. The GRAFOR programme allows to display electric field curves and calculated trajectories of large particles. 4 refs., 1 fig

  10. Isotope ratio mass spectrometry as a tool for source inference in forensic science: A critical review.

    Science.gov (United States)

    Gentile, Natacha; Siegwolf, Rolf T W; Esseiva, Pierre; Doyle, Sean; Zollinger, Kurt; Delémont, Olivier

    2015-06-01

    Isotope ratio mass spectrometry (IRMS) has been used in numerous fields of forensic science in a source inference perspective. This review compiles the studies published on the application of isotope ratio mass spectrometry (IRMS) to the traditional fields of forensic science so far. It completes the review of Benson et al. [1] and synthesises the extent of knowledge already gathered in the following fields: illicit drugs, flammable liquids, human provenancing, microtraces, explosives and other specific materials (packaging tapes, safety matches, plastics, etc.). For each field, a discussion assesses the state of science and highlights the relevance of the information in a forensic context. Through the different discussions which mark out the review, the potential and limitations of IRMS, as well as the needs and challenges of future studies are emphasized. The paper elicits the various dimensions of the source which can be obtained from the isotope information and demonstrates the transversal nature of IRMS as a tool for source inference. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  11. Student Perception of Social Media as a Course Tool

    Science.gov (United States)

    McCarthy, Richard V.; McCarthy, Mary M.

    2014-01-01

    If a technology provides features that are useful then it will have a positive impact on performance. Social media has morphed into one of the preferred methods of communication for many people; much has been written to proclaim its benefits including its usefulness as a tool to help students achieve success within the classroom. But is it…

  12. Developing a Modeling Tool Using Eclipse

    NARCIS (Netherlands)

    Kirtley, Nick; Waqas Kamal, Ahmad; Avgeriou, Paris

    2008-01-01

    Tool development using an open source platform provides autonomy to users to change, use, and develop cost-effective software with freedom from licensing requirements. However, open source tool development poses a number of challenges, such as poor documentation and continuous evolution. In this

  13. Written argument underlying the Brokdorf verdict

    International Nuclear Information System (INIS)

    Anon.

    1980-01-01

    In December 1979, the Schleswig administrative court delivered its judgment (AZ.: 10 A 512/76) against the plaintiffs (four neighbouring communities and nine individuals), who had brought in an action against the first part-construction permit for the Brokdorf nuclear power plant, issued on October 25, 1976. In mid-march 1980, the written argument underlying this court ruling (58 pages) has been sent out. The written argument conscientiously explains the reasoning of the court which delivered its verdict after several days of oral proceedings in October and November 1979, and clearly states the position of the court with regard to the limits of control by administrative jurisdiction as well as to the controversial legal problem of whether there is a lawful connection between the licensing in accordance with section 7, sub-section 2 of the AtG (Atomic Energy Act) and sufficient nuclear waste management provisions according to section 9a AtG. The court ruling declared the action to be substantially admissible but hot well-founded. (orig./HP) [de

  14. The determinants of spoken and written picture naming latencies.

    Science.gov (United States)

    Bonin, Patrick; Chalard, Marylène; Méot, Alain; Fayol, Michel

    2002-02-01

    The influence of nine variables on the latencies to write down or to speak aloud the names of pictures taken from Snodgrass and Vanderwart (1980) was investigated in French adults. The major determinants of both written and spoken picture naming latencies were image variability, image agreement and age of acquisition. To a lesser extent, name agreement was also found to have an impact in both production modes. The implications of the findings for theoretical views of both spoken and written picture naming are discussed.

  15. Enhancing the Benefits of Written Emotional Disclosure through Response Training

    OpenAIRE

    Konig, Andrea; Eonta, Alison; Dyal, Stephanie R.; Vrana, Scott R.

    2013-01-01

    Writing about a personal stressful event has been found to have psychological and physical health benefits, especially when physiological response increases during writing. Response training was developed to amplify appropriate physiological reactivity in imagery exposure. The present study examined whether response training enhances the benefits of written emotional disclosure. Participants were assigned to either a written emotional disclosure condition (n = 113) or a neutral writing condit...

  16. Multi-stage ranking of emergency technology alternatives for water source pollution accidents using a fuzzy group decision making tool.

    Science.gov (United States)

    Qu, Jianhua; Meng, Xianlin; You, Hong

    2016-06-05

    Due to the increasing number of unexpected water source pollution events, selection of the most appropriate disposal technology for a specific pollution scenario is of crucial importance to the security of urban water supplies. However, the formulation of the optimum option is considerably difficult owing to the substantial uncertainty of such accidents. In this research, a multi-stage technical screening and evaluation tool is proposed to determine the optimal technique scheme, considering the areas of pollutant elimination both in drinking water sources and water treatment plants. In stage 1, a CBR-based group decision tool was developed to screen available technologies for different scenarios. Then, the threat degree caused by the pollution was estimated in stage 2 using a threat evaluation system and was partitioned into four levels. For each threat level, a corresponding set of technique evaluation criteria weights was obtained using Group-G1. To identify the optimization alternatives corresponding to the different threat levels, an extension of TOPSIS, a multi-criteria interval-valued trapezoidal fuzzy decision making technique containing the four arrays of criteria weights, to a group decision environment was investigated in stage 3. The effectiveness of the developed tool was elaborated by two actual thallium-contaminated scenarios associated with different threat levels. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. A collection of open source applications for mass spectrometry data mining.

    Science.gov (United States)

    Gallardo, Óscar; Ovelleiro, David; Gay, Marina; Carrascal, Montserrat; Abian, Joaquin

    2014-10-01

    We present several bioinformatics applications for the identification and quantification of phosphoproteome components by MS. These applications include a front-end graphical user interface that combines several Thermo RAW formats to MASCOT™ Generic Format extractors (EasierMgf), two graphical user interfaces for search engines OMSSA and SEQUEST (OmssaGui and SequestGui), and three applications, one for the management of databases in FASTA format (FastaTools), another for the integration of search results from up to three search engines (Integrator), and another one for the visualization of mass spectra and their corresponding database search results (JsonVisor). These applications were developed to solve some of the common problems found in proteomic and phosphoproteomic data analysis and were integrated in the workflow for data processing and feeding on our LymPHOS database. Applications were designed modularly and can be used standalone. These tools are written in Perl and Python programming languages and are supported on Windows platforms. They are all released under an Open Source Software license and can be freely downloaded from our software repository hosted at GoogleCode. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Development of a Matlab/Simulink tool to facilitate system analysis and simulation via the adjoint and covariance methods

    NARCIS (Netherlands)

    Bucco, D.; Weiss, M.

    2007-01-01

    The COVariance and ADjoint Analysis Tool (COVAD) is a specially designed software tool, written for the Matlab/Simulink environment, which allows the user the capability to carry out system analysis and simulation using the adjoint, covariance or Monte Carlo methods. This paper describes phase one

  19. Open Source Approach to Project Management Tools

    Directory of Open Access Journals (Sweden)

    Romeo MARGEA

    2011-01-01

    Full Text Available Managing large projects involving different groups of people and complex tasks can be challenging. The solution is to use Project management software, which allows a more efficient management of projects. However, famous project management systems can be costly and may require expensive custom servers. Even if free software is not as complex as Microsoft Project, is noteworthy to think that not all projects need all the features, amenities and power of such systems. There are free and open source software alternatives that meet the needs of most projects, and that allow Web access based on different platforms and locations. A starting stage in adopting an OSS in-house is finding and identifying existing open source solution. In this paper we present an overview of Open Source Project Management Software (OSPMS based on articles, reviews, books and developers’ web sites, about those that seem to be the most popular software in this category.

  20. Aperture Photometry Tool

    Science.gov (United States)

    Laher, Russ R.; Gorjian, Varoujan; Rebull, Luisa M.; Masci, Frank J.; Fowler, John W.; Helou, George; Kulkarni, Shrinivas R.; Law, Nicholas M.

    2012-07-01

    Aperture Photometry Tool (APT) is software for astronomers and students interested in manually exploring the photometric qualities of astronomical images. It is a graphical user interface (GUI) designed to allow the image data associated with aperture photometry calculations for point and extended sources to be visualized and, therefore, more effectively analyzed. The finely tuned layout of the GUI, along with judicious use of color-coding and alerting, is intended to give maximal user utility and convenience. Simply mouse-clicking on a source in the displayed image will instantly draw a circular or elliptical aperture and sky annulus around the source and will compute the source intensity and its uncertainty, along with several commonly used measures of the local sky background and its variability. The results are displayed and can be optionally saved to an aperture-photometry-table file and plotted on graphs in various ways using functions available in the software. APT is geared toward processing sources in a small number of images and is not suitable for bulk processing a large number of images, unlike other aperture photometry packages (e.g., SExtractor). However, APT does have a convenient source-list tool that enables calculations for a large number of detections in a given image. The source-list tool can be run either in automatic mode to generate an aperture photometry table quickly or in manual mode to permit inspection and adjustment of the calculation for each individual detection. APT displays a variety of useful graphs with just the push of a button, including image histogram, x and y aperture slices, source scatter plot, sky scatter plot, sky histogram, radial profile, curve of growth, and aperture-photometry-table scatter plots and histograms. APT has many functions for customizing the calculations, including outlier rejection, pixel “picking” and “zapping,” and a selection of source and sky models. The radial-profile-interpolation source

  1. SNP_tools: A compact tool package for analysis and conversion of genotype data for MS-Excel.

    Science.gov (United States)

    Chen, Bowang; Wilkening, Stefan; Drechsel, Marion; Hemminki, Kari

    2009-10-23

    Single nucleotide polymorphism (SNP) genotyping is a major activity in biomedical research. Scientists prefer to have a facile access to the results which may require conversions between data formats. First hand SNP data is often entered in or saved in the MS-Excel format, but this software lacks genetic and epidemiological related functions. A general tool to do basic genetic and epidemiological analysis and data conversion for MS-Excel is needed. The SNP_tools package is prepared as an add-in for MS-Excel. The code is written in Visual Basic for Application, embedded in the Microsoft Office package. This add-in is an easy to use tool for users with basic computer knowledge (and requirements for basic statistical analysis). Our implementation for Microsoft Excel 2000-2007 in Microsoft Windows 2000, XP, Vista and Windows 7 beta can handle files in different formats and converts them into other formats. It is a free software.

  2. Californium source transfer

    International Nuclear Information System (INIS)

    Wallace, C.R.

    1995-01-01

    In early 1995, the receipt of four sealed californium-252 sources from Oak Ridge National Lab was successfully accomplished by a team comprised of Radiological Engineering, Radiological Operations and Health Physics Instrumentation personnel. A procedure was developed and walked-down by the participants during a Dry Run Evolution. Several special tools were developed during the pre-planning phases of the project which reduced individual and job dose to minimal levels. These included a mobile lifting device for attachment of a transfer ball valve assembly to the undercarriage of the Cannonball Carrier, a transfer tube elbow to ensure proper angle of the source transfer tube, and several tools used during emergency response for remote retrieval and handling of an unshielded source. Lessons were learned in the areas of contamination control, emergency preparedness, and benefits of thorough pre-planning, effectiveness of locally creating and designing special tools to reduce worker dose, and methods of successfully accomplishing source receipt evolutions during extreme or inclement weather

  3. Increasing advertising power via written scent references

    NARCIS (Netherlands)

    Fenko, Anna; Breulmann, Svenja; Bialkova, Svetlana; Bialkova, Svetlana

    2014-01-01

    Olfactory cues in advertisements can evoke positive consumer emotions and product attitudes, yet including real scent in advertising is not always feasible. This study aimed at investigating whether written scent references could produce effects similar to real scents. Participants in online

  4. The GNAT: A new tool for processing NMR data.

    Science.gov (United States)

    Castañar, Laura; Poggetto, Guilherme Dal; Colbourne, Adam A; Morris, Gareth A; Nilsson, Mathias

    2018-06-01

    The GNAT (General NMR Analysis Toolbox) is a free and open-source software package for processing, visualising, and analysing NMR data. It supersedes the popular DOSY Toolbox, which has a narrower focus on diffusion NMR. Data import of most common formats from the major NMR platforms is supported, as well as a GNAT generic format. Key basic processing of NMR data (e.g., Fourier transformation, baseline correction, and phasing) is catered for within the program, as well as more advanced techniques (e.g., reference deconvolution and pure shift FID reconstruction). Analysis tools include DOSY and SCORE for diffusion data, ROSY T 1 /T 2 estimation for relaxation data, and PARAFAC for multilinear analysis. The GNAT is written for the MATLAB® language and comes with a user-friendly graphical user interface. The standard version is intended to run with a MATLAB installation, but completely free-standing compiled versions for Windows, Mac, and Linux are also freely available. © 2018 The Authors Magnetic Resonance in Chemistry Published by John Wiley & Sons Ltd.

  5. A Comparison of Written, Vocal, and Video Feedback When Training Teachers

    Science.gov (United States)

    Luck, Kally M.; Lerman, Dorothea C.; Wu, Wai-Ling; Dupuis, Danielle L.; Hussein, Louisa A.

    2018-01-01

    We compared the effectiveness of and preference for different feedback strategies when training six special education teachers during a 5-day summer training program. In Experiment 1, teachers received written or vocal feedback while learning to implement two different types of preference assessments. In Experiment 2, we compared either written or…

  6. On Verification of PLC-Programs Written in the LD-Language

    Directory of Open Access Journals (Sweden)

    E. V. Kuzmin

    2012-01-01

    Full Text Available We discuss some questions connected with the construction of a technology of analysing correctness of Programmable Logic Controller programs. We consider an example of modeling and automated verification of PLC-programs written in the Ladder Diagram language (including timed function blocks of the IEC 61131-3 standard. We use the Cadence SMV for symbolic model checking. Program properties are written in the linear-time temporal logic LTL.

  7. How Do Surgery Students Use Written Language to Say What They See? A Framework to Understand Medical Students' Written Evaluations of Their Teachers.

    Science.gov (United States)

    Lim, David W; White, Jonathan S

    2015-11-01

    There remains debate regarding the value of the written comments that medical students are traditionally asked to provide to evaluate the teaching they receive. The purpose of this study was to examine written teaching evaluations to understand how medical students conceptualize teachers' behaviors and performance. All written comments collected from medical students about teachers in the two surgery clerkships at the University of Alberta in 2009-2010 and 2010-2011 were collated and anonymized. A grounded theory approach was used for analysis, with iterative reading and open coding to identify recurring themes. A framework capturing variations observed in the data was generated until data saturation was achieved. Domains and subdomains were named using an in situ coding approach. The conceptual framework contained three main domains: "Physician as Teacher," "Physician as Person," and "Physician as Physician." Under "Physician as Teacher," students commented on specific acts of teaching and subjective perceptions of an educator's teaching values. Under the "Physician as Physician" domain, students commented on elements of their educator's physicianship, including communication and collaborative skills, medical expertise, professionalism, and role modeling. Under "Physician as Person," students commented on how both positive and negative personality traits impacted their learning. This framework describes how medical students perceive their teachers and how they use written language to attach meaning to the behaviors they observe. Such a framework can be used to help students provide more constructive feedback to teachers and to assist in faculty development efforts aimed at improving teaching performance.

  8. Comparisons between written and computerised patient histories

    NARCIS (Netherlands)

    Quaak, Martien; Westerman, R. Frans; van Bemmel, Jan H.

    1987-01-01

    Patient histories were obtained from 99 patients in three different ways: by a computerised patient interview (patient record), by the usual written interview (medical record), and by the transcribed record, which was a computerised version of the medical record. Patient complaints, diagnostic

  9. 14 CFR 302.207 - Cases to be decided on written submissions.

    Science.gov (United States)

    2010-01-01

    ... administrative law judge is otherwise required by the public interest. (b) The standards employed in deciding... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Cases to be decided on written submissions....207 Cases to be decided on written submissions. (a) Applications under this subpart will be decided on...

  10. Assessing the suitability of written stroke materials: an evaluation of the interrater reliability of the suitability assessment of materials (SAM) checklist.

    Science.gov (United States)

    Hoffmann, Tammy; Ladner, Yvette

    2012-01-01

    Written materials are frequently used to provide education to stroke patients and their carers. However, poor quality materials are a barrier to effective information provision. A quick and reliable method of evaluating material quality is needed. This study evaluated the interrater reliability of the Suitability Assessment of Materials (SAM) checklist in a sample of written stroke education materials. Two independent raters evaluated the materials (n = 25) using the SAM, and ratings were analyzed to reveal total percentage agreements and weighted kappa values for individual items and overall SAM rating. The majority of the individual SAM items had high interrater reliability, with 17 of the 22 items achieving substantial, almost perfect, or perfect weighted kappa value scores. The overall SAM rating achieved a weighted kappa value of 0.60, with a percentage total agreement of 96%. Health care professionals should evaluate the content and design characteristics of written education materials before using them with patients. A tool such as the SAM checklist can be used; however, raters should exercise caution when interpreting results from items with more subjective scoring criteria. Refinements to the scoring criteria for these items are recommended. The value of the SAM is that it can be used to identify specific elements that should be modified before education materials are provided to patients.

  11. AssesSeg—A Command Line Tool to Quantify Image Segmentation Quality: A Test Carried Out in Southern Spain from Satellite Imagery

    Directory of Open Access Journals (Sweden)

    Antonio Novelli

    2017-01-01

    Full Text Available This letter presents the capabilities of a command line tool created to assess the quality of segmented digital images. The executable source code, called AssesSeg, was written in Python 2.7 using open source libraries. AssesSeg (University of Almeria, Almeria, Spain; Politecnico di Bari, Bari, Italy implements a modified version of the supervised discrepancy measure named Euclidean Distance 2 (ED2 and was tested on different satellite images (Sentinel-2, Landsat 8, and WorldView-2. The segmentation was applied to plastic covered greenhouse detection in the south of Spain (Almería. AssesSeg outputs were utilized to find the best band combinations for the performed segmentations of the images and showed a clear positive correlation between segmentation accuracy and the quantity of available reference data. This demonstrates the importance of a high number of reference data in supervised segmentation accuracy assessment problems.

  12. APT: Aperture Photometry Tool

    Science.gov (United States)

    Laher, Russ

    2012-08-01

    Aperture Photometry Tool (APT) is software for astronomers and students interested in manually exploring the photometric qualities of astronomical images. It has a graphical user interface (GUI) which allows the image data associated with aperture photometry calculations for point and extended sources to be visualized and, therefore, more effectively analyzed. Mouse-clicking on a source in the displayed image draws a circular or elliptical aperture and sky annulus around the source and computes the source intensity and its uncertainty, along with several commonly used measures of the local sky background and its variability. The results are displayed and can be optionally saved to an aperture-photometry-table file and plotted on graphs in various ways using functions available in the software. APT is geared toward processing sources in a small number of images and is not suitable for bulk processing a large number of images, unlike other aperture photometry packages (e.g., SExtractor). However, APT does have a convenient source-list tool that enables calculations for a large number of detections in a given image. The source-list tool can be run either in automatic mode to generate an aperture photometry table quickly or in manual mode to permit inspection and adjustment of the calculation for each individual detection. APT displays a variety of useful graphs, including image histogram, and aperture slices, source scatter plot, sky scatter plot, sky histogram, radial profile, curve of growth, and aperture-photometry-table scatter plots and histograms. APT has functions for customizing calculations, including outlier rejection, pixel “picking” and “zapping,” and a selection of source and sky models. The radial-profile-interpolation source model, accessed via the radial-profile-plot panel, allows recovery of source intensity from pixels with missing data and can be especially beneficial in crowded fields.

  13. Optical Tooling and its Uses at the Spallation Neutron Source (SNS)

    CERN Document Server

    Helus, Scott; Error, Joseph; Fazekas, Julius; Maines, James

    2005-01-01

    Optical tooling has been a mainstay of the accelerator alignment community for decades. Even now in the age of electronic survey equipment, optical tooling remains a viable alternative, and at times the only alternative. At SNS, we combine traditional optical tooling alignment methods, instrumentation, and techniques, with the more modern electronic techniques. This paper deals with the integration of optical tooling into the electronic survey world.

  14. Written Mathematical Traditions in Ancient Mesopotamia: Knowledge, ignorance, and reasonable guesses

    DEFF Research Database (Denmark)

    Høyrup, Jens

    of the latter tradition to type of writing after the Old Babylonian period is not well elucidated by the sources. Much worse, however, is the situation if we consider the sophisticated mathematics created during the Old Babylonian period. Its connection to the school institution and the new literate style......Writing, as well as various mathematical techniques, were created in proto-literate Uruk in order to serve accounting, and Mesopotamian mathematics as we know it was always expressed in writing. In so far, mathematics generically regarded was always part of the generic written tradition. However......, once we move away from the generic perspective, things become much less easy. If we look at basic numeracy from Uruk IV until Ur III, it is possible to point to continuity and thus to a “tradition”, and also if we look at place-value practical computation from Ur III onward – but already the relation...

  15. A Tool for Longitudinal Beam Dynamics in Synchrotrons

    Energy Technology Data Exchange (ETDEWEB)

    Ostiguy, J.-F. [Fermilab; Lebedev, V. A. [Fermilab

    2017-05-01

    A number of codes are available to simulate longitudinal dynamics in synchrotrons. Some established ones include TIBETAN, LONG1D, ESME and ORBIT. While they embody a wealth of accumulated wisdom and experience, most of these codes were written decades ago and to some extent they reflect the constraints of their time. As a result, there is an interest for updated tools taking better advantage of modern software and hardware capabilities. At Fermilab, the PIP-II project has provided the impetus for development of such a tool. In this contribution, we discuss design decisions and code architecture. A selection of test cases based on an initial prototype are also presented.

  16. Short message service (SMS language and written language skills: educators' perspectives

    Directory of Open Access Journals (Sweden)

    Salomé Geertsema

    2011-01-01

    Full Text Available SMS language is English language slang, used as a means of mobile phone text messaging. This practice may impact on the written language skills of learners at school. The main aim of this study was to determine the perspectives of Grade 8 and 9 English (as Home Language educators in Gauteng regarding the possible influence of SMS language on certain aspects of learners' written language skills. If an influence was perceived by the educators, their perceptions regarding the degree and nature of the influence were also explored. A quantitative research design, utilising a questionnaire, was employed. The sample of participants comprised 22 educators employed at independent secondaryschools within Gauteng, South Africa. The results indicated that the majority of educators viewed SMS language as having a negative influence on the written language skills of Grade 8 and 9 learners. The influence was perceived as occurring in the learners' spelling, punctuation, and sentence length. A further finding was that the majority of educators address the negative influences of SMS language when encountered in written tasks.

  17. Open Source and Proprietary Project Management Tools for SMEs.

    OpenAIRE

    Veronika Abramova; Francisco Pires; Jorge Bernardino

    2017-01-01

    The dimensional growth and increasing difficulty in project management promoted the development of different tools that serve to facilitate project management and track project schedule, resources and overall progress. These tools offer a variety of features, from task and time management, up to integrated CRM (Customer Relationship Management) and ERP (Enterprise Resource Planning) modules. Currently, a large number of project management software is available, to assist project team during t...

  18. Visualization tool for three-dimensional plasma velocity distributions (ISEE_3D) as a plug-in for SPEDAS

    Science.gov (United States)

    Keika, Kunihiro; Miyoshi, Yoshizumi; Machida, Shinobu; Ieda, Akimasa; Seki, Kanako; Hori, Tomoaki; Miyashita, Yukinaga; Shoji, Masafumi; Shinohara, Iku; Angelopoulos, Vassilis; Lewis, Jim W.; Flores, Aaron

    2017-12-01

    This paper introduces ISEE_3D, an interactive visualization tool for three-dimensional plasma velocity distribution functions, developed by the Institute for Space-Earth Environmental Research, Nagoya University, Japan. The tool provides a variety of methods to visualize the distribution function of space plasma: scatter, volume, and isosurface modes. The tool also has a wide range of functions, such as displaying magnetic field vectors and two-dimensional slices of distributions to facilitate extensive analysis. The coordinate transformation to the magnetic field coordinates is also implemented in the tool. The source codes of the tool are written as scripts of a widely used data analysis software language, Interactive Data Language, which has been widespread in the field of space physics and solar physics. The current version of the tool can be used for data files of the plasma distribution function from the Geotail satellite mission, which are publicly accessible through the Data Archives and Transmission System of the Institute of Space and Astronautical Science (ISAS)/Japan Aerospace Exploration Agency (JAXA). The tool is also available in the Space Physics Environment Data Analysis Software to visualize plasma data from the Magnetospheric Multiscale and the Time History of Events and Macroscale Interactions during Substorms missions. The tool is planned to be applied to data from other missions, such as Arase (ERG) and Van Allen Probes after replacing or adding data loading plug-ins. This visualization tool helps scientists understand the dynamics of space plasma better, particularly in the regions where the magnetohydrodynamic approximation is not valid, for example, the Earth's inner magnetosphere, magnetopause, bow shock, and plasma sheet.

  19. A dynamic regression analysis tool for quantitative assessment of bacterial growth written in Python.

    Science.gov (United States)

    Hoeflinger, Jennifer L; Hoeflinger, Daniel E; Miller, Michael J

    2017-01-01

    Herein, an open-source method to generate quantitative bacterial growth data from high-throughput microplate assays is described. The bacterial lag time, maximum specific growth rate, doubling time and delta OD are reported. Our method was validated by carbohydrate utilization of lactobacilli, and visual inspection revealed 94% of regressions were deemed excellent. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. A Coupling Tool for Parallel Molecular Dynamics-Continuum Simulations

    KAUST Repository

    Neumann, Philipp

    2012-06-01

    We present a tool for coupling Molecular Dynamics and continuum solvers. It is written in C++ and is meant to support the developers of hybrid molecular - continuum simulations in terms of both realisation of the respective coupling algorithm as well as parallel execution of the hybrid simulation. We describe the implementational concept of the tool and its parallel extensions. We particularly focus on the parallel execution of particle insertions into dense molecular systems and propose a respective parallel algorithm. Our implementations are validated for serial and parallel setups in two and three dimensions. © 2012 IEEE.

  1. Tools for open geospatial science

    Science.gov (United States)

    Petras, V.; Petrasova, A.; Mitasova, H.

    2017-12-01

    Open science uses open source to deal with reproducibility challenges in data and computational sciences. However, just using open source software or making the code public does not make the research reproducible. Moreover, the scientists face the challenge of learning new unfamiliar tools and workflows. In this contribution, we will look at a graduate-level course syllabus covering several software tools which make validation and reuse by a wider professional community possible. For the novices in the open science arena, we will look at how scripting languages such as Python and Bash help us reproduce research (starting with our own work). Jupyter Notebook will be introduced as a code editor, data exploration tool, and a lab notebook. We will see how Git helps us not to get lost in revisions and how Docker is used to wrap all the parts together using a single text file so that figures for a scientific paper or a technical report can be generated with a single command. We will look at examples of software and publications in the geospatial domain which use these tools and principles. Scientific contributions to GRASS GIS, a powerful open source desktop GIS and geoprocessing backend, will serve as an example of why and how to publish new algorithms and tools as part of a bigger open source project.

  2. Tracking and Quantifying Developmental Processes in C. elegans Using Open-source Tools.

    Science.gov (United States)

    Dutta, Priyanka; Lehmann, Christina; Odedra, Devang; Singh, Deepika; Pohl, Christian

    2015-12-16

    Quantitatively capturing developmental processes is crucial to derive mechanistic models and key to identify and describe mutant phenotypes. Here protocols are presented for preparing embryos and adult C. elegans animals for short- and long-term time-lapse microscopy and methods for tracking and quantification of developmental processes. The methods presented are all based on C. elegans strains available from the Caenorhabditis Genetics Center and on open-source software that can be easily implemented in any laboratory independently of the microscopy system used. A reconstruction of a 3D cell-shape model using the modelling software IMOD, manual tracking of fluorescently-labeled subcellular structures using the multi-purpose image analysis program Endrov, and an analysis of cortical contractile flow using PIVlab (Time-Resolved Digital Particle Image Velocimetry Tool for MATLAB) are shown. It is discussed how these methods can also be deployed to quantitatively capture other developmental processes in different models, e.g., cell tracking and lineage tracing, tracking of vesicle flow.

  3. English tsotsitaals? − an analysis of two written texts in Surfspeak ...

    African Journals Online (AJOL)

    ... medium of English; (b) give an appreciation of the humour, wit and style associated with English tsotsitaals, via the analysis of two written texts; and (c) show the limitations of tsotsitaals in extended written usage, for which they have to co-exist with more mainstream forms of the dialect of English they utilise for their base.

  4. Developmental perspectives in written language and literacy: In honor of Ludo Verhoeven

    NARCIS (Netherlands)

    Segers, P.C.J.; Broek, P.W. van den

    2017-01-01

    Research on the development on written language and literacy is inherently multidisciplinary. In this book, leading researchers studying brain, cognition and behavior, come together in revealing how children develop written language and literacy, why they may experience difficulties, and which

  5. Clinical presentation of women with pelvic source varicose veins in the perineum as a first step in the development of a disease-specific patient assessment tool.

    Science.gov (United States)

    Gibson, Kathleen; Minjarez, Renee; Ferris, Brian; Neradilek, Moni; Wise, Matthew; Stoughton, Julianne; Meissner, Mark

    2017-07-01

    Pelvic venous incompetence can cause symptomatic varicose veins in the perineum, buttock, and thigh. Presentation, symptom severity, and response to treatment of pelvic source varicose veins are not well defined. Currently available tools to measure the severity of lower extremity venous disease and its effects on quality of life may be inadequate to assess disease severity in these patients. The purpose of this study was to evaluate the histories, demographics, and clinical presentations of women with pelvic source varicose veins and to compare these data to a population of women with nonpelvic source varicose veins. A total of 72 female patients with symptomatic pelvic source varicose veins were prospectively followed up. Age, weight, height, parity, and birth weights of offspring were recorded. Both pelvic source varicose veins and saphenous incompetence were identified by duplex ultrasound. Patients were queried as to their primary symptoms, activities that made their symptoms worse, and time when their symptoms were most prominent. Severity of disease was objectively evaluated using the revised Venous Clinical Severity Score (rVCSS) and 10-point numeric pain rating scale (NPRS). Compared with women without a pelvic source of varicose veins (N = 1163), patients with pelvic source varicose veins were younger (mean, 44.6 ± 8.6 vs 52.6 ± 12.9 years; P source varicose veins are a unique subset of patients. They are younger and thinner than those with nonpelvic source varicose veins, have larger infants than the general U.S. population, and have an inverse correlation between age and pain. As the majority of premenopausal patients have increased symptoms during menses, this may be due to hormonal influence. As it is poorly associated with patient-reported discomfort, the rVCSS is a poor tool for evaluating pelvic source varicose veins. A disease-specific tool for the evaluation of pelvic source varicose veins is critically needed, and this study is a first

  6. Electronic circuit design with HEP computational tools

    International Nuclear Information System (INIS)

    Vaz, Mario

    1996-01-01

    CPSPICE is an electronic circuit statistical simulation program developed to run in a parallel environment under UNIX operating system and TCP/IP communications protocol, using CPS - Cooperative Processes Software , SPICE program and CERNLIB software package. It is part of a set of tools being develop, intended to help electronic engineers to design, model and simulate complex systems and circuits for High Energy Physics detectors, based on statistical methods, using the same software and methodology used by HEP physicists for data analysis. CPSPICE simulates electronic circuits by Monte Carlo method, through several different processes running simultaneously SPICE in UNIX parallel computers or workstation farms. Data transfer between CPS processes for a modified version of SPICE2G6 is done by RAM memory, but can also be done through hard disk files if no source files are available for the simulator, and for bigger simulation outputs files. Simulation results are written in a HBOOK file as a NTUPLE, to be examined by HBOOK in batch model or graphics, and analyzed by statistical procedures available. The HBOOK file be stored on hard disk for small amount of data, or into Exabyte tape file for large amount of data. HEP tools also helps circuit or component modeling, like MINUT program from CERNLIB, that implements Nelder and Mead Simplex and Gradient with or without derivatives algorithms, and can be used for design optimization.This paper presents CPSPICE program implementation. The scheme adopted is suitable to make parallel other electronic circuit simulators. (author)

  7. Learners' right to freedom of written expression

    African Journals Online (AJOL)

    Erna Kinsey

    Learners' right to freedom of written expression. W.J. van Vollenhoven. Department of Education Management and Policy Studies, University of Pretoria, Pretoria, 0002 South Africa wvvollen@postino.up.ac.za. Charles I. Glenn. Training and Policy Studies of the University Professors' Program, University of Boston. Although ...

  8. Low literacy and written drug information: information-seeking, leaflet evaluation and preferences, and roles for images.

    Science.gov (United States)

    van Beusekom, Mara M; Grootens-Wiegers, Petronella; Bos, Mark J W; Guchelaar, Henk-Jan; van den Broek, Jos M

    2016-12-01

    Background Low-literate patients are at risk to misinterpret written drug information. For the (co-) design of targeted patient information, it is key to involve this group in determining their communication barriers and information needs. Objective To gain insight into how people with low literacy use and evaluate written drug information, and to identify ways in which they feel the patient leaflet can be improved, and in particular how images could be used. Setting Food banks and an education institution for Dutch language training in the Netherlands. Method Semi-structured focus groups and individual interviews were held with low-literate participants (n = 45). The thematic framework approach was used for analysis to identify themes in the data. Main outcome measure Low-literate people's experience with patient information leaflets, ideas for improvements, and perceptions on possible uses for visuals. Results Patient information leaflets were considered discouraging to use, and information difficult to find and understand. Many rely on alternative information sources. The leaflet should be shorter, and improved in terms of organisation, legibility and readability. Participants thought images could increase the leaflet's appeal, help ask questions, provide an overview, help understand textual information, aid recall, reassure, and even lead to increased confidence, empowerment and feeling of safety. Conclusion Already at the stages of paying attention to the leaflet and maintaining interest in the message, low-literate patients experience barriers in the communication process through written drug information. Short, structured, visual/textual explanations can lower the motivational threshold to use the leaflet, improve understanding, and empower the low-literate target group.

  9. Tools for Authentication

    Energy Technology Data Exchange (ETDEWEB)

    White, G

    2008-07-09

    Many recent Non-proliferation and Arms Control software projects include a software authentication component. In this context, 'authentication' is defined as determining that a software package performs only its intended purpose and performs that purpose correctly and reliably over many years. In addition to visual inspection by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs both to aid the visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary, and have limited extensibility. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool must be based on a complete language compiler infrastructure, that is, one that can parse and digest the full language through its standard grammar. ROSE is precisely such a compiler infrastructure developed within DOE. ROSE is a robust source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C, C++, and FORTRAN. This year, it has been extended to support the automated analysis of binaries. We continue to extend ROSE to address a number of security-specific requirements and apply it to software authentication for Non-proliferation and Arms Control projects. We will give an update on the status of our work.

  10. Tools for Authentication

    International Nuclear Information System (INIS)

    White, G.

    2008-01-01

    Many recent Non-proliferation and Arms Control software projects include a software authentication component. In this context, 'authentication' is defined as determining that a software package performs only its intended purpose and performs that purpose correctly and reliably over many years. In addition to visual inspection by knowledgeable computer scientists, automated tools are needed to highlight suspicious code constructs both to aid the visual inspection and to guide program development. While many commercial tools are available for portions of the authentication task, they are proprietary, and have limited extensibility. An open-source, extensible tool can be customized to the unique needs of each project (projects can have both common and custom rules to detect flaws and security holes). Any such extensible tool must be based on a complete language compiler infrastructure, that is, one that can parse and digest the full language through its standard grammar. ROSE is precisely such a compiler infrastructure developed within DOE. ROSE is a robust source-to-source analysis and optimization infrastructure currently addressing large, million-line DOE applications in C, C++, and FORTRAN. This year, it has been extended to support the automated analysis of binaries. We continue to extend ROSE to address a number of security-specific requirements and apply it to software authentication for Non-proliferation and Arms Control projects. We will give an update on the status of our work

  11. Written culture: reading pratices and printed book

    Directory of Open Access Journals (Sweden)

    Lidia Eugenia Cavalcante

    2009-07-01

    Full Text Available The history of the written culture and the reading practices is the subject argued in this article. It aims at to understand the trajectory of the printed book in its materiality, as well as the processes delineated from the undisputed cultural presence and politics of this support for the modern society. Search to evidence the reading practices, the phenomena and the mutations that fortify such support per centuries, approaching the “book crisis”, its causes and effects. Therefore, it deals with the particularitities of the written culture, that if they had accomplished in the Siècle des Lumières and if they had consecrated in “acting” of the spirit of the authors and the readers of that time, whose propagation influenced the western person. It analyzes the sociological and historical conditions of the place of the modern reader between Science, Philosophy and Romance, continuously transformed for the renewal of the thought and the culture.

  12. Consumer Preferences for Written and Oral Information about Allergens When Eating Out.

    Science.gov (United States)

    Begen, Fiona M; Barnett, Julie; Payne, Ros; Roy, Debbie; Gowland, M Hazel; Lucas, Jane S

    2016-01-01

    Avoiding food allergens when eating outside the home presents particular difficulties for food allergic (FA) and intolerant (FI) consumers and a lack of allergen information in restaurants and takeaways causes unnecessary restrictions. Across Europe, legislation effective from December 2014, aims to improve allergen information by requiring providers of non-prepacked foods to supply information related to allergen content within their foods. Using in-depth interviews with 60 FA/FI adults and 15 parents/carers of FA/FI children, we aimed to identify FA/FI consumers' preferences for written and/or verbal allergen information when eating out or ordering takeaway food. A complex and dynamic set of preferences and practices for written and verbal allergen information was identified. Overwhelmingly, written information was favoured in the first instance, but credible personal/verbal communication was highly valued and essential to a good eating out experience. Adequate written information facilitated implicit trust in subsequent verbal information. Where written information was limited, FA/FIs depended on social cues to assess the reliability of verbal information resources, and defaulted to tried and tested allergen avoidance strategies when these were deemed unreliable. Understanding the subtle negotiations and difficulties encountered by FA/FIs when eating out can serve as a guide for legislators and food providers; by encouraging provision of clear written and verbal allergen information, and training of proactive, allergen-aware staff. This, in tandem with legal requirements for allergen information provision, paves the way for FA/FIs to feel more confident in eating out choices; and to experience improved eating out experiences.

  13. The Written-Pole{sup TM} motor: high efficiency - low start current

    Energy Technology Data Exchange (ETDEWEB)

    Beck, B. [C.Eng. Precise Power Corp., Bradenton, FL (United States); Friesen, D. [P.E. Manitoba Hydro, Winnipeg (Canada)

    2000-07-01

    Written-Pole{sup TM} technology is a patented machine technology, which changes the magnetic polarity of the rotor field in a rotating machine, while the machine is operating. The number of poles is thereby changed, resulting in a constant frequency - variable speed machine. When operating as a motor, a Written-Pole machine has inherently low starting current and high operating efficiency. (orig.)

  14. 17 CFR 230.437a - Written consents.

    Science.gov (United States)

    2010-04-01

    ...) Are filing a registration statement containing financial statements in which Arthur Andersen LLP (or a foreign affiliate of Arthur Andersen LLP) had been acting as the independent public accountant. (b... dispense with the requirement for the registrant to file the written consent of Arthur Andersen LLP (or a...

  15. Synergistic relationships between Analytical Chemistry and written standards

    International Nuclear Information System (INIS)

    Valcárcel, Miguel; Lucena, Rafael

    2013-01-01

    Graphical abstract: -- Highlights: •Analytical Chemistry is influenced by international written standards. •Different relationships can be established between them. •Synergies can be generated when these standards are conveniently managed. -- Abstract: This paper describes the mutual impact of Analytical Chemistry and several international written standards (norms and guides) related to knowledge management (CEN-CWA 14924:2004), social responsibility (ISO 26000:2010), management of occupational health and safety (OHSAS 18001/2), environmental management (ISO 14001:2004), quality management systems (ISO 9001:2008) and requirements of the competence of testing and calibration laboratories (ISO 17025:2004). The intensity of this impact, based on a two-way influence, is quite different depending on the standard considered. In any case, a new and fruitful approach to Analytical Chemistry based on these relationships can be derived

  16. Synergistic relationships between Analytical Chemistry and written standards

    Energy Technology Data Exchange (ETDEWEB)

    Valcárcel, Miguel, E-mail: qa1vacam@uco.es; Lucena, Rafael

    2013-07-25

    Graphical abstract: -- Highlights: •Analytical Chemistry is influenced by international written standards. •Different relationships can be established between them. •Synergies can be generated when these standards are conveniently managed. -- Abstract: This paper describes the mutual impact of Analytical Chemistry and several international written standards (norms and guides) related to knowledge management (CEN-CWA 14924:2004), social responsibility (ISO 26000:2010), management of occupational health and safety (OHSAS 18001/2), environmental management (ISO 14001:2004), quality management systems (ISO 9001:2008) and requirements of the competence of testing and calibration laboratories (ISO 17025:2004). The intensity of this impact, based on a two-way influence, is quite different depending on the standard considered. In any case, a new and fruitful approach to Analytical Chemistry based on these relationships can be derived.

  17. Oral Development for LSP via Open Source Tools

    Directory of Open Access Journals (Sweden)

    Alejandro Curado Fuentes

    2015-11-01

    Full Text Available For the development of oral abilities in LSP, few computer-based teaching and learning resources have actually focused intensively on web-based listening and speaking. Many more do on reading, writing, vocabulary and grammatical activities. Our aim in this paper is to approach oral communication in the online environment of Moodle by striving to make it suitable for a learning project which incorporates oral skills. The paper describes a blended process in which both individual and collaborative learning strategies can be combined and exploited through the implementation of specific tools and resources which may go hand in hand with traditional face-to-face conversational classes. The challenge with this new perspective is, ultimately, to provide effective tools for oral LSP development in an apparently writing skill-focused medium.

  18. Transforming Biology Assessment with Machine Learning: Automated Scoring of Written Evolutionary Explanations

    Science.gov (United States)

    Nehm, Ross H.; Ha, Minsu; Mayfield, Elijah

    2012-01-01

    This study explored the use of machine learning to automatically evaluate the accuracy of students' written explanations of evolutionary change. Performance of the Summarization Integrated Development Environment (SIDE) program was compared to human expert scoring using a corpus of 2,260 evolutionary explanations written by 565 undergraduate…

  19. Can written information material help to increase treatment motivation in patients with erectile dysfunction? A survey of 1188 men.

    Science.gov (United States)

    Günzler, C; Kriston, L; Stodden, V; Leiber, C; Berner, M M

    2007-01-01

    Although erectile dysfunction (ED) prevalence is high, patients and physicians often have problems discussing this issue. This study examines whether written information material increases motivation to seek treatment in patients with ED. For the study, persons were able to order information material about sexual problems within the context of a public campaign. From a total of 70,000 responders, 8000 persons were asked to fill out an epidemiological questionnaire. The response rate yielded 18.4%, the data of 1188 men with ED were analyzed. As a result of the information material, 28.3% of the untreated men intended to seek treatment and 38.5% of the men who had not spoken with their physician about their problem, planned to do so now. Nearly all responders were satisfied with the information material. These data reflect the usefulness of written information for men with ED. It not only serves as an informational source for patients but may also encourage them to seek treatment.

  20. Development of a Monte Carlo multiple source model for inclusion in a dose calculation auditing tool.

    Science.gov (United States)

    Faught, Austin M; Davidson, Scott E; Fontenot, Jonas; Kry, Stephen F; Etzel, Carol; Ibbott, Geoffrey S; Followill, David S

    2017-09-01

    The Imaging and Radiation Oncology Core Houston (IROC-H) (formerly the Radiological Physics Center) has reported varying levels of agreement in their anthropomorphic phantom audits. There is reason to believe one source of error in this observed disagreement is the accuracy of the dose calculation algorithms and heterogeneity corrections used. To audit this component of the radiotherapy treatment process, an independent dose calculation tool is needed. Monte Carlo multiple source models for Elekta 6 MV and 10 MV therapeutic x-ray beams were commissioned based on measurement of central axis depth dose data for a 10 × 10 cm 2 field size and dose profiles for a 40 × 40 cm 2 field size. The models were validated against open field measurements consisting of depth dose data and dose profiles for field sizes ranging from 3 × 3 cm 2 to 30 × 30 cm 2 . The models were then benchmarked against measurements in IROC-H's anthropomorphic head and neck and lung phantoms. Validation results showed 97.9% and 96.8% of depth dose data passed a ±2% Van Dyk criterion for 6 MV and 10 MV models respectively. Dose profile comparisons showed an average agreement using a ±2%/2 mm criterion of 98.0% and 99.0% for 6 MV and 10 MV models respectively. Phantom plan comparisons were evaluated using ±3%/2 mm gamma criterion, and averaged passing rates between Monte Carlo and measurements were 87.4% and 89.9% for 6 MV and 10 MV models respectively. Accurate multiple source models for Elekta 6 MV and 10 MV x-ray beams have been developed for inclusion in an independent dose calculation tool for use in clinical trial audits. © 2017 American Association of Physicists in Medicine.

  1. Java Power Tools

    CERN Document Server

    Smart, John

    2008-01-01

    All true craftsmen need the best tools to do their finest work, and programmers are no different. Java Power Tools delivers 30 open source tools designed to improve the development practices of Java developers in any size team or organization. Each chapter includes a series of short articles about one particular tool -- whether it's for build systems, version control, or other aspects of the development process -- giving you the equivalent of 30 short reference books in one package. No matter which development method your team chooses, whether it's Agile, RUP, XP, SCRUM, or one of many other

  2. Open source GIS based tools to improve hydrochemical water resources management in EU H2020 FREEWAT platform

    Science.gov (United States)

    Criollo, Rotman; Velasco, Violeta; Vázquez-Suñé, Enric; Nardi, Albert; Marazuela, Miguel A.; Rossetto, Rudy; Borsi, Iacopo; Foglia, Laura; Cannata, Massimiliano; De Filippis, Giovanna

    2017-04-01

    Due to the general increase of water scarcity (Steduto et al., 2012), water quantity and quality must be well known to ensure a proper access to water resources in compliance with local and regional directives. This circumstance can be supported by tools which facilitate process of data management and its analysis. Such analyses have to provide research/professionals, policy makers and users with the ability to improve the management of the water resources with standard regulatory guidelines. Compliance with the established standard regulatory guidelines (with a special focus on requirement deriving from the GWD) should have an effective monitoring, evaluation, and interpretation of a large number of physical and chemical parameters. These amounts of datasets have to be assessed and interpreted: (i) integrating data from different sources and gathered with different data access techniques and formats; (ii) managing data with varying temporal and spatial extent; (iii) integrating groundwater quality information with other relevant information such as further hydrogeological data (Velasco et al., 2014) and pre-processing these data generally for the realization of groundwater models. In this context, the Hydrochemical Analysis Tools, akvaGIS Tools, has been implemented within the H2020 FREEWAT project; which aims to manage water resources by modelling water resource management in an open source GIS platform (QGIS desktop). The main goal of AkvaGIS Tools is to improve water quality analysis through different capabilities to improve the case study conceptual model managing all data related into its geospatial database (implemented in Spatialite) and a set of tools for improving the harmonization, integration, standardization, visualization and interpretation of the hydrochemical data. To achieve that, different commands cover a wide range of methodologies for querying, interpreting, and comparing groundwater quality data and facilitate the pre-processing analysis for

  3. AceTree: a tool for visual analysis of Caenorhabditis elegans embryogenesis

    Directory of Open Access Journals (Sweden)

    Araya Carlos L

    2006-06-01

    Full Text Available Abstract Background The invariant lineage of the nematode Caenorhabditis elegans has potential as a powerful tool for the description of mutant phenotypes and gene expression patterns. We previously described procedures for the imaging and automatic extraction of the cell lineage from C. elegans embryos. That method uses time-lapse confocal imaging of a strain expressing histone-GFP fusions and a software package, StarryNite, processes the thousands of images and produces output files that describe the location and lineage relationship of each nucleus at each time point. Results We have developed a companion software package, AceTree, which links the images and the annotations using tree representations of the lineage. This facilitates curation and editing of the lineage. AceTree also contains powerful visualization and interpretive tools, such as space filling models and tree-based expression patterning, that can be used to extract biological significance from the data. Conclusion By pairing a fast lineaging program written in C with a user interface program written in Java we have produced a powerful software suite for exploring embryonic development.

  4. Documents written by the heads of the Catechetical School in Alexandria: From Mark to Clement

    Directory of Open Access Journals (Sweden)

    Willem H. Oliver

    2017-01-01

    Full Text Available The Catechetical School in Alexandria has delivered a number of prolific scholars and writers during the first centuries of the Common Era, up to its demise by the end of the 4th century. These scholars have produced an extensive collection of documents of which not many are extant. Fortunately, there are many references to these documents supplying us with an idea of the content thereof. As the author could not find one single source containing all the documents written by the heads of the School, he deemed it necessary to list these documents, together with a short discussion of it where possible. This article only discusses the writings of the following heads: Mark the Evangelist, Athenagoras, Pantaenus and Clement, covering the period between approximately 40 CE and the end of the 2nd century. The follow-up article discusses the documents of the heads who succeeded them.Intradisciplinary and/or interdisciplinary implications: The potential results of the proposed research are a full detailed list of all the documents being written by the heads of the School in Alexandria. The disciplines involved are (Church History, Theology and Antiquity. These results will make it easier for future researchers to work on these writers.

  5. Computing tools for implementing standards for single-case designs.

    Science.gov (United States)

    Chen, Li-Ting; Peng, Chao-Ying Joanne; Chen, Ming-E

    2015-11-01

    In the single-case design (SCD) literature, five sets of standards have been formulated and distinguished: design standards, assessment standards, analysis standards, reporting standards, and research synthesis standards. This article reviews computing tools that can assist researchers and practitioners in meeting the analysis standards recommended by the What Works Clearinghouse: Procedures and Standards Handbook-the WWC standards. These tools consist of specialized web-based calculators or downloadable software for SCD data, and algorithms or programs written in Excel, SAS procedures, SPSS commands/Macros, or the R programming language. We aligned these tools with the WWC standards and evaluated them for accuracy and treatment of missing data, using two published data sets. All tools were tested to be accurate. When missing data were present, most tools either gave an error message or conducted analysis based on the available data. Only one program used a single imputation method. This article concludes with suggestions for an inclusive computing tool or environment, additional research on the treatment of missing data, and reasonable and flexible interpretations of the WWC standards. © The Author(s) 2015.

  6. Helioviewer.org: An Open-source Tool for Visualizing Solar Data

    Science.gov (United States)

    Hughitt, V. Keith; Ireland, J.; Schmiedel, P.; Dimitoglou, G.; Mueller, D.; Fleck, B.

    2009-05-01

    As the amount of solar data available to scientists continues to increase at faster and faster rates, it is important that there exist simple tools for navigating this data quickly with a minimal amount of effort. By combining heterogeneous solar physics datatypes such as full-disk images and coronagraphs, along with feature and event information, Helioviewer offers a simple and intuitive way to browse multiple datasets simultaneously. Images are stored in a repository using the JPEG 2000 format and tiled dynamically upon a client's request. By tiling images and serving only the portions of the image requested, it is possible for the client to work with very large images without having to fetch all of the data at once. Currently, Helioviewer enables users to browse the entire SOHO data archive, updated hourly, as well as data feature/event catalog data from eight different catalogs including active region, flare, coronal mass ejection, type II radio burst data. In addition to a focus on intercommunication with other virtual observatories and browsers (VSO, HEK, etc), Helioviewer will offer a number of externally-available application programming interfaces (APIs) to enable easy third party use, adoption and extension. Future functionality will include: support for additional data-sources including TRACE, SDO and STEREO, dynamic movie generation, a navigable timeline of recorded solar events, social annotation, and basic client-side image processing.

  7. Streamlined sign-out of capillary protein electrophoresis using middleware and an open-source macro application.

    Science.gov (United States)

    Mathur, Gagan; Haugen, Thomas H; Davis, Scott L; Krasowski, Matthew D

    2014-01-01

    Interfacing of clinical laboratory instruments with the laboratory information system (LIS) via "middleware" software is increasingly common. Our clinical laboratory implemented capillary electrophoresis using a Sebia(®) Capillarys-2™ (Norcross, GA, USA) instrument for serum and urine protein electrophoresis. Using Data Innovations Instrument Manager, an interface was established with the LIS (Cerner) that allowed for bi-directional transmission of numeric data. However, the text of the interpretive pathology report was not properly transferred. To reduce manual effort and possibility for error in text data transfer, we developed scripts in AutoHotkey, a free, open-source macro-creation and automation software utility. Scripts were written to create macros that automated mouse and key strokes. The scripts retrieve the specimen accession number, capture user input text, and insert the text interpretation in the correct patient record in the desired format. The scripts accurately and precisely transfer narrative interpretation into the LIS. Combined with bar-code reading by the electrophoresis instrument, the scripts transfer data efficiently to the correct patient record. In addition, the AutoHotKey script automated repetitive key strokes required for manual entry into the LIS, making protein electrophoresis sign-out easier to learn and faster to use by the pathology residents. Scripts allow for either preliminary verification by residents or final sign-out by the attending pathologist. Using the open-source AutoHotKey software, we successfully improved the transfer of text data between capillary electrophoresis software and the LIS. The use of open-source software tools should not be overlooked as tools to improve interfacing of laboratory instruments.

  8. Does Use of Text-to-Speech and Related Read-Aloud Tools Improve Reading Comprehension for Students with Reading Disabilities? A Meta-Analysis

    Science.gov (United States)

    Wood, Sarah G.; Moxley, Jerad H.; Tighe, Elizabeth L.; Wagner, Richard K.

    2018-01-01

    Text-to-speech and related read-aloud tools are being widely implemented in an attempt to assist students' reading comprehension skills. Read-aloud software, including text-to-speech, is used to translate written text into spoken text, enabling one to listen to written text while reading along. It is not clear how effective text-to-speech is at…

  9. Comparison of an enhanced versus a written feedback model on the management of Medicare inpatients with venous thrombosis.

    Science.gov (United States)

    Hayes, R; Bratzler, D; Armour, B; Moore, L; Murray, C; Stevens, B R; Radford, M; Fitzgerald, D; Elward, K; Ballard, D J

    2001-03-01

    A multistate randomized study conducted under the Health Care Financing Administration's (HCFA's) Health Care Quality Improvement Program (HCQIP) offered the opportunity to compare the effect of a written feedback intervention (WFI) with that of an enhanced feedback intervention (EFI) on improving the anticoagulant management of Medicare beneficiaries who present to the hospital with venous thromboembolic disease. Twenty-nine hospitals in five states were randomly assigned to receive written hospital-specific feedback (WFI) of feedback enhanced by the participation of a trained physician, quality improvement tools, and an Anticoagulant Management of Venous Thrombosis (AMVT) project liaison (EFI). Differences in the performance of five quality indicators between baseline and remeasurement were assessed. Quality managers were interviewed to determine perceptions of project implementation. No significant differences in the change from baseline to remeasurement were found between the two intervention groups. Significant improvement in one indicator and significant decline in two indicators were found for one or both groups. Yet 59% of all quality managers perceived the AMVT project as being successful to very successful, and more EFI quality managers perceived success than did WFI managers (71% versus 40%). In the majority of EFI hospitals, physician liaisons played an important role in project implementation. Study results indicated that the addition of a physician liaison, quality improvement tools, and a project liaison did not provide incremental value to hospital-specific feedback for improving quality of care. Future studies with larger sample sizes, lengthier follow-up periods, and interventions that include more of the elements shown to affect practice behavior change are needed to identify an optimal feedback model for use by external quality management organizations.

  10. Cue Reliance in L2 Written Production

    Science.gov (United States)

    Wiechmann, Daniel; Kerz, Elma

    2014-01-01

    Second language learners reach expert levels in relative cue weighting only gradually. On the basis of ensemble machine learning models fit to naturalistic written productions of German advanced learners of English and expert writers, we set out to reverse engineer differences in the weighting of multiple cues in a clause linearization problem. We…

  11. Modeling statistical properties of written text.

    Directory of Open Access Journals (Sweden)

    M Angeles Serrano

    Full Text Available Written text is one of the fundamental manifestations of human language, and the study of its universal regularities can give clues about how our brains process information and how we, as a society, organize and share it. Among these regularities, only Zipf's law has been explored in depth. Other basic properties, such as the existence of bursts of rare words in specific documents, have only been studied independently of each other and mainly by descriptive models. As a consequence, there is a lack of understanding of linguistic processes as complex emergent phenomena. Beyond Zipf's law for word frequencies, here we focus on burstiness, Heaps' law describing the sublinear growth of vocabulary size with the length of a document, and the topicality of document collections, which encode correlations within and across documents absent in random null models. We introduce and validate a generative model that explains the simultaneous emergence of all these patterns from simple rules. As a result, we find a connection between the bursty nature of rare words and the topical organization of texts and identify dynamic word ranking and memory across documents as key mechanisms explaining the non trivial organization of written text. Our research can have broad implications and practical applications in computer science, cognitive science and linguistics.

  12. Language and Ageing--Exploring Propositional Density in Written Language--Stability over Time

    Science.gov (United States)

    Spencer, Elizabeth; Craig, Hugh; Ferguson, Alison; Colyvas, Kim

    2012-01-01

    This study investigated the stability of propositional density (PD) in written texts, as this aspect of language shows promise as an indicator and as a predictor of language decline with ageing. This descriptive longitudinal study analysed written texts obtained from the Australian Longitudinal Study of Women's Health in which participants were…

  13. Using open source computational tools for predicting human metabolic stability and additional absorption, distribution, metabolism, excretion, and toxicity properties.

    Science.gov (United States)

    Gupta, Rishi R; Gifford, Eric M; Liston, Ted; Waller, Chris L; Hohman, Moses; Bunin, Barry A; Ekins, Sean

    2010-11-01

    Ligand-based computational models could be more readily shared between researchers and organizations if they were generated with open source molecular descriptors [e.g., chemistry development kit (CDK)] and modeling algorithms, because this would negate the requirement for proprietary commercial software. We initially evaluated open source descriptors and model building algorithms using a training set of approximately 50,000 molecules and a test set of approximately 25,000 molecules with human liver microsomal metabolic stability data. A C5.0 decision tree model demonstrated that CDK descriptors together with a set of Smiles Arbitrary Target Specification (SMARTS) keys had good statistics [κ = 0.43, sensitivity = 0.57, specificity = 0.91, and positive predicted value (PPV) = 0.64], equivalent to those of models built with commercial Molecular Operating Environment 2D (MOE2D) and the same set of SMARTS keys (κ = 0.43, sensitivity = 0.58, specificity = 0.91, and PPV = 0.63). Extending the dataset to ∼193,000 molecules and generating a continuous model using Cubist with a combination of CDK and SMARTS keys or MOE2D and SMARTS keys confirmed this observation. When the continuous predictions and actual values were binned to get a categorical score we observed a similar κ statistic (0.42). The same combination of descriptor set and modeling method was applied to passive permeability and P-glycoprotein efflux data with similar model testing statistics. In summary, open source tools demonstrated predictive results comparable to those of commercial software with attendant cost savings. We discuss the advantages and disadvantages of open source descriptors and the opportunity for their use as a tool for organizations to share data precompetitively, avoiding repetition and assisting drug discovery.

  14. Integrating Philips' extreme UV source in the alpha-tools

    Science.gov (United States)

    Pankert, Joseph; Apetz, Rolf; Bergmann, Klaus; Derra, Guenther; Janssen, Maurice; Jonkers, Jeroen; Klein, Jurgen; Kruecken, Thomas; List, Andreas; Loeken, Michael; Metzmacher, Christof; Neff, Willi; Probst, Sven; Prummer, Ralph; Rosier, Oliver; Seiwert, Stefan; Siemons, Guido; Vaudrevange, Dominik; Wagemann, Dirk; Weber, Achim; Zink, Peter; Zitzen, Oliver

    2005-05-01

    The paper describes recent progress in the development of the Philips's EUV source. Progress has been realized at many frontiers: Integration studies of the source into a scanner have primarily been studied on the Xe source because it has a high degree of maturity. We report on integration with a collector, associated collector lifetime and optical characteristics. Collector lifetime in excess of 1 bln shots could be demonstrated. Next, an active dose control system was developed and tested on the Xe lamp. Resulting dose stability data are less than 0.2% for an exposure window of 100 pulses. The second part of the paper reports on progress in the development of the Philips' Sn source. First, the details of the concept are described. It is based on a Laser triggered vacuum arc, which is an extension with respect to previous designs. The source is furbished with rotating electrodes that are covered with a Sn film that is constantly regenerated. Hence by the very design of the source, it is scalable to very high power levels, and moreover has fundamentally solved the notorious problem of electrode erosion. Power values of 260 W in 2p sr are reported, along with a stable, long life operation of the lamp. The paper also addresses the problem of debris generation and mitigation of the Sn-source. The problem is attacked by a combined strategy of protection of the collector by traditional means (e.g. fields, foiltraps... ), and by designing the gas atmosphere according to the principles of the well known halogen cycles in incandescent lamps. These principles have been studied in the Lighting industry for decades and rely on the excessively high vapor pressures of metal halides. Transferred to the Sn source, it allows pumping away tin residues that would otherwise irreversibly deposit on the collector.

  15. A framework for air quality monitoring based on free public data and open source tools

    Science.gov (United States)

    Nikolov, Hristo; Borisova, Denitsa

    2014-10-01

    In the recent years more and more widely accepted by the Space agencies (e.g. NASA, ESA) is the policy toward provision of Earth observation (EO) data and end products concerning air quality especially in large urban areas without cost to researchers and SMEs. Those EO data are complemented by increasing amount of in-situ data also provided at no cost either from national authorities or having crowdsourced origin. This accessibility together with the increased processing capabilities of the free and open source software is a prerequisite for creation of solid framework for air modeling in support of decision making at medium and large scale. Essential part of this framework is web-based GIS mapping tool responsible for dissemination of the output generated. In this research an attempt is made to establish a running framework based solely on openly accessible data on air quality and on set of freely available software tools for processing and modeling taking into account the present status quo in Bulgaria. Among the primary sources of data, especially for bigger urban areas, for different types of gases and dust particles, noted should be the National Institute of Meteorology and Hydrology of Bulgaria (NIMH) and National System for Environmental Monitoring managed by Bulgarian Executive Environmental Agency (ExEA). Both authorities provide data for concentration of several gases just to mention CO, CO2, NO2, SO2, and fine suspended dust (PM10, PM2.5) on monthly (for some data on daily) basis. In the framework proposed these data will complement the data from satellite-based sensors such as OMI instrument aboard EOS-Aura satellite and from TROPOMI instrument payload for future ESA Sentinel-5P mission. Integral part of the framework is the modern map for the land use/land cover which is provided from EEA by initiative GIO Land CORINE. This map is also a product from EO data distributed at European level. First and above all, our effort is focused on provision to the

  16. Factors affecting written distance-learning feedback: the tutor’s perspective

    Directory of Open Access Journals (Sweden)

    Christine Calfoglou

    2011-02-01

    Full Text Available Launching the distance-learning student-tutor interaction process, tutors of the first module of the M.Ed in English course at the HOU lay the foundations of academic student autonomy by means of providing – inter alia -- the appropriate written feedback on written assignments. In doing so, they need to gauge the content and form of their written comments systematically with regard to both output- and student-, that is human factor-related issues (cf. Goldstein, 2004, the latter being particularly relevant to the distance-learning context. In this article we discuss tutor policy as well as tutor perceptions (cf. Lee, 2004, 2009 among others regarding written feedback on students’ academic assignments in terms of aspects of deviance treated and the relative gravity of ‘global’ and ‘local’ errors (e.g. Ferris, 2002, the directness of the correction, the punitive or facilitative nature of the comments provided as well as the relative balance of student strengths and weaknesses on the tutor’s comment agenda (cf. Hyland & Hyland, 2006. The role of the tutor as an assessor and/or counsellor is explored and the importance of striking a delicate balance between the two, especially in a context where face-to-face feedback opportunities are severely restricted, is underscored. We suggest that distance-learning feedback practices may need to be at least partially individualized to maximize student response and meet the goal of ‘informed autonomy’.

  17. MATH: A Scientific Tool for Numerical Methods Calculation and Visualization

    Directory of Open Access Journals (Sweden)

    Henrich Glaser-Opitz

    2016-02-01

    Full Text Available MATH is an easy to use application for various numerical methods calculations with graphical user interface and integrated plotting tool written in Qt with extensive use of Qwt library for plotting options and use of Gsl and MuParser libraries as a numerical and parser helping libraries. It can be found at http://sourceforge.net/projects/nummath. MATH is a convenient tool for use in education process because of its capability of showing every important step in solution process to better understand how it is done. MATH also enables fast comparison of similar method speed and precision.

  18. Data Mining and Optimization Tools for Developing Engine Parameters Tools

    Science.gov (United States)

    Dhawan, Atam P.

    1998-01-01

    This project was awarded for understanding the problem and developing a plan for Data Mining tools for use in designing and implementing an Engine Condition Monitoring System. Tricia Erhardt and I studied the problem domain for developing an Engine Condition Monitoring system using the sparse and non-standardized datasets to be available through a consortium at NASA Lewis Research Center. We visited NASA three times to discuss additional issues related to dataset which was not made available to us. We discussed and developed a general framework of data mining and optimization tools to extract useful information from sparse and non-standard datasets. These discussions lead to the training of Tricia Erhardt to develop Genetic Algorithm based search programs which were written in C++ and used to demonstrate the capability of GA algorithm in searching an optimal solution in noisy, datasets. From the study and discussion with NASA LeRC personnel, we then prepared a proposal, which is being submitted to NASA for future work for the development of data mining algorithms for engine conditional monitoring. The proposed set of algorithm uses wavelet processing for creating multi-resolution pyramid of tile data for GA based multi-resolution optimal search.

  19. Speech language therapy bilingual clinic, a written language therapeutical proposal to deaf people: case report.

    Science.gov (United States)

    Guarinello, Ana Cristina; Massi, Giselle; Berberian, Ana Paula; Tonocchi, Rita; Lustosa, Sandra Silva

    2015-01-01

    This study aimed to analyze the written production of a deaf person who is in the process of written language acquisition. One person with hearing disability, called R., participated in this study together with his Speech Language Pathologist. The therapist, proficient in sign language, acted as an interlocutor and interpreter, prioritizing the interactive nature of language and interfering in the written production only when it was requested. During the 3 years of work with R., a change in stance toward written language was observed. In addition, he began to reflect on his texts and utilize written Portuguese in a way that allowed his texts to be more coherent. Writing became an opportunity to show his singularity and to begin reconstructing his relationship with language. Speech language pathology and audiology therapy, at a bilingual clinic, can allow people with hearing disability early access to sign language and, consequently, enable the development of the written form of Portuguese.

  20. SpecSatisfiabilityTool: A tool for testing the satisfiability of specifications on XML documents

    Directory of Open Access Journals (Sweden)

    Javier Albors

    2015-01-01

    Full Text Available We present a prototype that implements a set of logical rules to prove the satisfiability for a class of specifications on XML documents. Specifications are given by means of constrains built on Boolean XPath patterns. The main goal of this tool is to test whether a given specification is satisfiable or not, and justify the decision showing the execution history. It can also be used to test whether a given document is a model of a given specification and, as a by-product, it permits to look for all the relations (monomorphisms between two patterns and to combine patterns in different ways. The results of these operations are visually shown and therefore the tool makes these operations more understandable. The implementation of the algorithm has been written in Prolog but the prototype has a Java interface for an easy and friendly use. In this paper we show how to use this interface in order to test all the desired properties.

  1. Interplay among Technical, Socio-Emotional and Personal Factors in Written Feedback Research

    Science.gov (United States)

    Chong, Ivan

    2018-01-01

    The centrality of written feedback is clearly seen from the proliferation of research in the context of higher education. As an increasingly expanding field in research, the majority of written feedback studies have been interested in investigating the technical aspect of how feedback should be given in order to promote student learning. More…

  2. Methods and tools used at the IPSN for the safety assessment of critical software

    International Nuclear Information System (INIS)

    Regnier, P.; Henry, J.Y.

    1998-01-01

    A significant feature of EDF's latest 1400MWe ''N4'' generation of pressurized water reactor (PWR) is the extensive use of computerized instrumentation and control, including a fully digital system for the reactor protection function. For the safety assessment of the software driving the operation of this digital reactor protection called SPIN, IPSN has developed and implemented a set of methods and tools. Using the lessons learned from this experience, IPSN has worked at improving those methods and tools, mainly trying to make them more automatic to use, and has participated in an international assessment exercise to test some other methods and tools, either new products on the market or self-developed products. As a result of these works, this paper presents an up to date overview of the IPSN methods and tools used for the assessment of safety critical software. This assessment, which consists of an analysis of all the documentation associated with the technical specifications and of a representative set of functions, is usually carried out in five steps: (1) critical examination of the documents, (2) evaluation of the quality of the code, (3) determination of the critical software components, (4) development of test cases and choice of testing strategy, (5) dynamic analysis (consistency and robustness). This paper also presents methods and tools developed or implemented by IPSN in order to: evaluate the completeness and consistency of specification and design documents written in natural language; build a model and simulate specification or design items; evaluate the quality of the source code; carry out FMEA analysis; run the binary code and perform tests (CLAIRE); perform random or mutational tests. (author)

  3. DensToolKit: A comprehensive open-source package for analyzing the electron density and its derivative scalar and vector fields

    Science.gov (United States)

    Solano-Altamirano, J. M.; Hernández-Pérez, Julio M.

    2015-11-01

    DensToolKit is a suite of cross-platform, optionally parallelized, programs for analyzing the molecular electron density (ρ) and several fields derived from it. Scalar and vector fields, such as the gradient of the electron density (∇ρ), electron localization function (ELF) and its gradient, localized orbital locator (LOL), region of slow electrons (RoSE), reduced density gradient, localized electrons detector (LED), information entropy, molecular electrostatic potential, kinetic energy densities K and G, among others, can be evaluated on zero, one, two, and three dimensional grids. The suite includes a program for searching critical points and bond paths of the electron density, under the framework of Quantum Theory of Atoms in Molecules. DensToolKit also evaluates the momentum space electron density on spatial grids, and the reduced density matrix of order one along lines joining two arbitrary atoms of a molecule. The source code is distributed under the GNU-GPLv3 license, and we release the code with the intent of establishing an open-source collaborative project. The style of DensToolKit's code follows some of the guidelines of an object-oriented program. This allows us to supply the user with a simple manner for easily implement new scalar or vector fields, provided they are derived from any of the fields already implemented in the code. In this paper, we present some of the most salient features of the programs contained in the suite, some examples of how to run them, and the mathematical definitions of the implemented fields along with hints of how we optimized their evaluation. We benchmarked our suite against both a freely-available program and a commercial package. Speed-ups of ˜2×, and up to 12× were obtained using a non-parallel compilation of DensToolKit for the evaluation of fields. DensToolKit takes similar times for finding critical points, compared to a commercial package. Finally, we present some perspectives for the future development and

  4. Application of micro-attenuated total reflectance Fourier transform infrared spectroscopy to ink examination in signatures written with ballpoint pen on questioned documents.

    Science.gov (United States)

    Nam, Yun Sik; Park, Jin Sook; Lee, Yeonhee; Lee, Kang-Bong

    2014-05-01

    Questioned documents examined in a forensic laboratory sometimes contain signatures written with ballpoint pen inks; these signatures were examined to assess the feasibility of micro-attenuated total reflectance (ATR) Fourier transform infrared (FTIR) spectroscopy as a forensic tool. Micro-ATR FTIR spectra for signatures written with 63 ballpoint pens available commercially in Korea were obtained and used to construct an FTIR spectral database. A library-searching program was utilized to identify the manufacturer, blend, and model of each black ballpoint pen ink based upon their FTIR peak intensities, positions, and patterns in the spectral database. This FTIR technique was also successfully used in determining the sequence of homogeneous line intersections from the crossing lines of two ballpoint pen signatures. We have demonstrated with a set of sample documents that micro-ATR FTIR is a viable nondestructive analytical method that can be used to identify the origin of the ballpoint pen ink used to mark signatures. © 2014 American Academy of Forensic Sciences.

  5. CoCoa: a software tool for estimating the coefficient of coancestry from multilocus genotype data.

    Science.gov (United States)

    Maenhout, Steven; De Baets, Bernard; Haesaert, Geert

    2009-10-15

    Phenotypic data collected in breeding programs and marker-trait association studies are often analyzed by means of linear mixed models. In these models, the covariance between the genetic background effects of all genotypes under study is modeled by means of pairwise coefficients of coancestry. Several marker-based coancestry estimation procedures allow to estimate this covariance matrix, but generally introduce a certain amount of bias when the examined genotypes are part of a breeding program. CoCoa implements the most commonly used marker-based coancestry estimation procedures and as such, allows to select the best fitting covariance structure for the phenotypic data at hand. This better model fit translates into an increased power and improved type I error control in association studies and an improved accuracy in phenotypic prediction studies. The presented software package also provides an implementation of the new Weighted Alikeness in State (WAIS) estimator for use in hybrid breeding programs. Besides several matrix manipulation tools, CoCoa implements two different bending heuristics, in case the inverse of an ill-conditioned coancestry matrix estimate is needed. The software package CoCoa is freely available at http://webs.hogent.be/cocoa. Source code, manual, binaries for 32 and 64-bit Linux systems and an installer for Microsoft Windows are provided. The core components of CoCoa are written in C++, while the graphical user interface is written in Java.

  6. Tools for remote computing in accelerator control

    International Nuclear Information System (INIS)

    Anderssen, P.S.; Frammery, V.; Wilcke, R.

    1990-01-01

    In modern accelerator control systems, the intelligence of the equipment is distributed in the geographical and the logical sense. Control processes for a large variety of tasks reside in both the equipment and the control computers. Hence successful operation hinges on the availability and reliability of the communication infrastructure. The computers are interconnected by a communication system and use remote procedure calls and message passing for information exchange. These communication mechanisms need a well-defined convention, i.e. a protocol. They also require flexibility in both the setup and changes to the protocol specification. The network compiler is a tool which provides the programmer with a means of establishing such a protocol for his application. Input to the network compiler is a single interface description file provided by the programmer. This file is written according to a grammar, and completely specifies the interprocess communication interfaces. Passed through the network compiler, the interface description file automatically produces the additional source code needed for the protocol. Hence the programmer does not have to be concerned about the details of the communication calls. Any further additions and modifications are made easy, because all the information about the interface is kept in a single file. (orig.)

  7. Individual Differences in Strategy Use on Division Problems: Mental versus Written Computation

    Science.gov (United States)

    Hickendorff, Marian; van Putten, Cornelis M.; Verhelst, Norman D.; Heiser, Willem J.

    2010-01-01

    Individual differences in strategy use (choice and accuracy) were analyzed. A sample of 362 Grade 6 students solved complex division problems under 2 different conditions. In the choice condition students were allowed to use either a mental or a written strategy. In the subsequent no-choice condition, they were required to use a written strategy.…

  8. PLOT3D Export Tool for Tecplot

    Science.gov (United States)

    Alter, Stephen

    2010-01-01

    The PLOT3D export tool for Tecplot solves the problem of modified data being impossible to output for use by another computational science solver. The PLOT3D Exporter add-on enables the use of the most commonly available visualization tools to engineers for output of a standard format. The exportation of PLOT3D data from Tecplot has far reaching effects because it allows for grid and solution manipulation within a graphical user interface (GUI) that is easily customized with macro language-based and user-developed GUIs. The add-on also enables the use of Tecplot as an interpolation tool for solution conversion between different grids of different types. This one add-on enhances the functionality of Tecplot so significantly, it offers the ability to incorporate Tecplot into a general suite of tools for computational science applications as a 3D graphics engine for visualization of all data. Within the PLOT3D Export Add-on are several functions that enhance the operations and effectiveness of the add-on. Unlike Tecplot output functions, the PLOT3D Export Add-on enables the use of the zone selection dialog in Tecplot to choose which zones are to be written by offering three distinct options - output of active, inactive, or all zones (grid blocks). As the user modifies the zones to output with the zone selection dialog, the zones to be written are similarly updated. This enables the use of Tecplot to create multiple configurations of a geometry being analyzed. For example, if an aircraft is loaded with multiple deflections of flaps, by activating and deactivating different zones for a specific flap setting, new specific configurations of that aircraft can be easily generated by only writing out specific zones. Thus, if ten flap settings are loaded into Tecplot, the PLOT3D Export software can output ten different configurations, one for each flap setting.

  9. Relations between scripted online peer feedback processes and quality of written argumentative essay

    NARCIS (Netherlands)

    Noroozi, Omid; Biemans, Harm; Mulder, Martin

    2016-01-01

    Teachers often complain about the quality of students' written essays in higher education. This study explores the relations between scripted online peer feedback processes and quality of written argumentative essay as they occur in an authentic learning situation with direct practical relevance.

  10. The GNEMRE Dendro Tool.

    Energy Technology Data Exchange (ETDEWEB)

    Merchant, Bion John

    2007-10-01

    The GNEMRE Dendro Tool provides a previously unrealized analysis capability in the field of nuclear explosion monitoring. Dendro Tool allows analysts to quickly and easily determine the similarity between seismic events using the waveform time-series for each of the events to compute cross-correlation values. Events can then be categorized into clusters of similar events. This analysis technique can be used to characterize historical archives of seismic events in order to determine many of the unique sources that are present. In addition, the source of any new events can be quickly identified simply by comparing the new event to the historical set.

  11. 42 CFR 456.180 - Individual written plan of care.

    Science.gov (United States)

    2010-10-01

    ... SERVICES (CONTINUED) MEDICAL ASSISTANCE PROGRAMS UTILIZATION CONTROL Utilization Control: Mental Hospitals Plan of Care § 456.180 Individual written plan of care. (a) Before admission to a mental hospital or...

  12. The written mathematical communication profile of prospective math teacher in mathematical proving

    Science.gov (United States)

    Pantaleon, K. V.; Juniati, D.; Lukito, A.; Mandur, K.

    2018-01-01

    Written mathematical communication is the process of expressing mathematical ideas and understanding in writing. It is one of the important aspects that must be mastered by the prospective math teacher as tool of knowledge transfer. This research was a qualitative research that aimed to describe the mathematical communication profile of the prospective mathematics teacher in mathematical proving. This research involved 48 students of Mathematics Education Study Program; one of them with moderate math skills was chosen as the main subject. Data were collected through tests, assignments, and task-based interviews. The results of this study point out that in the proof of geometry, the subject explains what is understood, presents the idea in the form of drawing and symbols, and explains the content/meaning of a representation accurately and clearly, but the subject can not convey the argument systematically and logically. Whereas in the proof of algebra, the subject describes what is understood, explains the method used, and describes the content/meaning of a symbolic representation accurately, systematically, logically, but the argument presented is not clear because it is insufficient detailed and complete.

  13. 19 CFR 210.4 - Written submissions; representations; sanctions.

    Science.gov (United States)

    2010-04-01

    ... 19 Customs Duties 3 2010-04-01 2010-04-01 false Written submissions; representations; sanctions. 210.4 Section 210.4 Customs Duties UNITED STATES INTERNATIONAL TRADE COMMISSION INVESTIGATIONS OF UNFAIR PRACTICES IN IMPORT TRADE ADJUDICATION AND ENFORCEMENT Rules of General Applicability § 210.4...

  14. An Open-Source Label Atlas Correction Tool and Preliminary Results on Huntingtons Disease Whole-Brain MRI Atlases.

    Science.gov (United States)

    Forbes, Jessica L; Kim, Regina E Y; Paulsen, Jane S; Johnson, Hans J

    2016-01-01

    The creation of high-quality medical imaging reference atlas datasets with consistent dense anatomical region labels is a challenging task. Reference atlases have many uses in medical image applications and are essential components of atlas-based segmentation tools commonly used for producing personalized anatomical measurements for individual subjects. The process of manual identification of anatomical regions by experts is regarded as a so-called gold standard; however, it is usually impractical because of the labor-intensive costs. Further, as the number of regions of interest increases, these manually created atlases often contain many small inconsistently labeled or disconnected regions that need to be identified and corrected. This project proposes an efficient process to drastically reduce the time necessary for manual revision in order to improve atlas label quality. We introduce the LabelAtlasEditor tool, a SimpleITK-based open-source label atlas correction tool distributed within the image visualization software 3D Slicer. LabelAtlasEditor incorporates several 3D Slicer widgets into one consistent interface and provides label-specific correction tools, allowing for rapid identification, navigation, and modification of the small, disconnected erroneous labels within an atlas. The technical details for the implementation and performance of LabelAtlasEditor are demonstrated using an application of improving a set of 20 Huntingtons Disease-specific multi-modal brain atlases. Additionally, we present the advantages and limitations of automatic atlas correction. After the correction of atlas inconsistencies and small, disconnected regions, the number of unidentified voxels for each dataset was reduced on average by 68.48%.

  15. One-Dimensional Signal Extraction Of Paper-Written ECG Image And Its Archiving

    Science.gov (United States)

    Zhang, Zhi-ni; Zhang, Hong; Zhuang, Tian-ge

    1987-10-01

    A method for converting paper-written electrocardiograms to one dimensional (1-D) signals for archival storage on floppy disk is presented here. Appropriate image processing techniques were employed to remove the back-ground noise inherent to ECG recorder charts and to reconstruct the ECG waveform. The entire process consists of (1) digitization of paper-written ECGs with an image processing system via a TV camera; (2) image preprocessing, including histogram filtering and binary image generation; (3) ECG feature extraction and ECG wave tracing, and (4) transmission of the processed ECG data to IBM-PC compatible floppy disks for storage and retrieval. The algorithms employed here may also be used in the recognition of paper-written EEG or EMG and may be useful in robotic vision.

  16. CORBA (Common Object Request Broker Architecture) compliant tools for high energy physics data analysis

    International Nuclear Information System (INIS)

    Rouse, Forest R.; Patel, Mayank B.

    1996-01-01

    We describe a system that given an IDL specification for a subroutine can generate the code required for autonomous execution of the routine in a distributed heterogeneous computing environment using the COBRA standard. This allows users to interactively modify the network of tools to either change the functionality or to redistribute the tools among the available processors without modifying any code. We show that the additional system overhead is small in comparison to the data movement time. Additionally, a prototype Graphical User Interface (GUI) to configure and control the network of tools has been written. (author)

  17. The Contribution of Verbal Working Memory to Deaf Children’s Oral and Written Production

    Science.gov (United States)

    Arfé, Barbara; Rossi, Cristina; Sicoli, Silvia

    2015-01-01

    This study investigated the contribution of verbal working memory to the oral and written story production of deaf children. Participants were 29 severely to profoundly deaf children aged 8–13 years and 29 hearing controls, matched for grade level. The children narrated a picture story orally and in writing and performed a reading comprehension test, the Wechsler Intelligence Scale for Children-Fourth Edition forward digit span task, and a reading span task. Oral and written stories were analyzed at the microstructural (i.e., clause) and macrostructural (discourse) levels. Hearing children’s stories scored higher than deaf children’s at both levels. Verbal working memory skills contributed to deaf children’s oral and written production over and above age and reading comprehension skills. Verbal rehearsal skills (forward digit span) contributed significantly to deaf children’s ability to organize oral and written stories at the microstructural level; they also accounted for unique variance at the macrostructural level in writing. Written story production appeared to involve greater verbal working memory resources than oral story production. PMID:25802319

  18. Youth Participatory Action Research and Educational Transformation: The Potential of Intertextuality as a Methodological Tool

    Science.gov (United States)

    Bertrand, Melanie

    2016-01-01

    In this article, Melanie Bertrand explores the potential of using the concept of intertextuality--which captures the way snippets of written or spoken text from one source become incorporated into other sources--in the study and practice of youth participatory action research (YPAR). Though this collective and youth-centered form of research…

  19. Open source engineering and sustainability tools for the built environment

    NARCIS (Netherlands)

    Coenders, J.L.

    2013-01-01

    This paper presents two novel open source software developments for design and engineering in the built environment. The first development, called “sustainability-open” [1], aims on providing open source design, analysis and assessment software source code for (environmental) performance of

  20. Written object naming, spelling to dictation, and immediate copying: Different tasks, different pathways?

    Science.gov (United States)

    Bonin, Patrick; Méot, Alain; Lagarrigue, Aurélie; Roux, Sébastien

    2015-01-01

    We report an investigation of cross-task comparisons of handwritten latencies in written object naming, spelling to dictation, and immediate copying. In three separate sessions, adults had to write down a list of concrete nouns from their corresponding pictures (written naming), from their spoken (spelling to dictation) and from their visual presentation (immediate copying). Linear mixed models without random slopes were performed on the latencies in order to study and compare within-task fixed effects. By-participants random slopes were then included to investigate individual differences within and across tasks. Overall, the findings suggest that written naming, spelling to dictation, and copying all involve a lexical pathway, but that written naming relies on this pathway more than the other two tasks do. Only spelling to dictation strongly involves a nonlexical pathway. Finally, the analyses performed at the level of participants indicate that, depending on the type of task, the slower participants are more or less influenced by certain psycholinguistic variables.

  1. Investigation and Evaluation of the open source ETL tools GeoKettle and Talend Open Studio in terms of their ability to process spatial data

    Science.gov (United States)

    Kuhnert, Kristin; Quedenau, Jörn

    2016-04-01

    Integration and harmonization of large spatial data sets is not only since the introduction of the spatial data infrastructure INSPIRE a big issue. The process of extracting and combining spatial data from heterogeneous source formats, transforming that data to obtain the required quality for particular purposes and loading it into a data store, are common tasks. The procedure of Extraction, Transformation and Loading of data is called ETL process. Geographic Information Systems (GIS) can take over many of these tasks but often they are not suitable for processing large datasets. ETL tools can make the implementation and execution of ETL processes convenient and efficient. One reason for choosing ETL tools for data integration is that they ease maintenance because of a clear (graphical) presentation of the transformation steps. Developers and administrators are provided with tools for identification of errors, analyzing processing performance and managing the execution of ETL processes. Another benefit of ETL tools is that for most tasks no or only little scripting skills are required so that also researchers without programming background can easily work with it. Investigations on ETL tools for business approaches are available for a long time. However, little work has been published on the capabilities of those tools to handle spatial data. In this work, we review and compare the open source ETL tools GeoKettle and Talend Open Studio in terms of processing spatial data sets of different formats. For evaluation, ETL processes are performed with both software packages based on air quality data measured during the BÄRLIN2014 Campaign initiated by the Institute for Advanced Sustainability Studies (IASS). The aim of the BÄRLIN2014 Campaign is to better understand the sources and distribution of particulate matter in Berlin. The air quality data are available in heterogeneous formats because they were measured with different instruments. For further data analysis

  2. The Written Literacy Forum: Combining Research and Practice.

    Science.gov (United States)

    Clark, Christopher M.; Florio, Susan

    1983-01-01

    Writing teachers and researchers collaborate in the Written Literacy Forum at Michigan State University to: (1) heighten teachers' awareness of the complexity of writing; (2) stimulate discussion across grade levels; and (3) focus research on areas concerning teachers. Discussion formats and inservice activities are described, and materials…

  3. Optimizing the efficiency of femtosecond-laser-written holograms

    DEFF Research Database (Denmark)

    Wædegaard, Kristian Juncher; Hansen, Henrik Dueholm; Balling, Peter

    2013-01-01

    Computer-generated binary holograms are written on a polished copper surface using single 800-nm, 120-fs pulses from a 1-kHz-repetition-rate laser system. The hologram efficiency (i.e. the power in the holographic reconstructed image relative to the incoming laser power) is investigated...

  4. a Free and Open Source Tool to Assess the Accuracy of Land Cover Maps: Implementation and Application to Lombardy Region (italy)

    Science.gov (United States)

    Bratic, G.; Brovelli, M. A.; Molinari, M. E.

    2018-04-01

    The availability of thematic maps has significantly increased over the last few years. Validation of these maps is a key factor in assessing their suitability for different applications. The evaluation of the accuracy of classified data is carried out through a comparison with a reference dataset and the generation of a confusion matrix from which many quality indexes can be derived. In this work, an ad hoc free and open source Python tool was implemented to automatically compute all the matrix confusion-derived accuracy indexes proposed by literature. The tool was integrated into GRASS GIS environment and successfully applied to evaluate the quality of three high-resolution global datasets (GlobeLand30, Global Urban Footprint, Global Human Settlement Layer Built-Up Grid) in the Lombardy Region area (Italy). In addition to the most commonly used accuracy measures, e.g. overall accuracy and Kappa, the tool allowed to compute and investigate less known indexes such as the Ground Truth and the Classification Success Index. The promising tool will be further extended with spatial autocorrelation analysis functions and made available to researcher and user community.

  5. Video processing project

    CSIR Research Space (South Africa)

    Globisch, R

    2009-03-01

    Full Text Available Video processing source code for algorithms and tools used in software media pipelines (e.g. image scalers, colour converters, etc.) The currently available source code is written in C++ with their associated libraries and DirectShow- Filters....

  6. Modality differences between written and spoken story retelling in healthy older adults

    Directory of Open Access Journals (Sweden)

    Jessica Ann Obermeyer

    2015-04-01

    Methods: Ten native English speaking healthy elderly participants between the ages of 50 and 80 were recruited. Exclusionary criteria included neurological disease/injury, history of learning disability, uncorrected hearing or vision impairment, history of drug/alcohol abuse and presence of cognitive decline (based on Cognitive Linguistic Quick Test. Spoken and written discourse was analyzed for micro linguistic measures including total words, percent correct information units (CIUs; Nicholas & Brookshire, 1993 and percent complete utterances (CUs; Edmonds, et al. 2009. CIUs measure relevant and informative words while CUs focus at the sentence level and measure whether a relevant subject and verb and object (if appropriate are present. Results: Analysis was completed using Wilcoxon Rank Sum Test due to small sample size. Preliminary results revealed that healthy elderly people produced significantly more words in spoken retellings than written retellings (p=.000; however, this measure contrasted with %CIUs and %CUs with participants producing significantly higher %CIUs (p=.000 and %CUs (p=.000 in written story retellings than in spoken story retellings. Conclusion: These findings indicate that written retellings, while shorter, contained higher accuracy at both a word (CIU and sentence (CU level. This observation could be related to the ability to revise written text and therefore make it more concise, whereas the nature of speech results in more embellishment and “thinking out loud,” such as comments about the task, associated observations about the story, etc. We plan to run more participants and conduct a main concepts analysis (before conference time to gain more insight into modality differences and implications.

  7. Middle-aged women's decisions about body weight management: needs assessment and testing of a knowledge translation tool.

    Science.gov (United States)

    Stacey, Dawn; Jull, Janet; Beach, Sarah; Dumas, Alex; Strychar, Irene; Adamo, Kristi; Brochu, Martin; Prud'homme, Denis

    2015-04-01

    This study aims to assess middle-aged women's needs when making body weight management decisions and to evaluate a knowledge translation tool for addressing their needs. A mixed-methods study used an interview-guided theory-based survey of professional women aged 40 to 65 years. The tool summarized evidence to address their needs and enabled women to monitor actions taken. Acceptability and usability were reported descriptively. Sixty female participants had a mean body mass index of 28.0 kg/m(2) (range, 17.0-44.9 kg/m(2)), and half were premenopausal. Common options for losing (82%) or maintaining (18%) weight included increasing physical activity (60%), eating healthier (57%), and getting support (40%). Decision-making involved getting information on options (52%), soliciting others' decisions/advice (20%), and being self-motivated (20%). Preferred information sources included written information (97%), counseling (90%), and social networking websites (43%). Five professionals (dietitian, personal trainer, occupational therapist, and two physicians) had similar responses. Of 53 women sent the tool, 27 provided acceptability feedback. They rated it as good to excellent for information on menopause (96%), body weight changes (85%), and managing body weight (85%). Most would tell others about it (81%). After 4 weeks of use, 25 women reported that the wording made sense (96%) and that the tool had clear instructions (92%) and was easy to use across time (88%). The amount of information was rated as just right (64%), but the tool had limited space for responding (72%). When making decisions about body weight management, women's needs were "getting information" and "getting support." The knowledge translation tool was acceptable and usable, but further evaluation is required.

  8. SESAME: a software tool for the numerical dosimetric reconstruction of radiological accidents involving external sources and its application to the accident in Chile in December 2005.

    Science.gov (United States)

    Huet, C; Lemosquet, A; Clairand, I; Rioual, J B; Franck, D; de Carlan, L; Aubineau-Lanièce, I; Bottollier-Depois, J F

    2009-01-01

    Estimating the dose distribution in a victim's body is a relevant indicator in assessing biological damage from exposure in the event of a radiological accident caused by an external source. This dose distribution can be assessed by physical dosimetric reconstruction methods. Physical dosimetric reconstruction can be achieved using experimental or numerical techniques. This article presents the laboratory-developed SESAME--Simulation of External Source Accident with MEdical images--tool specific to dosimetric reconstruction of radiological accidents through numerical simulations which combine voxel geometry and the radiation-material interaction MCNP(X) Monte Carlo computer code. The experimental validation of the tool using a photon field and its application to a radiological accident in Chile in December 2005 are also described.

  9. A precompiler written in SPITBOL applied to programs to analyze nuclear data

    International Nuclear Information System (INIS)

    Winkelmann, K.; Croome, D.

    1985-01-01

    For an interactive data acquisition and analysis system for nuclear physics experiments a precompiler is provided to expand system specific macros in user written analysis programs. It is written with help of the string processing language SPITBOL and generates PL/I or PL-11 code. It is shown that SPITBOL is a suitable precompiler language for this kind of medium size precompile problems. (orig.)

  10. Concreteness Effects and Syntactic Modification in Written Composition.

    Science.gov (United States)

    Sadoski, Mark; Goetz, Ernest T.

    1998-01-01

    Investigates whether concreteness was related to a key characteristic of written composition--the cumulative sentence with a final modifier--which has been consistently associated with higher quality writing. Supports the conceptual-peg hypothesis of dual coding theory, with concrete verbs providing the pegs on which cumulative sentences are…

  11. Streamlined sign-out of capillary protein electrophoresis using middleware and an open-source macro application

    Directory of Open Access Journals (Sweden)

    Gagan Mathur

    2014-01-01

    Full Text Available Background: Interfacing of clinical laboratory instruments with the laboratory information system (LIS via "middleware" software is increasingly common. Our clinical laboratory implemented capillary electrophoresis using a Sebia; Capillarys-2™ (Norcross, GA, USA instrument for serum and urine protein electrophoresis. Using Data Innovations Instrument Manager, an interface was established with the LIS (Cerner that allowed for bi-directional transmission of numeric data. However, the text of the interpretive pathology report was not properly transferred. To reduce manual effort and possibility for error in text data transfer, we developed scripts in AutoHotkey, a free, open-source macro-creation and automation software utility. Materials and Methods: Scripts were written to create macros that automated mouse and key strokes. The scripts retrieve the specimen accession number, capture user input text, and insert the text interpretation in the correct patient record in the desired format. Results: The scripts accurately and precisely transfer narrative interpretation into the LIS. Combined with bar-code reading by the electrophoresis instrument, the scripts transfer data efficiently to the correct patient record. In addition, the AutoHotKey script automated repetitive key strokes required for manual entry into the LIS, making protein electrophoresis sign-out easier to learn and faster to use by the pathology residents. Scripts allow for either preliminary verification by residents or final sign-out by the attending pathologist. Conclusions: Using the open-source AutoHotKey software, we successfully improved the transfer of text data between capillary electrophoresis software and the LIS. The use of open-source software tools should not be overlooked as tools to improve interfacing of laboratory instruments.

  12. Physics analysis tools for beauty physics in ATLAS

    International Nuclear Information System (INIS)

    Anastopoulos, C; Bouhova-Thacker, E; Catmore, J; Mora, L de; Dallison, S; Derue, F; Epp, B; Jussel, P; Kaczmarska, A; Radziewski, H v; Stahl, T; Reznicek, P

    2008-01-01

    The Large Hadron Collider experiments will search for physics phenomena beyond the Standard Model. Highly sensitive tests of beauty hadrons will represent an alternative approach to this research. The analysis of complex decay chains of the beauty hadrons have to efficiently extract the detector tracks made by these reactions and reject other events in order to make sufficiently precise measurement. This places severe demands on the software used to analyze the B-physics data. The ATLAS B-physics group has written a series of tools and algorithms for performing these tasks, to be run within the ATLAS offline software framework Athena. This paper describes this analysis suite, paying particular attention to mechanisms for handling combinatorics, interfaces to secondary vertex fitting packages, B-flavor tagging tools and finally Monte Carlo true information association to pursue simulation data in process of the software validations which is an important part of the development of the physics analysis tools

  13. The Relevance of Second Language Acquisition Theory to the Written Error Correction Debate

    Science.gov (United States)

    Polio, Charlene

    2012-01-01

    The controversies surrounding written error correction can be traced to Truscott (1996) in his polemic against written error correction. He claimed that empirical studies showed that error correction was ineffective and that this was to be expected "given the nature of the correction process and "the nature of language learning" (p. 328, emphasis…

  14. Responding Effectively to Composition Students: Comparing Student Perceptions of Written and Audio Feedback

    Science.gov (United States)

    Bilbro, J.; Iluzada, C.; Clark, D. E.

    2013-01-01

    The authors compared student perceptions of audio and written feedback in order to assess what types of students may benefit from receiving audio feedback on their essays rather than written feedback. Many instructors previously have reported the advantages they see in audio feedback, but little quantitative research has been done on how the…

  15. Marginalia as the beginning of written culture: The Glosas Emilianensis

    Directory of Open Access Journals (Sweden)

    Maja Šabec

    2010-12-01

    Full Text Available The Glosas emilianenses are notes in Latin and in a Romance language dating from the eleventh century, written by an anonymous monk between the lines and in the margins of a Latin manuscript known as Codex Aemilianensis 60 to explicate syntactic, morphological, and semantic difficulties in understanding the original. The document was named after its place of origin, a monastery in the village of San Millán de la Cogolla, known as “the cradle of Castilian.” The non-Latin Romance glosses are believed to be the first written accounts of the language that later evolved into present-day Castilian or Spanish; they are therefore invaluable historical, linguistic, literary, and cultural material. The place and time of the origin of the glosses are not a coincidence, but a consequence of particular historical circumstances in the Iberian Peninsula. The Moorish invasion in 711 AD destroyed the Visigothic Kingdom and constrained the development of Christian culture, confining it to two independent cores in the north. The ninth century therefore saw the establishment of the County of Castile emerging from the two cores as the predecessor of the Kingdom of Castile (1065. Due to turbulent historical events, the place was populated by people from various adjacent and rather distant countries, thus making the spoken language a mixture of several varieties of Vulgar Latin, Mozarabic, and Navarrian (Basque elements. All of these features are reflected in the glosses in the San Millán manuscript. Therefore, it is difficult for linguists to name the variant of the Romance language the glosses were written in: “the Riojan dialect,” “a vernacular Castilian-Riojan dialect of the second half of the eleventh century displaying tendencies towards learned Latin,” or “a Riojan dialect with elements more common to neighboring dialects (Aragon, Navarrian, Léon, and Mozarabic than to Castilian.” However, because the San Millán glosses also include elements

  16. 9 CFR 202.113 - Rule 13: Written hearing.

    Science.gov (United States)

    2010-01-01

    ... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Rule 13: Written hearing. 202.113 Section 202.113 Animals and Animal Products GRAIN INSPECTION, PACKERS AND STOCKYARDS ADMINISTRATION... waiver of the right to file such evidence. (g) Extension of time for depositions. If any party timely...

  17. Written Emotional Expression as an Intervention for Asthma

    Science.gov (United States)

    Bray, Melissa A.; Theodore, Lea A.; Patwa, Shamim S.; Margiano, Suzanne G.; Alric, Jolie M.; Peck, Heather L.

    2003-01-01

    This investigation employed a multiple baseline design across five participants to examine written emotional expression as an intervention to improve lung function in high school-aged students, college students, and adults with asthma. The predicted forced expiratory volume in 1 second (FEV[subscript 1] measure of large airway functioning) and…

  18. Open Health Tools: Tooling for Interoperable Healthcare

    Directory of Open Access Journals (Sweden)

    Skip McGaughey

    2008-11-01

    Full Text Available The Open Health Tools initiative is creating an ecosystem focused on the production of software tooling that promotes the exchange of medical information across political, geographic, cultural, product, and technology lines. At its core, OHT believes that the availability of high-quality tooling that interoperates will propel the industry forward, enabling organizations and vendors to build products and systems that effectively work together. This will ?raise the interoperability bar? as a result of having tools that just work. To achieve these lofty goals, careful consideration must be made to the constituencies that will be most affected by an OHT-influenced world. This document outlines a vision of OHT?s impact to these stakeholders. It does not explain the OHT process itself or how the OHT community operates. Instead, we place emphasis on the impact of that process within the health industry. The catchphrase ?code is king? underpins this document, meaning that the manifestation of any open source community lies in the products and technology it produces.

  19. Evaluating the quality and readability of Internet information sources regarding the treatment of swallowing disorders.

    Science.gov (United States)

    O'Connell Ferster, Ashley P; Hu, Amanda

    2017-03-01

    The Internet has become a popular resource for patient education. The information it provides, however, is rarely peer-reviewed, and its quality may be a concern. Since the average American reads at an 8th grade level, the American Medical Association and the National Institutes of Health have recommended that health information be written at a 4th to 6th grade level. We performed a study to assess the quality and readability of online information regarding the treatment of swallowing disorders. A Google search for "swallowing treatment" was conducted. We studied the first 50 websites that appeared on the search engine's results with the use of the DISCERN quality index tool, the Flesch Ease of Reading Score (FRES), and the Flesch-Kincaid Grade Level (FKGL) readability test. DISCERN is a validated 16-item questionnaire used to assess the quality of written health information; FRES and FKGL are used to assess readability. We classified the websites as either patient-targeted or professional-targeted sites, as well as either major or minor. The overall DISCERN score was 1.61 ± 0.61 (range: 1 to 5), the overall FRES was 39.1 ± 19.0 (range: 1 to 100), and the overall FKGL was 11.8 ± 3.4 (range: 3 to 12). As would be expected, patient-targeted websites had significantly higher FRES and significantly lower FKGL scores than did the professional-targeted websites (p = 0.01 and p = 0.04, respectively); there was no significant difference between the two in DISCERN scores. The major websites had significantly higher DISCERN scores than did the minor sites (p = 0.002); there were no significant differences in FRES and FKGL scores. We conclude that online information sources regarding the treatment of swallowing disorders were of suboptimal quality in that information was written at a level too difficult for the average American to easily understand. Also, the patient-targeted websites were written at a lower reading level, and the major websites contained a higher quality

  20. Penetration Tester's Open Source Toolkit

    CERN Document Server

    Faircloth, Jeremy

    2011-01-01

    Great commercial penetration testing tools can be very expensive and sometimes hard to use or of questionable accuracy. This book helps solve both of these problems. The open source, no-cost penetration testing tools presented do a great job and can be modified by the user for each situation. Many tools, even ones that cost thousands of dollars, do not come with any type of instruction on how and in which situations the penetration tester can best use them. Penetration Tester's Open Source Toolkit, Third Edition, expands upon existing instructions so that a professional can get the most accura

  1. Evaluation of written patient educational materials in the field of diagnostic imaging

    International Nuclear Information System (INIS)

    Ryhaenen, A.M.; Johansson, K.; Virtanen, H.; Salo, S.; Salanterae, S.; Leino-Kilpi, H.

    2009-01-01

    Aim: To evaluate the quality of written educational materials for diagnostic imaging (radiological and nuclear medicine) patients. Materials and methods: Written educational materials (n = 70) for diagnostic imaging patients were analysed. The materials were evaluated based on their external appearance (9 criteria), instructiveness (7), content (7), language and structure (8) and readability (1). Deductive content analysis was used. Quantified parts of the analyses were analysed by SAS for Windows. Dependence between criteria (32) was tested by Pearson correlation coefficients. Results: The external appearance fulfilled almost completely the criteria of good written education materials. The instructiveness was addressed clearly, except for the purpose of the material. The contents of materials dealt with bio-physiological, functional and cognitive dimensions of knowledge, while financial dimensions of knowledge were hardly dealt with at all. The language and the structure were reasonably good, but the language was partly in passive voice and the text contained strange words. Most of the education material was moderately easy to read. Conclusions: The results show that the quality of material was quite good in all dimensions. Only a small number of criteria were unsatisfactory. The results can be used to further improve written patient education materials and patient education in the imaging unit.

  2. How Does Dissociation between Written and Oral Forms Affect Reading: Evidence from Auxiliary Verbs in Arabic

    Science.gov (United States)

    Ibrahim, Raphiq

    2011-01-01

    In Arabic, auxiliary verbs are necessary in the written language, but absent from the oral language. This is contrary to languages such as English and French in which auxiliary verbs are mandatory in both written and oral languages. This fact was exploited to examine if dissociation between written and oral forms affects reading measures like…

  3. A Theory of Developing Competence with Written Mathematical Symbols.

    Science.gov (United States)

    Hiebert, James

    1988-01-01

    Presented is a theory of how competence with written mathematical symbols develops, tracing a succession of cognitive processes that cumulate to yield competence. Arguments supporting the theory are drawn from the history, philosophy, and psychology of mathematics. (MNS)

  4. Oscillation Baselining and Analysis Tool

    Energy Technology Data Exchange (ETDEWEB)

    2017-03-27

    PNNL developed a new tool for oscillation analysis and baselining. This tool has been developed under a new DOE Grid Modernization Laboratory Consortium (GMLC) Project (GM0072 - “Suite of open-source applications and models for advanced synchrophasor analysis”) and it is based on the open platform for PMU analysis. The Oscillation Baselining and Analysis Tool (OBAT) performs the oscillation analysis and identifies modes of oscillations (frequency, damping, energy, and shape). The tool also does oscillation event baselining (fining correlation between oscillations characteristics and system operating conditions).

  5. Your Personal Analysis Toolkit - An Open Source Solution

    Science.gov (United States)

    Mitchell, T.

    2009-12-01

    Open source software is commonly known for its web browsers, word processors and programming languages. However, there is a vast array of open source software focused on geographic information management and geospatial application building in general. As geo-professionals, having easy access to tools for our jobs is crucial. Open source software provides the opportunity to add a tool to your tool belt and carry it with you for your entire career - with no license fees, a supportive community and the opportunity to test, adopt and upgrade at your own pace. OSGeo is a US registered non-profit representing more than a dozen mature geospatial data management applications and programming resources. Tools cover areas such as desktop GIS, web-based mapping frameworks, metadata cataloging, spatial database analysis, image processing and more. Learn about some of these tools as they apply to AGU members, as well as how you can join OSGeo and its members in getting the job done with powerful open source tools. If you haven't heard of OSSIM, MapServer, OpenLayers, PostGIS, GRASS GIS or the many other projects under our umbrella - then you need to hear this talk. Invest in yourself - use open source!

  6. Written Formative Assessment and Silence in the Classroom

    Science.gov (United States)

    Lee Hang, Desmond Mene; Bell, Beverley

    2015-01-01

    In this commentary, we build on Xinying Yin and Gayle Buck's discussion by exploring the cultural practices which are integral to formative assessment, when it is viewed as a sociocultural practice. First we discuss the role of assessment and in particular oral and written formative assessments in both western and Samoan cultures, building on the…

  7. Shortcomings of the written survey questionnaire for discovering ...

    African Journals Online (AJOL)

    In this article I describe my reflections on using a written survey questionnaire to investigate, on a large-scale, students' perceptions of studying Xhosa as a first language in high schools. I describe the aims of the project, how the questionnaire was designed, and the problems I encountered with the analysis of the data.

  8. THE PHONOLOGICAL BASIS OF MISSPELLINGS IN THE WRITTEN ...

    African Journals Online (AJOL)

    Misspellings have been a common error in the written English of non-native speakers. ... The study was done with a view to investigating whether the phonology of Kikuyu as a learner's first language and pronunciation of words in English as the second language, based on the influence of the phonology of Kikuyu affects ...

  9. Fabrication of self-written waveguide in photosensitive polyimide resin by controlling photochemical reaction of photosensitizer

    International Nuclear Information System (INIS)

    Yamashita, K.; Kuro, T.; Oe, K.; Mune, K.; Tagawa, K.; Naitou, R.; Mochizuki, A.

    2004-01-01

    We have investigated optical properties of photosensitive polyimide appropriating for long self-written waveguide fabrication. From systematic measurements of absorption properties, it was found that photochemical reaction of photosensitizer dissolved in the photosensitive polyimide resins relates to transparency after the exposure, which limits the length of the fabricated self-written waveguide. By controlling the photochemical reaction, in which the photosensitive polyimide resin has sufficient transparency during exposure, four times longer self-written waveguide core was fabricated

  10. Coastal On-line Assessment and Synthesis Tool 2.0

    Science.gov (United States)

    Brown, Richard; Navard, Andrew; Nguyen, Beth

    2011-01-01

    COAST (Coastal On-line Assessment and Synthesis Tool) is a 3D, open-source Earth data browser developed by leveraging and enhancing previous NASA open-source tools. These tools use satellite imagery and elevation data in a way that allows any user to zoom from orbit view down into any place on Earth, and enables the user to experience Earth terrain in a visually rich 3D view. The benefits associated with taking advantage of an open-source geo-browser are that it is free, extensible, and offers a worldwide developer community that is available to provide additional development and improvement potential. What makes COAST unique is that it simplifies the process of locating and accessing data sources, and allows a user to combine them into a multi-layered and/or multi-temporal visual analytical look into possible data interrelationships and coeffectors for coastal environment phenomenology. COAST provides users with new data visual analytic capabilities. COAST has been upgraded to maximize use of open-source data access, viewing, and data manipulation software tools. The COAST 2.0 toolset has been developed to increase access to a larger realm of the most commonly implemented data formats used by the coastal science community. New and enhanced functionalities that upgrade COAST to COAST 2.0 include the development of the Temporal Visualization Tool (TVT) plug-in, the Recursive Online Remote Data-Data Mapper (RECORD-DM) utility, the Import Data Tool (IDT), and the Add Points Tool (APT). With these improvements, users can integrate their own data with other data sources, and visualize the resulting layers of different data types (such as spatial and spectral, for simultaneous visual analysis), and visualize temporal changes in areas of interest.

  11. An open-source LabVIEW application toolkit for phasic heart rate analysis in psychophysiological research.

    Science.gov (United States)

    Duley, Aaron R; Janelle, Christopher M; Coombes, Stephen A

    2004-11-01

    The cardiovascular system has been extensively measured in a variety of research and clinical domains. Despite technological and methodological advances in cardiovascular science, the analysis and evaluation of phasic changes in heart rate persists as a way to assess numerous psychological concomitants. Some researchers, however, have pointed to constraints on data analysis when evaluating cardiac activity indexed by heart rate or heart period. Thus, an off-line application toolkit for heart rate analysis is presented. The program, written with National Instruments' LabVIEW, incorporates a variety of tools for off-line extraction and analysis of heart rate data. Current methods and issues concerning heart rate analysis are highlighted, and how the toolkit provides a flexible environment to ameliorate common problems that typically lead to trial rejection is discussed. Source code for this program may be downloaded from the Psychonomic Society Web archive at www.psychonomic.org/archive/.

  12. BBAT: Bunch and bucket analysis tool

    International Nuclear Information System (INIS)

    Deng, D.P.

    1995-01-01

    BBAT is written to meet the need of an interactive graphical tool to explore the longitudinal phase space. It is driven for testing new ideas or new tricks quickly. It is especially suitable for machine physicists or operation staff as well both in the control room during machine studies or off-line to analyze the data. The heart of the package contains a set of c-routines to do the number crunching. The graphics part is wired with scripting language tcl/tk and BLT. The c-routines are general enough that one can write new applications such as animation of the bucket as a machine parameter varies via a sliding scale. BBAT deals with single rf system. For double rf system, one can use Dr. BBAT, which stands for Double rf Bunch and Bucket Analysis Tool. One usage of Dr. BBAT is to visualize the process of bunch coalescing and flat bunch creation

  13. Open-source web-enabled data management, analyses, and visualization of very large data in geosciences using Jupyter, Apache Spark, and community tools

    Science.gov (United States)

    Chaudhary, A.

    2017-12-01

    Current simulation models and sensors are producing high-resolution, high-velocity data in geosciences domain. Knowledge discovery from these complex and large size datasets require tools that are capable of handling very large data and providing interactive data analytics features to researchers. To this end, Kitware and its collaborators are producing open-source tools GeoNotebook, GeoJS, Gaia, and Minerva for geosciences that are using hardware accelerated graphics and advancements in parallel and distributed processing (Celery and Apache Spark) and can be loosely coupled to solve real-world use-cases. GeoNotebook (https://github.com/OpenGeoscience/geonotebook) is co-developed by Kitware and NASA-Ames and is an extension to the Jupyter Notebook. It provides interactive visualization and python-based analysis of geospatial data and depending the backend (KTile or GeoPySpark) can handle data sizes of Hundreds of Gigabytes to Terabytes. GeoNotebook uses GeoJS (https://github.com/OpenGeoscience/geojs) to render very large geospatial data on the map using WebGL and Canvas2D API. GeoJS is more than just a GIS library as users can create scientific plots such as vector and contour and can embed InfoVis plots using D3.js. GeoJS aims for high-performance visualization and interactive data exploration of scientific and geospatial location aware datasets and supports features such as Point, Line, Polygon, and advanced features such as Pixelmap, Contour, Heatmap, and Choropleth. Our another open-source tool Minerva ((https://github.com/kitware/minerva) is a geospatial application that is built on top of open-source web-based data management system Girder (https://github.com/girder/girder) which provides an ability to access data from HDFS or Amazon S3 buckets and provides capabilities to perform visualization and analyses on geosciences data in a web environment using GDAL and GeoPandas wrapped in a unified API provided by Gaia (https

  14. Tracking PACS usage with open source tools.

    Science.gov (United States)

    French, Todd L; Langer, Steve G

    2011-08-01

    A typical choice faced by Picture Archiving and Communication System (PACS) administrators is deciding how many PACS workstations are needed and where they should be sited. Oftentimes, the social consequences of having too few are severe enough to encourage oversupply and underutilization. This is costly, at best in terms of hardware and electricity, and at worst (depending on the PACS licensing and support model) in capital costs and maintenance fees. The PACS administrator needs tools to asses accurately the use to which her fleet is being subjected, and thus make informed choices before buying more workstations. Lacking a vended solution for this challenge, we developed our own.

  15. RdTools: An Open Source Python Library for PV Degradation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Deceglie, Michael G [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Jordan, Dirk [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Nag, Ambarish [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Deline, Christopher A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Shinn, Adam [kWh Analytics

    2018-05-04

    RdTools is a set of Python tools for analysis of photovoltaic data. In particular, PV production data is evaluated over several years to obtain rates of performance degradation over time. Rdtools can handle both high frequency (hourly or better) or low frequency (daily, weekly, etc.) datasets. Best results are obtained with higher frequency data.

  16. Processes, Performance Drivers and ICT Tools in Human Resources Management

    Directory of Open Access Journals (Sweden)

    Oškrdal Václav

    2011-06-01

    Full Text Available This article presents an insight to processes, performance drivers and ICT tools in human resources (HR management area. On the basis of a modern approach to HR management, a set of business processes that are handled by today’s HR managers is defined. Consequently, the concept of ICT-supported performance drivers and their relevance in the area of HR management as well as the relationship between HR business processes, performance drivers and ICT tools are defined. The theoretical outcomes are further enhanced with results obtained from a survey among Czech companies. This article was written with kind courtesy of finances provided by VŠE IGA grant „IGA – 32/2010“.

  17. TACIT: An open-source text analysis, crawling, and interpretation tool.

    Science.gov (United States)

    Dehghani, Morteza; Johnson, Kate M; Garten, Justin; Boghrati, Reihane; Hoover, Joe; Balasubramanian, Vijayan; Singh, Anurag; Shankar, Yuvarani; Pulickal, Linda; Rajkumar, Aswin; Parmar, Niki Jitendra

    2017-04-01

    As human activity and interaction increasingly take place online, the digital residues of these activities provide a valuable window into a range of psychological and social processes. A great deal of progress has been made toward utilizing these opportunities; however, the complexity of managing and analyzing the quantities of data currently available has limited both the types of analysis used and the number of researchers able to make use of these data. Although fields such as computer science have developed a range of techniques and methods for handling these difficulties, making use of those tools has often required specialized knowledge and programming experience. The Text Analysis, Crawling, and Interpretation Tool (TACIT) is designed to bridge this gap by providing an intuitive tool and interface for making use of state-of-the-art methods in text analysis and large-scale data management. Furthermore, TACIT is implemented as an open, extensible, plugin-driven architecture, which will allow other researchers to extend and expand these capabilities as new methods become available.

  18. Designing student peer assessment in higher education: Analysis of written and oral peer feedback

    NARCIS (Netherlands)

    van den Berg, I.; Admiraal, W.; Pilot, A.

    2006-01-01

    Designing student peer assessment in higher education: analysis of written and oral peer feedback Relating it to design features, the present article describes the nature of written and oral peer feedback as it occurred in seven writing courses, each with a different PA design. Results indicate that

  19. THE ORTHOGRAPHIC NORM IN SECONDARY SCHOOL STUDENTS’ WRITTEN ASSIGNMENTS

    Directory of Open Access Journals (Sweden)

    Ivana Đorđev

    2016-06-01

    Full Text Available This paper presents the results of research conducted with the primary objective to determine in which areas secondary school students usually make orthographic mistakes when writing (official written assignments. Starting from the hypothesis that the punctuation writing of whole and split words are areas in which secondary school students (regardless of age and school orientation achieved the weakest achievements an (exploratory research was conducted on a corpus of 3,135 written assignments written in the school year of 2010/11. The research sample was intentional, descriptive and analytical methods were used for the description and the analysis of the results. The results showed the following (1 secondary school students usually make mistakes in punctuation of written assignments - we recorded 4,487 errors in the use of signs to denote intonation and meaning of a text (errors of this type make 53.93% of the total number of spelling errors reported in the corpus of research; by frequency of errors the second are errors related to writing whole and split words (11.02%, the third error is in the use of the capital letter (9.34%; (2 most problems in orthography have second grade students, quantum of mistakes is almost the same with first graders and seniors, but in all grades the most frequent errors are in punctuation, writing of whole and split words and the use of capital letters; (3 Although school orientation affects the spelling skills of pupils, the weakest orthographic achievements are also recorded in punctuation, writing of whole and split words and capitalization, so those are areas that need to be thoroughly addressed in teaching and methodology literature. The results are, on the one hand, a picture of the current status of teaching orthography and grammar knowledge of secondary school students. On the other hand, the research results can be applied in all phases of methodical practical work in teaching orthography, the upgrading the

  20. IllinoisGRMHD: an open-source, user-friendly GRMHD code for dynamical spacetimes

    International Nuclear Information System (INIS)

    Etienne, Zachariah B; Paschalidis, Vasileios; Haas, Roland; Mösta, Philipp; Shapiro, Stuart L

    2015-01-01

    In the extreme violence of merger and mass accretion, compact objects like black holes and neutron stars are thought to launch some of the most luminous outbursts of electromagnetic and gravitational wave energy in the Universe. Modeling these systems realistically is a central problem in theoretical astrophysics, but has proven extremely challenging, requiring the development of numerical relativity codes that solve Einstein's equations for the spacetime, coupled to the equations of general relativistic (ideal) magnetohydrodynamics (GRMHD) for the magnetized fluids. Over the past decade, the Illinois numerical relativity (ILNR) group's dynamical spacetime GRMHD code has proven itself as a robust and reliable tool for theoretical modeling of such GRMHD phenomena. However, the code was written ‘by experts and for experts’ of the code, with a steep learning curve that would severely hinder community adoption if it were open-sourced. Here we present IllinoisGRMHD, which is an open-source, highly extensible rewrite of the original closed-source GRMHD code of the ILNR group. Reducing the learning curve was the primary focus of this rewrite, with the goal of facilitating community involvement in the code's use and development, as well as the minimization of human effort in generating new science. IllinoisGRMHD also saves computer time, generating roundoff-precision identical output to the original code on adaptive-mesh grids, but nearly twice as fast at scales of hundreds to thousands of cores. (paper)

  1. An Artificial Intelligence Classification Tool and Its Application to Gamma-Ray Bursts

    Science.gov (United States)

    Hakkila, Jon; Haglin, David J.; Roiger, Richard J.; Giblin, Timothy; Paciesas, William S.; Pendleton, Geoffrey N.; Mallozzi, Robert S.

    2004-01-01

    Despite being the most energetic phenomenon in the known universe, the astrophysics of gamma-ray bursts (GRBs) has still proven difficult to understand. It has only been within the past five years that the GRB distance scale has been firmly established, on the basis of a few dozen bursts with x-ray, optical, and radio afterglows. The afterglows indicate source redshifts of z=1 to z=5, total energy outputs of roughly 10(exp 52) ergs, and energy confined to the far x-ray to near gamma-ray regime of the electromagnetic spectrum. The multi-wavelength afterglow observations have thus far provided more insight on the nature of the GRB mechanism than the GRB observations; far more papers have been written about the few observed gamma-ray burst afterglows in the past few years than about the thousands of detected gamma-ray bursts. One reason the GRB central engine is still so poorly understood is that GRBs have complex, overlapping characteristics that do not appear to be produced by one homogeneous process. At least two subclasses have been found on the basis of duration, spectral hardness, and fluence (time integrated flux); Class 1 bursts are softer, longer, and brighter than Class 2 bursts (with two second durations indicating a rough division). A third GRB subclass, overlapping the other two, has been identified using statistical clustering techniques; Class 3 bursts are intermediate between Class 1 and Class 2 bursts in brightness and duration, but are softer than Class 1 bursts. We are developing a tool to aid scientists in the study of GRB properties. In the process of developing this tool, we are building a large gamma-ray burst classification database. We are also scientifically analyzing some GRB data as we develop the tool. Tool development thus proceeds in tandem with the dataset for which it is being designed. The tool invokes a modified KDD (Knowledge Discovery in Databases) process, which is described as follows.

  2. Development of a written assessment for a national interprofessional cardiotocography education program.

    Science.gov (United States)

    Thellesen, Line; Bergholt, Thomas; Hedegaard, Morten; Colov, Nina Palmgren; Christensen, Karl Bang; Andersen, Kristine Sylvan; Sorensen, Jette Led

    2017-05-18

    To reduce the incidence of hypoxic brain injuries among newborns a national cardiotocography (CTG) education program was implemented in Denmark. A multiple-choice question test was integrated as part of the program. The aim of this article was to describe and discuss the test development process and to introduce a feasible method for written test development in general. The test development was based on the unitary approach to validity. The process involved national consensus on learning objectives, standardized item writing, pilot testing, sensitivity analyses, standard setting and evaluation of psychometric properties using Item Response Theory models. Test responses and feedback from midwives, specialists and residents in obstetrics and gynecology, and medical and midwifery students were used in the process (proofreaders n = 6, pilot test participants n = 118, CTG course participants n = 1679). The final test included 30 items and the passing score was established at 25 correct answers. All items fitted a loglinear Rasch model and the test was able to discriminate levels of competence. Seven items revealed differential item functioning in relation to profession and geographical regions, which means the test is not suitable for measuring differences between midwives and physicians or differences across regions. In the setting of pilot testing Cronbach's alpha equaled 0.79, whereas Cronbach's alpha equaled 0.63 in the setting of the CTG education program. This indicates a need for more items and items with a higher degree of difficulty in the test, and illuminates the importance of context when discussing validity. Test development is a complex and time-consuming process. The unitary approach to validity was a useful and applicable tool for development of a CTG written assessment. The process and findings supported our proposed interpretation of the assessment as measuring CTG knowledge and interpretive skills. However, for the test to function as a

  3. Cracking the code: residents' interpretations of written assessment comments

    NARCIS (Netherlands)

    Ginsburg, S.; Vleuten, C.P.M. van der; Eva, K.W.; Lingard, L.

    2017-01-01

    CONTEXT: Interest is growing in the use of qualitative data for assessment. Written comments on residents' in-training evaluation reports (ITERs) can be reliably rank-ordered by faculty attendings, who are adept at interpreting these narratives. However, if residents do not interpret assessment

  4. Survivability as a Tool for Evaluating Open Source Software

    Science.gov (United States)

    2015-06-01

    tremendously successful in certain applications such as the Mozilla Firefox web browser and the Apache web server [10]. Open source software is often...source versions (such as Internet Explorer compared to Mozilla Firefox ), which typically conclude that vulnerabilities are, in fact, much more...for radios M. Smith ACS ACS ROS autonomous functionality (none) ACS PX4 Firmware PX4 FMU driver BSD 3-clause ACS PX4 Nuttx real time OS BSD ACS

  5. A Structure Analysis of English Argumentative Writings Written by Chinese and Korean EFL Learners

    Science.gov (United States)

    Zheng, Cui

    2013-01-01

    This study employed Kamimura and Oi (1996)'s classification of the organizational patterns of the argumentative essay structure: Thesis Statement (TS), Background Information (BI), Reservation (R), Hesitation (H), Rational Appeals (RA), Affective Appeals (AA) and Conclusion (C). 178 essays, 84 written by Chinese EFL learners, 84 written by Korean…

  6. Discourse Features of Written Mexican Spanish: Current Research in Contrastive Rhetoric and Its Implications.

    Science.gov (United States)

    Montano-Harmon, Maria Rosario

    1991-01-01

    Analyzes discourse features of compositions written in Spanish by secondary school students in Mexico, draws comparisons with those written in English by Anglo-American students in the United States, and discusses the implications of the results for teaching and evaluating composition skills in Spanish language programs. (29 references) (GLR)

  7. Advance Planning of Form Properties in the Written Production of Single and Multiple Words

    Science.gov (United States)

    Damian, Markus F.; Stadthagen-Gonzalez, Hans

    2009-01-01

    Three experiments investigated the scope of advance planning in written production. Experiment 1 manipulated phonological factors in single word written production, and Experiments 2 and 3 did the same in the production of adjective-noun utterances. In all three experiments, effects on latencies were found which mirrored those previously…

  8. Open-Source Colorimeter

    OpenAIRE

    Anzalone, Gerald C.; Glover, Alexandra G.; Pearce, Joshua M.

    2013-01-01

    The high cost of what have historically been sophisticated research-related sensors and tools has limited their adoption to a relatively small group of well-funded researchers. This paper provides a methodology for applying an open-source approach to design and development of a colorimeter. A 3-D printable, open-source colorimeter utilizing only open-source hardware and software solutions and readily available discrete components is discussed and its performance compared to a commercial porta...

  9. Teachers' Accounts of Their Perceptions and Practices of Providing Written Feedback to Nursing Students on Their Assignments

    Science.gov (United States)

    Iqbal, Sajid; Gul, Raisa; Lakhani, Arusa; Rizvi, Nusrat Fatima

    2014-01-01

    Written feedback can facilitate students' learning in several ways. However, the teachers' practices of written feedback may be affected by various factors. This study aimed to explore the nurse teachers' accounts of their perceptions and practices of providing written feedback. A descriptive exploratory design was employed in the study. A…

  10. An Earthquake Information Service with Free and Open Source Tools

    Science.gov (United States)

    Schroeder, M.; Stender, V.; Jüngling, S.

    2015-12-01

    At the GFZ German Research Centre for Geosciences in Potsdam, the working group Earthquakes and Volcano Physics examines the spatiotemporal behavior of earthquakes. In this context also the hazards of volcanic eruptions and tsunamis are explored. The aim is to collect related information after the occurrence of such extreme event and make them available for science and partly to the public as quickly as possible. However, the overall objective of this research is to reduce the geological risks that emanate from such natural hazards. In order to meet the stated objectives and to get a quick overview about the seismicity of a particular region and to compare the situation to historical events, a comprehensive visualization was desired. Based on the web-accessible data from the famous GFZ GEOFON network a user-friendly web mapping application was realized. Further, this web service integrates historical and current earthquake information from the USGS earthquake database, and more historical events from various other catalogues like Pacheco, International Seismological Centre (ISC) and more. This compilation of sources is unique in Earth sciences. Additionally, information about historical and current occurrences of volcanic eruptions and tsunamis are also retrievable. Another special feature in the application is the containment of times via a time shifting tool. Users can interactively vary the visualization by moving the time slider. Furthermore, the application was realized by using the newest JavaScript libraries which enables the application to run in all sizes of displays and devices. Our contribution will present the making of, the architecture behind, and few examples of the look and feel of this application.

  11. Improving mass measurement accuracy in mass spectrometry based proteomics by combining open source tools for chromatographic alignment and internal calibration.

    Science.gov (United States)

    Palmblad, Magnus; van der Burgt, Yuri E M; Dalebout, Hans; Derks, Rico J E; Schoenmaker, Bart; Deelder, André M

    2009-05-02

    Accurate mass determination enhances peptide identification in mass spectrometry based proteomics. We here describe the combination of two previously published open source software tools to improve mass measurement accuracy in Fourier transform ion cyclotron resonance mass spectrometry (FTICRMS). The first program, msalign, aligns one MS/MS dataset with one FTICRMS dataset. The second software, recal2, uses peptides identified from the MS/MS data for automated internal calibration of the FTICR spectra, resulting in sub-ppm mass measurement errors.

  12. Oral and Written Picture Description in Individuals with Aphasia

    Science.gov (United States)

    Vandenborre, Dorien; Visch-Brink, Evy; van Dun, Kim; Verhoeven, Jo; Mariën, Peter

    2018-01-01

    Background: Aphasia is characterized by difficulties in connected speech/writing. Aims: To explore the differences between the oral and written description of a picture in individuals with chronic aphasia (IWA) and healthy controls. Descriptions were controlled for productivity, efficiency, grammatical organization, substitution behaviour and…

  13. 42 CFR 456.80 - Individual written plan of care.

    Science.gov (United States)

    2010-10-01

    ... (CONTINUED) MEDICAL ASSISTANCE PROGRAMS UTILIZATION CONTROL Utilization Control: Hospitals Plan of Care § 456.80 Individual written plan of care. (a) Before admission to a hospital or before authorization for... and rehabilitative services; (iv) Activities; (v) Social services; (vi) Diet; (4) Plans for continuing...

  14. Attitudes of second language students towards self-editing their own written texts

    Directory of Open Access Journals (Sweden)

    Daniel Kasule

    2010-05-01

    Full Text Available Recognizing students’ deliberate e!orts to minimize errors in their written texts is valuable in seeing them as responsible active agents in text creation. This paper reports on a brief survey of the attitudes towards self-editing of seventy university students using a questionnaire and class discussion. The context of the study is characterized by its emphasis on evaluating the finished written product. Findings show that students appreciate the role of self-editing in minimizing errors in their texts and that it helps in eventually producing well-written texts. Conceptualizing writing as discourse and therefore as social practice leads to an understanding of writers as socially-situated actors; repositions the student writer as an active agent in text creation; and is central to student-centred pedagogy. We recommend the recognition of self-editing as a vital element in the writing process and that additional error detection mechanisms namely peers, the lecturer, and the computer, increase student autonomy.

  15. PySE: Python Source Extractor for radio astronomical images

    Science.gov (United States)

    Spreeuw, Hanno; Swinbank, John; Molenaar, Gijs; Staley, Tim; Rol, Evert; Sanders, John; Scheers, Bart; Kuiack, Mark

    2018-05-01

    PySE finds and measures sources in radio telescope images. It is run with several options, such as the detection threshold (a multiple of the local noise), grid size, and the forced clean beam fit, followed by a list of input image files in standard FITS or CASA format. From these, PySe provides a list of found sources; information such as the calculated background image, source list in different formats (e.g. text, region files importable in DS9), and other data may be saved. PySe can be integrated into a pipeline; it was originally written as part of the LOFAR Transient Detection Pipeline (TraP, ascl:1412.011).

  16. Blogs Written by Families During Their Child's Hospitalization: A Thematic Narrative Analysis.

    Science.gov (United States)

    Jones, Carolyn W; Lynn, Mary R

    2018-04-04

    To identify stressors experienced by parents whose child is hospitalized in an intensive care unit, and identify coping mechanisms utilized to ameliorate those stressors. Using Lazarus and Folkman's Transactional Model of Stress and Coping as a framework, 20 publicly available blogs written by parents while their child was a patient in intensive care were analyzed using thematic analysis techniques. Stressors and coping techniques were identified, and grouped by theme for further analysis. The most frequently noted types of stressors were related to information; both knowing and not knowing information related to their child's condition was reported as stressful, as well as waiting for information and when the information was not what was expected. Reframing was the emotion-focused technique most often identified by the parents, and seeking support was the most frequently noted problem-focused coping mechanism. Illness blogs represent a rich source of information regarding the experiences of families with a child in the hospital. Parents transitioned from more emotion-focused coping strategies to problem-focused strategies during their child's hospital stay. When nurses give information to parents, they should be aware that knowing information can be stressful as well as not knowing, and care should be taken to provide support for parents after information is given. Nurses can also help parents identify sources of support. Writing about their experiences, either online or in a journal, may help parents cope in stressful situations. Copyright © 2017 Elsevier Inc. All rights reserved.

  17. Open Source GIS based integrated watershed management

    Science.gov (United States)

    Byrne, J. M.; Lindsay, J.; Berg, A. A.

    2013-12-01

    Optimal land and water management to address future and current resource stresses and allocation challenges requires the development of state-of-the-art geomatics and hydrological modelling tools. Future hydrological modelling tools should be of high resolution, process based with real-time capability to assess changing resource issues critical to short, medium and long-term enviromental management. The objective here is to merge two renowned, well published resource modeling programs to create an source toolbox for integrated land and water management applications. This work will facilitate a much increased efficiency in land and water resource security, management and planning. Following an 'open-source' philosophy, the tools will be computer platform independent with source code freely available, maximizing knowledge transfer and the global value of the proposed research. The envisioned set of water resource management tools will be housed within 'Whitebox Geospatial Analysis Tools'. Whitebox, is an open-source geographical information system (GIS) developed by Dr. John Lindsay at the University of Guelph. The emphasis of the Whitebox project has been to develop a user-friendly interface for advanced spatial analysis in environmental applications. The plugin architecture of the software is ideal for the tight-integration of spatially distributed models and spatial analysis algorithms such as those contained within the GENESYS suite. Open-source development extends knowledge and technology transfer to a broad range of end-users and builds Canadian capability to address complex resource management problems with better tools and expertise for managers in Canada and around the world. GENESYS (Generate Earth Systems Science input) is an innovative, efficient, high-resolution hydro- and agro-meteorological model for complex terrain watersheds developed under the direction of Dr. James Byrne. GENESYS is an outstanding research and applications tool to address

  18. Storytelling as an Active Learning Tool to Engage Students in a Genetics Classroom

    Directory of Open Access Journals (Sweden)

    Karobi Moitra

    2014-08-01

    Full Text Available Storytelling is an ancient art that originated long before the written word. Storytelling interspersed with traditional lectures has been used as a teaching tool in an Introductory Genetics course. Students have eagerly responded to the storytelling pedagogy, suggesting that it can be leveraged to help students grasp complicated theories, engage students, and help improve student retention in STEM fields.

  19. Data processing with Pymicra, the Python tool for Micrometeorological Analyses

    Science.gov (United States)

    Chor, T. L.; Dias, N. L.

    2017-12-01

    With the ever-increasing capability of instrumentation of collecting high-frequency turbulence data, micrometeorological experiments are now generating significant amounts of data. Clearly, data processing -- and not data collection anymore -- has become the limiting factor for those very large data sets. The ability of extracting useful scientific information from those experiments, therefore, hinges on tools that (i) are able to process those data effectively and accurately, (ii) are flexible enough to be adapted to the specific requirements of each investigation, and (iii) are robust enough to make data analysis easily reproducible over different sets of large data sets. We have developed a framework for micrometeorological data analysis called Pymicra which does deliver such capabilities while maintaining proximity of the investigator with the data. It is fully written in an open-source, very high level language, Python, which has been gaining widespread acceptance as a scientific tool. It follows the philosophy of "not reinventing the wheel" and, as a result, relies on existing well-established open-source Python packages such as Numpy and Pandas. Thus, minimum effort is needed to program statistics, array processing, Fourier analysis, etc. Among the things that Pymicra does are reading and organizing data from virtually any format, applying common quality control procedures, extracting fluctuations in a number of ways, correcting for sensor drift, automatic calculation of fluid properties (such as air and dry air density), handling of units, calculation of cross-spectra, calculation of turbulent fluxes and scales, and all other features already provided by Pandas (interpolation, statistical tests, handling of missing data, etc.). Pymicra is freely available on Github and the fact that it makes heavy use of high-level programming makes adding and modifying code considerably easy for any scientific programmer, making it straightforward for other scientists to

  20. Enhancing the benefits of written emotional disclosure through response training.

    Science.gov (United States)

    Konig, Andrea; Eonta, Alison; Dyal, Stephanie R; Vrana, Scott R

    2014-05-01

    Writing about a personal stressful event has been found to have psychological and physical health benefits, especially when physiological response increases during writing. Response training was developed to amplify appropriate physiological reactivity in imagery exposure. The present study examined whether response training enhances the benefits of written emotional disclosure. Participants were assigned to either a written emotional disclosure condition (n=113) or a neutral writing condition (n=133). Participants in each condition wrote for 20 minutes on 3 occasions and received response training (n=79), stimulus training (n=84) or no training (n=83). Heart rate and skin conductance were recorded throughout a 10-minute baseline, 20-minute writing, and a 10-minute recovery period. Self-reported emotion was assessed in each session. One month after completing the sessions, participants completed follow-up assessments of psychological and physical health outcomes. Emotional disclosure elicited greater physiological reactivity and self-reported emotion than neutral writing. Response training amplified physiological reactivity to emotional disclosure. Greater heart rate during emotional disclosure was associated with the greatest reductions in event-related distress, depression, and physical illness symptoms at follow-up, especially among response trained participants. Results support an exposure explanation of emotional disclosure effects and are the first to demonstrate that response training facilitates emotional processing and may be a beneficial adjunct to written emotional disclosure. Copyright © 2014. Published by Elsevier Ltd.

  1. Enhancing the Benefits of Written Emotional Disclosure through Response Training

    Science.gov (United States)

    Konig, Andrea; Eonta, Alison; Dyal, Stephanie R.; Vrana, Scott R.

    2014-01-01

    Writing about a personal stressful event has been found to have psychological and physical health benefits, especially when physiological response increases during writing. Response training was developed to amplify appropriate physiological reactivity in imagery exposure. The present study examined whether response training enhances the benefits of written emotional disclosure. Participants were assigned to either a written emotional disclosure condition (n = 113) or a neutral writing condition (n = 133). Participants in each condition wrote for 20 minutes on three occasions and received response training (n = 79), stimulus training (n = 84) or no training (n = 83). Heart rate and skin conductance were recorded throughout a 10-minute baseline, 20-minute writing, and a 10-minute recovery period. Self-reported emotion was assessed in each session. One month after completing the sessions, participants completed follow-up assessments of psychological and physical health outcomes. Emotional disclosure elicited greater physiological reactivity and self-reported emotion than neutral writing. Response training amplified physiological reactivity to emotional disclosure. Greater heart rate during emotional disclosure was associated with the greatest reductions in event-related distress, depression, and physical illness symptoms at follow-up, especially among response trained participants. Results support an exposure explanation of emotional disclosure effects and are the first to demonstrate that response training facilitates emotional processing and may be a beneficial adjunct to written emotional disclosure. PMID:24680230

  2. Open source tools for large-scale neuroscience.

    Science.gov (United States)

    Freeman, Jeremy

    2015-06-01

    New technologies for monitoring and manipulating the nervous system promise exciting biology but pose challenges for analysis and computation. Solutions can be found in the form of modern approaches to distributed computing, machine learning, and interactive visualization. But embracing these new technologies will require a cultural shift: away from independent efforts and proprietary methods and toward an open source and collaborative neuroscience. Copyright © 2015 The Author. Published by Elsevier Ltd.. All rights reserved.

  3. The emotional importance of key: do Beatles songs written in different keys convey different emotional tones?

    Science.gov (United States)

    Whissel, R; Whissel, C

    2000-12-01

    Lyrics from 155 songs written by the Lennon-McCartney team were scored using the Dictionary of Affect in Language. Resultant scores (pleasantness, activation, and imagery of words) were compared across key signatures using one way analyses of variance. Words from songs written in minor keys were less pleasant and less active than those from songs written in major keys. Words from songs written in the key of F scored extremely low on all three measures. Lyrics from the keys of C, D, and G were relatively active in tone. Results from Dictionary scoring were compared with assignments of character to keys made more than one century ago and with current musicians' opinions.

  4. Politeness strategies in written communications: the issue of Iranian EFL learners

    Directory of Open Access Journals (Sweden)

    Karimkhanlooei Giti

    2017-09-01

    Full Text Available The approximation of the pragmatic knowledge of English language learners to native speakers has been a realm of concern for the scholars and researchers in applied linguistics. Thus, this research was an endeavor to figure out the association between the proficiency level and politeness strategies and external/internal modifications in written communication skills in the speech act of requests in Iranian English language learners. To this end, a written Discourse Completion Test (DCT, adapted from Rose (1994, including 8 situations was administered to elicit data from Iran Language Institute120 female and male EFL learners, 60 upper-intermediate and 60 intermediate. The data were sorted out using Brown and Levinson’s politeness strategies taxonomy (Brown and Levinson 1987 and external/internal modifications developed by Faerch and Kasper (1989. The written request utterances provided by each participant were analyzed in terms of frequency and types of politeness strategies, namely, positive, negative, bald on record, and off-record as well as external/internal modifications utilized in requests. The Pearson Chi-Square test results revealed that there was a statistically significant difference between upper-intermediate and intermediate learners’ type of politeness strategies and external/internal modifications.

  5. 2 CFR 182.100 - How is this part written?

    Science.gov (United States)

    2010-01-01

    ... 2 Grants and Agreements 1 2010-01-01 2010-01-01 false How is this part written? 182.100 Section 182.100 Grants and Agreements OFFICE OF MANAGEMENT AND BUDGET GOVERNMENTWIDE GUIDANCE FOR GRANTS AND AGREEMENTS Reserved GOVERNMENTWIDE REQUIREMENTS FOR DRUG-FREE WORKPLACE (FINANCIAL ASSISTANCE) Purpose and...

  6. MCM generator: a Java-based tool for generating medical metadata.

    Science.gov (United States)

    Munoz, F; Hersh, W

    1998-01-01

    In a previous paper we introduced the need to implement a mechanism to facilitate the discovery of relevant Web medical documents. We maintained that the use of META tags, specifically ones that define the medical subject and resource type of a document, help towards this goal. We have now developed a tool to facilitate the generation of these tags for the authors of medical documents. Written entirely in Java, this tool makes use of the SAPHIRE server, and helps the author identify the Medical Subject Heading terms that most appropriately describe the subject of the document. Furthermore, it allows the author to generate metadata tags for the 15 elements that the Dublin Core considers as core elements in the description of a document. This paper describes the use of this tool in the cataloguing of Web and non-Web medical documents, such as images, movie, and sound files.

  7. The "SignOn"-Model for Teaching Written Language to Deaf People

    Directory of Open Access Journals (Sweden)

    Marlene Hilzensauer

    2012-08-01

    Full Text Available This paper shows a method of teaching written language to deaf people using sign language as the language of instruction. Written texts in the target language are combined with sign language videos which provide the users with various modes of translation (words/phrases/sentences. As examples, two EU projects for English for the Deaf are presented which feature English texts and translations into the national sign languages of all the partner countries plus signed grammar explanations and interactive exercises. Both courses are web-based; the programs may be accessed free of charge via the respective homepages (without any download or log-in.

  8. iCAVE: an open source tool for visualizing biomolecular networks in 3D, stereoscopic 3D and immersive 3D.

    Science.gov (United States)

    Liluashvili, Vaja; Kalayci, Selim; Fluder, Eugene; Wilson, Manda; Gabow, Aaron; Gümüs, Zeynep H

    2017-08-01

    Visualizations of biomolecular networks assist in systems-level data exploration in many cellular processes. Data generated from high-throughput experiments increasingly inform these networks, yet current tools do not adequately scale with concomitant increase in their size and complexity. We present an open source software platform, interactome-CAVE (iCAVE), for visualizing large and complex biomolecular interaction networks in 3D. Users can explore networks (i) in 3D using a desktop, (ii) in stereoscopic 3D using 3D-vision glasses and a desktop, or (iii) in immersive 3D within a CAVE environment. iCAVE introduces 3D extensions of known 2D network layout, clustering, and edge-bundling algorithms, as well as new 3D network layout algorithms. Furthermore, users can simultaneously query several built-in databases within iCAVE for network generation or visualize their own networks (e.g., disease, drug, protein, metabolite). iCAVE has modular structure that allows rapid development by addition of algorithms, datasets, or features without affecting other parts of the code. Overall, iCAVE is the first freely available open source tool that enables 3D (optionally stereoscopic or immersive) visualizations of complex, dense, or multi-layered biomolecular networks. While primarily designed for researchers utilizing biomolecular networks, iCAVE can assist researchers in any field. © The Authors 2017. Published by Oxford University Press.

  9. Rf power sources

    International Nuclear Information System (INIS)

    Allen, M.A.

    1988-05-01

    This paper covers RF power sources for accelerator applications. The approach has been with particular customers in mind. These customers are high energy physicists who use accelerators as experimental tools in the study of the nucleus of the atom, and synchrotron light sources derived from electron or positron storage rings. This paper is confined to electron-positron linear accelerators since the RF sources have always defined what is possible to achieve with these accelerators. 11 refs., 13 figs

  10. DEVELOPMENT OF COMPLEXITY, ACCURACY, AND FLUENCY IN HIGH SCHOOL STUDENTS’ WRITTEN FOREIGN LANGUAGE PRODUCTION

    Directory of Open Access Journals (Sweden)

    Bouchaib Benzehaf

    2016-11-01

    Full Text Available The present study aims to longitudinally depict the dynamic and interactive development of Complexity, Accuracy, and Fluency (CAF in multilingual learners’ L2 and L3 writing. The data sources include free writing tasks written in L2 French and L3 English by 45 high school participants over a period of four semesters. CAF dimensions are measured using a variation of Hunt’s T-units (1964. Analysis of the quantitative data obtained suggests that CAF measures develop differently for learners’ L2 French and L3 English. They increase more persistently in L3 English, and they display the characteristics of a dynamic, non-linear system characterized by ups and downs particularly in L2 French. In light of the results, we suggest more and denser longitudinal data to explore the nature of interactions between these dimensions in foreign language development, particularly at the individual level.

  11. Using bio.tools to generate and annotate workbench tool descriptions [version 1; referees: 2 approved

    Directory of Open Access Journals (Sweden)

    Kenzo-Hugo Hillion

    2017-11-01

    Full Text Available Workbench and workflow systems such as Galaxy, Taverna, Chipster, or Common Workflow Language (CWL-based frameworks, facilitate the access to bioinformatics tools in a user-friendly, scalable and reproducible way. Still, the integration of tools in such environments remains a cumbersome, time consuming and error-prone process. A major consequence is the incomplete or outdated description of tools that are often missing important information, including parameters and metadata such as publication or links to documentation. ToolDog (Tool DescriptiOn Generator facilitates the integration of tools - which have been registered in the ELIXIR tools registry (https://bio.tools - into workbench environments by generating tool description templates. ToolDog includes two modules. The first module analyses the source code of the bioinformatics software with language-specific plugins, and generates a skeleton for a Galaxy XML or CWL tool description. The second module is dedicated to the enrichment of the generated tool description, using metadata provided by bio.tools. This last module can also be used on its own to complete or correct existing tool descriptions with missing metadata.

  12. S&T converging trends in dealing with disaster: A review on AI tools

    International Nuclear Information System (INIS)

    Hasan, Abu Bakar; Isa, Mohd Hafez Mohd.

    2016-01-01

    Science and Technology (S&T) has been able to help mankind to solve or minimize problems when arise. Different methodologies, techniques and tools were developed or used for specific cases by researchers, engineers, scientists throughout the world, and numerous papers and articles have been written by them. Nine selected cases such as flash flood, earthquakes, workplace accident, fault in aircraft industry, seismic vulnerability, disaster mitigation and management, and early fault detection in nuclear industry have been studied. This paper looked at those cases, and their results showed nearly 60% uses artificial intelligence (AI) as a tool. This paper also did some review that will help young researchers in deciding the types of AI tools to be selected; thus proving the future trends in S&T

  13. S&T converging trends in dealing with disaster: A review on AI tools

    Science.gov (United States)

    Hasan, Abu Bakar; Isa, Mohd. Hafez Mohd.

    2016-01-01

    Science and Technology (S&T) has been able to help mankind to solve or minimize problems when arise. Different methodologies, techniques and tools were developed or used for specific cases by researchers, engineers, scientists throughout the world, and numerous papers and articles have been written by them. Nine selected cases such as flash flood, earthquakes, workplace accident, fault in aircraft industry, seismic vulnerability, disaster mitigation and management, and early fault detection in nuclear industry have been studied. This paper looked at those cases, and their results showed nearly 60% uses artificial intelligence (AI) as a tool. This paper also did some review that will help young researchers in deciding the types of AI tools to be selected; thus proving the future trends in S&T.

  14. S&T converging trends in dealing with disaster: A review on AI tools

    Energy Technology Data Exchange (ETDEWEB)

    Hasan, Abu Bakar, E-mail: abakarh@usim.edu.my; Isa, Mohd Hafez Mohd. [Faculty of Science and Technology, Universiti Sains Islam Malaysia, 71800 Nilai, Negeri Sembilan (Malaysia)

    2016-01-22

    Science and Technology (S&T) has been able to help mankind to solve or minimize problems when arise. Different methodologies, techniques and tools were developed or used for specific cases by researchers, engineers, scientists throughout the world, and numerous papers and articles have been written by them. Nine selected cases such as flash flood, earthquakes, workplace accident, fault in aircraft industry, seismic vulnerability, disaster mitigation and management, and early fault detection in nuclear industry have been studied. This paper looked at those cases, and their results showed nearly 60% uses artificial intelligence (AI) as a tool. This paper also did some review that will help young researchers in deciding the types of AI tools to be selected; thus proving the future trends in S&T.

  15. Survey of Non-Rigid Registration Tools in Medicine.

    Science.gov (United States)

    Keszei, András P; Berkels, Benjamin; Deserno, Thomas M

    2017-02-01

    We catalogue available software solutions for non-rigid image registration to support scientists in selecting suitable tools for specific medical registration purposes. Registration tools were identified using non-systematic search in Pubmed, Web of Science, IEEE Xplore® Digital Library, Google Scholar, and through references in identified sources (n = 22). Exclusions are due to unavailability or inappropriateness. The remaining (n = 18) tools were classified by (i) access and technology, (ii) interfaces and application, (iii) living community, (iv) supported file formats, and (v) types of registration methodologies emphasizing the similarity measures implemented. Out of the 18 tools, (i) 12 are open source, 8 are released under a permissive free license, which imposes the least restrictions on the use and further development of the tool, 8 provide graphical processing unit (GPU) support; (ii) 7 are built on software platforms, 5 were developed for brain image registration; (iii) 6 are under active development but only 3 have had their last update in 2015 or 2016; (iv) 16 support the Analyze format, while 7 file formats can be read with only one of the tools; and (v) 6 provide multiple registration methods and 6 provide landmark-based registration methods. Based on open source, licensing, GPU support, active community, several file formats, algorithms, and similarity measures, the tools Elastics and Plastimatch are chosen for the platform ITK and without platform requirements, respectively. Researchers in medical image analysis already have a large choice of registration tools freely available. However, the most recently published algorithms may not be included in the tools, yet.

  16. The Scythe Statistical Library: An Open Source C++ Library for Statistical Computation

    Directory of Open Access Journals (Sweden)

    Daniel Pemstein

    2011-08-01

    Full Text Available The Scythe Statistical Library is an open source C++ library for statistical computation. It includes a suite of matrix manipulation functions, a suite of pseudo-random number generators, and a suite of numerical optimization routines. Programs written using Scythe are generally much faster than those written in commonly used interpreted languages, such as R and proglang{MATLAB}; and can be compiled on any system with the GNU GCC compiler (and perhaps with other C++ compilers. One of the primary design goals of the Scythe developers has been ease of use for non-expert C++ programmers. Ease of use is provided through three primary mechanisms: (1 operator and function over-loading, (2 numerous pre-fabricated utility functions, and (3 clear documentation and example programs. Additionally, Scythe is quite flexible and entirely extensible because the source code is available to all users under the GNU General Public License.

  17. 22 CFR 208.50 - How is this part written?

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false How is this part written? 208.50 Section 208.50 Foreign Relations AGENCY FOR INTERNATIONAL DEVELOPMENT GOVERNMENTWIDE DEBARMENT AND SUSPENSION... for the general public and business community to use. The section headings and text, often in the form...

  18. Dynamic Written Corrective Feedback in Developmental Multilingual Writing Classes

    Science.gov (United States)

    Kurzer, Kendon

    2018-01-01

    This study investigated the role of dynamic written corrective feedback (DWCF; Evans, Hartshorn, McCollum, & Wolfersberger, 2010; Hartshorn & Evans, 2015; Hartshorn et al., 2010), a mode of providing specific, targeted, and individualized grammar feedback in developmental English as a second language (ESL) writing classes (pre-first year…

  19. Argumentation Schema and the Myside Bias in Written Argumentation

    Science.gov (United States)

    Wolfe, Christopher R.; Britt, M. Anne; Butler, Jodie A.

    2009-01-01

    This article describes a cognitive argumentation schema for written arguments and presents three empirical studies on the "myside" bias--the tendency to ignore or exclude evidence against one's position. Study 1 examined the consequences of conceding, rebutting, and denying other-side information. Rebuttal led to higher ratings of…

  20. Morphosyntactic correctness of written language production in adults with moderate to severe congenital hearing loss

    NARCIS (Netherlands)

    Huysmans, Elke; de Jong, Jan; Festen, Joost M.; Coene, Martine M.R.; Goverts, S. Theo

    2017-01-01

    Objective To examine whether moderate to severe congenital hearing loss (MSCHL) leads to persistent morphosyntactic problems in the written language production of adults, as it does in their spoken language production. Design Samples of written language in Dutch were analysed for morphosyntactic

  1. Biological data integration: wrapping data and tools.

    Science.gov (United States)

    Lacroix, Zoé

    2002-06-01

    Nowadays scientific data is inevitably digital and stored in a wide variety of formats in heterogeneous systems. Scientists need to access an integrated view of remote or local heterogeneous data sources with advanced data accessing, analyzing, and visualization tools. Building a digital library for scientific data requires accessing and manipulating data extracted from flat files or databases, documents retrieved from the Web as well as data generated by software. We present an approach to wrapping web data sources, databases, flat files, or data generated by tools through a database view mechanism. Generally, a wrapper has two tasks: it first sends a query to the source to retrieve data and, second builds the expected output with respect to the virtual structure. Our wrappers are composed of a retrieval component based on an intermediate object view mechanism called search views mapping the source capabilities to attributes, and an eXtensible Markup Language (XML) engine, respectively, to perform these two tasks. The originality of the approach consists of: 1) a generic view mechanism to access seamlessly data sources with limited capabilities and 2) the ability to wrap data sources as well as the useful specific tools they may provide. Our approach has been developed and demonstrated as part of the multidatabase system supporting queries via uniform object protocol model (OPM) interfaces.

  2. Using Open Source Tools to Create a Mobile Optimized, Crowdsourced Translation Tool

    OpenAIRE

    Evviva Weinraub Lajoie; Trey Terrell; Susan McEvoy; Eva Kaplan; Ariel Schwartz; Esther Ajambo

    2014-01-01

    In late 2012, OSU Libraries and Press partnered with Maria's Libraries, an NGO in Rural Kenya, to provide users the ability to crowdsource translations of folk tales and existing children's books into a variety of African languages, sub-languages, and dialects. Together, these two organizations have been creating a mobile optimized platform using open source libraries such as Wink Toolkit (a library which provides mobile-friendly interaction from a website) and Globalize3 to allow for multipl...

  3. Characterization of UV written waveguides with luminescence microscopy

    DEFF Research Database (Denmark)

    Svalgaard, Mikael; Harpøth, Anders; Rosbirk, Tue

    2005-01-01

    Luminescence microscopy is used to measure the refractive index profile and molecular defect distribution of UV written waveguides with a spatial resolution of ~0.4 mm and high signal-to-noise ratio. The measurements reveal comlex waveguide formation dynamics with significant topological changes...... in the core profile. In addition, it is observed that thewaveguide formation process requires several milliseconds of UV exposure before starting....

  4. Beam simulation tools for GEANT4 (and neutrino source applications)

    International Nuclear Information System (INIS)

    V.Daniel Elvira, Paul Lebrun and Panagiotis Spentzouris email daniel@fnal.gov

    2002-01-01

    Geant4 is a tool kit developed by a collaboration of physicists and computer professionals in the High Energy Physics field for simulation of the passage of particles through matter. The motivation for the development of the Beam Tools is to extend the Geant4 applications to accelerator physics. Although there are many computer programs for beam physics simulations, Geant4 is ideal to model a beam going through material or a system with a beam line integrated to a complex detector. There are many examples in the current international High Energy Physics programs, such as studies related to a future Neutrino Factory, a Linear Collider, and a very Large Hadron Collider

  5. The design of instructional tools affects secondary school students' learning of cardiopulmonary resuscitation (CPR) in reciprocal peer learning: a randomized controlled trial.

    Science.gov (United States)

    Iserbyt, Peter; Byra, Mark

    2013-11-01

    Research investigating design effects of instructional tools for learning Basic Life Support (BLS) is almost non-existent. To demonstrate the design of instructional tools matter. The effect of spatial contiguity, a design principle stating that people learn more deeply when words and corresponding pictures are placed close (i.e., integrated) rather than far from each other on a page was investigated on task cards for learning Cardiopulmonary Resuscitation (CPR) during reciprocal peer learning. A randomized controlled trial. A total of 111 students (mean age: 13 years) constituting six intact classes learned BLS through reciprocal learning with task cards. Task cards combine a picture of the skill with written instructions about how to perform it. In each class, students were randomly assigned to the experimental group or the control. In the control, written instructions were placed under the picture on the task cards. In the experimental group, written instructions were placed close to the corresponding part of the picture on the task cards reflecting application of the spatial contiguity principle. One-way analysis of variance found significantly better performances in the experimental group for ventilation volumes (P=.03, ηp2=.10) and flow rates (P=.02, ηp2=.10). For chest compression depth, compression frequency, compressions with correct hand placement, and duty cycles no significant differences were found. This study shows that the design of instructional tools (i.e., task cards) affects student learning. Research-based design of learning tools can enhance BLS and CPR education. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  6. CaFE: a tool for binding affinity prediction using end-point free energy methods.

    Science.gov (United States)

    Liu, Hui; Hou, Tingjun

    2016-07-15

    Accurate prediction of binding free energy is of particular importance to computational biology and structure-based drug design. Among those methods for binding affinity predictions, the end-point approaches, such as MM/PBSA and LIE, have been widely used because they can achieve a good balance between prediction accuracy and computational cost. Here we present an easy-to-use pipeline tool named Calculation of Free Energy (CaFE) to conduct MM/PBSA and LIE calculations. Powered by the VMD and NAMD programs, CaFE is able to handle numerous static coordinate and molecular dynamics trajectory file formats generated by different molecular simulation packages and supports various force field parameters. CaFE source code and documentation are freely available under the GNU General Public License via GitHub at https://github.com/huiliucode/cafe_plugin It is a VMD plugin written in Tcl and the usage is platform-independent. tingjunhou@zju.edu.cn. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  7. Probabilistic source term predictions for use with decision support systems

    International Nuclear Information System (INIS)

    Grindon, E.; Kinniburgh, C.G.

    2003-01-01

    Probabilistic Inference of Nuclear Power plant Transients (SPRINT), the SPRINT tool takes, as input, observations and trends in key instrument readings from the Nuclear Power Plant (NPP). lt uses these observations to interrogate a database of precalculated source terms, typically compiled from existing NPP analyses performed as Part of a Level 2 PSA study. The basis for this interrogation is a probabilistic logic model, or belief network, of plant behaviour developed by plant experts (using, as a platform, the Netica Bayesian network Software). This is included in the application as a data file. The end points of this logic model are then mapped onto the database of pre-calculated source terms. This process is very rapid (taking only as long as is required to input the responses to a series of questions about the NPP status) and can overcome 'don't know' responses in the question set by resorting to prior probabilities determined by the plant experts who set up the model. The SPRINT application is the combination of the main SPRINT interface containing the user input/output screens and the logic model (dne file) loaded in the Netica application. The conditional probability tables. Beside the network structure itself, the key data in the model are the conditional probabilities that define the strength of influence of a parent node an a daughter node. These values are stored in the model file as network tables, one table per, node, called Conditional Probability Tables (CPTs). However, as detailed documentation of the CPT values is vital during development of reactor specific models, the Net2SS pro.gram was written to take any dne file and output the CPT data as an Excel spreadsheet which can be modified, annotated and loaded back into the dne file via the SPRINT interface. The repository file. The parameters defining the source term data stored in the Excel repository file is determined by the requirement that SPRINT be capable of generating input files for Decision Support

  8. A History of Oral and Written Storytelling in Nigeria

    Science.gov (United States)

    Edosomwan, Simeon; Peterson, Claudette M.

    2016-01-01

    Storytelling is a powerful process in adult education as a useful instructional approach in facilitating adult instruction and learning, especially during preliterate eras. What began as oral tradition has evolved to include written literature. A popular Eurocentric perspective in the early 19th century was that before the arrival of Europeans…

  9. Students' Written Arguments in General Chemistry Laboratory Investigations

    Science.gov (United States)

    Choi, Aeran; Hand, Brian; Greenbowe, Thomas

    2013-01-01

    This study aimed to examine the written arguments developed by college freshman students using the Science Writing Heuristic approach in inquiry-based general chemistry laboratory classrooms and its relationships with students' achievement in chemistry courses. Fourteen freshman students participated in the first year of the study while 19…

  10. Assessing written communication during interhospital transfers of emergency general surgery patients.

    Science.gov (United States)

    Harl, Felicity N R; Saucke, Megan C; Greenberg, Caprice C; Ingraham, Angela M

    2017-06-15

    Poor communication causes fragmented care. Studies of transitions of care within a hospital and on discharge suggest significant communication deficits. Communication during transfers between hospitals has not been well studied. We assessed the written communication provided during interhospital transfers of emergency general surgery patients. We hypothesized that patients are transferred with incomplete documentation from referring facilities. We performed a retrospective review of written communication provided during interhospital transfers to our emergency department (ED) from referring EDs for emergency general surgical evaluation between January 1, 2014 and January 1, 2016. Elements of written communication were abstracted from referring facility documents scanned into the medical record using a standardized abstraction protocol. Descriptive statistics summarized the information communicated. A total of 129 patients met inclusion criteria. 87.6% (n = 113) of charts contained referring hospital documents. 42.5% (n = 48) were missing history and physicals. Diagnoses were missing in 9.7% (n = 11). Ninety-one computed tomography scans were performed; among 70 with reads, final reads were absent for 70.0% (n = 49). 45 ultrasounds and x-rays were performed; among 27 with reads, final reads were missing for 80.0% (n = 36). Reasons for transfer were missing in 18.6% (n = 21). Referring hospital physicians outside the ED were consulted in 32.7% (n = 37); consultants' notes were absent in 89.2% (n = 33). In 12.4% (n = 14), referring documents arrived after the patient's ED arrival and were not part of the original documentation provided. This study documents that information important to patient care is often missing in the written communication provided during interhospital transfers. This gap affords a foundation for standardizing provider communication during interhospital transfers. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. [Jan Fryderyk Wolfgang's autobiography (1850) in the light of hand-written and printed sources].

    Science.gov (United States)

    Kuźnicka, B

    2001-01-01

    The archival collection of the Lithuanian Academy of Sciences in Vilnius (Wilno) contains many manuscripts relating to the scientific work of Jan Fryderyk Wolfgang (1776-1859), professor of pharmacy and pharmacology of the Wilno University in the years 1807-1831, the founder and main figure in the Wilno pharmacognostic school, a botanist with substantial achievements in wide-ranging research on the flora of the Wilno region, as well as a historian of pharmacy. The most interesting of the manuscripts include Wolfgang's Autobiografia [Autobiography], written in 1850, and a list of his publications covering a total of 57 items (including some that have hitherto remained unknown), a work entitled Historya Farmakologii i Farmacyi [History of pharmacology and pharmacy], and a particularly valuable manuscript (666 + 12 sheets) entitled Farmakologiia [Pharmacology]. Worth mentioning are also two catalogues of books from Wolfgang's library: one compiled by Wolfgang himself (37 sheets) and the other by Adam Ferdynand Adamowicz. The content of the autobiography manuscript is contained on five sheets. The author of the present article analyzes the document, comparing the information contained in it with the biographies of J. F. Wolfgang that hhave been published so far (these being primarily the biography by Dominik Cezary ChodYko, published in 1863, and that by Witold W3odzimierz G3owacki of 1960). The text of the autobiography is quoted in full, together with numerous comments. The analysis of the manuscript as well as the biographical data contained in the above-mentioned biographies indicate that Wolfgang had great achievements as a scientist (in both research and organizational work), as a champion of public causes and as an educator of a generation of botanists-pharmacognostics. It also transpires from the autobiography, as well as from the research by historians, that he was a very good and trustful person, who readily granted access to his research to his collaborators

  12. RADSHI: shielding calculation program for different geometries sources

    International Nuclear Information System (INIS)

    Gelen, A.; Alvarez, I.; Lopez, H.; Manso, M.

    1996-01-01

    A computer code written in pascal language for IBM/Pc is described. The program calculates the optimum thickness of slab shield for different geometries sources. The Point Kernel Method is employed, which enables the obtention of the ionizing radiation flux density. The calculation takes into account the possibility of self-absorption in the source. The air kerma rate for gamma radiation is determined, and with the concept of attenuation length through the equivalent attenuation length the shield is obtained. The scattering and the exponential attenuation inside the shield material is considered in the program. The shield materials can be: concrete, water, iron or lead. It also calculates the shield for point isotropic neutron source, using as shield materials paraffin, concrete or water. (authors). 13 refs

  13. Reconsidering Written Language

    Directory of Open Access Journals (Sweden)

    Gopal P. Sarma

    2015-07-01

    Full Text Available A number of elite thinkers in Europe during the 16th and 17th centuries pursued an agenda which historian Paolo Rossi calls the “quest for a universal language,” a quest which was deeply interwoven with the emergence of the scientific method. From a modern perspective, one of the many surprising aspects of these efforts is that they relied on a diverse array of memorization techniques as foundational elements. In the case of Leibniz’s universal calculus, the ultimate vision was to create a pictorial language that could be learned by anyone in a matter of weeks and which would contain within it a symbolic representation of all domains of contemporary thought, ranging from the natural sciences, to theology, to law. In this brief article, I explore why this agenda might have been appealing to thinkers of this era by examining ancient and modern memory feats. As a thought experiment, I suggest that a society built entirely upon memorization might be less limited than we might otherwise imagine, and furthermore, that cultural norms discouraging the use of written language might have had implications for the development of scientific methodology. Viewed in this light, the efforts of Leibniz and others seem significantly less surprising. I close with some general observations about cross-cultural origins of scientific thought.

  14. L2 writing and L2 written feedback in upper secondary schools as experienced by teachers

    OpenAIRE

    Manousou, Angeliki

    2015-01-01

    L2 written feedback is a multi-faceted issue and this is the reason behind the big number of studies that have been conducted on it. However, the majority of studies deal with learners’ opinions of teachers’ feedback or several types of feedback and their advantages and disadvantages. There are no studies that could have addressed teachers’ opinions of their L2 written feedback. This study attempts to describe how L2 teachers view their written feedback on learners’ essays. ...

  15. ToTem: a tool for variant calling pipeline optimization.

    Science.gov (United States)

    Tom, Nikola; Tom, Ondrej; Malcikova, Jitka; Pavlova, Sarka; Kubesova, Blanka; Rausch, Tobias; Kolarik, Miroslav; Benes, Vladimir; Bystry, Vojtech; Pospisilova, Sarka

    2018-06-26

    High-throughput bioinformatics analyses of next generation sequencing (NGS) data often require challenging pipeline optimization. The key problem is choosing appropriate tools and selecting the best parameters for optimal precision and recall. Here we introduce ToTem, a tool for automated pipeline optimization. ToTem is a stand-alone web application with a comprehensive graphical user interface (GUI). ToTem is written in Java and PHP with an underlying connection to a MySQL database. Its primary role is to automatically generate, execute and benchmark different variant calling pipeline settings. Our tool allows an analysis to be started from any level of the process and with the possibility of plugging almost any tool or code. To prevent an over-fitting of pipeline parameters, ToTem ensures the reproducibility of these by using cross validation techniques that penalize the final precision, recall and F-measure. The results are interpreted as interactive graphs and tables allowing an optimal pipeline to be selected, based on the user's priorities. Using ToTem, we were able to optimize somatic variant calling from ultra-deep targeted gene sequencing (TGS) data and germline variant detection in whole genome sequencing (WGS) data. ToTem is a tool for automated pipeline optimization which is freely available as a web application at  https://totem.software .

  16. A Descriptive Study of Registers Found in Spoken and Written Communication (A Semantic Analysis

    Directory of Open Access Journals (Sweden)

    Nurul Hidayah

    2016-07-01

    Full Text Available This research is descriptive study of registers found in spoken and written communication. The type of this research is Descriptive Qualitative Research. In this research, the data of the study is register in spoken and written communication that are found in a book entitled "Communicating! Theory and Practice" and from internet. The data can be in the forms of words, phrases and abbreviation. In relation with method of collection data, the writer uses the library method as her instrument. The writer relates it to the study of register in spoken and written communication. The technique of analyzing the data using descriptive method. The types of register in this term will be separated into formal register and informal register, and identify the meaning of register.

  17. Written Corrective Feedback: The Perception of Korean EFL Learners

    Science.gov (United States)

    Chung, Bohyon

    2015-01-01

    This paper reports on the perception of Korean EFL learners toward feedback types on their written errors. The survey was administered using an adopted questionnaire from previous studies (Ishii 2011; Leki, 1991). This further allows a comparison of Korean EFL learners' attitudes with the responses to an identical questionnaire by Japanese EFL…

  18. Students' Problem Solving as Mediated by Their Cognitive Tool Use: A Study of Tool Use Patterns

    Science.gov (United States)

    Liu, M.; Horton, L. R.; Corliss, S. B.; Svinicki, M. D.; Bogard, T.; Kim, J.; Chang, M.

    2009-01-01

    The purpose of this study was to use multiple data sources, both objective and subjective, to capture students' thinking processes as they were engaged in problem solving, examine the cognitive tool use patterns, and understand what tools were used and why they were used. The findings of this study confirmed previous research and provided clear…

  19. FASTBUS simulation tools

    International Nuclear Information System (INIS)

    Dean, T.D.; Haney, M.J.

    1991-10-01

    A generalized model of a FASTBUS master is presented. The model is used with simulation tools to aid in the specification, design, and production of FASTBUS slave modules. The model provides a mechanism to interact with the electrical schematics and software models to predict performance. The model is written in the IEEE std 1076-1987 hardware description language VHDL. A model of the ATC logic is also presented. VHDL was chosen to provide portability to various platforms and simulation tools. The models, in conjunction with most commercially available simulators, will perform all of the transactions specified in IEEE std 960-1989. The models may be used to study the behavior of electrical schematics and other software models and detect violations of the FASTBUS protocol. For example, a hardware design of a slave module could be studied, protocol violations detected and corrected before committing money to prototype development. The master model accepts a stream of high level commands from an ASCII file to initiate FASTBUS transactions. The high level command language is based on the FASTBUS standard routines listed in IEEE std 1177-1989. Using this standard-based command language to direct the model of the master, hardware engineers can simulate FASTBUS transactions in the language used by physicists and programmers to operate FASTBUS systems. 15 refs., 6 figs

  20. Evaluation of a visual risk communication tool: effects on knowledge and perception of blood transfusion risk.

    Science.gov (United States)

    Lee, D H; Mehta, M D

    2003-06-01

    Effective risk communication in transfusion medicine is important for health-care consumers, but understanding the numerical magnitude of risks can be difficult. The objective of this study was to determine the effect of a visual risk communication tool on the knowledge and perception of transfusion risk. Laypeople were randomly assigned to receive transfusion risk information with either a written or a visual presentation format for communicating and comparing the probabilities of transfusion risks relative to other hazards. Knowledge of transfusion risk was ascertained with a multiple-choice quiz and risk perception was ascertained by psychometric scaling and principal components analysis. Two-hundred subjects were recruited and randomly assigned. Risk communication with both written and visual presentation formats increased knowledge of transfusion risk and decreased the perceived dread and severity of transfusion risk. Neither format changed the perceived knowledge and control of transfusion risk, nor the perceived benefit of transfusion. No differences in knowledge or risk perception outcomes were detected between the groups randomly assigned to written or visual presentation formats. Risk communication that incorporates risk comparisons in either written or visual presentation formats can improve knowledge and reduce the perception of transfusion risk in laypeople.