WorldWideScience

Sample records for source tool written

  1. Improving the use of historical written sources in paleopathology.

    Science.gov (United States)

    Mitchell, Piers D

    2017-12-01

    The texts written by the people of past societies can provide key information that enhances our understanding of disease in the past. Written sources and art can describe cultural contexts that not only help us interpret lesions in excavated human remains, but also provide evidence for past disease events themselves. However, in recent decades many biohistorical articles have been published that claim to diagnose diseases present in past celebrities or well known individuals, often using less than scholarly methodology. This article aims to help researchers use historical written sources and artwork responsibly, thus improving our understanding of health and disease in the past. It explores a broad range of historical sources, from medical texts and histories to legal documents and tax records, and it highlights how the key to interpreting any past text is to understand who wrote it, when it was written, and why it was written. Case studies of plague epidemics, crucifixion, and the spinal deformity of King Richard III are then used to highlight how we might better integrate archaeological and historical evidence. When done well, integrating evidence from both archaeological and historical sources increases the probability of a complete and well-balanced understanding of disease in past societies. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Pylinguistics: an open source library for readability assessment of texts written in Portuguese

    Directory of Open Access Journals (Sweden)

    Castilhos, S.

    2016-12-01

    Full Text Available Readability assessment is an important task in automatic text simplification that aims identify the text complexity by computing a set of metrics. In this paper, we present the development and assessment of an open source library called Pylinguistics to readability assessment of texts written in Portuguese. Additionally, to illustrate the possibilities of our tool, this work also presents an empirical analysis of readability of Brazilian scientific news dissemination.

  3. Integrating Technology Tools for Students Struggling with Written Language

    Science.gov (United States)

    Fedora, Pledger

    2015-01-01

    This exploratory study was designed to assess the experience of preservice teachers when integrating written language technology and their likelihood of applying that technology in their future classrooms. Results suggest that after experiencing technology integration, preservice teachers are more likely to use it in their future teaching.

  4. Eyewitness Culture and History: Primary Written Sources. The Iconoclast.

    Science.gov (United States)

    McMurtry, John

    1995-01-01

    Asserts that contemporary history and historiography is "official" history that ignores the daily struggles of people for their continued survival. Argues that, while public illiteracy has nearly disappeared, individuals are ignorant of the wealth of primary-source materials of other cultures' histories. (CFR)

  5. Mushu, a free- and open source BCI signal acquisition, written in Python.

    Science.gov (United States)

    Venthur, Bastian; Blankertz, Benjamin

    2012-01-01

    The following paper describes Mushu, a signal acquisition software for retrieval and online streaming of Electroencephalography (EEG) data. It is written, but not limited, to the needs of Brain Computer Interfacing (BCI). It's main goal is to provide a unified interface to EEG data regardless of the amplifiers used. It runs under all major operating systems, like Windows, Mac OS and Linux, is written in Python and is free- and open source software licensed under the terms of the GNU General Public License.

  6. The students’ use of written and internet sources and electronic media for assessment in slovene

    Directory of Open Access Journals (Sweden)

    Petra Hromin

    2015-06-01

    Full Text Available The article presents the frequency of using written and online sources as well as of electronic media during preparation of secondary school students for in-class examinations in Slovene language and literature. Within the scope of the above mentioned aspects we have controlled the of age and type of secondary school programmes. In the first part of the article the concept of information and communication technology/multimedia, the concept of e-learning and the concept of student activity are defined. In the second half of the article I present the results of the research, which show the frequency of use of written and web sources as well as of electronic media. These results have shown that with the oral examination of knowledge of grammar and literature the use of the notebook is prevalent, while with the written examination of knowledge of grammar and literature the use of the course book is predominant. The frequency of use of World Wide Web sources and electronic media increases with age and according to the level of difficultness of education programme. Thus the use of the notebook is the most prevalent in vocational schools whereas the use of the course book is predominant at the level of technical gimnazija and general gimnazija programmes.

  7. Reasons for the fall: Written sources and Material evidence for the collapse of Great Moravia

    Directory of Open Access Journals (Sweden)

    Maddalena Betti

    2016-09-01

    Full Text Available This paper re-examines the causes of the fall of Great Moravia, traditionally associated with the expansion of the Magyars into the Danube basin between the end of the ninth and the beginning of the tenth century. It first analyses the written sources, and in particular the Annals of Fulda, which it is argued describe the gradual marginalisation of the polity’s political influence and agency in the region. Second, on the basis of archaeological evidence, the paper attempts to demonstrate that Moravia’s political crisis was closely tied to its fragile socio-economic foundations.

  8. [Written personalized action plan for atopic dermatitis: a patient education tool].

    Science.gov (United States)

    Gabeff, R; Assathiany, R; Barbarot, S; Salinier, C; Stalder, J-F

    2014-07-01

    Atopic dermatitis (AD) is the most frequent children's chronic skin disease. Management of AD can be difficult because local treatments must be adapted to the skin's condition. Between consultations, sudden changes in the state of the disease can make it difficult to manage local treatment. Parents and children need information that will help them adapt their treatment to the course of their disease. Aiming to enable parents to better treat their atopic child by themselves, we have developed a personalized action plan in order to simplify, personalize, and adapt the medical prescription to the state of the disease. The Personalized Written Action Plan for Atopics (PA2P) is based on the model used in the treatment of asthma, with integrated specificities for AD in children. The aim of this study was to assess the feasibility and pertinence of the PA2P for pediatricians to use in private practice. A total of 479 pediatricians answered a questionnaire sent by e-mail. The vast majority of the respondents gave positive reviews of the tool: 99% of the pediatricians declared the tool to be pertinent, qualifying it as clear and logical. The PA2P appeared to be appropriate for the atopic patient because it improves the families' involvement in the application of local treatment by offering personalized care and by simplifying the doctor's prescription. Finally, 72% of doctors responding to the questionnaire were willing to take part in future studies involving parents. More than a gadget, the PA2P could become a useful tool for therapeutic patient education. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  9. Intrusion Detection using Open Source Tools

    OpenAIRE

    Jack TIMOFTE

    2008-01-01

    We have witnessed in the recent years that open source tools have gained popularity among all types of users, from individuals or small businesses to large organizations and enterprises. In this paper we will present three open source IDS tools: OSSEC, Prelude and SNORT.

  10. Deserts and holy mountains of medieval Serbia: Written sources, spatial patterns, architectural designs

    Directory of Open Access Journals (Sweden)

    Popović Danica

    2007-01-01

    Full Text Available Essential concepts in Christian thought and practice, the desert and holy mountain denote a particular kind of monastic and sacral space. They are secluded from the world, intended for asceticism, and ambivalent in nature they are inhospitable and menacing zones populated with demons, but also a monastic paradise, places for spiritual conversion and encounter with the divine. From earliest times, deserts and holy mountains had a few distinguishing characteristics. All forms of monastic life, from communal to solitary, were practiced side by side there. Monks of a special make-up and distinction known as holy men who were also often founders of illustrious communities, future saints and miracle-workers acted there. Furthermore these locales were important spiritual and bookmaking centre's, and therefore, strongholds of Orthodoxy. When trying to research Serbian material on this topic, we face a specific situation: few surviving sources on the one hand, and devastated monuments on the other. The ultimate consequence is that the entire subject has been neglected. Therefore the study of the Serbian deserts and holy mountains requires a very complex interdisciplinary approach with systematic field work as its essential part. It should address the following issues: corroboration, on the basis of written sources, of the reception of the concept of the monastic desert and holy mountain in a particular, regional, context; the distinct means and mechanisms employed in their physical realization; interpretation of their function; the recognition of patterns preserved in the surviving physical structures. Even the results obtained so far appear to be relevant enough to become included in the sacral topography of the Christian world. The author of this study gives particular attention to the detailed analysis of written sources of various genres - diplomatic sources, hagiographic material, liturgical texts, observation notes - in order to establish the

  11. Introducing Product Lines through Open Source Tools

    OpenAIRE

    Haugen, Øystein

    2008-01-01

    We present an approach to introducing product lines to companies that lower their initial risk by applying open source tools and a smooth learning curve into the use and creation of domain specific modeling combined with standardized variability modeling.

  12. A study of potential sources of linguistic ambiguity in written work instructions.

    Energy Technology Data Exchange (ETDEWEB)

    Matzen, Laura E.

    2009-11-01

    This report describes the results of a small experimental study that investigated potential sources of ambiguity in written work instructions (WIs). The English language can be highly ambiguous because words with different meanings can share the same spelling. Previous studies in the nuclear weapons complex have shown that ambiguous WIs can lead to human error, which is a major cause for concern. To study possible sources of ambiguity in WIs, we determined which of the recommended action verbs in the DOE and BWXT writer's manuals have numerous meanings to their intended audience, making them potentially ambiguous. We used cognitive psychology techniques to conduct a survey in which technicians who use WIs in their jobs indicated the first meaning that came to mind for each of the words. Although the findings of this study are limited by the small number of respondents, we identified words that had many different meanings even within this limited sample. WI writers should pay particular attention to these words and to their most frequent meanings so that they can avoid ambiguity in their writing.

  13. Students' Engagement with a Collaborative Wiki Tool Predicts Enhanced Written Exam Performance

    Science.gov (United States)

    Stafford, Tom; Elgueta, Herman; Cameron, Harriet

    2014-01-01

    We introduced voluntary wiki-based exercises to a long-running cognitive psychology course, part of the core curriculum for an undergraduate degree in psychology. Over 2 yearly cohorts, students who used the wiki more also scored higher on the final written exam. Using regression analysis, it is possible to account for students' tendency to score…

  14. System level modelling with open source tools

    DEFF Research Database (Denmark)

    Jakobsen, Mikkel Koefoed; Madsen, Jan; Niaki, Seyed Hosein Attarzadeh

    , called ForSyDe. ForSyDe is available under the open Source approach, which allows small and medium enterprises (SME) to get easy access to advanced modeling capabilities and tools. We give an introduction to the design methodology through the system level modeling of a simple industrial use case, and we...

  15. Open source tools for fluorescent imaging.

    Science.gov (United States)

    Hamilton, Nicholas A

    2012-01-01

    As microscopy becomes increasingly automated and imaging expands in the spatial and time dimensions, quantitative analysis tools for fluorescent imaging are becoming critical to remove both bottlenecks in throughput as well as fully extract and exploit the information contained in the imaging. In recent years there has been a flurry of activity in the development of bio-image analysis tools and methods with the result that there are now many high-quality, well-documented, and well-supported open source bio-image analysis projects with large user bases that cover essentially every aspect from image capture to publication. These open source solutions are now providing a viable alternative to commercial solutions. More importantly, they are forming an interoperable and interconnected network of tools that allow data and analysis methods to be shared between many of the major projects. Just as researchers build on, transmit, and verify knowledge through publication, open source analysis methods and software are creating a foundation that can be built upon, transmitted, and verified. Here we describe many of the major projects, their capabilities, and features. We also give an overview of the current state of open source software for fluorescent microscopy analysis and the many reasons to use and develop open source methods. Copyright © 2012 Elsevier Inc. All rights reserved.

  16. Students’ engagement with a collaborative wiki tool predicts enhanced written exam performance

    Directory of Open Access Journals (Sweden)

    Tom Stafford

    2014-08-01

    Full Text Available We introduced voluntary wiki-based exercises to a long-running cognitive psychology course, part of the core curriculum for an undergraduate degree in psychology. Over 2 yearly cohorts, students who used the wiki more also scored higher on the final written exam. Using regression analysis, it is possible to account for students’ tendency to score well on other psychology exams, thus statistically removing some obvious candidate third factors, such as general talent or enthusiasm for psychology, which might drive this correlation. Such an analysis shows that both high- and low-grading students who used the wiki got higher scores on the final exam, with engaged wiki users scoring an average of an extra 5 percentage points. We offer an interpretation of the mechanisms of action in terms of the psychological literature on learning and memory.

  17. Plasma sources for EUV lithography exposure tools

    International Nuclear Information System (INIS)

    Banine, Vadim; Moors, Roel

    2004-01-01

    The source is an integral part of an extreme ultraviolet lithography (EUVL) tool. Such a source, as well as the EUVL tool, has to fulfil extremely high demands both technical and cost oriented. The EUVL tool operates at a wavelength in the range 13-14 nm, which requires a major re-thinking of state-of-the-art lithography systems operating in the DUV range. The light production mechanism changes from conventional lamps and lasers to relatively high temperature emitting plasmas. The light transport, mainly refractive for DUV, should become reflective for EUV. The source specifications are derived from the customer requirements for the complete tool, which are: throughput, cost of ownership (CoO) and imaging quality. The EUVL system is considered as a follow up of the existing DUV based lithography technology and, while improving the feature resolution, it has to maintain high wafer throughput performance, which is driven by the overall CoO picture. This in turn puts quite high requirements on the collectable in-band power produced by an EUV source. Increased, due to improved feature resolution, critical dimension (CD) control requirements, together with reflective optics restrictions, necessitate pulse-to-pulse repeatability, spatial stability control and repetition rates, which are substantially better than those of current optical systems. All together the following aspects of the source specification will be addressed: the operating wavelength, the EUV power, the hot spot size, the collectable angle, the repetition rate, the pulse-to-pulse repeatability and the debris induced lifetime of components

  18. Open-source tools for data mining.

    Science.gov (United States)

    Zupan, Blaz; Demsar, Janez

    2008-03-01

    With a growing volume of biomedical databases and repositories, the need to develop a set of tools to address their analysis and support knowledge discovery is becoming acute. The data mining community has developed a substantial set of techniques for computational treatment of these data. In this article, we discuss the evolution of open-source toolboxes that data mining researchers and enthusiasts have developed over the span of a few decades and review several currently available open-source data mining suites. The approaches we review are diverse in data mining methods and user interfaces and also demonstrate that the field and its tools are ready to be fully exploited in biomedical research.

  19. A dynamic regression analysis tool for quantitative assessment of bacterial growth written in Python.

    Science.gov (United States)

    Hoeflinger, Jennifer L; Hoeflinger, Daniel E; Miller, Michael J

    2017-01-01

    Herein, an open-source method to generate quantitative bacterial growth data from high-throughput microplate assays is described. The bacterial lag time, maximum specific growth rate, doubling time and delta OD are reported. Our method was validated by carbohydrate utilization of lactobacilli, and visual inspection revealed 94% of regressions were deemed excellent. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Sourcing While Reading Divergent Expert Accounts: Pathways from Views of Knowing to Written Argumentation

    Science.gov (United States)

    Barzilai, Sarit; Tzadok, Eynav; Eshet-Alkalai, Yoram

    2015-01-01

    Sourcing is vital for knowledge construction from online information sources, yet learners may find it difficult to engage in effective sourcing. Sourcing can be particularly challenging when lay readers encounter conflicting expert accounts of controversial topics, a situation which is increasingly common when learning online. The aim of this…

  1. EUV sources for the alpha-tools

    Science.gov (United States)

    Pankert, Joseph; Apetz, Rolf; Bergmann, Klaus; Damen, Marcel; Derra, Günther; Franken, Oliver; Janssen, Maurice; Jonkers, Jeroen; Klein, Jürgen; Kraus, Helmar; Krücken, Thomas; List, Andreas; Loeken, Micheal; Mader, Arnaud; Metzmacher, Christof; Neff, Willi; Probst, Sven; Prümmer, Ralph; Rosier, Oliver; Schwabe, Stefan; Seiwert, Stefan; Siemons, Guido; Vaudrevange, Dominik; Wagemann, Dirk; Weber, Achim; Zink, Peter; Zitzen, Oliver

    2006-03-01

    In this paper, we report on the recent progress of the Philips Extreme UV source. The Philips source concept is based on a discharge plasma ignited in a Sn vapor plume that is ablated by a laser pulse. Using rotating electrodes covered with a regenerating tin surface, the problems of electrode erosion and power scaling are fundamentally solved. Most of the work of the past year has been dedicated to develop a lamp system which is operating very reliably and stable under full scanner remote control. Topics addressed were the development of the scanner interface, a dose control system, thermo-mechanical design, positional stability of the source, tin handling, and many more. The resulting EUV source-the Philips NovaTin(R) source-can operate at more than 10kW electrical input power and delivers 200W in-band EUV into 2π continuously. The source is very small, so nearly 100% of the EUV radiation can be collected within etendue limits. The lamp system is fully automated and can operate unattended under full scanner remote control. 500 Million shots of continuous operation without interruption have been realized, electrode lifetime is at least 2 Billion shots. Three sources are currently being prepared, two of them will be integrated into the first EUV Alpha Demonstration tools of ASML. The debris problem was reduced to a level which is well acceptable for scanner operation. First, a considerable reduction of the Sn emission of the source has been realized. The debris mitigation system is based on a two-step concept using a foil trap based stage and a chemical cleaning stage. Both steps were improved considerably. A collector lifetime of 1 Billion shots is achieved, after this operating time a cleaning would be applied. The cleaning step has been verified to work with tolerable Sn residues. From the experimental results, a total collector lifetime of more than 10 Billion shots can be expected.

  2. [Jan Fryderyk Wolfgang's autobiography (1850) in the light of hand-written and printed sources].

    Science.gov (United States)

    Kuźnicka, B

    2001-01-01

    The archival collection of the Lithuanian Academy of Sciences in Vilnius (Wilno) contains many manuscripts relating to the scientific work of Jan Fryderyk Wolfgang (1776-1859), professor of pharmacy and pharmacology of the Wilno University in the years 1807-1831, the founder and main figure in the Wilno pharmacognostic school, a botanist with substantial achievements in wide-ranging research on the flora of the Wilno region, as well as a historian of pharmacy. The most interesting of the manuscripts include Wolfgang's Autobiografia [Autobiography], written in 1850, and a list of his publications covering a total of 57 items (including some that have hitherto remained unknown), a work entitled Historya Farmakologii i Farmacyi [History of pharmacology and pharmacy], and a particularly valuable manuscript (666 + 12 sheets) entitled Farmakologiia [Pharmacology]. Worth mentioning are also two catalogues of books from Wolfgang's library: one compiled by Wolfgang himself (37 sheets) and the other by Adam Ferdynand Adamowicz. The content of the autobiography manuscript is contained on five sheets. The author of the present article analyzes the document, comparing the information contained in it with the biographies of J. F. Wolfgang that hhave been published so far (these being primarily the biography by Dominik Cezary ChodYko, published in 1863, and that by Witold W3odzimierz G3owacki of 1960). The text of the autobiography is quoted in full, together with numerous comments. The analysis of the manuscript as well as the biographical data contained in the above-mentioned biographies indicate that Wolfgang had great achievements as a scientist (in both research and organizational work), as a champion of public causes and as an educator of a generation of botanists-pharmacognostics. It also transpires from the autobiography, as well as from the research by historians, that he was a very good and trustful person, who readily granted access to his research to his collaborators

  3. An Open Source Tool to Test Interoperability

    Science.gov (United States)

    Bermudez, L. E.

    2012-12-01

    Scientists interact with information at various levels from gathering of the raw observed data to accessing portrayed processed quality control data. Geoinformatics tools help scientist on the acquisition, storage, processing, dissemination and presentation of geospatial information. Most of the interactions occur in a distributed environment between software components that take the role of either client or server. The communication between components includes protocols, encodings of messages and managing of errors. Testing of these communication components is important to guarantee proper implementation of standards. The communication between clients and servers can be adhoc or follow standards. By following standards interoperability between components increase while reducing the time of developing new software. The Open Geospatial Consortium (OGC), not only coordinates the development of standards but also, within the Compliance Testing Program (CITE), provides a testing infrastructure to test clients and servers. The OGC Web-based Test Engine Facility, based on TEAM Engine, allows developers to test Web services and clients for correct implementation of OGC standards. TEAM Engine is a JAVA open source facility, available at Sourceforge that can be run via command line, deployed in a web servlet container or integrated in developer's environment via MAVEN. The TEAM Engine uses the Compliance Test Language (CTL) and TestNG to test HTTP requests, SOAP services and XML instances against Schemas and Schematron based assertions of any type of web service, not only OGC services. For example, the OGC Web Feature Service (WFS) 1.0.0 test has more than 400 test assertions. Some of these assertions includes conformance of HTTP responses, conformance of GML-encoded data; proper values for elements and attributes in the XML; and, correct error responses. This presentation will provide an overview of TEAM Engine, introduction of how to test via the OGC Testing web site and

  4. The processes of social complexity in the Northwest of the Iberian Peninsula: archaeology and written sources

    Directory of Open Access Journals (Sweden)

    Sastre, Inés

    2004-12-01

    Full Text Available The general aim of this paper is the analysis of the forms of social inequality in Northwestern “castro” communities, taking into account both the regional diversity and the Roman influence before the conquest. Two models are proposed: one segmentary and the other of large castro settlements, in order to define and interpret the diversity of social relations and the processes of change. This allows us to overcome the traditional homogenising points of view and to put forward an integral analysis from both the archaeological record and the ancient literary sources.

    El objetivo de este trabajo es realizar un análisis de las formas de desigualdad social de las sociedades castreñas del Noroeste que tenga en cuenta la diversidad regional así como el papel de la influencia romana anterior a la conquista en el desarrollo de estas comunidades. Para ello se recurre a dos modelos interpretativos: el de los castros segmentarios y el de los grandes asentamientos castreños. Esto permite afirmar que existen diversas formas de articularse las relaciones sociales en el Noroeste prerromano que no son reducibles a modelos uniformes y homogenizadores como los que tradicionalmente se han aplicado. A partir de esta constatación es posible integrar la información en el estudio de las fuentes literarias, de manera convergente con el análisis arqueológico.

  5. Megaliths as land-marks. Chronicle of the territorial role of the megalithic monuments through written sources

    Directory of Open Access Journals (Sweden)

    Martinón-Torres, Marcos

    2001-06-01

    Full Text Available Megalithic monuments have played dijferent roles throughout History. One of them has a spatial function, i.e. as landmarks. The aim of this paper has been to collect and analyse every written reference concerning Galician megaliths operating as landmarks between the 6th and 19th centuries AD. On this basis, the evolution of this social-territorial function of the monuments through time is reconstructed, and an interpretative hypothesis for this phenomenon is proposed. Finally, the importance of reviewing written sources as a methodology for archaeological survey and for studies of the topographic settings of monuments is emphasised.

    A lo largo de la Historia, los monumentos megalíticos han desempeñado, entre otras, una función espacial, como marcos de territorio. Para este artículo se recogen y analizan las referencias escritas a megalitos gallegos funcionando como marcadores o identificadores espaciales, entre los siglos VI y XIX d.C. A partir de este registro de fuentes se reconstruye la evolución de este papel social-territorial de los monumentos en las distintas épocas. Se plantea un modelo interpretativo para este fenómeno, y se valora la revisión de fuentes escritas como metodología para la prospección arqueológica y para los estudios de emplazamiento de megalitos.

  6. The Exercise: An Exercise Generator Tool for the SOURCe Project

    Science.gov (United States)

    Kakoyianni-Doa, Fryni; Tziafa, Eleni; Naskos, Athanasios

    2016-01-01

    The Exercise, an Exercise generator in the SOURCe project, is a tool that complements the properties and functionalities of the SOURCe project, which includes the search engine for the Searchable Online French-Greek parallel corpus for the UniveRsity of Cyprus (SOURCe) (Kakoyianni-Doa & Tziafa, 2013), the PENCIL (an alignment tool)…

  7. Serbian Written Sources on the Tatars and the Golden Horde (first half of the 14th century

    Directory of Open Access Journals (Sweden)

    Aleksandar Uzelac

    2014-01-01

    Full Text Available Serbian narrative and documentary texts, written in the first half of the XIV century, represent valuable source material for the research of Tatar political and military influence in the Balkan lands. Most important among them are Vita of King Stephen Uroš II Milutin (1282–1321, extant in three different editions and Vita of Archbishop Daniel II (1324–1337. The first one offers insight into the relations between the Kingdom of Serbia and the powerful Juchid prince Nogai, while in the latter, the key role of Tatar contingents in internal power struggle between king Milutin and his brother Stephen Dragutin is mentioned. The presence of Tatars in the Battle of Velbazhd (1330, fought between Serbia and the Bulgarian Empire, is also attested in various sources, including the so-called Old Serbian chronicles and the Code of Law of Emperor Stephen Dušan (1349. Another group of sources analyzed in the text are several apocryphal writings of South Slavic literature. Their value lies in the fact that they reflect the image of the Tatars in the eyes of the Balkan Slavs. Last, but not least important testimony of Tatar activities in Serbian lands is preserved in place-names of Tatar origin, recorded in royal charters, issued by Milutin’s son Stephen (1321–1331 and grandson Stephen Dušan (1331–1355.

  8. A survey of open source tools for business intelligence

    DEFF Research Database (Denmark)

    Thomsen, Christian; Pedersen, Torben Bach

    2005-01-01

    The industrial use of open source Business Intelligence (BI) tools is not yet common. It is therefore of interest to explore which possibilities are available for open source BI and compare the tools. In this survey paper, we consider the capabilities of a number of open source tools for BI....... In the paper, we consider three Extract-Transform-Load (ETL) tools, three On-Line Analytical Processing (OLAP) servers, two OLAP clients, and four database management systems (DBMSs). Further, we describe the licenses that the products are released under. It is argued that the ETL tools are still not very...

  9. A Survey of Open Source Tools for Business Intelligence

    DEFF Research Database (Denmark)

    Thomsen, Christian; Pedersen, Torben Bach

    2009-01-01

    The industrial use of open source Business Intelligence (BI) tools is becoming more common, but is still not as widespread as for other types of software. It is therefore of interest to explore which possibilities are available for open source BI and compare the tools. In this survey paper, we co...

  10. Open Source for Knowledge and Learning Management: Strategies beyond Tools

    Science.gov (United States)

    Lytras, Miltiadis, Ed.; Naeve, Ambjorn, Ed.

    2007-01-01

    In the last years, knowledge and learning management have made a significant impact on the IT research community. "Open Source for Knowledge and Learning Management: Strategies Beyond Tools" presents learning and knowledge management from a point of view where the basic tools and applications are provided by open source technologies.…

  11. Open Source Approach to Project Management Tools

    Directory of Open Access Journals (Sweden)

    Romeo MARGEA

    2011-01-01

    Full Text Available Managing large projects involving different groups of people and complex tasks can be challenging. The solution is to use Project management software, which allows a more efficient management of projects. However, famous project management systems can be costly and may require expensive custom servers. Even if free software is not as complex as Microsoft Project, is noteworthy to think that not all projects need all the features, amenities and power of such systems. There are free and open source software alternatives that meet the needs of most projects, and that allow Web access based on different platforms and locations. A starting stage in adopting an OSS in-house is finding and identifying existing open source solution. In this paper we present an overview of Open Source Project Management Software (OSPMS based on articles, reviews, books and developers’ web sites, about those that seem to be the most popular software in this category.

  12. Tracking PACS usage with open source tools.

    Science.gov (United States)

    French, Todd L; Langer, Steve G

    2011-08-01

    A typical choice faced by Picture Archiving and Communication System (PACS) administrators is deciding how many PACS workstations are needed and where they should be sited. Oftentimes, the social consequences of having too few are severe enough to encourage oversupply and underutilization. This is costly, at best in terms of hardware and electricity, and at worst (depending on the PACS licensing and support model) in capital costs and maintenance fees. The PACS administrator needs tools to asses accurately the use to which her fleet is being subjected, and thus make informed choices before buying more workstations. Lacking a vended solution for this challenge, we developed our own.

  13. Building Eclectic Personal Learning Landscapes with Open Source Tools

    NARCIS (Netherlands)

    Kalz, Marco

    2008-01-01

    Kalz, M. (2005). Building Eclectic Personal Learning Landscapes with Open Source Tools. In F. de Vries, G. Attwell, R. Elferink & A. Tödt (Eds.), Open Source for Education in Europe. Research & Practice (= Proceedings of the Open Source for Education in Europe Conference) (pp. 163-168). 2005,

  14. Building Eclectic Personal Learning Landscapes with Open Source Tools

    OpenAIRE

    Kalz, Marco

    2008-01-01

    Kalz, M. (2005). Building Eclectic Personal Learning Landscapes with Open Source Tools. In F. de Vries, G. Attwell, R. Elferink & A. Tödt (Eds.), Open Source for Education in Europe. Research & Practice (= Proceedings of the Open Source for Education in Europe Conference) (pp. 163-168). 2005, Heerlen, The Netherlands.

  15. An open-source optimization tool for solar home systems: A case study in Namibia

    International Nuclear Information System (INIS)

    Campana, Pietro Elia; Holmberg, Aksel; Pettersson, Oscar; Klintenberg, Patrik; Hangula, Abraham; Araoz, Fabian Benavente; Zhang, Yang; Stridh, Bengt; Yan, Jinyue

    2016-01-01

    Highlights: • An open-source optimization tool for solar home systems (SHSs) design is developed. • The optimization tool is written in MS Excel-VBA. • The optimization tool is validated with a commercial and open-source software. • The optimization tool has the potential of improving future SHS installations. - Abstract: Solar home systems (SHSs) represent a viable technical solution for providing electricity to households and improving standard of living conditions in areas not reached by the national grid or local grids. For this reason, several rural electrification programmes in developing countries, including Namibia, have been relying on SHSs to electrify rural off-grid communities. However, the limited technical know-how of service providers, often resulting in over- or under-sized SHSs, is an issue that has to be solved to avoid dissatisfaction of SHSs’ users. The solution presented here is to develop an open-source software that service providers can use to optimally design SHSs components based on the specific electricity requirements of the end-user. The aim of this study is to develop and validate an optimization model written in MS Excel-VBA which calculates the optimal SHSs components capacities guaranteeing the minimum costs and the maximum system reliability. The results obtained with the developed tool showed good agreement with a commercial software and a computational code used in research activities. When applying the developed optimization tool to existing systems, the results identified that several components were incorrectly sized. The tool has thus the potentials of improving future SHSs installations, contributing to increasing satisfaction of end-users.

  16. EUV source development for high-volume chip manufacturing tools

    Science.gov (United States)

    Stamm, Uwe; Yoshioka, Masaki; Kleinschmidt, Jürgen; Ziener, Christian; Schriever, Guido; Schürmann, Max C.; Hergenhan, Guido; Borisov, Vladimir M.

    2007-03-01

    Xenon-fueled gas discharge produced plasma (DPP) sources were integrated into Micro Exposure Tools already in 2004. Operation of these tools in a research environment gave early learning for the development of EUV sources for Alpha and Beta-Tools. Further experiments with these sources were performed for basic understanding on EUV source technology and limits, especially the achievable power and reliability. The intermediate focus power of Alpha-Tool sources under development is measured to values above 10 W. Debris mitigation schemes were successfully integrated into the sources leading to reasonable collector mirror lifetimes with target of 10 billion pulses due to the effective debris flux reduction. Source collector mirrors, which withstand the radiation and temperature load of Xenon-fueled sources, have been developed in cooperation with MediaLario Technologies to support intermediate focus power well above 10 W. To fulfill the requirements for High Volume chip Manufacturing (HVM) applications, a new concept for HVM EUV sources with higher efficiency has been developed at XTREME technologies. The discharge produced plasma (DPP) source concept combines the use of rotating disk electrodes (RDE) with laser exited droplet targets. The source concept is called laser assisted droplet RDE source. The fuel of these sources has been selected to be Tin. The conversion efficiency achieved with the laser assisted droplet RDE source is 2-3x higher compared to Xenon. Very high pulse energies well above 200 mJ / 2π sr have been measured with first prototypes of the laser assisted droplet RDE source. If it is possible to maintain these high pulse energies at higher repetition rates a 10 kHz EUV source could deliver 2000 W / 2π sr. According to the first experimental data the new concept is expected to be scalable to an intermediate focus power on the 300 W level.

  17. Open source Modeling and optimization tools for Planning

    Energy Technology Data Exchange (ETDEWEB)

    Peles, S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-10

    Open source modeling and optimization tools for planning The existing tools and software used for planning and analysis in California are either expensive, difficult to use, or not generally accessible to a large number of participants. These limitations restrict the availability of participants for larger scale energy and grid studies in the state. The proposed initiative would build upon federal and state investments in open source software, and create and improve open source tools for use in the state planning and analysis activities. Computational analysis and simulation frameworks in development at national labs and universities can be brought forward to complement existing tools. An open source platform would provide a path for novel techniques and strategies to be brought into the larger community and reviewed by a broad set of stakeholders.

  18. Digital-flight-control-system software written in automated-engineering-design language: A user's guide of verification and validation tools

    Science.gov (United States)

    Saito, Jim

    1987-01-01

    The user guide of verification and validation (V&V) tools for the Automated Engineering Design (AED) language is specifically written to update the information found in several documents pertaining to the automated verification of flight software tools. The intent is to provide, in one document, all the information necessary to adequately prepare a run to use the AED V&V tools. No attempt is made to discuss the FORTRAN V&V tools since they were not updated and are not currently active. Additionally, the current descriptions of the AED V&V tools are contained and provides information to augment the NASA TM 84276. The AED V&V tools are accessed from the digital flight control systems verification laboratory (DFCSVL) via a PDP-11/60 digital computer. The AED V&V tool interface handlers on the PDP-11/60 generate a Univac run stream which is transmitted to the Univac via a Remote Job Entry (RJE) link. Job execution takes place on the Univac 1100 and the job output is transmitted back to the DFCSVL and stored as a PDP-11/60 printfile.

  19. Open Source and Proprietary Project Management Tools for SMEs.

    Directory of Open Access Journals (Sweden)

    Veronika Abramova

    2017-05-01

    Full Text Available The dimensional growth and increasing difficulty in project management promoted the development of different tools that serve to facilitate project management and track project schedule, resources and overall progress. These tools offer a variety of features, from task and time management, up to integrated CRM (Customer Relationship Management and ERP (Enterprise Resource Planning modules. Currently, a large number of project management software is available, to assist project team during the entire project lifecycle. We present the main differences between open source and proprietary project management tools and how those could be important for SMEs, describing the key features and how those can assist the project manager and the development team. In this paper, we analyse four open-source project management tools: OpenProject, ProjectLibre, Redmine, LibrePlan and four proprietary tools: Bitrix24, JIRA, Microsoft Project and Asana.

  20. Open source tools for ATR development and performance evaluation

    Science.gov (United States)

    Baumann, James M.; Dilsavor, Ronald L.; Stubbles, James; Mossing, John C.

    2002-07-01

    Early in almost every engineering project, a decision must be made about tools; should I buy off-the-shelf tools or should I develop my own. Either choice can involve significant cost and risk. Off-the-shelf tools may be readily available, but they can be expensive to purchase and to maintain licenses, and may not be flexible enough to satisfy all project requirements. On the other hand, developing new tools permits great flexibility, but it can be time- (and budget-) consuming, and the end product still may not work as intended. Open source software has the advantages of both approaches without many of the pitfalls. This paper examines the concept of open source software, including its history, unique culture, and informal yet closely followed conventions. These characteristics influence the quality and quantity of software available, and ultimately its suitability for serious ATR development work. We give an example where Python, an open source scripting language, and OpenEV, a viewing and analysis tool for geospatial data, have been incorporated into ATR performance evaluation projects. While this case highlights the successful use of open source tools, we also offer important insight into risks associated with this approach.

  1. Determining the sources of fine-grained sediment using the Sediment Source Assessment Tool (Sed_SAT)

    Science.gov (United States)

    Gorman Sanisaca, Lillian E.; Gellis, Allen C.; Lorenz, David L.

    2017-07-27

    A sound understanding of sources contributing to instream sediment flux in a watershed is important when developing total maximum daily load (TMDL) management strategies designed to reduce suspended sediment in streams. Sediment fingerprinting and sediment budget approaches are two techniques that, when used jointly, can qualify and quantify the major sources of sediment in a given watershed. The sediment fingerprinting approach uses trace element concentrations from samples in known potential source areas to determine a clear signature of each potential source. A mixing model is then used to determine the relative source contribution to the target suspended sediment samples.The computational steps required to apportion sediment for each target sample are quite involved and time intensive, a problem the Sediment Source Assessment Tool (Sed_SAT) addresses. Sed_SAT is a user-friendly statistical model that guides the user through the necessary steps in order to quantify the relative contributions of sediment sources in a given watershed. The model is written using the statistical software R (R Core Team, 2016b) and utilizes Microsoft Access® as a user interface but requires no prior knowledge of R or Microsoft Access® to successfully run the model successfully. Sed_SAT identifies outliers, corrects for differences in size and organic content in the source samples relative to the target samples, evaluates the conservative behavior of tracers used in fingerprinting by applying a “Bracket Test,” identifies tracers with the highest discriminatory power, and provides robust error analysis through a Monte Carlo simulation following the mixing model. Quantifying sediment source contributions using the sediment fingerprinting approach provides local, State, and Federal land management agencies with important information needed to implement effective strategies to reduce sediment. Sed_SAT is designed to assist these agencies in applying the sediment fingerprinting

  2. Adding tools to the open source toolbox: The Internet

    Science.gov (United States)

    Porth, Tricia

    1994-01-01

    The Internet offers researchers additional sources of information not easily available from traditional sources such as print volumes or commercial data bases. Internet tools such as e-mail and file transfer protocol (ftp) speed up the way researchers communicate and transmit data. Mosaic, one of the newest additions to the Internet toolbox, allows users to combine tools such as ftp, gopher, wide area information server, and the world wide web with multimedia capabilities. Mosaic has quickly become a popular means of making information available on the Internet because it is versatile and easily customizable.

  3. A Survey of Open Source Tools for Business Intelligence

    DEFF Research Database (Denmark)

    Thomsen, Christian; Pedersen, Torben Bach

    The industrial use of open source Business Intelligence (BI) tools is becoming more common, but is still not as widespread as for other types of software.  It is therefore of interest to explore which possibilities are available for open source BI and compare the tools. In this survey paper, we c......The industrial use of open source Business Intelligence (BI) tools is becoming more common, but is still not as widespread as for other types of software.  It is therefore of interest to explore which possibilities...... are available for open source BI and compare the tools. In this survey paper, we consider the capabilities of a number of open source tools for BI. In the paper, we consider a number of Extract‐Transform‐Load (ETL) tools, database management systems (DBMSs), On‐Line Analytical Processing (OLAP) servers, and OLAP clients. We find that, unlike the situation a few years ago, there now...

  4. The Value of Open Source Software Tools in Qualitative Research

    Science.gov (United States)

    Greenberg, Gary

    2011-01-01

    In an era of global networks, researchers using qualitative methods must consider the impact of any software they use on the sharing of data and findings. In this essay, I identify researchers' main areas of concern regarding the use of qualitative software packages for research. I then examine how open source software tools, wherein the publisher…

  5. Large Data Visualization with Open-Source Tools

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    Visualization and post-processing of large data have become increasingly challenging and require more and more tools to support the diversity of data to process. In this seminar, we will present a suite of open-source tools supported and developed by Kitware to perform large-scale data visualization and analysis. In particular, we will present ParaView, an open-source tool for parallel visualization of massive datasets, the Visualization Toolkit (VTK), an open-source toolkit for scientific visualization, and Tangelohub, a suite of tools for large data analytics. About the speaker Julien Jomier is directing Kitware's European subsidiary in Lyon, France, where he focuses on European business development. Julien works on a variety of projects in the areas of parallel and distributed computing, mobile computing, image processing, and visualization. He is one of the developers of the Insight Toolkit (ITK), the Visualization Toolkit (VTK), and ParaView. Julien is also leading the CDash project, an open-source co...

  6. E-Sourcing platforms as reasonable marketing tools for suppliers

    OpenAIRE

    Göbl, Martin; Greiter, Thomas

    2014-01-01

    Research questions: E-sourcing platforms offer purchasing organisations often easy access to a high number of relevant suppliers, their goods and services and the accord-ing prices. For the suppliers, e-sourcing platforms are a good and easy pos-sibility to present their products and services to the relevant buyers and to get in contact with potential customers. Subject of this research will be the question, whether e-sourcing platforms are also a reasonable marketing tool for suppliers in or...

  7. Synchrotron light sources: A powerful tool for science and technology

    International Nuclear Information System (INIS)

    Schlachter, F.; Robinson, A.

    1996-01-01

    A new generation of synchrotron light sources is producing extremely bright beams of vacuum-ultraviolet and x-ray radiation, poweful new tools for research in a wide variety of basic and applied sciences. Spectromicroscopy using high spectral and spatial resolution is a new way of seeing, offering many opportunities in the study of matter. Development of a new light source provides the country or region of the world in which the light source is located many new opportunities: a focal point for research in many scientific and technological areas, a means of upgrading the technology infrastructure of the country, a means of training students, and a potential service to industry. A light source for Southeast Asia would thus be a major resource for many years. Scientists and engineers from light sources around the world look forward to providing assistance to make this a reality in Southeast Asia

  8. Synchrotron light sources: A powerful tool for science and technology

    International Nuclear Information System (INIS)

    Schlachter, F.; Robinson, A.

    1996-01-01

    A new generation of synchrotron light sources is producing extremely bright beams of vacuum-ultraviolet and x-ray radiation, powerful new tools for research in a wide variety of basic and applied sciences. Spectromicroscopy using high spectral and spatial resolution is a new way of seeing, offering many opportunities in the study of matter. Development of a new light source provides the country or region of the world in which the light source is located many new opportunities: a focal point for research in many scientific and technological areas, a means of upgrading the technology infrastructure of the country, a means of training students, and a potential service to industry. A light source for Southeast Asia would thus be a major resource for many years. Scientists and engineers from light sources around the world look forward to providing assistance to make this a reality in Southeast Asia

  9. Decision support tool for diagnosing the source of variation

    Science.gov (United States)

    Masood, Ibrahim; Azrul Azhad Haizan, Mohamad; Norbaya Jumali, Siti; Ghazali, Farah Najihah Mohd; Razali, Hazlin Syafinaz Md; Shahir Yahya, Mohd; Azlan, Mohd Azwir bin

    2017-08-01

    Identifying the source of unnatural variation (SOV) in manufacturing process is essential for quality control. The Shewhart control chart patterns (CCPs) are commonly used to monitor the SOV. However, a proper interpretation of CCPs associated to its SOV requires a high skill industrial practitioner. Lack of knowledge in process engineering will lead to erroneous corrective action. The objective of this study is to design the operating procedures of computerized decision support tool (DST) for process diagnosis. The DST is an embedded tool in CCPs recognition scheme. Design methodology involves analysis of relationship between geometrical features, manufacturing process and CCPs. The DST contents information about CCPs and its possible root cause error and description on SOV phenomenon such as process deterioration in tool bluntness, offsetting tool, loading error, and changes in materials hardness. The DST will be useful for an industrial practitioner in making effective troubleshooting.

  10. Open source intelligence: A tool to combat illicit trafficking

    Energy Technology Data Exchange (ETDEWEB)

    Sjoeberg, J [Swedish Armed Forces HQ, Stockholm (Sweden)

    2001-10-01

    The purpose of my presentation is to provide some thoughts on Open Sources and how Open Sources can be used as tools for detecting illicit trafficking and proliferation. To fulfill this purpose I would like to deal with the following points during my presentation: What is Open Source? How can it be defined? - Different sources - Methods. Open Source information can be defined as publicly available information as well as other unclassified information that has limited public distribution or access to it. It comes in print, electronic or oral form. It can be found distributed either to the mass public by print or electronic media or to a much more limited customer base like companies, experts or specialists of some kind including the so called gray literature. Open Source information is not a single source but a multi-source. Thus, you can say that Open Sources does not say anything about the information itself, it only refers to if the information is classified or not.

  11. Open source intelligence: A tool to combat illicit trafficking

    International Nuclear Information System (INIS)

    Sjoeberg, J.

    2001-01-01

    The purpose of my presentation is to provide some thoughts on Open Sources and how Open Sources can be used as tools for detecting illicit trafficking and proliferation. To fulfill this purpose I would like to deal with the following points during my presentation: What is Open Source? How can it be defined? - Different sources - Methods. Open Source information can be defined as publicly available information as well as other unclassified information that has limited public distribution or access to it. It comes in print, electronic or oral form. It can be found distributed either to the mass public by print or electronic media or to a much more limited customer base like companies, experts or specialists of some kind including the so called gray literature. Open Source information is not a single source but a multi-source. Thus, you can say that Open Sources does not say anything about the information itself, it only refers to if the information is classified or not

  12. jSPyDB, an open source database-independent tool for data management

    Science.gov (United States)

    Pierro, Giuseppe Antonio; Cavallari, Francesca; Di Guida, Salvatore; Innocente, Vincenzo

    2011-12-01

    Nowadays, the number of commercial tools available for accessing Databases, built on Java or .Net, is increasing. However, many of these applications have several drawbacks: usually they are not open-source, they provide interfaces only with a specific kind of database, they are platform-dependent and very CPU and memory consuming. jSPyDB is a free web-based tool written using Python and Javascript. It relies on jQuery and python libraries, and is intended to provide a simple handler to different database technologies inside a local web browser. Such a tool, exploiting fast access libraries such as SQLAlchemy, is easy to install, and to configure. The design of this tool envisages three layers. The front-end client side in the local web browser communicates with a backend server. Only the server is able to connect to the different databases for the purposes of performing data definition and manipulation. The server makes the data available to the client, so that the user can display and handle them safely. Moreover, thanks to jQuery libraries, this tool supports export of data in different formats, such as XML and JSON. Finally, by using a set of pre-defined functions, users are allowed to create their customized views for a better data visualization. In this way, we optimize the performance of database servers by avoiding short connections and concurrent sessions. In addition, security is enforced since we do not provide users the possibility to directly execute any SQL statement.

  13. jSPyDB, an open source database-independent tool for data management

    International Nuclear Information System (INIS)

    Pierro, Giuseppe Antonio; Cavallari, Francesca; Di Guida, Salvatore; Innocente, Vincenzo

    2011-01-01

    Nowadays, the number of commercial tools available for accessing Databases, built on Java or .Net, is increasing. However, many of these applications have several drawbacks: usually they are not open-source, they provide interfaces only with a specific kind of database, they are platform-dependent and very CPU and memory consuming. jSPyDB is a free web-based tool written using Python and Javascript. It relies on jQuery and python libraries, and is intended to provide a simple handler to different database technologies inside a local web browser. Such a tool, exploiting fast access libraries such as SQLAlchemy, is easy to install, and to configure. The design of this tool envisages three layers. The front-end client side in the local web browser communicates with a backend server. Only the server is able to connect to the different databases for the purposes of performing data definition and manipulation. The server makes the data available to the client, so that the user can display and handle them safely. Moreover, thanks to jQuery libraries, this tool supports export of data in different formats, such as XML and JSON. Finally, by using a set of pre-defined functions, users are allowed to create their customized views for a better data visualization. In this way, we optimize the performance of database servers by avoiding short connections and concurrent sessions. In addition, security is enforced since we do not provide users the possibility to directly execute any SQL statement.

  14. Windows Developer Power Tools Turbocharge Windows development with more than 170 free and open source tools

    CERN Document Server

    Avery, James

    2007-01-01

    Software developers need to work harder and harder to bring value to their development process in order to build high quality applications and remain competitive. Developers can accomplish this by improving their productivity, quickly solving problems, and writing better code. A wealth of open source and free software tools are available for developers who want to improve the way they create, build, deploy, and use software. Tools, components, and frameworks exist to help developers at every point in the development process. Windows Developer Power Tools offers an encyclopedic guide to m

  15. Plasma diagnostic tools for optimizing negative hydrogen ion sources

    International Nuclear Information System (INIS)

    Fantz, U.; Falter, H.D.; Franzen, P.; Speth, E.; Hemsworth, R.; Boilson, D.; Krylov, A.

    2006-01-01

    The powerful diagnostic tool of optical emission spectroscopy is used to measure the plasma parameters in negative hydrogen ion sources based on the surface mechanism. Results for electron temperature, electron density, atomic-to-molecular hydrogen density ratio, and gas temperature are presented for two types of sources, a rf source and an arc source, which are currently under development for a neutral beam heating system of ITER. The amount of cesium in the plasma volume is obtained from cesium radiation: the Cs neutral density is five to ten orders of magnitude lower than the hydrogen density and the Cs ion density is two to three orders of magnitude lower than the electron density in front of the grid. It is shown that monitoring of cesium lines is very useful for monitoring the cesium balance in the source. From a line-ratio method negative ion densities are determined. In a well-conditioned source the negative ion density is of the same order of magnitude as the electron density and correlates with extracted current densities

  16. jSPyDB, an open source database-independent tool for data management

    CERN Document Server

    Pierro, Giuseppe Antonio

    2010-01-01

    Nowadays, the number of commercial tools available for accessing Databases, built on Java or .Net, is increasing. However, many of these applications have several drawbacks: usually they are not open-source, they provide interfaces only with a specific kind of database, they are platform-dependent and very CPU and memory consuming. jSPyDB is a free web based tool written using Python and Javascript. It relies on jQuery and python libraries, and is intended to provide a simple handler to different Database technologies inside a local web browser. Such a tool, exploiting fast access libraries such as SQLAlchemy, is easy to install, and to configure. The design of this tool envisages three layers. The front-end client side in the local web browser communicates with a backend server. Only the server is able to connect to the different databases for the purposes of performing data definition and manipulation. The server makes the data available to the client, so that the user can display and handle them safely. ...

  17. ProteoWizard: open source software for rapid proteomics tools development.

    Science.gov (United States)

    Kessner, Darren; Chambers, Matt; Burke, Robert; Agus, David; Mallick, Parag

    2008-11-01

    The ProteoWizard software project provides a modular and extensible set of open-source, cross-platform tools and libraries. The tools perform proteomics data analyses; the libraries enable rapid tool creation by providing a robust, pluggable development framework that simplifies and unifies data file access, and performs standard proteomics and LCMS dataset computations. The library contains readers and writers of the mzML data format, which has been written using modern C++ techniques and design principles and supports a variety of platforms with native compilers. The software has been specifically released under the Apache v2 license to ensure it can be used in both academic and commercial projects. In addition to the library, we also introduce a rapidly growing set of companion tools whose implementation helps to illustrate the simplicity of developing applications on top of the ProteoWizard library. Cross-platform software that compiles using native compilers (i.e. GCC on Linux, MSVC on Windows and XCode on OSX) is available for download free of charge, at http://proteowizard.sourceforge.net. This website also provides code examples, and documentation. It is our hope the ProteoWizard project will become a standard platform for proteomics development; consequently, code use, contribution and further development are strongly encouraged.

  18. The Status of Kasimov Chinggisids during the Reigns of Vasily II and Ivan III according to Written Sources

    OpenAIRE

    M.A. Nesin

    2017-01-01

    Research objective: To study the issue both of the status of Kasimov Chinggisids as well as of the attitude toward the service Tatars and the Tatars who were allied to Moscow during the reigns of the great Moscow princes Vasily II and Ivan III. Research materials: Published and unpublished sources: books of official orders in the Russian state, chronicles, acts, diplomatic documents, etc. Results and novelty of the research: This work is the first comprehensive study of this topic. The ...

  19. Business intelligence tools for radiology: creating a prototype model using open-source tools.

    Science.gov (United States)

    Prevedello, Luciano M; Andriole, Katherine P; Hanson, Richard; Kelly, Pauline; Khorasani, Ramin

    2010-04-01

    Digital radiology departments could benefit from the ability to integrate and visualize data (e.g. information reflecting complex workflow states) from all of their imaging and information management systems in one composite presentation view. Leveraging data warehousing tools developed in the business world may be one way to achieve this capability. In total, the concept of managing the information available in this data repository is known as Business Intelligence or BI. This paper describes the concepts used in Business Intelligence, their importance to modern Radiology, and the steps used in the creation of a prototype model of a data warehouse for BI using open-source tools.

  20. Commissioning software tools at the Advanced Photon Source

    International Nuclear Information System (INIS)

    Emery, L.

    1995-01-01

    A software tool-oriented approach has been adopted in the commissioning of the Advanced Photon Source (APS) at Argonne National Laboratory, particularly in the commissioning of the Positron Accumulator Ring (PAR). The general philosophy is to decompose a complicated procedure involving measurement, data processing, and control into a series of simpler steps, each accomplished by a generic toolkit program. The implementation is greatly facilitated by adopting the SDDS (self-describing data set protocol), which comes with its own toolkit. The combined toolkit has made accelerator physics measurements easier. For instance, the measurement of the optical functions of the PAR and the beamlines connected to it have been largely automated. Complicated measurements are feasible with a combination of tools running independently

  1. The Status of Kasimov Chinggisids during the Reigns of Vasily II and Ivan III according to Written Sources

    Directory of Open Access Journals (Sweden)

    M.A. Nesin

    2017-06-01

    Full Text Available Research objective: To study the issue both of the status of Kasimov Chinggisids as well as of the attitude toward the service Tatars and the Tatars who were allied to Moscow during the reigns of the great Moscow princes Vasily II and Ivan III. Research materials: Published and unpublished sources: books of official orders in the Russian state, chronicles, acts, diplomatic documents, etc. Results and novelty of the research: This work is the first comprehensive study of this topic. The author comes to the conclusion that Kasimov’s rulers during this period were considered servitors, vassals in relation to the Moscow principality. This was facilitated by the demesnial position of Kasimov, which was considered part of the grand prince’s domain rather than a sovereign splinter of the Golden Horde. Kasimov Chinggisids did not have a large military force in order to create an independent khanate in the steppe capable of competing, for example, with Kazan, and they relied on the strong Moscow principality performing military service for it, receiving money for proper maintenance and obeying the grand prince’s orders. At the same time, Ivan III even considered the possibility of replacing Daniyar in Kasimov with another serving prince during his lifetime. In addition, the author shows how much the attitude towards the service Tatars and the Tatar detachments, who were allied to Moscow, differed in different Russian lands.

  2. Open Source Tools for Assessment of Global Water Availability, Demands, and Scarcity

    Science.gov (United States)

    Li, X.; Vernon, C. R.; Hejazi, M. I.; Link, R. P.; Liu, Y.; Feng, L.; Huang, Z.; Liu, L.

    2017-12-01

    Water availability and water demands are essential factors for estimating water scarcity conditions. To reproduce historical observations and to quantify future changes in water availability and water demand, two open source tools have been developed by the JGCRI (Joint Global Change Research Institute): Xanthos and GCAM-STWD. Xanthos is a gridded global hydrologic model, designed to quantify and analyze water availability in 235 river basins. Xanthos uses a runoff generation and a river routing modules to simulate both historical and future estimates of total runoff and streamflows on a monthly time step at a spatial resolution of 0.5 degrees. GCAM-STWD is a spatiotemporal water disaggregation model used with the Global Change Assessment Model (GCAM) to spatially downscale global water demands for six major enduse sectors (irrigation, domestic, electricity generation, mining, and manufacturing) from the region scale to the scale of 0.5 degrees. GCAM-STWD then temporally downscales the gridded annual global water demands to monthly results. These two tools, written in Python, can be integrated to assess global, regional or basin-scale water scarcity or water stress. Both of the tools are extensible to ensure flexibility and promote contribution from researchers that utilize GCAM and study global water use and supply.

  3. Microbial source tracking: a tool for identifying sources of microbial contamination in the food chain.

    Science.gov (United States)

    Fu, Ling-Lin; Li, Jian-Rong

    2014-01-01

    The ability to trace fecal indicators and food-borne pathogens to the point of origin has major ramifications for food industry, food regulatory agencies, and public health. Such information would enable food producers and processors to better understand sources of contamination and thereby take corrective actions to prevent transmission. Microbial source tracking (MST), which currently is largely focused on determining sources of fecal contamination in waterways, is also providing the scientific community tools for tracking both fecal bacteria and food-borne pathogens contamination in the food chain. Approaches to MST are commonly classified as library-dependent methods (LDMs) or library-independent methods (LIMs). These tools will have widespread applications, including the use for regulatory compliance, pollution remediation, and risk assessment. These tools will reduce the incidence of illness associated with food and water. Our aim in this review is to highlight the use of molecular MST methods in application to understanding the source and transmission of food-borne pathogens. Moreover, the future directions of MST research are also discussed.

  4. VSEARCH: a versatile open source tool for metagenomics.

    Science.gov (United States)

    Rognes, Torbjørn; Flouri, Tomáš; Nichols, Ben; Quince, Christopher; Mahé, Frédéric

    2016-01-01

    VSEARCH is an open source and free of charge multithreaded 64-bit tool for processing and preparing metagenomics, genomics and population genomics nucleotide sequence data. It is designed as an alternative to the widely used USEARCH tool (Edgar, 2010) for which the source code is not publicly available, algorithm details are only rudimentarily described, and only a memory-confined 32-bit version is freely available for academic use. When searching nucleotide sequences, VSEARCH uses a fast heuristic based on words shared by the query and target sequences in order to quickly identify similar sequences, a similar strategy is probably used in USEARCH. VSEARCH then performs optimal global sequence alignment of the query against potential target sequences, using full dynamic programming instead of the seed-and-extend heuristic used by USEARCH. Pairwise alignments are computed in parallel using vectorisation and multiple threads. VSEARCH includes most commands for analysing nucleotide sequences available in USEARCH version 7 and several of those available in USEARCH version 8, including searching (exact or based on global alignment), clustering by similarity (using length pre-sorting, abundance pre-sorting or a user-defined order), chimera detection (reference-based or de novo ), dereplication (full length or prefix), pairwise alignment, reverse complementation, sorting, and subsampling. VSEARCH also includes commands for FASTQ file processing, i.e., format detection, filtering, read quality statistics, and merging of paired reads. Furthermore, VSEARCH extends functionality with several new commands and improvements, including shuffling, rereplication, masking of low-complexity sequences with the well-known DUST algorithm, a choice among different similarity definitions, and FASTQ file format conversion. VSEARCH is here shown to be more accurate than USEARCH when performing searching, clustering, chimera detection and subsampling, while on a par with USEARCH for paired

  5. VSEARCH: a versatile open source tool for metagenomics

    Directory of Open Access Journals (Sweden)

    Torbjørn Rognes

    2016-10-01

    Full Text Available Background VSEARCH is an open source and free of charge multithreaded 64-bit tool for processing and preparing metagenomics, genomics and population genomics nucleotide sequence data. It is designed as an alternative to the widely used USEARCH tool (Edgar, 2010 for which the source code is not publicly available, algorithm details are only rudimentarily described, and only a memory-confined 32-bit version is freely available for academic use. Methods When searching nucleotide sequences, VSEARCH uses a fast heuristic based on words shared by the query and target sequences in order to quickly identify similar sequences, a similar strategy is probably used in USEARCH. VSEARCH then performs optimal global sequence alignment of the query against potential target sequences, using full dynamic programming instead of the seed-and-extend heuristic used by USEARCH. Pairwise alignments are computed in parallel using vectorisation and multiple threads. Results VSEARCH includes most commands for analysing nucleotide sequences available in USEARCH version 7 and several of those available in USEARCH version 8, including searching (exact or based on global alignment, clustering by similarity (using length pre-sorting, abundance pre-sorting or a user-defined order, chimera detection (reference-based or de novo, dereplication (full length or prefix, pairwise alignment, reverse complementation, sorting, and subsampling. VSEARCH also includes commands for FASTQ file processing, i.e., format detection, filtering, read quality statistics, and merging of paired reads. Furthermore, VSEARCH extends functionality with several new commands and improvements, including shuffling, rereplication, masking of low-complexity sequences with the well-known DUST algorithm, a choice among different similarity definitions, and FASTQ file format conversion. VSEARCH is here shown to be more accurate than USEARCH when performing searching, clustering, chimera detection and subsampling

  6. Using Open Source Tools to Create a Mobile Optimized, Crowdsourced Translation Tool

    Directory of Open Access Journals (Sweden)

    Evviva Weinraub Lajoie

    2014-04-01

    Full Text Available In late 2012, OSU Libraries and Press partnered with Maria's Libraries, an NGO in Rural Kenya, to provide users the ability to crowdsource translations of folk tales and existing children's books into a variety of African languages, sub-languages, and dialects. Together, these two organizations have been creating a mobile optimized platform using open source libraries such as Wink Toolkit (a library which provides mobile-friendly interaction from a website and Globalize3 to allow for multiple translations of database entries in a Ruby on Rails application. Research regarding successes of similar tools has been utilized in providing a consistent user interface. The OSU Libraries & Press team delivered a proof-of-concept tool that has the opportunity to promote technology exploration, improve early childhood literacy, change the way we approach foreign language learning, and to provide opportunities for cost-effective, multi-language publishing.

  7. The Systems Biology Research Tool: evolvable open-source software

    Directory of Open Access Journals (Sweden)

    Wright Jeremiah

    2008-06-01

    Full Text Available Abstract Background Research in the field of systems biology requires software for a variety of purposes. Software must be used to store, retrieve, analyze, and sometimes even to collect the data obtained from system-level (often high-throughput experiments. Software must also be used to implement mathematical models and algorithms required for simulation and theoretical predictions on the system-level. Results We introduce a free, easy-to-use, open-source, integrated software platform called the Systems Biology Research Tool (SBRT to facilitate the computational aspects of systems biology. The SBRT currently performs 35 methods for analyzing stoichiometric networks and 16 methods from fields such as graph theory, geometry, algebra, and combinatorics. New computational techniques can be added to the SBRT via process plug-ins, providing a high degree of evolvability and a unifying framework for software development in systems biology. Conclusion The Systems Biology Research Tool represents a technological advance for systems biology. This software can be used to make sophisticated computational techniques accessible to everyone (including those with no programming ability, to facilitate cooperation among researchers, and to expedite progress in the field of systems biology.

  8. Open source tools for standardized privacy protection of medical images

    Science.gov (United States)

    Lien, Chung-Yueh; Onken, Michael; Eichelberg, Marco; Kao, Tsair; Hein, Andreas

    2011-03-01

    In addition to the primary care context, medical images are often useful for research projects and community healthcare networks, so-called "secondary use". Patient privacy becomes an issue in such scenarios since the disclosure of personal health information (PHI) has to be prevented in a sharing environment. In general, most PHIs should be completely removed from the images according to the respective privacy regulations, but some basic and alleviated data is usually required for accurate image interpretation. Our objective is to utilize and enhance these specifications in order to provide reliable software implementations for de- and re-identification of medical images suitable for online and offline delivery. DICOM (Digital Imaging and Communications in Medicine) images are de-identified by replacing PHI-specific information with values still being reasonable for imaging diagnosis and patient indexing. In this paper, this approach is evaluated based on a prototype implementation built on top of the open source framework DCMTK (DICOM Toolkit) utilizing standardized de- and re-identification mechanisms. A set of tools has been developed for DICOM de-identification that meets privacy requirements of an offline and online sharing environment and fully relies on standard-based methods.

  9. An Earthquake Information Service with Free and Open Source Tools

    Science.gov (United States)

    Schroeder, M.; Stender, V.; Jüngling, S.

    2015-12-01

    At the GFZ German Research Centre for Geosciences in Potsdam, the working group Earthquakes and Volcano Physics examines the spatiotemporal behavior of earthquakes. In this context also the hazards of volcanic eruptions and tsunamis are explored. The aim is to collect related information after the occurrence of such extreme event and make them available for science and partly to the public as quickly as possible. However, the overall objective of this research is to reduce the geological risks that emanate from such natural hazards. In order to meet the stated objectives and to get a quick overview about the seismicity of a particular region and to compare the situation to historical events, a comprehensive visualization was desired. Based on the web-accessible data from the famous GFZ GEOFON network a user-friendly web mapping application was realized. Further, this web service integrates historical and current earthquake information from the USGS earthquake database, and more historical events from various other catalogues like Pacheco, International Seismological Centre (ISC) and more. This compilation of sources is unique in Earth sciences. Additionally, information about historical and current occurrences of volcanic eruptions and tsunamis are also retrievable. Another special feature in the application is the containment of times via a time shifting tool. Users can interactively vary the visualization by moving the time slider. Furthermore, the application was realized by using the newest JavaScript libraries which enables the application to run in all sizes of displays and devices. Our contribution will present the making of, the architecture behind, and few examples of the look and feel of this application.

  10. Open source engineering and sustainability tools for the built environment

    NARCIS (Netherlands)

    Coenders, J.L.

    2013-01-01

    This paper presents two novel open source software developments for design and engineering in the built environment. The first development, called “sustainability-open” [1], aims on providing open source design, analysis and assessment software source code for (environmental) performance of

  11. Open Source and Proprietary Project Management Tools for SMEs.

    OpenAIRE

    Veronika Abramova; Francisco Pires; Jorge Bernardino

    2017-01-01

    The dimensional growth and increasing difficulty in project management promoted the development of different tools that serve to facilitate project management and track project schedule, resources and overall progress. These tools offer a variety of features, from task and time management, up to integrated CRM (Customer Relationship Management) and ERP (Enterprise Resource Planning) modules. Currently, a large number of project management software is available, to assist project team during t...

  12. Beam simulation tools for GEANT4 (and neutrino source applications)

    International Nuclear Information System (INIS)

    V.Daniel Elvira, Paul Lebrun and Panagiotis Spentzouris email daniel@fnal.gov

    2002-01-01

    Geant4 is a tool kit developed by a collaboration of physicists and computer professionals in the High Energy Physics field for simulation of the passage of particles through matter. The motivation for the development of the Beam Tools is to extend the Geant4 applications to accelerator physics. Although there are many computer programs for beam physics simulations, Geant4 is ideal to model a beam going through material or a system with a beam line integrated to a complex detector. There are many examples in the current international High Energy Physics programs, such as studies related to a future Neutrino Factory, a Linear Collider, and a very Large Hadron Collider

  13. High Fidelity Tool for Noise Source Identification, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — Thorough understanding of airframe and propulsion aerodynamic noise sources and the subsequent acoustic propagation to the farfield is necessary to the design and...

  14. Survivability as a Tool for Evaluating Open Source Software

    Science.gov (United States)

    2015-06-01

    tremendously successful in certain applications such as the Mozilla Firefox web browser and the Apache web server [10]. Open source software is often...source versions (such as Internet Explorer compared to Mozilla Firefox ), which typically conclude that vulnerabilities are, in fact, much more...for radios M. Smith ACS ACS ROS autonomous functionality (none) ACS PX4 Firmware PX4 FMU driver BSD 3-clause ACS PX4 Nuttx real time OS BSD ACS

  15. Integrating Philips' extreme UV source in the alpha-tools

    Science.gov (United States)

    Pankert, Joseph; Apetz, Rolf; Bergmann, Klaus; Derra, Guenther; Janssen, Maurice; Jonkers, Jeroen; Klein, Jurgen; Kruecken, Thomas; List, Andreas; Loeken, Michael; Metzmacher, Christof; Neff, Willi; Probst, Sven; Prummer, Ralph; Rosier, Oliver; Seiwert, Stefan; Siemons, Guido; Vaudrevange, Dominik; Wagemann, Dirk; Weber, Achim; Zink, Peter; Zitzen, Oliver

    2005-05-01

    The paper describes recent progress in the development of the Philips's EUV source. Progress has been realized at many frontiers: Integration studies of the source into a scanner have primarily been studied on the Xe source because it has a high degree of maturity. We report on integration with a collector, associated collector lifetime and optical characteristics. Collector lifetime in excess of 1 bln shots could be demonstrated. Next, an active dose control system was developed and tested on the Xe lamp. Resulting dose stability data are less than 0.2% for an exposure window of 100 pulses. The second part of the paper reports on progress in the development of the Philips' Sn source. First, the details of the concept are described. It is based on a Laser triggered vacuum arc, which is an extension with respect to previous designs. The source is furbished with rotating electrodes that are covered with a Sn film that is constantly regenerated. Hence by the very design of the source, it is scalable to very high power levels, and moreover has fundamentally solved the notorious problem of electrode erosion. Power values of 260 W in 2p sr are reported, along with a stable, long life operation of the lamp. The paper also addresses the problem of debris generation and mitigation of the Sn-source. The problem is attacked by a combined strategy of protection of the collector by traditional means (e.g. fields, foiltraps... ), and by designing the gas atmosphere according to the principles of the well known halogen cycles in incandescent lamps. These principles have been studied in the Lighting industry for decades and rely on the excessively high vapor pressures of metal halides. Transferred to the Sn source, it allows pumping away tin residues that would otherwise irreversibly deposit on the collector.

  16. Methods and apparatus for safely handling radioactive sources in measuring-while-drilling tools

    International Nuclear Information System (INIS)

    Wraight, P.D.

    1989-01-01

    This patent describes a method for removing a chemical radioactive source from a MWD tool which is coupled in a drill string supported by a drilling rig while a borehole is drilled and includes logging means for measuring formation characteristics in response to irradiation of the adjacent formations by the radioactive source during the drilling operation. The steps of the method are: halting the drilling operation and then removing the drill string from the borehole for moving the MWD tool to a work station at the surface where the source is at a safe working distance from the drilling rig and will be accessible by way of one end of the MWD tool; positioning a radiation shield at a location adjacent to the one end of the MWD tool where the shield is ready for receiving the source as it is moved away from the other end of the MWD tool and then moving the source away from the other end of the MWD tool for enclosing the source within the shield; and once the source is enclosed within the shield; removing the shield together with the enclosed source from the MWD tool for transferring the enclosed source to another work station

  17. Oral Development for LSP via Open Source Tools

    Directory of Open Access Journals (Sweden)

    Alejandro Curado Fuentes

    2015-11-01

    Full Text Available For the development of oral abilities in LSP, few computer-based teaching and learning resources have actually focused intensively on web-based listening and speaking. Many more do on reading, writing, vocabulary and grammatical activities. Our aim in this paper is to approach oral communication in the online environment of Moodle by striving to make it suitable for a learning project which incorporates oral skills. The paper describes a blended process in which both individual and collaborative learning strategies can be combined and exploited through the implementation of specific tools and resources which may go hand in hand with traditional face-to-face conversational classes. The challenge with this new perspective is, ultimately, to provide effective tools for oral LSP development in an apparently writing skill-focused medium.

  18. Regulatory inspection: a powerful tool to control industrial radioactive sources

    International Nuclear Information System (INIS)

    Silva, F.C.A. da; Leocadio, J.C.; Ramalho, A.T.

    2008-01-01

    An important contribution for Brazilian development, especially for the quality control of products, is the use of radiation sources by conventional industries. There are in Brazil roughly 3,000 radioactive sources spread out among 950 industries. The main industrial practices involved are: industrial radiography, industrial irradiators, industrial accelerators, well logging petroleum and nuclear gauges. More than 1,800 Radiation Protection Officers (RPOs) were qualified to work in these practices. The present work presents a brief description of the safety control over industrial radioactive installations performed by the Brazilian Regulatory Authority, i.e. the National Commission of Nuclear Energy (CNEN). This paper also describes the national system for radiation safety inspections, the regulation infrastructure and the national inventory of industrial installations. The inspections are based on specific indicators, and their periodicity depends on the risk and type of installation. The present work discusses some relevant aspects that must be considered during the inspections, in order to make the inspections more efficient in controlling the sources. One of these aspects regards the evaluation of the storage place for the sources, a very important parameter for preventing future risky situations. (author)

  19. Educatie en open-source software: meer dan gratis tools.

    NARCIS (Netherlands)

    Bakker, de G.M.

    2008-01-01

    In het onderwijs wordt steeds meer open source software gebruikt, zo merkt ook Gijs de Bakker*. In de regel moet daar niet al te veel achtergezocht worden. 'Het is in de regel niet meer dan een slimme manier om aan gratis en doorgaans kwalitatief goede software te komen'. Wat De Bakker betreft een

  20. Development and Validation of a Standardized Tool for Prioritization of Information Sources.

    Science.gov (United States)

    Akwar, Holy; Kloeze, Harold; Mukhi, Shamir

    2016-01-01

    To validate the utility and effectiveness of a standardized tool for prioritization of information sources for early detection of diseases. The tool was developed with input from diverse public health experts garnered through survey. Ten raters used the tool to evaluate ten information sources and reliability among raters was computed. The Proc mixed procedure with random effect statement and SAS Macros were used to compute multiple raters' Fleiss Kappa agreement and Kendall's Coefficient of Concordance. Ten disparate information sources evaluated obtained the following composite scores: ProMed 91%; WAHID 90%; Eurosurv 87%; MediSys 85%; SciDaily 84%; EurekAl 83%; CSHB 78%; GermTrax 75%; Google 74%; and CBC 70%. A Fleiss Kappa agreement of 50.7% was obtained for ten information sources and 72.5% for a sub-set of five sources rated, which is substantial agreement validating the utility and effectiveness of the tool. This study validated the utility and effectiveness of a standardized criteria tool developed to prioritize information sources. The new tool was used to identify five information sources suited for use by the KIWI system in the CEZD-IIR project to improve surveillance of infectious diseases. The tool can be generalized to situations when prioritization of numerous information sources is necessary.

  1. Vapor Intrusion Estimation Tool for Unsaturated Zone Contaminant Sources. User’s Guide

    Science.gov (United States)

    2016-08-30

    estimation process when applying the tool. The tool described here is focused on vapor-phase diffusion from the current vadose zone source , and is not...from the current defined vadose zone source ). The estimated soil gas contaminant concentration obtained from the pre-modeled scenarios for a building...need a full site-specific numerical model to assess the impacts beyond the current vadose zone source . 35 5.0 References Brennan, R.A., N

  2. The Systems Biology Research Tool: evolvable open-source software

    OpenAIRE

    Wright, J; Wagner, A

    2008-01-01

    Abstract Background Research in the field of systems biology requires software for a variety of purposes. Software must be used to store, retrieve, analyze, and sometimes even to collect the data obtained from system-level (often high-throughput) experiments. Software must also be used to implement mathematical models and algorithms required for simulation and theoretical predictions on the system-level. Results We introduce a free, easy-to-use, open-source, integrated software platform calle...

  3. Open source tools for large-scale neuroscience.

    Science.gov (United States)

    Freeman, Jeremy

    2015-06-01

    New technologies for monitoring and manipulating the nervous system promise exciting biology but pose challenges for analysis and computation. Solutions can be found in the form of modern approaches to distributed computing, machine learning, and interactive visualization. But embracing these new technologies will require a cultural shift: away from independent efforts and proprietary methods and toward an open source and collaborative neuroscience. Copyright © 2015 The Author. Published by Elsevier Ltd.. All rights reserved.

  4. WRITTEN COMMUNICATION IN BUSINESS

    OpenAIRE

    Oana COSMAN

    2013-01-01

    The article examines the work of researchers primarily interested in the investigation of written communication in business settings. The author regards 'business discourse' as a field of study with distinct features in the domain of discourse analysis. Thus, the paper overviews the most important contributions to the development of written business discourse with a number of landmark studies. To gain a greater understanding of the written business discourse, the author also investigates some...

  5. Neural Monkey: An Open-source Tool for Sequence Learning

    Directory of Open Access Journals (Sweden)

    Helcl Jindřich

    2017-04-01

    Full Text Available In this paper, we announce the development of Neural Monkey – an open-source neural machine translation (NMT and general sequence-to-sequence learning system built over the TensorFlow machine learning library. The system provides a high-level API tailored for fast prototyping of complex architectures with multiple sequence encoders and decoders. Models’ overall architecture is specified in easy-to-read configuration files. The long-term goal of the Neural Monkey project is to create and maintain a growing collection of implementations of recently proposed components or methods, and therefore it is designed to be easily extensible. Trained models can be deployed either for batch data processing or as a web service. In the presented paper, we describe the design of the system and introduce the reader to running experiments using Neural Monkey.

  6. Open source tools and toolkits for bioinformatics: significance, and where are we?

    Science.gov (United States)

    Stajich, Jason E; Lapp, Hilmar

    2006-09-01

    This review summarizes important work in open-source bioinformatics software that has occurred over the past couple of years. The survey is intended to illustrate how programs and toolkits whose source code has been developed or released under an Open Source license have changed informatics-heavy areas of life science research. Rather than creating a comprehensive list of all tools developed over the last 2-3 years, we use a few selected projects encompassing toolkit libraries, analysis tools, data analysis environments and interoperability standards to show how freely available and modifiable open-source software can serve as the foundation for building important applications, analysis workflows and resources.

  7. Anaerobes as Sources of Bioactive Compounds and Health Promoting Tools.

    Science.gov (United States)

    Mamo, Gashaw

    Aerobic microorganisms have been sources of medicinal agents for several decades and an impressive variety of drugs have been isolated from their cultures, studied and formulated to treat or prevent diseases. On the other hand, anaerobes, which are believed to be the oldest life forms on earth and evolved remarkably diverse physiological functions, have largely been neglected as sources of bioactive compounds. However, results obtained from the limited research done so far show that anaerobes are capable of producing a range of interesting bioactive compounds that can promote human health. In fact, some of these bioactive compounds are found to be novel in their structure and/or mode of action.Anaerobes play health-promoting roles through their bioactive products as well as application of whole cells. The bioactive compounds produced by these microorganisms include antimicrobial agents and substances such as immunomodulators and vitamins. Bacteriocins produced by anaerobes have been in use as preservatives for about 40 years. Because these substances are effective at low concentrations, encounter relatively less resistance from bacteria and are safe to use, there is a growing interest in these antimicrobial agents. Moreover, several antibiotics have been reported from the cultures of anaerobes. Closthioamide and andrimid produced by Clostridium cellulolyticum and Pantoea agglomerans, respectively, are examples of novel antibiotics of anaerobe origin. The discovery of such novel bioactive compounds is expected to encourage further studies which can potentially lead to tapping of the antibiotic production potential of this fascinating group of microorganisms.Anaerobes are widely used in preparation of fermented foods and beverages. During the fermentation processes, these organisms produce a number of bioactive compounds including anticancer, antihypertensive and antioxidant substances. The well-known health promoting effect of fermented food is mostly due to these

  8. Applying open source data visualization tools to standard based medical data.

    Science.gov (United States)

    Kopanitsa, Georgy; Taranik, Maxim

    2014-01-01

    Presentation of medical data in personal health records (PHRs) requires flexible platform independent tools to ensure easy access to the information. Different backgrounds of the patients, especially elder people require simple graphical presentation of the data. Data in PHRs can be collected from heterogeneous sources. Application of standard based medical data allows development of generic visualization methods. Focusing on the deployment of Open Source Tools, in this paper we applied Java Script libraries to create data presentations for standard based medical data.

  9. Development and validation of an open source quantification tool for DSC-MRI studies.

    Science.gov (United States)

    Gordaliza, P M; Mateos-Pérez, J M; Montesinos, P; Guzmán-de-Villoria, J A; Desco, M; Vaquero, J J

    2015-03-01

    This work presents the development of an open source tool for the quantification of dynamic susceptibility-weighted contrast-enhanced (DSC) perfusion studies. The development of this tool is motivated by the lack of open source tools implemented on open platforms to allow external developers to implement their own quantification methods easily and without the need of paying for a development license. This quantification tool was developed as a plugin for the ImageJ image analysis platform using the Java programming language. A modular approach was used in the implementation of the components, in such a way that the addition of new methods can be done without breaking any of the existing functionalities. For the validation process, images from seven patients with brain tumors were acquired and quantified with the presented tool and with a widely used clinical software package. The resulting perfusion parameters were then compared. Perfusion parameters and the corresponding parametric images were obtained. When no gamma-fitting is used, an excellent agreement with the tool used as a gold-standard was obtained (R(2)>0.8 and values are within 95% CI limits in Bland-Altman plots). An open source tool that performs quantification of perfusion studies using magnetic resonance imaging has been developed and validated using a clinical software package. It works as an ImageJ plugin and the source code has been published with an open source license. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Using Open Source Tools to Create a Mobile Optimized, Crowdsourced Translation Tool

    OpenAIRE

    Evviva Weinraub Lajoie; Trey Terrell; Susan McEvoy; Eva Kaplan; Ariel Schwartz; Esther Ajambo

    2014-01-01

    In late 2012, OSU Libraries and Press partnered with Maria's Libraries, an NGO in Rural Kenya, to provide users the ability to crowdsource translations of folk tales and existing children's books into a variety of African languages, sub-languages, and dialects. Together, these two organizations have been creating a mobile optimized platform using open source libraries such as Wink Toolkit (a library which provides mobile-friendly interaction from a website) and Globalize3 to allow for multipl...

  11. BAT: An open-source, web-based audio events annotation tool

    OpenAIRE

    Blai Meléndez-Catalan, Emilio Molina, Emilia Gómez

    2017-01-01

    In this paper we present BAT (BMAT Annotation Tool), an open-source, web-based tool for the manual annotation of events in audio recordings developed at BMAT (Barcelona Music and Audio Technologies). The main feature of the tool is that it provides an easy way to annotate the salience of simultaneous sound sources. Additionally, it allows to define multiple ontologies to adapt to multiple tasks and offers the possibility to cross-annotate audio data. Moreover, it is easy to install and deploy...

  12. “Materials for the Dictionary of the Old Russian Language in the Written Records” by I.I. Sreznevskiy As the Source of Diachronic Research of the Substantive Word-Formation

    Directory of Open Access Journals (Sweden)

    Anastasiya Yuryevna Vekolova

    2015-12-01

    Full Text Available The article presents the results of the historical research in historical aspect on word-formation based on «Materials for the dictionary of the old Russian language in the written records» by I.I. Sreznevskiy that is characterized as the most important source of lexicographical material for the diachronic research. The dictionary is the only completed lexicographical source that reflects the language in the XI-XVII cent. It includes samples of the old Slavic and the old Russian written monuments, thus demonstrating lexis from the variety of sources. Its entries represent data on lexical, in particular word building system of the Old Russian language. The significance of the «Materials for the dictionary of the old Russian language in the written records» by I.I. Sreznevskiy for the diachronic research of the substantive wordformation is proved with the system of the old Russian substantive derivatives with evaluative suffixes that was allocated in the research. Productive modification formants are revealed, their morphological characteristics are considered. Special attention is concentrated on the analysis of the suffixal frequency. On the basis of the dictionary data connotation of affixes is characterized, options of suffixes are given. It is noted that these morphemes have a positive or negative assessment. The compiler of this dictionary pays attention to the connotation. The suggested indication of the word allows defining the boundaries of suffixes. Examples of the derivatives with evaluative affixes in context are given. It is emphasized that the presence of the usage helps to systematic comprehension of the material.

  13. RETRANS - A tool to verify the functional equivalence of automatically generated source code with its specification

    International Nuclear Information System (INIS)

    Miedl, H.

    1998-01-01

    Following the competent technical standards (e.g. IEC 880) it is necessary to verify each step in the development process of safety critical software. This holds also for the verification of automatically generated source code. To avoid human errors during this verification step and to limit the cost effort a tool should be used which is developed independently from the development of the code generator. For this purpose ISTec has developed the tool RETRANS which demonstrates the functional equivalence of automatically generated source code with its underlying specification. (author)

  14. Text mining and visualization case studies using open-source tools

    CERN Document Server

    Chisholm, Andrew

    2016-01-01

    Text Mining and Visualization: Case Studies Using Open-Source Tools provides an introduction to text mining using some of the most popular and powerful open-source tools: KNIME, RapidMiner, Weka, R, and Python. The contributors-all highly experienced with text mining and open-source software-explain how text data are gathered and processed from a wide variety of sources, including books, server access logs, websites, social media sites, and message boards. Each chapter presents a case study that you can follow as part of a step-by-step, reproducible example. You can also easily apply and extend the techniques to other problems. All the examples are available on a supplementary website. The book shows you how to exploit your text data, offering successful application examples and blueprints for you to tackle your text mining tasks and benefit from open and freely available tools. It gets you up to date on the latest and most powerful tools, the data mining process, and specific text mining activities.

  15. Total organic carbon, an important tool in an holistic approach to hydrocarbon source fingerprinting

    Energy Technology Data Exchange (ETDEWEB)

    Boehm, P.D.; Burns, W.A.; Page, D.S.; Bence, A.E.; Mankiewicz, P.J.; Brown, J.S.; Douglas, G.S. [Battelle Member Inst., Waltham, MA (United States)

    2002-07-01

    The identification and allocation of multiple hydrocarbon sources in marine sediments is best achieved using an holistic approach. Total organic carbon (TOC) is one important tool that can constrain the contributions of specific sources and rule out incorrect source allocations in cases where inputs are dominated by fossil organic carbon. In a study of the benthic sediments from Prince William Sound (PWS) and the Gulf of Alaska (GOA), we find excellent agreement between measured TOC and TOC calculated from hydrocarbon fingerprint matches of polycyclic aromatic hydrocarbons (PAH) and chemical biomarkers. Confirmation by two such independent source indicators (TOC and fingerprint matches) provides evidence that source allocations determined by the fingerprint matches are robust and that the major TOC sources have been correctly identified. Fingerprint matches quantify the hydrocarbon contributions of various sources to the benthic sediments and the degree of hydrocarbon winnowing by waves and currents. TOC contents are then calculated using source allocation results from fingerprint matches and the TOCs of contributing sources. Comparisons of the actual sediment TOC values and those calculated from source allocation support our earlier published findings that the natural petrogenic hydrocarbon background in sediments in this area comes from eroding Tertiary shales and associated oil seeps along the northern GOA coast and exclude thermally mature area coals from being important contributors to the PWS background due to their high TOC content.

  16. An Open-Source Tool Set Enabling Analog-Digital-Software Co-Design

    Directory of Open Access Journals (Sweden)

    Michelle Collins

    2016-02-01

    Full Text Available This paper presents an analog-digital hardware-software co-design environment for simulating and programming reconfigurable systems. The tool simulates, designs, as well as enables experimental measurements after compiling to configurable systems in the same integrated design tool framework. High level software in Scilab/Xcos (open-source programs similar to MATLAB/Simulink that converts the high-level block description by the user to blif format (sci2blif, which acts as an input to the modified VPR tool, including the code v p r 2 s w c s , encoding the specific platform through specific architecture files, resulting in a targetable switch list on the resulting configurable analog–digital system. The resulting tool uses an analog and mixed-signal library of components, enabling users and future researchers access to the basic analog operations/computations that are possible.

  17. ThinkHazard!: an open-source, global tool for understanding hazard information

    Science.gov (United States)

    Fraser, Stuart; Jongman, Brenden; Simpson, Alanna; Nunez, Ariel; Deparday, Vivien; Saito, Keiko; Murnane, Richard; Balog, Simone

    2016-04-01

    Rapid and simple access to added-value natural hazard and disaster risk information is a key issue for various stakeholders of the development and disaster risk management (DRM) domains. Accessing available data often requires specialist knowledge of heterogeneous data, which are often highly technical and can be difficult for non-specialists in DRM to find and exploit. Thus, availability, accessibility and processing of these information sources are crucial issues, and an important reason why many development projects suffer significant impacts from natural hazards. The World Bank's Global Facility for Disaster Reduction and Recovery (GFDRR) is currently developing a new open-source tool to address this knowledge gap: ThinkHazard! The main aim of the ThinkHazard! project is to develop an analytical tool dedicated to facilitating improvements in knowledge and understanding of natural hazards among non-specialists in DRM. It also aims at providing users with relevant guidance and information on handling the threats posed by the natural hazards present in a chosen location. Furthermore, all aspects of this tool will be open and transparent, in order to give users enough information to understand its operational principles. In this presentation, we will explain the technical approach behind the tool, which translates state-of-the-art probabilistic natural hazard data into understandable hazard classifications and practical recommendations. We will also demonstrate the functionality of the tool, and discuss limitations from a scientific as well as an operational perspective.

  18. Semantic integration of gene expression analysis tools and data sources using software connectors

    Science.gov (United States)

    2013-01-01

    Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools

  19. Methods and tools to evaluate the availability of renewable energy sources

    International Nuclear Information System (INIS)

    Angelis-Dimakis, Athanasios; Kartalidis, Avraam; Biberacher, Markus; Gadocha, Sabine; Dominguez, Javier; Pinedo, Irene; Fiorese, Giulia; Gnansounou, Edgard; Panichelli, Luis; Guariso, Giorgio; Robba, Michela

    2011-01-01

    The recent statements of both the European Union and the US Presidency pushed in the direction of using renewable forms of energy, in order to act against climate changes induced by the growing concentration of carbon dioxide in the atmosphere. In this paper, a survey regarding methods and tools presently available to determine potential and exploitable energy in the most important renewable sectors (i.e., solar, wind, wave, biomass and geothermal energy) is presented. Moreover, challenges for each renewable resource are highlighted as well as the available tools that can help in evaluating the use of a mix of different sources. (author)

  20. Tracing catchment fine sediment sources using the new SIFT (SedIment Fingerprinting Tool) open source software.

    Science.gov (United States)

    Pulley, S; Collins, A L

    2018-09-01

    The mitigation of diffuse sediment pollution requires reliable provenance information so that measures can be targeted. Sediment source fingerprinting represents one approach for supporting these needs, but recent methodological developments have resulted in an increasing complexity of data processing methods rendering the approach less accessible to non-specialists. A comprehensive new software programme (SIFT; SedIment Fingerprinting Tool) has therefore been developed which guides the user through critical data analysis decisions and automates all calculations. Multiple source group configurations and composite fingerprints are identified and tested using multiple methods of uncertainty analysis. This aims to explore the sediment provenance information provided by the tracers more comprehensively than a single model, and allows for model configurations with high uncertainties to be rejected. This paper provides an overview of its application to an agricultural catchment in the UK to determine if the approach used can provide a reduction in uncertainty and increase in precision. Five source group classifications were used; three formed using a k-means cluster analysis containing 2, 3 and 4 clusters, and two a-priori groups based upon catchment geology. Three different composite fingerprints were used for each classification and bi-plots, range tests, tracer variability ratios and virtual mixtures tested the reliability of each model configuration. Some model configurations performed poorly when apportioning the composition of virtual mixtures, and different model configurations could produce different sediment provenance results despite using composite fingerprints able to discriminate robustly between the source groups. Despite this uncertainty, dominant sediment sources were identified, and those in close proximity to each sediment sampling location were found to be of greatest importance. This new software, by integrating recent methodological developments in

  1. Identifying Sources of Clinical Conflict: A Tool for Practice and Training in Bioethics Mediation.

    Science.gov (United States)

    Bergman, Edward J

    2015-01-01

    Bioethics mediators manage a wide range of clinical conflict emanating from diverse sources. Parties to clinical conflict are often not fully aware of, nor willing to express, the true nature and scope of their conflict. As such, a significant task of the bioethics mediator is to help define that conflict. The ability to assess and apply the tools necessary for an effective mediation process can be facilitated by each mediator's creation of a personal compendium of sources that generate clinical conflict, to provide an orientation for the successful management of complex dilemmatic cases. Copyright 2015 The Journal of Clinical Ethics. All rights reserved.

  2. An open source GIS-based tool to integrate the fragmentation mechanism in rockfall propagation

    Science.gov (United States)

    Matas, Gerard; Lantada, Nieves; Gili, Josep A.; Corominas, Jordi

    2015-04-01

    Rockfalls are frequent instability processes in road cuts, open pit mines and quarries, steep slopes and cliffs. Even though the stability of rock slopes can be determined using analytical approaches, the assessment of large rock cliffs require simplifying assumptions due to the difficulty of working with a large amount of joints, the scattering of both the orientations and strength parameters. The attitude and persistency of joints within the rock mass define the size of kinematically unstable rock volumes. Furthermore the rock block will eventually split in several fragments during its propagation downhill due its impact with the ground surface. Knowledge of the size, energy, trajectory… of each block resulting from fragmentation is critical in determining the vulnerability of buildings and protection structures. The objective of this contribution is to present a simple and open source tool to simulate the fragmentation mechanism in rockfall propagation models and in the calculation of impact energies. This tool includes common modes of motion for falling boulders based on the previous literature. The final tool is being implemented in a GIS (Geographic Information Systems) using open source Python programming. The tool under development will be simple, modular, compatible with any GIS environment, open source, able to model rockfalls phenomena correctly. It could be used in any area susceptible to rockfalls with a previous adjustment of the parameters. After the adjustment of the model parameters to a given area, a simulation could be performed to obtain maps of kinetic energy, frequency, stopping density and passing heights. This GIS-based tool and the analysis of the fragmentation laws using data collected from recent rockfall have being developed within the RockRisk Project (2014-2016). This project is funded by the Spanish Ministerio de Economía y Competitividad and entitled "Rockfalls in cliffs: risk quantification and its prevention"(BIA2013-42582-P).

  3. iPhone Open Application Development Write Native Applications Using the Open Source Tool Chain

    CERN Document Server

    Zdziarski, Jonathan

    2008-01-01

    Developers everywhere are eager to create applications for the iPhone, and many of them prefer the open source, community-developed tool chain to Apple's own toolkit. This new edition of iPhone Open Application Development covers the latest version of the open toolkit -- now updated for Apple's iPhone 2.x software and iPhone 3G -- and explains in clear language how to create applications using Objective-C and the iPhone API.

  4. A Benchmarking Analysis of Open-Source Business Intelligence Tools in Healthcare Environments

    Directory of Open Access Journals (Sweden)

    Andreia Brandão

    2016-10-01

    Full Text Available In recent years, a wide range of Business Intelligence (BI technologies have been applied to different areas in order to support the decision-making process. BI enables the extraction of knowledge from the data stored. The healthcare industry is no exception, and so BI applications have been under investigation across multiple units of different institutions. Thus, in this article, we intend to analyze some open-source/free BI tools on the market and their applicability in the clinical sphere, taking into consideration the general characteristics of the clinical environment. For this purpose, six BI tools were selected, analyzed, and tested in a practical environment. Then, a comparison metric and a ranking were defined for the tested applications in order to choose the one that best applies to the extraction of useful knowledge and clinical data in a healthcare environment. Finally, a pervasive BI platform was developed using a real case in order to prove the tool viability.

  5. Interactive, open source, travel time scenario modelling: tools to facilitate participation in health service access analysis.

    Science.gov (United States)

    Fisher, Rohan; Lassa, Jonatan

    2017-04-18

    Modelling travel time to services has become a common public health tool for planning service provision but the usefulness of these analyses is constrained by the availability of accurate input data and limitations inherent in the assumptions and parameterisation. This is particularly an issue in the developing world where access to basic data is limited and travel is often complex and multi-modal. Improving the accuracy and relevance in this context requires greater accessibility to, and flexibility in, travel time modelling tools to facilitate the incorporation of local knowledge and the rapid exploration of multiple travel scenarios. The aim of this work was to develop simple open source, adaptable, interactive travel time modelling tools to allow greater access to and participation in service access analysis. Described are three interconnected applications designed to reduce some of the barriers to the more wide-spread use of GIS analysis of service access and allow for complex spatial and temporal variations in service availability. These applications are an open source GIS tool-kit and two geo-simulation models. The development of these tools was guided by health service issues from a developing world context but they present a general approach to enabling greater access to and flexibility in health access modelling. The tools demonstrate a method that substantially simplifies the process for conducting travel time assessments and demonstrate a dynamic, interactive approach in an open source GIS format. In addition this paper provides examples from empirical experience where these tools have informed better policy and planning. Travel and health service access is complex and cannot be reduced to a few static modeled outputs. The approaches described in this paper use a unique set of tools to explore this complexity, promote discussion and build understanding with the goal of producing better planning outcomes. The accessible, flexible, interactive and

  6. Online characterization of planetary surfaces: PlanetServer, an open-source analysis and visualization tool

    Science.gov (United States)

    Marco Figuera, R.; Pham Huu, B.; Rossi, A. P.; Minin, M.; Flahaut, J.; Halder, A.

    2018-01-01

    The lack of open-source tools for hyperspectral data visualization and analysis creates a demand for new tools. In this paper we present the new PlanetServer, a set of tools comprising a web Geographic Information System (GIS) and a recently developed Python Application Programming Interface (API) capable of visualizing and analyzing a wide variety of hyperspectral data from different planetary bodies. Current WebGIS open-source tools are evaluated in order to give an overview and contextualize how PlanetServer can help in this matters. The web client is thoroughly described as well as the datasets available in PlanetServer. Also, the Python API is described and exposed the reason of its development. Two different examples of mineral characterization of different hydrosilicates such as chlorites, prehnites and kaolinites in the Nili Fossae area on Mars are presented. As the obtained results show positive outcome in hyperspectral analysis and visualization compared to previous literature, we suggest using the PlanetServer approach for such investigations.

  7. EEGNET: An Open Source Tool for Analyzing and Visualizing M/EEG Connectome.

    Science.gov (United States)

    Hassan, Mahmoud; Shamas, Mohamad; Khalil, Mohamad; El Falou, Wassim; Wendling, Fabrice

    2015-01-01

    The brain is a large-scale complex network often referred to as the "connectome". Exploring the dynamic behavior of the connectome is a challenging issue as both excellent time and space resolution is required. In this context Magneto/Electroencephalography (M/EEG) are effective neuroimaging techniques allowing for analysis of the dynamics of functional brain networks at scalp level and/or at reconstructed sources. However, a tool that can cover all the processing steps of identifying brain networks from M/EEG data is still missing. In this paper, we report a novel software package, called EEGNET, running under MATLAB (Math works, inc), and allowing for analysis and visualization of functional brain networks from M/EEG recordings. EEGNET is developed to analyze networks either at the level of scalp electrodes or at the level of reconstructed cortical sources. It includes i) Basic steps in preprocessing M/EEG signals, ii) the solution of the inverse problem to localize / reconstruct the cortical sources, iii) the computation of functional connectivity among signals collected at surface electrodes or/and time courses of reconstructed sources and iv) the computation of the network measures based on graph theory analysis. EEGNET is the unique tool that combines the M/EEG functional connectivity analysis and the computation of network measures derived from the graph theory. The first version of EEGNET is easy to use, flexible and user friendly. EEGNET is an open source tool and can be freely downloaded from this webpage: https://sites.google.com/site/eegnetworks/.

  8. An integrated, open-source set of tools for urban vulnerability monitoring from Earth observation data

    Science.gov (United States)

    De Vecchi, Daniele; Harb, Mostapha; Dell'Acqua, Fabio; Aurelio Galeazzo, Daniel

    2015-04-01

    Aim: The paper introduces an integrated set of open-source tools designed to process medium and high-resolution imagery with the aim to extract vulnerability indicators [1]. Problem: In the context of risk monitoring [2], a series of vulnerability proxies can be defined, such as the extension of a built-up area or buildings regularity [3]. Different open-source C and Python libraries are already available for image processing and geospatial information (e.g. OrfeoToolbox, OpenCV and GDAL). They include basic processing tools but not vulnerability-oriented workflows. Therefore, it is of significant importance to provide end-users with a set of tools capable to return information at a higher level. Solution: The proposed set of python algorithms is a combination of low-level image processing and geospatial information handling tools along with high-level workflows. In particular, two main products are released under the GPL license: source code, developers-oriented, and a QGIS plugin. These tools were produced within the SENSUM project framework (ended December 2014) where the main focus was on earthquake and landslide risk. Further development and maintenance is guaranteed by the decision to include them in the platform designed within the FP 7 RASOR project . Conclusion: With the lack of a unified software suite for vulnerability indicators extraction, the proposed solution can provide inputs for already available models like the Global Earthquake Model. The inclusion of the proposed set of algorithms within the RASOR platforms can guarantee support and enlarge the community of end-users. Keywords: Vulnerability monitoring, remote sensing, optical imagery, open-source software tools References [1] M. Harb, D. De Vecchi, F. Dell'Acqua, "Remote sensing-based vulnerability proxies in the EU FP7 project SENSUM", Symposium on earthquake and landslide risk in Central Asia and Caucasus: exploiting remote sensing and geo-spatial information management, 29-30th January 2014

  9. Multiband Study of Radio Sources of the RCR Catalogue with Virtual Observatory Tools

    Directory of Open Access Journals (Sweden)

    Zhelenkova O. P.

    2012-09-01

    Full Text Available We present early results of our multiband study of the RATAN Cold Revised (RCR catalogue obtained from seven cycles of the “Cold” survey carried with the RATAN-600 radio telescope at 7.6 cm in 1980-1999, at the declination of the SS 433 source. We used the 2MASS and LAS UKIDSS infrared surveys, the DSS-II and SDSS DR7 optical surveys, as well as the USNO-B1 and GSC-II catalogues, the VLSS, TXS, NVSS, FIRST and GB6 radio surveys to accumulate information about the sources. For radio sources that have no detectable optical candidate in optical or infrared catalogues, we additionally looked through images in several bands from the SDSS, LAS UKIDSS, DPOSS, 2MASS surveys and also used co-added frames in different bands. We reliably identified 76% of radio sources of the RCR catalogue. We used the ALADIN and SAOImage DS9 scripting capabilities, interoperability services of ALADIN and TOPCAT, and also other Virtual Observatory (VO tools and resources, such as CASJobs, NED, Vizier, and WSA, for effective data access, visualization and analysis. Without VO tools it would have been problematic to perform our study.

  10. The European source term code ESTER - basic ideas and tools for coupling of ATHLET and ESTER

    International Nuclear Information System (INIS)

    Schmidt, F.; Schuch, A.; Hinkelmann, M.

    1993-04-01

    The French software house CISI and IKE of the University of Stuttgart have developed during 1990 and 1991 in the frame of the Shared Cost Action Reactor Safety the informatic structure of the European Source TERm Evaluation System (ESTER). Due to this work tools became available which allow to unify on an European basis both code development and code application in the area of severe core accident research. The behaviour of reactor cores is determined by thermal hydraulic conditions. Therefore for the development of ESTER it was important to investigate how to integrate thermal hydraulic code systems with ESTER applications. This report describes the basic ideas of ESTER and improvements of ESTER tools in view of a possible coupling of the thermal hydraulic code system ATHLET and ESTER. Due to the work performed during this project the ESTER tools became the most modern informatic tools presently available in the area of severe accident research. A sample application is given which demonstrates the use of the new tools. (orig.) [de

  11. Biomechanical ToolKit: Open-source framework to visualize and process biomechanical data.

    Science.gov (United States)

    Barre, Arnaud; Armand, Stéphane

    2014-04-01

    C3D file format is widely used in the biomechanical field by companies and laboratories to store motion capture systems data. However, few software packages can visualize and modify the integrality of the data in the C3D file. Our objective was to develop an open-source and multi-platform framework to read, write, modify and visualize data from any motion analysis systems using standard (C3D) and proprietary file formats (used by many companies producing motion capture systems). The Biomechanical ToolKit (BTK) was developed to provide cost-effective and efficient tools for the biomechanical community to easily deal with motion analysis data. A large panel of operations is available to read, modify and process data through C++ API, bindings for high-level languages (Matlab, Octave, and Python), and standalone application (Mokka). All these tools are open-source and cross-platform and run on all major operating systems (Windows, Linux, MacOS X). Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  12. A new energy analysis tool for ground source heat pump systems

    Energy Technology Data Exchange (ETDEWEB)

    Michopoulos, A.; Kyriakis, N. [Process Equipment Design Laboratory, Mechanical Engineering Department, Aristotle University of Thessaloniki, POB 487, 541 24 Thessaloniki (Greece)

    2009-09-15

    A new tool, suitable for energy analysis of vertical ground source heat pump systems, is presented. The tool is based on analytical equations describing the heat exchanged with the ground, developed in Matlab {sup registered} environment. The time step of the simulation can be freely chosen by the user (e.g. 1, 2 h etc.) and the calculation time required is very short. The heating and cooling loads of the building, at the afore mentioned time step, are needed as input, along with the thermophysical properties of the soil and of the ground heat exchanger, the operation characteristic curves of the system's heat pumps and the basic ground source heat exchanger dimensions. The results include the electricity consumption of the system and the heat absorbed from or rejected to the ground. The efficiency of the tool is verified through comparison with actual electricity consumption data collected from an existing large scale ground coupled heat pump installation over a three-year period. (author)

  13. Neurophysiological analytics for all! Free open-source software tools for documenting, analyzing, visualizing, and sharing using electronic notebooks.

    Science.gov (United States)

    Rosenberg, David M; Horn, Charles C

    2016-08-01

    Neurophysiology requires an extensive workflow of information analysis routines, which often includes incompatible proprietary software, introducing limitations based on financial costs, transfer of data between platforms, and the ability to share. An ecosystem of free open-source software exists to fill these gaps, including thousands of analysis and plotting packages written in Python and R, which can be implemented in a sharable and reproducible format, such as the Jupyter electronic notebook. This tool chain can largely replace current routines by importing data, producing analyses, and generating publication-quality graphics. An electronic notebook like Jupyter allows these analyses, along with documentation of procedures, to display locally or remotely in an internet browser, which can be saved as an HTML, PDF, or other file format for sharing with team members and the scientific community. The present report illustrates these methods using data from electrophysiological recordings of the musk shrew vagus-a model system to investigate gut-brain communication, for example, in cancer chemotherapy-induced emesis. We show methods for spike sorting (including statistical validation), spike train analysis, and analysis of compound action potentials in notebooks. Raw data and code are available from notebooks in data supplements or from an executable online version, which replicates all analyses without installing software-an implementation of reproducible research. This demonstrates the promise of combining disparate analyses into one platform, along with the ease of sharing this work. In an age of diverse, high-throughput computational workflows, this methodology can increase efficiency, transparency, and the collaborative potential of neurophysiological research. Copyright © 2016 the American Physiological Society.

  14. An open source GIS tool to quantify the visual impact of wind turbines and photovoltaic panels

    International Nuclear Information System (INIS)

    Minelli, Annalisa; Marchesini, Ivan; Taylor, Faith E.; De Rosa, Pierluigi; Casagrande, Luca; Cenci, Michele

    2014-01-01

    Although there are clear economic and environmental incentives for producing energy from solar and wind power, there can be local opposition to their installation due to their impact upon the landscape. To date, no international guidelines exist to guide quantitative visual impact assessment of these facilities, making the planning process somewhat subjective. In this paper we demonstrate the development of a method and an Open Source GIS tool to quantitatively assess the visual impact of these facilities using line-of-site techniques. The methods here build upon previous studies by (i) more accurately representing the shape of energy producing facilities, (ii) taking into account the distortion of the perceived shape and size of facilities caused by the location of the observer, (iii) calculating the possible obscuring of facilities caused by terrain morphology and (iv) allowing the combination of various facilities to more accurately represent the landscape. The tool has been applied to real and synthetic case studies and compared to recently published results from other models, and demonstrates an improvement in accuracy of the calculated visual impact of facilities. The tool is named r.wind.sun and is freely available from GRASS GIS AddOns. - Highlights: • We develop a tool to quantify wind turbine and photovoltaic panel visual impact. • The tool is freely available to download and edit as a module of GRASS GIS. • The tool takes into account visual distortion of the shape and size of objects. • The accuracy of calculation of visual impact is improved over previous methods

  15. An open source GIS tool to quantify the visual impact of wind turbines and photovoltaic panels

    Energy Technology Data Exchange (ETDEWEB)

    Minelli, Annalisa, E-mail: Annalisa.Minelli@univ-brest.fr [Insitute Universitaire Européen de la Mer, Université de la Bretagne Occidentale, Rue Dumont D' Urville, 29280 Plouzané (France); Marchesini, Ivan, E-mail: Ivan.Marchesini@irpi.cnr.it [National Research Council (CNR), Research Insitute for Geo-hydrological Protection (IRPI), Strada della Madonna Alta 126, 06125 Perugia (Italy); Taylor, Faith E., E-mail: Faith.Taylor@kcl.ac.uk [Earth and Environmental Dynamics Research Group, Department of Geography, King' s College London, Strand, London WC2R 2LS (United Kingdom); De Rosa, Pierluigi, E-mail: Pierluigi.Derosa@unipg.it [Physics and Geology Department, University of Perugia, Via Zefferino Faina 4, 06123 Perugia (Italy); Casagrande, Luca, E-mail: Luca.Casagrande@gfosservices.it [Gfosservices S.A., Open Source GIS-WebGIS Solutions, Spatial Data Infrastructures, Planning and Counseling, Via F.lli Cairoli 24, 06127 Perugia (Italy); Cenci, Michele, E-mail: mcenci@regione.umbria.it [Servizio Energia qualità dell' ambiente, rifiuti, attività estrattive, Regione Umbia, Corso Vannucci 96, 06121 Perugia (Italy)

    2014-11-15

    Although there are clear economic and environmental incentives for producing energy from solar and wind power, there can be local opposition to their installation due to their impact upon the landscape. To date, no international guidelines exist to guide quantitative visual impact assessment of these facilities, making the planning process somewhat subjective. In this paper we demonstrate the development of a method and an Open Source GIS tool to quantitatively assess the visual impact of these facilities using line-of-site techniques. The methods here build upon previous studies by (i) more accurately representing the shape of energy producing facilities, (ii) taking into account the distortion of the perceived shape and size of facilities caused by the location of the observer, (iii) calculating the possible obscuring of facilities caused by terrain morphology and (iv) allowing the combination of various facilities to more accurately represent the landscape. The tool has been applied to real and synthetic case studies and compared to recently published results from other models, and demonstrates an improvement in accuracy of the calculated visual impact of facilities. The tool is named r.wind.sun and is freely available from GRASS GIS AddOns. - Highlights: • We develop a tool to quantify wind turbine and photovoltaic panel visual impact. • The tool is freely available to download and edit as a module of GRASS GIS. • The tool takes into account visual distortion of the shape and size of objects. • The accuracy of calculation of visual impact is improved over previous methods.

  16. Reconsidering Written Language

    Directory of Open Access Journals (Sweden)

    Gopal P. Sarma

    2015-07-01

    Full Text Available A number of elite thinkers in Europe during the 16th and 17th centuries pursued an agenda which historian Paolo Rossi calls the “quest for a universal language,” a quest which was deeply interwoven with the emergence of the scientific method. From a modern perspective, one of the many surprising aspects of these efforts is that they relied on a diverse array of memorization techniques as foundational elements. In the case of Leibniz’s universal calculus, the ultimate vision was to create a pictorial language that could be learned by anyone in a matter of weeks and which would contain within it a symbolic representation of all domains of contemporary thought, ranging from the natural sciences, to theology, to law. In this brief article, I explore why this agenda might have been appealing to thinkers of this era by examining ancient and modern memory feats. As a thought experiment, I suggest that a society built entirely upon memorization might be less limited than we might otherwise imagine, and furthermore, that cultural norms discouraging the use of written language might have had implications for the development of scientific methodology. Viewed in this light, the efforts of Leibniz and others seem significantly less surprising. I close with some general observations about cross-cultural origins of scientific thought.

  17. DeltaSA tool for source apportionment benchmarking, description and sensitivity analysis

    Science.gov (United States)

    Pernigotti, D.; Belis, C. A.

    2018-05-01

    DeltaSA is an R-package and a Java on-line tool developed at the EC-Joint Research Centre to assist and benchmark source apportionment applications. Its key functionalities support two critical tasks in this kind of studies: the assignment of a factor to a source in factor analytical models (source identification) and the model performance evaluation. The source identification is based on the similarity between a given factor and source chemical profiles from public databases. The model performance evaluation is based on statistical indicators used to compare model output with reference values generated in intercomparison exercises. The references values are calculated as the ensemble average of the results reported by participants that have passed a set of testing criteria based on chemical profiles and time series similarity. In this study, a sensitivity analysis of the model performance criteria is accomplished using the results of a synthetic dataset where "a priori" references are available. The consensus modulated standard deviation punc gives the best choice for the model performance evaluation when a conservative approach is adopted.

  18. Rapid development of medical imaging tools with open-source libraries.

    Science.gov (United States)

    Caban, Jesus J; Joshi, Alark; Nagy, Paul

    2007-11-01

    Rapid prototyping is an important element in researching new imaging analysis techniques and developing custom medical applications. In the last ten years, the open source community and the number of open source libraries and freely available frameworks for biomedical research have grown significantly. What they offer are now considered standards in medical image analysis, computer-aided diagnosis, and medical visualization. A cursory review of the peer-reviewed literature in imaging informatics (indeed, in almost any information technology-dependent scientific discipline) indicates the current reliance on open source libraries to accelerate development and validation of processes and techniques. In this survey paper, we review and compare a few of the most successful open source libraries and frameworks for medical application development. Our dual intentions are to provide evidence that these approaches already constitute a vital and essential part of medical image analysis, diagnosis, and visualization and to motivate the reader to use open source libraries and software for rapid prototyping of medical applications and tools.

  19. A spectroscopic tool for identifying sources of origin for materials of military interest

    Science.gov (United States)

    Miziolek, Andrzej W.; De Lucia, Frank C.

    2014-05-01

    There is a need to identify the source of origin for many items of military interest, including ammunition and weapons that may be circulated and traded in illicit markets. Both fieldable systems (man-portable or handheld) as well as benchtop systems in field and home base laboratories are desired for screening and attribution purposes. Laser Induced Breakdown Spectroscopy (LIBS) continues to show significant capability as a promising new tool for materials identification, matching, and provenance. With the use of the broadband, high resolution spectrometer systems, the LIBS devices can not only determine the elemental inventory of the sample, but they are also capable of elemental fingerprinting to signify sources of origin of various materials. We present the results of an initial study to differentiate and match spent cartridges from different manufacturers and countries. We have found that using Partial Least Squares Discriminant Analysis (PLS-DA) we are able to achieve on average 93.3% True Positives and 5.3% False Positives. These results add to the large body of publications that have demonstrated that LIBS is a particularly suitable tool for source of origin determinations.

  20. EpiTools: An Open-Source Image Analysis Toolkit for Quantifying Epithelial Growth Dynamics.

    Science.gov (United States)

    Heller, Davide; Hoppe, Andreas; Restrepo, Simon; Gatti, Lorenzo; Tournier, Alexander L; Tapon, Nicolas; Basler, Konrad; Mao, Yanlan

    2016-01-11

    Epithelia grow and undergo extensive rearrangements to achieve their final size and shape. Imaging the dynamics of tissue growth and morphogenesis is now possible with advances in time-lapse microscopy, but a true understanding of their complexities is limited by automated image analysis tools to extract quantitative data. To overcome such limitations, we have designed a new open-source image analysis toolkit called EpiTools. It provides user-friendly graphical user interfaces for accurately segmenting and tracking the contours of cell membrane signals obtained from 4D confocal imaging. It is designed for a broad audience, especially biologists with no computer-science background. Quantitative data extraction is integrated into a larger bioimaging platform, Icy, to increase the visibility and usability of our tools. We demonstrate the usefulness of EpiTools by analyzing Drosophila wing imaginal disc growth, revealing previously overlooked properties of this dynamic tissue, such as the patterns of cellular rearrangements. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  1. Global search tool for the Advanced Photon Source Integrated Relational Model of Installed Systems (IRMIS) database

    International Nuclear Information System (INIS)

    Quock, D.E.R.; Cianciarulo, M.B.

    2007-01-01

    The Integrated Relational Model of Installed Systems (IRMIS) is a relational database tool that has been implemented at the Advanced Photon Source to maintain an updated account of approximately 600 control system software applications, 400,000 process variables, and 30,000 control system hardware components. To effectively display this large amount of control system information to operators and engineers, IRMIS was initially built with nine Web-based viewers: Applications Organizing Index, IOC, PLC, Component Type, Installed Components, Network, Controls Spares, Process Variables, and Cables. However, since each viewer is designed to provide details from only one major category of the control system, the necessity for a one-stop global search tool for the entire database became apparent. The user requirements for extremely fast database search time and ease of navigation through search results led to the choice of Asynchronous JavaScript and XML (AJAX) technology in the implementation of the IRMIS global search tool. Unique features of the global search tool include a two-tier level of displayed search results, and a database data integrity validation and reporting mechanism.

  2. 42: An Open-Source Simulation Tool for Study and Design of Spacecraft Attitude Control Systems

    Science.gov (United States)

    Stoneking, Eric

    2018-01-01

    Simulation is an important tool in the analysis and design of spacecraft attitude control systems. The speaker will discuss the simulation tool, called simply 42, that he has developed over the years to support his own work as an engineer in the Attitude Control Systems Engineering Branch at NASA Goddard Space Flight Center. 42 was intended from the outset to be high-fidelity and powerful, but also fast and easy to use. 42 is publicly available as open source since 2014. The speaker will describe some of 42's models and features, and discuss its applicability to studies ranging from early concept studies through the design cycle, integration, and operations. He will outline 42's architecture and share some thoughts on simulation development as a long-term project.

  3. An Open-Source Web-Based Tool for Resource-Agnostic Interactive Translation Prediction

    Directory of Open Access Journals (Sweden)

    Daniel Torregrosa

    2014-09-01

    Full Text Available We present a web-based open-source tool for interactive translation prediction (ITP and describe its underlying architecture. ITP systems assist human translators by making context-based computer-generated suggestions as they type. Most of the ITP systems in literature are strongly coupled with a statistical machine translation system that is conveniently adapted to provide the suggestions. Our system, however, follows a resource-agnostic approach and suggestions are obtained from any unmodified black-box bilingual resource. This paper reviews our ITP method and describes the architecture of Forecat, a web tool, partly based on the recent technology of web components, that eases the use of our ITP approach in any web application requiring this kind of translation assistance. We also evaluate the performance of our method when using an unmodified Moses-based statistical machine translation system as the bilingual resource.

  4. WannierTools: An open-source software package for novel topological materials

    Science.gov (United States)

    Wu, QuanSheng; Zhang, ShengNan; Song, Hai-Feng; Troyer, Matthias; Soluyanov, Alexey A.

    2018-03-01

    We present an open-source software package WannierTools, a tool for investigation of novel topological materials. This code works in the tight-binding framework, which can be generated by another software package Wannier90 (Mostofi et al., 2008). It can help to classify the topological phase of a given material by calculating the Wilson loop, and can get the surface state spectrum, which is detected by angle resolved photoemission (ARPES) and in scanning tunneling microscopy (STM) experiments. It also identifies positions of Weyl/Dirac points and nodal line structures, calculates the Berry phase around a closed momentum loop and Berry curvature in a part of the Brillouin zone (BZ).

  5. Genome sequencing of bacteria: sequencing, de novo assembly and rapid analysis using open source tools.

    Science.gov (United States)

    Kisand, Veljo; Lettieri, Teresa

    2013-04-01

    De novo genome sequencing of previously uncharacterized microorganisms has the potential to open up new frontiers in microbial genomics by providing insight into both functional capabilities and biodiversity. Until recently, Roche 454 pyrosequencing was the NGS method of choice for de novo assembly because it generates hundreds of thousands of long reads (tools for processing NGS data are increasingly free and open source and are often adopted for both their high quality and role in promoting academic freedom. The error rate of pyrosequencing the Alcanivorax borkumensis genome was such that thousands of insertions and deletions were artificially introduced into the finished genome. Despite a high coverage (~30 fold), it did not allow the reference genome to be fully mapped. Reads from regions with errors had low quality, low coverage, or were missing. The main defect of the reference mapping was the introduction of artificial indels into contigs through lower than 100% consensus and distracting gene calling due to artificial stop codons. No assembler was able to perform de novo assembly comparable to reference mapping. Automated annotation tools performed similarly on reference mapped and de novo draft genomes, and annotated most CDSs in the de novo assembled draft genomes. Free and open source software (FOSS) tools for assembly and annotation of NGS data are being developed rapidly to provide accurate results with less computational effort. Usability is not high priority and these tools currently do not allow the data to be processed without manual intervention. Despite this, genome assemblers now readily assemble medium short reads into long contigs (>97-98% genome coverage). A notable gap in pyrosequencing technology is the quality of base pair calling and conflicting base pairs between single reads at the same nucleotide position. Regardless, using draft whole genomes that are not finished and remain fragmented into tens of contigs allows one to characterize

  6. Pika: A snow science simulation tool built using the open-source framework MOOSE

    Science.gov (United States)

    Slaughter, A.; Johnson, M.

    2017-12-01

    The Department of Energy (DOE) is currently investing millions of dollars annually into various modeling and simulation tools for all aspects of nuclear energy. An important part of this effort includes developing applications based on the open-source Multiphysics Object Oriented Simulation Environment (MOOSE; mooseframework.org) from Idaho National Laboratory (INL).Thanks to the efforts of the DOE and outside collaborators, MOOSE currently contains a large set of physics modules, including phase-field, level set, heat conduction, tensor mechanics, Navier-Stokes, fracture and crack propagation (via the extended finite-element method), flow in porous media, and others. The heat conduction, tensor mechanics, and phase-field modules, in particular, are well-suited for snow science problems. Pika--an open-source MOOSE-based application--is capable of simulating both 3D, coupled nonlinear continuum heat transfer and large-deformation mechanics applications (such as settlement) and phase-field based micro-structure applications. Additionally, these types of problems may be coupled tightly in a single solve or across length and time scales using a loosely coupled Picard iteration approach. In addition to the wide range of physics capabilities, MOOSE-based applications also inherit an extensible testing framework, graphical user interface, and documentation system; tools that allow MOOSE and other applications to adhere to nuclear software quality standards. The snow science community can learn from the nuclear industry and harness the existing effort to build simulation tools that are open, modular, and share a common framework. In particular, MOOSE-based multiphysics solvers are inherently parallel, dimension agnostic, adaptive in time and space, fully coupled, and capable of interacting with other applications. The snow science community should build on existing tools to enable collaboration between researchers and practitioners throughout the world, and advance the

  7. Open source tools for the information theoretic analysis of neural data

    Directory of Open Access Journals (Sweden)

    Robin A. A Ince

    2010-05-01

    Full Text Available The recent and rapid development of open-source software tools for the analysis of neurophysiological datasets consisting of multiple simultaneous recordings of spikes, field potentials and other neural signals holds the promise for a significant advance in the standardization, transparency, quality, reproducibility and variety of techniques used to analyze neurophysiological data and integrate the information obtained at different spatial and temporal scales. In this Review we focus on recent advances in open source toolboxes for the information theoretic analysis of neural responses. We also present examples of their use to investigate the role of spike timing precision, correlations across neurons and field potential fluctuations in the encoding of sensory information. These information toolboxes, available both in Matlab and Python programming environments, hold the potential to enlarge the domain of application of information theory to neuroscience and to lead to new discoveries about how neurons encode and transmit information.

  8. CellProfiler and KNIME: open source tools for high content screening.

    Science.gov (United States)

    Stöter, Martin; Niederlein, Antje; Barsacchi, Rico; Meyenhofer, Felix; Brandl, Holger; Bickle, Marc

    2013-01-01

    High content screening (HCS) has established itself in the world of the pharmaceutical industry as an essential tool for drug discovery and drug development. HCS is currently starting to enter the academic world and might become a widely used technology. Given the diversity of problems tackled in academic research, HCS could experience some profound changes in the future, mainly with more imaging modalities and smart microscopes being developed. One of the limitations in the establishment of HCS in academia is flexibility and cost. Flexibility is important to be able to adapt the HCS setup to accommodate the multiple different assays typical of academia. Many cost factors cannot be avoided, but the costs of the software packages necessary to analyze large datasets can be reduced by using Open Source software. We present and discuss the Open Source software CellProfiler for image analysis and KNIME for data analysis and data mining that provide software solutions which increase flexibility and keep costs low.

  9. Open source tools for the information theoretic analysis of neural data.

    Science.gov (United States)

    Ince, Robin A A; Mazzoni, Alberto; Petersen, Rasmus S; Panzeri, Stefano

    2010-01-01

    The recent and rapid development of open source software tools for the analysis of neurophysiological datasets consisting of simultaneous multiple recordings of spikes, field potentials and other neural signals holds the promise for a significant advance in the standardization, transparency, quality, reproducibility and variety of techniques used to analyze neurophysiological data and for the integration of information obtained at different spatial and temporal scales. In this review we focus on recent advances in open source toolboxes for the information theoretic analysis of neural responses. We also present examples of their use to investigate the role of spike timing precision, correlations across neurons, and field potential fluctuations in the encoding of sensory information. These information toolboxes, available both in MATLAB and Python programming environments, hold the potential to enlarge the domain of application of information theory to neuroscience and to lead to new discoveries about how neurons encode and transmit information.

  10. The Efficacy of Social Media as a Research Tool and Information Source for Safeguards Verification

    International Nuclear Information System (INIS)

    Skoeld, T.; Feldman, Y.

    2015-01-01

    The IAEA Department of Safeguards aims to provide credible assurances to the international community that States are fulfiling their safeguards obligations in that all nuclear material remains in peaceful use. In order to draw a soundly-based safeguards conclusion for a State that has a safeguards agreement in force with the IAEA, the Department establishes a knowledge base of the State's nuclear-related infrastructure and activities against which a State's declarations are evaluated for correctness and completeness. Open source information is one stream of data that is used in the evaluation of nuclear fuel cycle activities in the State. The Department is continuously working to ensure that it has access to the most up-to-date, accurate, relevant and credible open source information available, and has begun to examine the use of social media as a new source of information. The use of social networking sites has increased exponentially in the last decade. In fact, social media has emerged as the key vehicle for delivering and acquiring information in near real-time. Therefore, it has become necessary for the open source analyst to consider social media as an essential element in the broader concept of open source information. Characteristics, such as ''immediacy'', ''recency'', ''interractiveness'', which set social networks apart from the ''traditional media'', are also the same attributes that present a challenge for using social media as an efficient information-delivery platform and a credible source of information. New tools and technologies for social media analytics have begun to emerge to help systematically monitor and mine this large body of data. The paper will survey the social media landscape in an effort to identify platforms that could be of value for safeguards verification purposes. It will explore how a number of social networking sites, such as Twitter

  11. Acoustic holography as a metrological tool for characterizing medical ultrasound sources and fields

    Science.gov (United States)

    Sapozhnikov, Oleg A.; Tsysar, Sergey A.; Khokhlova, Vera A.; Kreider, Wayne

    2015-01-01

    Acoustic holography is a powerful technique for characterizing ultrasound sources and the fields they radiate, with the ability to quantify source vibrations and reduce the number of required measurements. These capabilities are increasingly appealing for meeting measurement standards in medical ultrasound; however, associated uncertainties have not been investigated systematically. Here errors associated with holographic representations of a linear, continuous-wave ultrasound field are studied. To facilitate the analysis, error metrics are defined explicitly, and a detailed description of a holography formulation based on the Rayleigh integral is provided. Errors are evaluated both for simulations of a typical therapeutic ultrasound source and for physical experiments with three different ultrasound sources. Simulated experiments explore sampling errors introduced by the use of a finite number of measurements, geometric uncertainties in the actual positions of acquired measurements, and uncertainties in the properties of the propagation medium. Results demonstrate the theoretical feasibility of keeping errors less than about 1%. Typical errors in physical experiments were somewhat larger, on the order of a few percent; comparison with simulations provides specific guidelines for improving the experimental implementation to reduce these errors. Overall, results suggest that holography can be implemented successfully as a metrological tool with small, quantifiable errors. PMID:26428789

  12. Acoustic holography as a metrological tool for characterizing medical ultrasound sources and fields.

    Science.gov (United States)

    Sapozhnikov, Oleg A; Tsysar, Sergey A; Khokhlova, Vera A; Kreider, Wayne

    2015-09-01

    Acoustic holography is a powerful technique for characterizing ultrasound sources and the fields they radiate, with the ability to quantify source vibrations and reduce the number of required measurements. These capabilities are increasingly appealing for meeting measurement standards in medical ultrasound; however, associated uncertainties have not been investigated systematically. Here errors associated with holographic representations of a linear, continuous-wave ultrasound field are studied. To facilitate the analysis, error metrics are defined explicitly, and a detailed description of a holography formulation based on the Rayleigh integral is provided. Errors are evaluated both for simulations of a typical therapeutic ultrasound source and for physical experiments with three different ultrasound sources. Simulated experiments explore sampling errors introduced by the use of a finite number of measurements, geometric uncertainties in the actual positions of acquired measurements, and uncertainties in the properties of the propagation medium. Results demonstrate the theoretical feasibility of keeping errors less than about 1%. Typical errors in physical experiments were somewhat larger, on the order of a few percent; comparison with simulations provides specific guidelines for improving the experimental implementation to reduce these errors. Overall, results suggest that holography can be implemented successfully as a metrological tool with small, quantifiable errors.

  13. Tools for Trade Analysis and Open Source Information Monitoring for Non-proliferation

    International Nuclear Information System (INIS)

    Cojazzi, G.G.M.; Versino, C.; Wolfart, E.; Renda, G.; Janssens, W.A.M.; )

    2015-01-01

    The new state level approach being proposed by IAEA envisions an objective based and information driven safeguards approach utilizing all relevant information to improve the effectiveness and efficiency of safeguards. To this goal the IAEA makes also use of open source information, here broadly defined as any information that is neither classified nor proprietary. It includes, but is not limited to: media sources, government and non-governmental reports and analyzes, commercial data, and scientific/technical literature, including trade data. Within the EC support programme to IAEA, JRC has surveyed and catalogued open sources on import-export customs trade data and developed tools for supporting the use of the related databases in safeguards. The JRC software The Big Table, (TBT), supports i.a.: a) the search through a collection of reference documents relevant to trade analysis (legal/regulatory documents, technical handbooks); b) the selection of items of interests to specific verifications and c) the mapping of these items to customs commodities searchable in trade databases. In the field of open source monitoring, JRC is developing and operating a ''Nuclear Security Media Monitor'' (NSMM), which is a web-based multilingual news aggregation system that automatically collects news articles from pre-defined web sites. NSMM is a domain specific version of the general JRC-Europe Media Monitor (EMM). NSMM has been established within the EC support programme with the aim, i.e., to streamline IAEA's process of open source information monitoring. In the first part, the paper will recall the trade data sources relevant for non-proliferation and will then illustrate the main features of TBT, recently coupled with the IAEA Physical Model, and new visualization techniques applied to trade data. In the second part it will present the main aspects of the NSMM also by illustrating some of uses done at JRC. (author)

  14. Demonstrating High-Accuracy Orbital Access Using Open-Source Tools

    Science.gov (United States)

    Gilbertson, Christian; Welch, Bryan

    2017-01-01

    Orbit propagation is fundamental to almost every space-based analysis. Currently, many system analysts use commercial software to predict the future positions of orbiting satellites. This is one of many capabilities that can replicated, with great accuracy, without using expensive, proprietary software. NASAs SCaN (Space Communication and Navigation) Center for Engineering, Networks, Integration, and Communications (SCENIC) project plans to provide its analysis capabilities using a combination of internal and open-source software, allowing for a much greater measure of customization and flexibility, while reducing recurring software license costs. MATLAB and the open-source Orbit Determination Toolbox created by Goddard Space Flight Center (GSFC) were utilized to develop tools with the capability to propagate orbits, perform line-of-sight (LOS) availability analyses, and visualize the results. The developed programs are modular and can be applied for mission planning and viability analysis in a variety of Solar System applications. The tools can perform 2 and N-body orbit propagation, find inter-satellite and satellite to ground station LOS access (accounting for intermediate oblate spheroid body blocking, geometric restrictions of the antenna field-of-view (FOV), and relativistic corrections), and create animations of planetary movement, satellite orbits, and LOS accesses. The code is the basis for SCENICs broad analysis capabilities including dynamic link analysis, dilution-of-precision navigation analysis, and orbital availability calculations.

  15. Open-Source tools: Incidence in the wireless security of the Technical University of Babahoyo

    Directory of Open Access Journals (Sweden)

    Joffre León-Acurio

    2018-02-01

    Full Text Available Computer security is a fundamental part of an organization, especially in Higher Education institutions, where there is very sensitive information, capable of being vulnerable by diffeerent methods of intrusion, the most common being free access through wireless points. The main objective of this research is to analyze the impact of the open source tools in charge of managing the security information of the wireless network, such as OSSIM, a set of active and passive components used to manage events that generate tra c within the network. net. This research exposes the use of free software as a viable option of low cost to solve the problems that a ict student sta , such as lack of access to academic services, problems of wireless interconnectivity, with the purpose to restore confidence in students in the Use of the services offered by the institution for research-related development, guaranteeing free and free access to the internet. The level of dissatisfaction on the part of the students con rms the problem presented at the Technical University of Babahoyo, thus confirming the positive influence of the Open-Source tools for the institution’s wireless security.

  16. Managing research and surveillance projects in real-time with a novel open-source eManagement tool designed for under-resourced countries.

    Science.gov (United States)

    Steiner, Andreas; Hella, Jerry; Grüninger, Servan; Mhalu, Grace; Mhimbira, Francis; Cercamondi, Colin I; Doulla, Basra; Maire, Nicolas; Fenner, Lukas

    2016-09-01

    A software tool is developed to facilitate data entry and to monitor research projects in under-resourced countries in real-time. The eManagement tool "odk_planner" is written in the scripting languages PHP and Python. The odk_planner is lightweight and uses minimal internet resources. It was designed to be used with the open source software Open Data Kit (ODK). The users can easily configure odk_planner to meet their needs, and the online interface displays data collected from ODK forms in a graphically informative way. The odk_planner also allows users to upload pictures and laboratory results and sends text messages automatically. User-defined access rights protect data and privacy. We present examples from four field applications in Tanzania successfully using the eManagement tool: 1) clinical trial; 2) longitudinal Tuberculosis (TB) Cohort Study with a complex visit schedule, where it was used to graphically display missing case report forms, upload digitalized X-rays, and send text message reminders to patients; 3) intervention study to improve TB case detection, carried out at pharmacies: a tablet-based electronic referral system monitored referred patients, and sent automated messages to remind pharmacy clients to visit a TB Clinic; and 4) TB retreatment case monitoring designed to improve drug resistance surveillance: clinicians at four public TB clinics and lab technicians at the TB reference laboratory used a smartphone-based application that tracked sputum samples, and collected clinical and laboratory data. The user friendly, open source odk_planner is a simple, but multi-functional, Web-based eManagement tool with add-ons that helps researchers conduct studies in under-resourced countries. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. Role of Open Source Tools and Resources in Virtual Screening for Drug Discovery.

    Science.gov (United States)

    Karthikeyan, Muthukumarasamy; Vyas, Renu

    2015-01-01

    Advancement in chemoinformatics research in parallel with availability of high performance computing platform has made handling of large scale multi-dimensional scientific data for high throughput drug discovery easier. In this study we have explored publicly available molecular databases with the help of open-source based integrated in-house molecular informatics tools for virtual screening. The virtual screening literature for past decade has been extensively investigated and thoroughly analyzed to reveal interesting patterns with respect to the drug, target, scaffold and disease space. The review also focuses on the integrated chemoinformatics tools that are capable of harvesting chemical data from textual literature information and transform them into truly computable chemical structures, identification of unique fragments and scaffolds from a class of compounds, automatic generation of focused virtual libraries, computation of molecular descriptors for structure-activity relationship studies, application of conventional filters used in lead discovery along with in-house developed exhaustive PTC (Pharmacophore, Toxicophores and Chemophores) filters and machine learning tools for the design of potential disease specific inhibitors. A case study on kinase inhibitors is provided as an example.

  18. MASTODON: A geosciences simulation tool built using the open-source framework MOOSE

    Science.gov (United States)

    Slaughter, A.

    2017-12-01

    The Department of Energy (DOE) is currently investing millions of dollars annually into various modeling and simulation tools for all aspects of nuclear energy. An important part of this effort includes developing applications based on the open-source Multiphysics Object Oriented Simulation Environment (MOOSE; mooseframework.org) from Idaho National Laboratory (INL).Thanks to the efforts of the DOE and outside collaborators, MOOSE currently contains a large set of physics modules, including phase field, level set, heat conduction, tensor mechanics, Navier-Stokes, fracture (extended finite-element method), and porous media, among others. The tensor mechanics and contact modules, in particular, are well suited for nonlinear geosciences problems. Multi-hazard Analysis for STOchastic time-DOmaiN phenomena (MASTODON; https://seismic-research.inl.gov/SitePages/Mastodon.aspx)--a MOOSE-based application--is capable of analyzing the response of 3D soil-structure systems to external hazards with current development focused on earthquakes. It is capable of simulating seismic events and can perform extensive "source-to-site" simulations including earthquake fault rupture, nonlinear wave propagation, and nonlinear soil-structure interaction analysis. MASTODON also includes a dynamic probabilistic risk assessment capability that enables analysts to not only perform deterministic analyses, but also easily perform probabilistic or stochastic simulations for the purpose of risk assessment. Although MASTODON has been developed for the nuclear industry, it can be used to assess the risk for any structure subjected to earthquakes.The geosciences community can learn from the nuclear industry and harness the enormous effort underway to build simulation tools that are open, modular, and share a common framework. In particular, MOOSE-based multiphysics solvers are inherently parallel, dimension agnostic, adaptive in time and space, fully coupled, and capable of interacting with other

  19. Exploring TechQuests Through Open Source and Tools That Inspire Digital Natives

    Science.gov (United States)

    Hayden, K.; Ouyang, Y.; Kilb, D.; Taylor, N.; Krey, B.

    2008-12-01

    "There is little doubt that K-12 students need to understand and appreciate the Earth on which they live. They can achieve this understanding only if their teachers are well prepared". Dan Barstow, Director of Center for Earth and Space Science Education at TERC. The approach of San Diego County's Cyberinfrastructure Training, Education, Advancement, and Mentoring (SD Cyber-TEAM) project is to build understandings of Earth systems for middle school teachers and students through a collaborative that has engaged the scientific community in the use of cyber-based tools and environments for learning. The SD Cyber-TEAM has used Moodle, an open source management system with social networking tools, that engage digital native students and their teachers in collaboration and sharing of ideas and research related to Earth science. Teachers participate in on-line professional dialog through chat, wikis, blogs, forums, journals and other tools and choose the tools that will best fit their classroom. The use of Moodle during the Summer Cyber Academy developed a cyber-collaboratory environment where teaching strategies were discussed, supported and actualized by participants. These experiences supported digital immigrants (teachers) in adapting teaching strategies using technologies that are most attractive and familiar to students (digital natives). A new study by the National School Boards Association and Grunwald Associates LLC indicated that "the online behaviors of U.S. teens and 'tweens shows that 96 percent of students with online access use social networking technologies, such as chatting, text messaging, blogging, and visiting online communities such as Facebook, MySpace, and Webkinz". While SD Cyber-TEAM teachers are implementing TechQuests in classrooms they use these social networking elements to capture student interest and address the needs of digital natives. Through the Moodle environment, teachers have explored a variety of learning objects called Tech

  20. Isotope ratio mass spectrometry as a tool for source inference in forensic science: A critical review.

    Science.gov (United States)

    Gentile, Natacha; Siegwolf, Rolf T W; Esseiva, Pierre; Doyle, Sean; Zollinger, Kurt; Delémont, Olivier

    2015-06-01

    Isotope ratio mass spectrometry (IRMS) has been used in numerous fields of forensic science in a source inference perspective. This review compiles the studies published on the application of isotope ratio mass spectrometry (IRMS) to the traditional fields of forensic science so far. It completes the review of Benson et al. [1] and synthesises the extent of knowledge already gathered in the following fields: illicit drugs, flammable liquids, human provenancing, microtraces, explosives and other specific materials (packaging tapes, safety matches, plastics, etc.). For each field, a discussion assesses the state of science and highlights the relevance of the information in a forensic context. Through the different discussions which mark out the review, the potential and limitations of IRMS, as well as the needs and challenges of future studies are emphasized. The paper elicits the various dimensions of the source which can be obtained from the isotope information and demonstrates the transversal nature of IRMS as a tool for source inference. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  1. OSS4EVA: Using Open-Source Tools to Fulfill Digital Preservation Requirements

    Directory of Open Access Journals (Sweden)

    Heidi Dowding

    2016-10-01

    Full Text Available This paper builds on the findings of a workshop held at the 2015 International Conference on Digital Preservation (iPRES, entitled, “Using Open-Source Tools to Fulfill Digital Preservation Requirements” (OSS4PRES hereafter. This day-long workshop brought together participants from across the library and archives community, including practitioners, proprietary vendors, and representatives from open-source projects. The resulting conversations were surprisingly revealing: while OSS’ significance within the preservation landscape was made clear, participants noted that there are a number of roadblocks that discourage or altogether prevent its use in many organizations. Overcoming these challenges will be necessary to further widespread, sustainable OSS adoption within the digital preservation community. This article will mine the rich discussions that took place at OSS4PRES to (1 summarize the workshop’s key themes and major points of debate, (2 provide a comprehensive analysis of the opportunities, gaps, and challenges that using OSS entails at a philosophical, institutional, and individual level, and (3 offer a tangible set of recommendations for future work designed to broaden community engagement and enhance the sustainability of open source initiatives, drawing on both participants’ experience as well as additional research.

  2. ALPHACAL: A new user-friendly tool for the calibration of alpha-particle sources.

    Science.gov (United States)

    Timón, A Fernández; Vargas, M Jurado; Gallardo, P Álvarez; Sánchez-Oro, J; Peralta, L

    2018-05-01

    In this work, we present and describe the program ALPHACAL, specifically developed for the calibration of alpha-particle sources. It is therefore more user-friendly and less time-consuming than multipurpose codes developed for a wide range of applications. The program is based on the recently developed code AlfaMC, which simulates specifically the transport of alpha particles. Both cylindrical and point sources mounted on the surface of polished backings can be simulated, as is the convention in experimental measurements of alpha-particle sources. In addition to the efficiency calculation and determination of the backscattering coefficient, some additional tools are available to the user, like the visualization of energy spectrum, use of energy cut-off or low-energy tail corrections. ALPHACAL has been implemented in C++ language using QT library, so it is available for Windows, MacOs and Linux platforms. It is free and can be provided under request to the authors. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. BYMUR software: a free and open source tool for quantifying and visualizing multi-risk analyses

    Science.gov (United States)

    Tonini, Roberto; Selva, Jacopo

    2013-04-01

    The BYMUR software aims to provide an easy-to-use open source tool for both computing multi-risk and managing/visualizing/comparing all the inputs (e.g. hazard, fragilities and exposure) as well as the corresponding results (e.g. risk curves, risk indexes). For all inputs, a complete management of inter-model epistemic uncertainty is considered. The BYMUR software will be one of the final products provided by the homonymous ByMuR project (http://bymur.bo.ingv.it/) funded by Italian Ministry of Education, Universities and Research (MIUR), focused to (i) provide a quantitative and objective general method for a comprehensive long-term multi-risk analysis in a given area, accounting for inter-model epistemic uncertainty through Bayesian methodologies, and (ii) apply the methodology to seismic, volcanic and tsunami risks in Naples (Italy). More specifically, the BYMUR software will be able to separately account for the probabilistic hazard assessment of different kind of hazardous phenomena, the relative (time-dependent/independent) vulnerabilities and exposure data, and their possible (predefined) interactions: the software will analyze these inputs and will use them to estimate both single- and multi- risk associated to a specific target area. In addition, it will be possible to connect the software to further tools (e.g., a full hazard analysis), allowing a dynamic I/O of results. The use of Python programming language guarantees that the final software will be open source and platform independent. Moreover, thanks to the integration of some most popular and rich-featured Python scientific modules (Numpy, Matplotlib, Scipy) with the wxPython graphical user toolkit, the final tool will be equipped with a comprehensive Graphical User Interface (GUI) able to control and visualize (in the form of tables, maps and/or plots) any stage of the multi-risk analysis. The additional features of importing/exporting data in MySQL databases and/or standard XML formats (for

  4. A framework for air quality monitoring based on free public data and open source tools

    Science.gov (United States)

    Nikolov, Hristo; Borisova, Denitsa

    2014-10-01

    In the recent years more and more widely accepted by the Space agencies (e.g. NASA, ESA) is the policy toward provision of Earth observation (EO) data and end products concerning air quality especially in large urban areas without cost to researchers and SMEs. Those EO data are complemented by increasing amount of in-situ data also provided at no cost either from national authorities or having crowdsourced origin. This accessibility together with the increased processing capabilities of the free and open source software is a prerequisite for creation of solid framework for air modeling in support of decision making at medium and large scale. Essential part of this framework is web-based GIS mapping tool responsible for dissemination of the output generated. In this research an attempt is made to establish a running framework based solely on openly accessible data on air quality and on set of freely available software tools for processing and modeling taking into account the present status quo in Bulgaria. Among the primary sources of data, especially for bigger urban areas, for different types of gases and dust particles, noted should be the National Institute of Meteorology and Hydrology of Bulgaria (NIMH) and National System for Environmental Monitoring managed by Bulgarian Executive Environmental Agency (ExEA). Both authorities provide data for concentration of several gases just to mention CO, CO2, NO2, SO2, and fine suspended dust (PM10, PM2.5) on monthly (for some data on daily) basis. In the framework proposed these data will complement the data from satellite-based sensors such as OMI instrument aboard EOS-Aura satellite and from TROPOMI instrument payload for future ESA Sentinel-5P mission. Integral part of the framework is the modern map for the land use/land cover which is provided from EEA by initiative GIO Land CORINE. This map is also a product from EO data distributed at European level. First and above all, our effort is focused on provision to the

  5. FREEWAT: an HORIZON 2020 project to build open source tools for water management.

    Science.gov (United States)

    Rossetto, Rudy; Borsi, Iacopo; Foglia, Laura

    2015-04-01

    tools for better producing feasibility and management plans; (ii) a set of activities devoted to fix bugs and to provide a well-integrated interface for the different tools implemented. Further capabilities to be integrated are: - a dedicated module for water management and planning that will help to manage and aggregate all the distributed data coming from the simulation scenarios; - a whole module for calibration, uncertainty and sensitivity analysis; - a module for solute transport in the unsaturated zone; - a module for crop growth and water requirements in agriculture; - tools for dealing with groundwater quality issues; - tools for the analysis, interpretation and visualization of hydrogeological data. Through creating a common environment among water research/professionals, policy makers and implementers, FREEWAT main impact will be on enhancing science- and participatory approach and evidence-based decision making in water resource management, hence producing relevant and appropriate outcomes for policy implementation. The Consortium is constituted by partners from various water sectors from 10 EU countries, plus Turkey and Ukraine. Synergies with the UNESCO HOPE initiative on free and open source software in water management greatly boost the value of the project. Large stakeholders involvement is thought to guarantee results dissemination and exploitation. Acknowledgements This paper is presented within the framework of the project FREEWAT, which has received funding from the European Union's Horizon 2020 research and innovation programme under Grant Agreement n. 642224. References MARSOL (2014). Demonstrating Managed Aquifer Recharge as a Solution to Water Scarcity and Drought www.marsol.eu [accessed 4 January 2015] Rossetto, R., Borsi, I., Schifani, C., Bonari, E., Mogorovich P. & Primicerio M. (2013) - SID&GRID: integrating hydrological modeling in GIS environment hydroinformatics system for the management of the water resource. Rendiconti Online Societa

  6. An open source automatic quality assurance (OSAQA) tool for the ACR MRI phantom.

    Science.gov (United States)

    Sun, Jidi; Barnes, Michael; Dowling, Jason; Menk, Fred; Stanwell, Peter; Greer, Peter B

    2015-03-01

    Routine quality assurance (QA) is necessary and essential to ensure MR scanner performance. This includes geometric distortion, slice positioning and thickness accuracy, high contrast spatial resolution, intensity uniformity, ghosting artefact and low contrast object detectability. However, this manual process can be very time consuming. This paper describes the development and validation of an open source tool to automate the MR QA process, which aims to increase physicist efficiency, and improve the consistency of QA results by reducing human error. The OSAQA software was developed in Matlab and the source code is available for download from http://jidisun.wix.com/osaqa-project/. During program execution QA results are logged for immediate review and are also exported to a spreadsheet for long-term machine performance reporting. For the automatic contrast QA test, a user specific contrast evaluation was designed to improve accuracy for individuals on different display monitors. American College of Radiology QA images were acquired over a period of 2 months to compare manual QA and the results from the proposed OSAQA software. OSAQA was found to significantly reduce the QA time from approximately 45 to 2 min. Both the manual and OSAQA results were found to agree with regard to the recommended criteria and the differences were insignificant compared to the criteria. The intensity homogeneity filter is necessary to obtain an image with acceptable quality and at the same time keeps the high contrast spatial resolution within the recommended criterion. The OSAQA tool has been validated on scanners with different field strengths and manufacturers. A number of suggestions have been made to improve both the phantom design and QA protocol in the future.

  7. Beam diagnostic tools for the negative hydrogen ion source test facility ELISE

    International Nuclear Information System (INIS)

    Nocentini, Riccardo; Fantz, Ursel; Franzen, Peter; Froeschle, Markus; Heinemann, Bernd; Riedl, Rudolf; Ruf, Benjamin; Wuenderlich, Dirk

    2013-01-01

    Highlights: ► We present an overview of beam diagnostic tools foreseen for the new testbed ELISE. ► A sophisticated diagnostic calorimeter allows beam profile measurement. ► A tungsten wire mesh in the beam path provides a qualitative picture of the beam. ► Stripping losses and beam divergence are measured by H α Doppler shift spectroscopy. -- Abstract: The test facility ELISE, presently being commissioned at IPP, is a first step in the R and D roadmap for the RF driven ion source and extraction system of the ITER NBI system. The “half-size” ITER-like test facility includes a negative hydrogen ion source that can be operated for 1 h. ELISE is expected to extract an ion beam of 20 A at 60 kV for 10 s every 3 min, therefore delivering a total power of 1.2 MW. The extraction area has a geometry that closely reproduces the ITER design, with the same width and half the height, i.e. 1 m × 1 m. This paper presents an overview of beam diagnostic tools foreseen for ELISE. For the commissioning phase, a simple beam dump with basic diagnostic capabilities has been installed. In the second phase, the beam dump will be substituted by a more sophisticated diagnostic calorimeter to allow beam profile measurement. Additionally, a tungsten wire mesh will be introduced in the beam path to provide a qualitative picture of beam size and position. Stripping losses and beam divergence will be measured by means of H α Doppler shift spectroscopy. An absolute calibration is foreseen in order to measure beam intensity

  8. OLS Client and OLS Dialog: Open Source Tools to Annotate Public Omics Datasets.

    Science.gov (United States)

    Perez-Riverol, Yasset; Ternent, Tobias; Koch, Maximilian; Barsnes, Harald; Vrousgou, Olga; Jupp, Simon; Vizcaíno, Juan Antonio

    2017-10-01

    The availability of user-friendly software to annotate biological datasets and experimental details is becoming essential in data management practices, both in local storage systems and in public databases. The Ontology Lookup Service (OLS, http://www.ebi.ac.uk/ols) is a popular centralized service to query, browse and navigate biomedical ontologies and controlled vocabularies. Recently, the OLS framework has been completely redeveloped (version 3.0), including enhancements in the data model, like the added support for Web Ontology Language based ontologies, among many other improvements. However, the new OLS is not backwards compatible and new software tools are needed to enable access to this widely used framework now that the previous version is no longer available. We here present the OLS Client as a free, open-source Java library to retrieve information from the new version of the OLS. It enables rapid tool creation by providing a robust, pluggable programming interface and common data model to programmatically access the OLS. The library has already been integrated and is routinely used by several bioinformatics resources and related data annotation tools. Secondly, we also introduce an updated version of the OLS Dialog (version 2.0), a Java graphical user interface that can be easily plugged into Java desktop applications to access the OLS. The software and related documentation are freely available at https://github.com/PRIDE-Utilities/ols-client and https://github.com/PRIDE-Toolsuite/ols-dialog. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Development of a Monte Carlo multiple source model for inclusion in a dose calculation auditing tool.

    Science.gov (United States)

    Faught, Austin M; Davidson, Scott E; Fontenot, Jonas; Kry, Stephen F; Etzel, Carol; Ibbott, Geoffrey S; Followill, David S

    2017-09-01

    The Imaging and Radiation Oncology Core Houston (IROC-H) (formerly the Radiological Physics Center) has reported varying levels of agreement in their anthropomorphic phantom audits. There is reason to believe one source of error in this observed disagreement is the accuracy of the dose calculation algorithms and heterogeneity corrections used. To audit this component of the radiotherapy treatment process, an independent dose calculation tool is needed. Monte Carlo multiple source models for Elekta 6 MV and 10 MV therapeutic x-ray beams were commissioned based on measurement of central axis depth dose data for a 10 × 10 cm 2 field size and dose profiles for a 40 × 40 cm 2 field size. The models were validated against open field measurements consisting of depth dose data and dose profiles for field sizes ranging from 3 × 3 cm 2 to 30 × 30 cm 2 . The models were then benchmarked against measurements in IROC-H's anthropomorphic head and neck and lung phantoms. Validation results showed 97.9% and 96.8% of depth dose data passed a ±2% Van Dyk criterion for 6 MV and 10 MV models respectively. Dose profile comparisons showed an average agreement using a ±2%/2 mm criterion of 98.0% and 99.0% for 6 MV and 10 MV models respectively. Phantom plan comparisons were evaluated using ±3%/2 mm gamma criterion, and averaged passing rates between Monte Carlo and measurements were 87.4% and 89.9% for 6 MV and 10 MV models respectively. Accurate multiple source models for Elekta 6 MV and 10 MV x-ray beams have been developed for inclusion in an independent dose calculation tool for use in clinical trial audits. © 2017 American Association of Physicists in Medicine.

  10. Sharing clinical decisions for multimorbidity case management using social network and open-source tools.

    Science.gov (United States)

    Martínez-García, Alicia; Moreno-Conde, Alberto; Jódar-Sánchez, Francisco; Leal, Sandra; Parra, Carlos

    2013-12-01

    . The professionals valued positively all the items in the questionnaire. As part of the SCP, opensource tools for CDS will be incorporated to provide recommendations for medication and problem interactions, as well as to calculate indexes or scales from validated questionnaires. They will receive the patient summary information provided by the regional Electronic Health Record system through a web service with the information defined according to the virtual Medical Record specification. Clinical Wall has been developed to allow communication and coordination between the healthcare professionals involved in multimorbidity patient care. Agreed decisions were about coordination for appointment changing, patient conditions, diagnosis tests, and prescription changes and renewal. The application of interoperability standards and open source software can bridge the gap between knowledge and clinical practice, while enabling interoperability and scalability. Open source with the social network encourages adoption and facilitates collaboration. Although the results obtained for use indicators are still not as high as it was expected, based on the promising results obtained in the acceptance questionnaire of SMP, we expect that the new CDS tools will increase the use by the health professionals. Copyright © 2013 Elsevier Inc. All rights reserved.

  11. Tool supported modeling of sensor communication networks by using finite-source priority retrial queues

    Directory of Open Access Journals (Sweden)

    Tamas Berczes

    2012-06-01

    Full Text Available The main aim of the present paper is to draw the attention of the readers of this special issue to the modeling issues of sensor networks. The novelty of this investigation is the introduction of servers vacation combined with priority customers for finite-source retrial queues and its application to wireless sensor networks. In this paper we analyze a priority finite-source retrial queue with repeated vacations. Two types of priority customers are defined, customers with priority 1 (P1 go directly to an ordinary FIFO queue. However, if customers with priority 2 (P2 find the server in busy or unavailable state go to the orbit. These customers stay in the orbit and retry their request until find the server in idle and available state. We assume that P1 customers have non-preemptive priority over P2 customers. The server starts with a listening period and if no customer arrive during this period it will enter in the vacation mode. When the vacation period is terminated, then the node wakes up. If there is a P1 customer in the queue the server begin to serve it, and when there is no any P1 customer, the node will remain awake for exponentially distributed time period. If that period expires without arrivals the node will enter in the next sleeping period. All random variables involved in model construction are supposed to be independent and exponentially distributed ones. Our main interest is to give the main steady-state performance measures of the system computed by the help of the MOSEL tool. Several Figures illustrate the effect of input parameters on the mean response time.

  12. Climate modeling - a tool for the assessment of the paleodistribution of source and reservoir rocks

    Energy Technology Data Exchange (ETDEWEB)

    Roscher, M.; Schneider, J.W. [Technische Univ. Bergakademie Freiberg (Germany). Inst. fuer Geologie; Berner, U. [Bundesanstalt fuer Geowissenschaften und Rohstoffe, Hannover (Germany). Referat Organische Geochemie/Kohlenwasserstoff-Forschung

    2008-10-23

    In an on-going project of BGR and TU Bergakademie Freiberg, numeric paleo-climate modeling is used as a tool for the assessment of the paleo-distribution of organic rich deposits as well as of reservoir rocks. This modeling approach is based on new ideas concerning the formation of the Pangea supercontinent. The new plate tectonic concept is supported by paleo- magnetic data as it fits the 95% confidence interval of published data. Six Permocarboniferous time slices (340, 320, 300, 290, 270, 255 Ma) were chosen within a first paleo-climate modeling approach as they represent the most important changes of the Late Paleozoic climate development. The digital maps have a resolution of 2.8 x 2.8 (T42), suitable for high-resolution climate modeling, using the PLASIM model. CO{sub 2} concentrations of the paleo-atmosphere and paleo-insolation values have been estimated by published methods. For the purpose of validation, quantitative model output, had to be transformed into qualitative parameters in order to be able to compare digital data with qualitative data of geologic indicators. The model output of surface temperatures and precipitation was therefore converted into climate zones. The reconstructed occurrences of geological indicators like aeolian sands, evaporites, reefs, coals, oil source rocks, tillites, phosphorites and cherts were then compared to the computed paleo-climate zones. Examples of the Permian Pangea show a very good agreement between model results and geological indicators. From the modeling approach we are able to identify climatic processes which lead to the deposition of hydrocarbon source and reservoir rocks. The regional assessment of such atmospheric processes may be used for the identification of the paleo-distribution of organic rich deposits or rock types suitable to form hydrocarbon reservoirs. (orig.)

  13. Model-based evaluation of the use of polycyclic aromatic hydrocarbons molecular diagnostic ratios as a source identification tool

    International Nuclear Information System (INIS)

    Katsoyiannis, Athanasios; Breivik, Knut

    2014-01-01

    Polycyclic Aromatic Hydrocarbons (PAHs) molecular diagnostic ratios (MDRs) are unitless concentration ratios of pair-PAHs with the same molecular weight (MW); MDRs have long been used as a tool for PAHs source identification purposes. In the present paper, the efficiency of the MDR methodology is evaluated through the use of a multimedia fate model, the calculation of characteristic travel distances (CTD) and the estimation of air concentrations for individual PAHs as a function of distance from an initial point source. The results show that PAHs with the same MW are sometimes characterized by substantially different CTDs and therefore their air concentrations and hence MDRs are predicted to change as the distance from the original source increases. From the assessed pair-PAHs, the biggest CTD difference is seen for Fluoranthene (107 km) vs. Pyrene (26 km). This study provides a strong indication that MDRs are of limited use as a source identification tool. -- Highlights: • Model-based evaluation of the PAHs molecular diagnostic ratios efficiency. • Individual PAHs are characterized by different characteristic travel distances. • MDRs are proven to be a limited tool for source identification. • Use of MDRs for other environmental media is likely unfeasible. -- PAHs molecular diagnostic ratios which change greatly as a function of distance from the emitting source are improper for source identification purposes

  14. Independent assessment of MELCOR as a severe accident thermal-hydraulic/source term analysis tool

    International Nuclear Information System (INIS)

    Madni, I.K.; Eltawila, F.

    1994-01-01

    MELCOR is a fully integrated computer code that models all phases of the progression of severe accidents in light water reactor nuclear power plants, and is being developed for the US Nuclear Regulatory Commission (NRC) by Sandia National Laboratories (SNL). Brookhaven National Laboratory (BNL) has a program with the NRC called ''MELCOR Verification, Benchmarking, and Applications,'' whose aim is to provide independent assessment of MELCOR as a severe accident thermal-hydraulic/source term analysis tool. The scope of this program is to perform quality control verification on all released versions of MELCOR, to benchmark MELCOR against more mechanistic codes and experimental data from severe fuel damage tests, and to evaluate the ability of MELCOR to simulate long-term severe accident transients in commercial LWRs, by applying the code to model both BWRs and PWRs. Under this program, BNL provided input to the NRC-sponsored MELCOR Peer Review, and is currently contributing to the MELCOR Cooperative Assessment Program (MCAP). This paper presents a summary of MELCOR assessment efforts at BNL and their contribution to NRC goals with respect to MELCOR

  15. Helioviewer.org: An Open-source Tool for Visualizing Solar Data

    Science.gov (United States)

    Hughitt, V. Keith; Ireland, J.; Schmiedel, P.; Dimitoglou, G.; Mueller, D.; Fleck, B.

    2009-05-01

    As the amount of solar data available to scientists continues to increase at faster and faster rates, it is important that there exist simple tools for navigating this data quickly with a minimal amount of effort. By combining heterogeneous solar physics datatypes such as full-disk images and coronagraphs, along with feature and event information, Helioviewer offers a simple and intuitive way to browse multiple datasets simultaneously. Images are stored in a repository using the JPEG 2000 format and tiled dynamically upon a client's request. By tiling images and serving only the portions of the image requested, it is possible for the client to work with very large images without having to fetch all of the data at once. Currently, Helioviewer enables users to browse the entire SOHO data archive, updated hourly, as well as data feature/event catalog data from eight different catalogs including active region, flare, coronal mass ejection, type II radio burst data. In addition to a focus on intercommunication with other virtual observatories and browsers (VSO, HEK, etc), Helioviewer will offer a number of externally-available application programming interfaces (APIs) to enable easy third party use, adoption and extension. Future functionality will include: support for additional data-sources including TRACE, SDO and STEREO, dynamic movie generation, a navigable timeline of recorded solar events, social annotation, and basic client-side image processing.

  16. IPeak: An open source tool to combine results from multiple MS/MS search engines.

    Science.gov (United States)

    Wen, Bo; Du, Chaoqin; Li, Guilin; Ghali, Fawaz; Jones, Andrew R; Käll, Lukas; Xu, Shaohang; Zhou, Ruo; Ren, Zhe; Feng, Qiang; Xu, Xun; Wang, Jun

    2015-09-01

    Liquid chromatography coupled tandem mass spectrometry (LC-MS/MS) is an important technique for detecting peptides in proteomics studies. Here, we present an open source software tool, termed IPeak, a peptide identification pipeline that is designed to combine the Percolator post-processing algorithm and multi-search strategy to enhance the sensitivity of peptide identifications without compromising accuracy. IPeak provides a graphical user interface (GUI) as well as a command-line interface, which is implemented in JAVA and can work on all three major operating system platforms: Windows, Linux/Unix and OS X. IPeak has been designed to work with the mzIdentML standard from the Proteomics Standards Initiative (PSI) as an input and output, and also been fully integrated into the associated mzidLibrary project, providing access to the overall pipeline, as well as modules for calling Percolator on individual search engine result files. The integration thus enables IPeak (and Percolator) to be used in conjunction with any software packages implementing the mzIdentML data standard. IPeak is freely available and can be downloaded under an Apache 2.0 license at https://code.google.com/p/mzidentml-lib/. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Differential Expression and Functional Analysis of High-Throughput -Omics Data Using Open Source Tools.

    Science.gov (United States)

    Kebschull, Moritz; Fittler, Melanie Julia; Demmer, Ryan T; Papapanou, Panos N

    2017-01-01

    Today, -omics analyses, including the systematic cataloging of messenger RNA and microRNA sequences or DNA methylation patterns in a cell population, organ, or tissue sample, allow for an unbiased, comprehensive genome-level analysis of complex diseases, offering a large advantage over earlier "candidate" gene or pathway analyses. A primary goal in the analysis of these high-throughput assays is the detection of those features among several thousand that differ between different groups of samples. In the context of oral biology, our group has successfully utilized -omics technology to identify key molecules and pathways in different diagnostic entities of periodontal disease.A major issue when inferring biological information from high-throughput -omics studies is the fact that the sheer volume of high-dimensional data generated by contemporary technology is not appropriately analyzed using common statistical methods employed in the biomedical sciences.In this chapter, we outline a robust and well-accepted bioinformatics workflow for the initial analysis of -omics data generated using microarrays or next-generation sequencing technology using open-source tools. Starting with quality control measures and necessary preprocessing steps for data originating from different -omics technologies, we next outline a differential expression analysis pipeline that can be used for data from both microarray and sequencing experiments, and offers the possibility to account for random or fixed effects. Finally, we present an overview of the possibilities for a functional analysis of the obtained data.

  18. Tracking and Quantifying Developmental Processes in C. elegans Using Open-source Tools.

    Science.gov (United States)

    Dutta, Priyanka; Lehmann, Christina; Odedra, Devang; Singh, Deepika; Pohl, Christian

    2015-12-16

    Quantitatively capturing developmental processes is crucial to derive mechanistic models and key to identify and describe mutant phenotypes. Here protocols are presented for preparing embryos and adult C. elegans animals for short- and long-term time-lapse microscopy and methods for tracking and quantification of developmental processes. The methods presented are all based on C. elegans strains available from the Caenorhabditis Genetics Center and on open-source software that can be easily implemented in any laboratory independently of the microscopy system used. A reconstruction of a 3D cell-shape model using the modelling software IMOD, manual tracking of fluorescently-labeled subcellular structures using the multi-purpose image analysis program Endrov, and an analysis of cortical contractile flow using PIVlab (Time-Resolved Digital Particle Image Velocimetry Tool for MATLAB) are shown. It is discussed how these methods can also be deployed to quantitatively capture other developmental processes in different models, e.g., cell tracking and lineage tracing, tracking of vesicle flow.

  19. Optical Tooling and its Uses at the Spallation Neutron Source (SNS)

    CERN Document Server

    Helus, Scott; Error, Joseph; Fazekas, Julius; Maines, James

    2005-01-01

    Optical tooling has been a mainstay of the accelerator alignment community for decades. Even now in the age of electronic survey equipment, optical tooling remains a viable alternative, and at times the only alternative. At SNS, we combine traditional optical tooling alignment methods, instrumentation, and techniques, with the more modern electronic techniques. This paper deals with the integration of optical tooling into the electronic survey world.

  20. For whom were Gospels written?

    Directory of Open Access Journals (Sweden)

    Richard Bauckham

    1999-12-01

    Full Text Available This arlcie challenges the current consensus in Gospels scholarship that each Gospel was written for a specific church or group of churches. It argues that, since all our evidence about the early Christian movement shows it to have been a network of communities in constant, close communication, since all our evidence about early Christian leaders, such as might have written Gospels, shows them to have been typically people who travelled widely around the churches, and since, moreover, the evidence we have about early Christian literature shows that it did in fact circulate rapidily and widely, the strong probability is that the Gospels were written for general circulation around all the churches.

  1. Technical Note: PLASTIMATCH MABS, an open source tool for automatic image segmentation

    International Nuclear Information System (INIS)

    Zaffino, Paolo; Spadea, Maria Francesca; Raudaschl, Patrik; Fritscher, Karl; Sharp, Gregory C.

    2016-01-01

    Purpose: Multiatlas based segmentation is largely used in many clinical and research applications. Due to its good performances, it has recently been included in some commercial platforms for radiotherapy planning and surgery guidance. Anyway, to date, a software with no restrictions about the anatomical district and image modality is still missing. In this paper we introduce PLASTIMATCH MABS, an open source software that can be used with any image modality for automatic segmentation. Methods: PLASTIMATCH MABS workflow consists of two main parts: (1) an offline phase, where optimal registration and voting parameters are tuned and (2) an online phase, where a new patient is labeled from scratch by using the same parameters as identified in the former phase. Several registration strategies, as well as different voting criteria can be selected. A flexible atlas selection scheme is also available. To prove the effectiveness of the proposed software across anatomical districts and image modalities, it was tested on two very different scenarios: head and neck (H&N) CT segmentation for radiotherapy application, and magnetic resonance image brain labeling for neuroscience investigation. Results: For the neurological study, minimum dice was equal to 0.76 (investigated structures: left and right caudate, putamen, thalamus, and hippocampus). For head and neck case, minimum dice was 0.42 for the most challenging structures (optic nerves and submandibular glands) and 0.62 for the other ones (mandible, brainstem, and parotid glands). Time required to obtain the labels was compatible with a real clinical workflow (35 and 120 min). Conclusions: The proposed software fills a gap in the multiatlas based segmentation field, since all currently available tools (both for commercial and for research purposes) are restricted to a well specified application. Furthermore, it can be adopted as a platform for exploring MABS parameters and as a reference implementation for comparing against

  2. Technical Note: PLASTIMATCH MABS, an open source tool for automatic image segmentation

    Energy Technology Data Exchange (ETDEWEB)

    Zaffino, Paolo; Spadea, Maria Francesca [Department of Experimental and Clinical Medicine, Magna Graecia University of Catanzaro, Catanzaro 88100 (Italy); Raudaschl, Patrik; Fritscher, Karl [Institute for Biomedical Image Analysis, Private University of Health Sciences, Medical Informatics and Technology, Hall in Tirol 6060 (Austria); Sharp, Gregory C. [Department for Radiation Oncology, Massachusetts General Hospital, Boston, Massachusetts 02114 (United States)

    2016-09-15

    Purpose: Multiatlas based segmentation is largely used in many clinical and research applications. Due to its good performances, it has recently been included in some commercial platforms for radiotherapy planning and surgery guidance. Anyway, to date, a software with no restrictions about the anatomical district and image modality is still missing. In this paper we introduce PLASTIMATCH MABS, an open source software that can be used with any image modality for automatic segmentation. Methods: PLASTIMATCH MABS workflow consists of two main parts: (1) an offline phase, where optimal registration and voting parameters are tuned and (2) an online phase, where a new patient is labeled from scratch by using the same parameters as identified in the former phase. Several registration strategies, as well as different voting criteria can be selected. A flexible atlas selection scheme is also available. To prove the effectiveness of the proposed software across anatomical districts and image modalities, it was tested on two very different scenarios: head and neck (H&N) CT segmentation for radiotherapy application, and magnetic resonance image brain labeling for neuroscience investigation. Results: For the neurological study, minimum dice was equal to 0.76 (investigated structures: left and right caudate, putamen, thalamus, and hippocampus). For head and neck case, minimum dice was 0.42 for the most challenging structures (optic nerves and submandibular glands) and 0.62 for the other ones (mandible, brainstem, and parotid glands). Time required to obtain the labels was compatible with a real clinical workflow (35 and 120 min). Conclusions: The proposed software fills a gap in the multiatlas based segmentation field, since all currently available tools (both for commercial and for research purposes) are restricted to a well specified application. Furthermore, it can be adopted as a platform for exploring MABS parameters and as a reference implementation for comparing against

  3. Free and Open Source Tools (FOSTs): An Empirical Investigation of Pre-Service Teachers' Competencies, Attitudes, and Pedagogical Intentions

    Science.gov (United States)

    Asing-Cashman, Joyce G.; Gurung, Binod; Limbu, Yam B.; Rutledge, David

    2014-01-01

    This study examines the digital native pre-service teachers' (DNPSTs) perceptions of their competency, attitude, and pedagogical intention to use free and open source tools (FOSTs) in their future teaching. Participants were 294 PSTs who responded to pre-course surveys at the beginning of an educational technology course. Using the structural…

  4. Failure to Follow Written Procedures

    Science.gov (United States)

    2017-12-01

    Most tasks in aviation have a mandated written procedure to be followed specifically under the Code of Federal Regulations (CFR) Part 14, Section 43.13(a). However, the incidence of Failure to Follow Procedure (FFP) events continues to be a major iss...

  5. Open source tools for management and archiving of digital microscopy data to allow integration with patient pathology and treatment information.

    Science.gov (United States)

    Khushi, Matloob; Edwards, Georgina; de Marcos, Diego Alonso; Carpenter, Jane E; Graham, J Dinny; Clarke, Christine L

    2013-02-12

    Virtual microscopy includes digitisation of histology slides and the use of computer technologies for complex investigation of diseases such as cancer. However, automated image analysis, or website publishing of such digital images, is hampered by their large file sizes. We have developed two Java based open source tools: Snapshot Creator and NDPI-Splitter. Snapshot Creator converts a portion of a large digital slide into a desired quality JPEG image. The image is linked to the patient's clinical and treatment information in a customised open source cancer data management software (Caisis) in use at the Australian Breast Cancer Tissue Bank (ABCTB) and then published on the ABCTB website (http://www.abctb.org.au) using Deep Zoom open source technology. Using the ABCTB online search engine, digital images can be searched by defining various criteria such as cancer type, or biomarkers expressed. NDPI-Splitter splits a large image file into smaller sections of TIFF images so that they can be easily analysed by image analysis software such as Metamorph or Matlab. NDPI-Splitter also has the capacity to filter out empty images. Snapshot Creator and NDPI-Splitter are novel open source Java tools. They convert digital slides into files of smaller size for further processing. In conjunction with other open source tools such as Deep Zoom and Caisis, this suite of tools is used for the management and archiving of digital microscopy images, enabling digitised images to be explored and zoomed online. Our online image repository also has the capacity to be used as a teaching resource. These tools also enable large files to be sectioned for image analysis. The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/5330903258483934.

  6. Open source tools for management and archiving of digital microscopy data to allow integration with patient pathology and treatment information

    Directory of Open Access Journals (Sweden)

    Khushi Matloob

    2013-02-01

    Full Text Available Abstract Background Virtual microscopy includes digitisation of histology slides and the use of computer technologies for complex investigation of diseases such as cancer. However, automated image analysis, or website publishing of such digital images, is hampered by their large file sizes. Results We have developed two Java based open source tools: Snapshot Creator and NDPI-Splitter. Snapshot Creator converts a portion of a large digital slide into a desired quality JPEG image. The image is linked to the patient’s clinical and treatment information in a customised open source cancer data management software (Caisis in use at the Australian Breast Cancer Tissue Bank (ABCTB and then published on the ABCTB website (http://www.abctb.org.au using Deep Zoom open source technology. Using the ABCTB online search engine, digital images can be searched by defining various criteria such as cancer type, or biomarkers expressed. NDPI-Splitter splits a large image file into smaller sections of TIFF images so that they can be easily analysed by image analysis software such as Metamorph or Matlab. NDPI-Splitter also has the capacity to filter out empty images. Conclusions Snapshot Creator and NDPI-Splitter are novel open source Java tools. They convert digital slides into files of smaller size for further processing. In conjunction with other open source tools such as Deep Zoom and Caisis, this suite of tools is used for the management and archiving of digital microscopy images, enabling digitised images to be explored and zoomed online. Our online image repository also has the capacity to be used as a teaching resource. These tools also enable large files to be sectioned for image analysis. Virtual Slides The virtual slide(s for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/5330903258483934

  7. Minimal Poems Written in 1979 Minimal Poems Written in 1979

    Directory of Open Access Journals (Sweden)

    Sandra Sirangelo Maggio

    2008-04-01

    Full Text Available The reading of M. van der Slice's Minimal Poems Written in 1979 (the work, actually, has no title reminded me of a book I have seen a long time ago. called Truth, which had not even a single word printed inside. In either case we have a sample of how often excentricities can prove efficient means of artistic creativity, in this new literary trend known as Minimalism. The reading of M. van der Slice's Minimal Poems Written in 1979 (the work, actually, has no title reminded me of a book I have seen a long time ago. called Truth, which had not even a single word printed inside. In either case we have a sample of how often excentricities can prove efficient means of artistic creativity, in this new literary trend known as Minimalism.

  8. Readability of Written Materials for CKD Patients: A Systematic Review.

    Science.gov (United States)

    Morony, Suzanne; Flynn, Michaela; McCaffery, Kirsten J; Jansen, Jesse; Webster, Angela C

    2015-06-01

    The "average" patient has a literacy level of US grade 8 (age 13-14 years), but this may be lower for people with chronic kidney disease (CKD). Current guidelines suggest that patient education materials should be pitched at a literacy level of around 5th grade (age 10-11 years). This study aims to evaluate the readability of written materials targeted at patients with CKD. Systematic review. Patient information materials aimed at adults with CKD and written in English. Patient education materials designed to be printed and read, sourced from practices in Australia and online at all known websites run by relevant international CKD organizations during March 2014. Quantitative analysis of readability using Lexile Analyzer and Flesch-Kincaid tools. We analyzed 80 materials. Both Lexile Analyzer and Flesch-Kincaid analyses suggested that most materials required a minimum of grade 9 (age 14-15 years) schooling to read them. Only 5% of materials were pitched at the recommended level (grade 5). Readability formulas have inherent limitations and do not account for visual information. We did not consider other media through which patients with CKD may access information. Although the study covered materials from the United States, United Kingdom, and Australia, all non-Internet materials were sourced locally, and it is possible that some international paper-based materials were missed. Generalizability may be limited due to exclusion of non-English materials. These findings suggest that patient information materials aimed at patients with CKD are pitched above the average patient's literacy level. This issue is compounded by cognitive decline in patients with CKD, who may have lower literacy than the average patient. It suggests that information providers need to consider their audience more carefully when preparing patient information materials, including user testing with a low-literacy patient population. Copyright © 2015 National Kidney Foundation, Inc. Published by

  9. RdTools: An Open Source Python Library for PV Degradation Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Deceglie, Michael G [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Jordan, Dirk [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Nag, Ambarish [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Deline, Christopher A [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Shinn, Adam [kWh Analytics

    2018-05-04

    RdTools is a set of Python tools for analysis of photovoltaic data. In particular, PV production data is evaluated over several years to obtain rates of performance degradation over time. Rdtools can handle both high frequency (hourly or better) or low frequency (daily, weekly, etc.) datasets. Best results are obtained with higher frequency data.

  10. Overview of the tool-flow for the Montium Processing Tile

    NARCIS (Netherlands)

    Smit, Gerardus Johannes Maria; Rosien, M.A.J.; Guo, Y.; Heysters, P.M.

    This paper presents an overview of a tool chain to support a transformational design methodology. The tool can be used to compile code written in a high level source language, like C, to a coarse grain reconfigurable architecture. The source code is first translated into a Control Data Flow Graph

  11. An open source software tool to assign the material properties of bone for ABAQUS finite element simulations.

    Science.gov (United States)

    Pegg, Elise C; Gill, Harinderjit S

    2016-09-06

    A new software tool to assign the material properties of bone to an ABAQUS finite element mesh was created and compared with Bonemat, a similar tool originally designed to work with Ansys finite element models. Our software tool (py_bonemat_abaqus) was written in Python, which is the chosen scripting language for ABAQUS. The purpose of this study was to compare the software packages in terms of the material assignment calculation and processing speed. Three element types were compared (linear hexahedral (C3D8), linear tetrahedral (C3D4) and quadratic tetrahedral elements (C3D10)), both individually and as part of a mesh. Comparisons were made using a CT scan of a hemi-pelvis as a test case. A small difference, of -0.05kPa on average, was found between Bonemat version 3.1 (the current version) and our Python package. Errors were found in the previous release of Bonemat (version 3.0 downloaded from www.biomedtown.org) during calculation of the quadratic tetrahedron Jacobian, and conversion of the apparent density to modulus when integrating over the Young׳s modulus field. These issues caused up to 2GPa error in the modulus assignment. For these reasons, we recommend users upgrade to the most recent release of Bonemat. Processing speeds were assessed for the three different element types. Our Python package took significantly longer (110s on average) to perform the calculations compared with the Bonemat software (10s). Nevertheless, the workflow advantages of the package and added functionality makes 'py_bonemat_abaqus' a useful tool for ABAQUS users. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. "I CAMMINI DELLA REGINA" - Open Source based tools for preserving and culturally exploring historical traffic routes.

    Science.gov (United States)

    Cannata, Massimiliano; Colombo, Massimo; Antonovic, Milan; Cardoso, Mirko; Delucchi, Andrea; Gianocca, Giancarlo; Brovelli, Maria Antonia

    2015-04-01

    "I CAMMINI DELLA REGINA" (The Via Regina Paths) is an Interreg project funded within the transnational cooperation program between Italy and Switzerland 2007-2013. The aim of this project is the preservation and valorization of the cultural heritage linked to the walking historically paths crossing, connecting and serving the local territories. With the approach of leveraging the already existing tools, which generally consist of technical descriptions of the paths, the project uses the open source geospatial technologies to deploy innovative solutions which can fill some of the gaps in historical-cultural tourism offers. The Swiss part, and particularly the IST-SUPSI team, has been focusing its activities in the realization of two innovative solutions: a mobile application for the survey of historical paths and a storytelling system for immersive cultural exploration of the historical paths. The former, based on Android, allows to apply in a revised manner a consolidated and already successfully used methodology of survey focused on the conservation of the historical paths (Inventory of historical traffic routes in Switzerland). Up to now operators could rely only on hand work based on a combination of notes, pictures and GPS devices synthesized in manually drawn maps; this procedure is error prone and shows many problems both in data updating and extracting for elaborations. Thus it has been created an easy to use interface which allows to map, according to a newly developed spatially enabled data model, paths, morphological elements, and multimedia notes. When connected to the internet the application can send the data to a web service which, after applying linear referencing and further elaborating the data, makes them available using open standards. The storytelling system has been designed to provide users with cultural insights embedded in a multimedial and immersive geospatial portal. Whether the tourist is exploring physically or virtually the desired

  13. SedInConnect: a stand-alone, free and open source tool for the assessment of sediment connectivity

    Science.gov (United States)

    Crema, Stefano; Cavalli, Marco

    2018-02-01

    There is a growing call, within the scientific community, for solid theoretic frameworks and usable indices/models to assess sediment connectivity. Connectivity plays a significant role in characterizing structural properties of the landscape and, when considered in combination with forcing processes (e.g., rainfall-runoff modelling), can represent a valuable analysis for an improved landscape management. In this work, the authors present the development and application of SedInConnect: a free, open source and stand-alone application for the computation of the Index of Connectivity (IC), as expressed in Cavalli et al. (2013) with the addition of specific innovative features. The tool is intended to have a wide variety of users, both from the scientific community and from the authorities involved in the environmental planning. Thanks to its open source nature, the tool can be adapted and/or integrated according to the users' requirements. Furthermore, presenting an easy-to-use interface and being a stand-alone application, the tool can help management experts in the quantitative assessment of sediment connectivity in the context of hazard and risk assessment. An application to a sample dataset and an overview on up-to-date applications of the approach and of the tool shows the development potential of such analyses. The modelled connectivity, in fact, appears suitable not only to characterize sediment dynamics at the catchment scale but also to integrate prediction models and as a tool for helping geomorphological interpretation.

  14. Development of a tool dedicated to the evaluation of hydrogen term source for technological Wastes: assumptions, physical models, and validation

    Energy Technology Data Exchange (ETDEWEB)

    Lamouroux, C. [CEA Saclay, Nuclear Energy Division /DANS, Department of physico-chemistry, 91191 Gif sur yvette (France); Esnouf, S. [CEA Saclay, DSM/IRAMIS/SIS2M/Radiolysis Laboratory , 91191 Gif sur yvette (France); Cochin, F. [Areva NC,recycling BU, DIRP/RDP tour Areva, 92084 Paris La Defense (France)

    2013-07-01

    In radioactive waste packages hydrogen is generated, in one hand, from the radiolysis of wastes (mainly organic materials) and, in the other hand, from the radiolysis of water content in the cement matrix. In order to assess hydrogen generation 2 tools based on operational models have been developed. One is dedicated to the determination of the hydrogen source term issues from the radiolysis of the wastes: the STORAGE tool (Simulation Tool Of Emission Radiolysis Gas), the other deals with the hydrogen source term gas, produced by radiolysis of the cement matrices (the Damar tool). The approach used by the STORAGE tool for assessing the production rate of radiolysis gases is divided into five steps: 1) Specification of the data packages, in particular, inventories and radiological materials defined for a package medium; 2) Determination of radiochemical yields for the different constituents and the laws of behavior associated, this determination of radiochemical yields is made from the PRELOG database in which radiochemical yields in different irradiation conditions have been compiled; 3) Definition of hypothesis concerning the composition and the distribution of contamination inside the package to allow assessment of the power absorbed by the constituents; 4) Sum-up of all the contributions; And finally, 5) validation calculations by comparison with a reduced sampling of packages. Comparisons with measured values confirm the conservative character of the methodology and give confidence in the safety margins for safety analysis report.

  15. Simulations of a spectral gamma-ray logging tool response to a surface source distribution on the borehole wall

    International Nuclear Information System (INIS)

    Wilson, R.D.; Conaway, J.G.

    1991-01-01

    We have developed Monte Carlo and discrete ordinates simulation models for the large-detector spectral gamma-ray (SGR) logging tool in use at the Nevada Test Site. Application of the simulation models produced spectra for source layers on the borehole wall, either from potassium-bearing mudcakes or from plate-out of radon daughter products. Simulations show that the shape and magnitude of gamma-ray spectra from sources distributed on the borehole wall depend on radial position with in the air-filled borehole as well as on hole diameter. No such dependence is observed for sources uniformly distributed in the formation. In addition, sources on the borehole wall produce anisotropic angular fluxes at the higher scattered energies and at the source energy. These differences in borehole effects and in angular flux are important to the process of correcting SGR logs for the presence of potassium mudcakes; they also suggest a technique for distinguishing between spectral contributions from formation sources and sources on the borehole wall. These results imply the existence of a standoff effect not present for spectra measured in air-filled boreholes from formation sources. 5 refs., 11 figs

  16. A Metric Tool for Predicting Source Code Quality from a PDL Design

    OpenAIRE

    Henry, Sallie M.; Selig, Calvin

    1987-01-01

    The software crisis has increased the demand for automated tools to assist software developers in the production of quality software. Quality metrics have given software developers a tool to measure software quality. These measurements, however, are available only after the software has been produced. Due to high cost, software managers are reluctant, to redesign and reimplement low quality software. Ideally, a life cycle which allows early measurement of software quality is a necessary ingre...

  17. Financial analysis as a marketing tool in the process of awareness increase in the area of renewable energy sources

    Directory of Open Access Journals (Sweden)

    Marcela Taušová

    2007-04-01

    Full Text Available Alternative sources of energy represent a great area of progress nowadays. The trend of the 21. century is energetically demanding with an increaming tendency to use fossil fuels, sources of which are however limited. The article will deal with an inevitability of the use of marketing tools with the aim to increase the share of these energetical resources on the Slovak market. The result will be obtaining of some financial advantage for future users on one side and the increase of volume of sales for vendors on the other side.

  18. Parallel Beam Dynamics Simulation Tools for Future Light Source Linac Modeling

    International Nuclear Information System (INIS)

    Qiang, Ji; Pogorelov, Ilya v.; Ryne, Robert D.

    2007-01-01

    Large-scale modeling on parallel computers is playing an increasingly important role in the design of future light sources. Such modeling provides a means to accurately and efficiently explore issues such as limits to beam brightness, emittance preservation, the growth of instabilities, etc. Recently the IMPACT codes suite was enhanced to be applicable to future light source design. Simulations with IMPACT-Z were performed using up to one billion simulation particles for the main linac of a future light source to study the microbunching instability. Combined with the time domain code IMPACT-T, it is now possible to perform large-scale start-to-end linac simulations for future light sources, including the injector, main linac, chicanes, and transfer lines. In this paper we provide an overview of the IMPACT code suite, its key capabilities, and recent enhancements pertinent to accelerator modeling for future linac-based light sources

  19. A new software for deformation source optimization, the Bayesian Earthquake Analysis Tool (BEAT)

    Science.gov (United States)

    Vasyura-Bathke, H.; Dutta, R.; Jonsson, S.; Mai, P. M.

    2017-12-01

    Modern studies of crustal deformation and the related source estimation, including magmatic and tectonic sources, increasingly use non-linear optimization strategies to estimate geometric and/or kinematic source parameters and often consider both jointly, geodetic and seismic data. Bayesian inference is increasingly being used for estimating posterior distributions of deformation source model parameters, given measured/estimated/assumed data and model uncertainties. For instance, some studies consider uncertainties of a layered medium and propagate these into source parameter uncertainties, while others use informative priors to reduce the model parameter space. In addition, innovative sampling algorithms have been developed to efficiently explore the high-dimensional parameter spaces. Compared to earlier studies, these improvements have resulted in overall more robust source model parameter estimates that include uncertainties. However, the computational burden of these methods is high and estimation codes are rarely made available along with the published results. Even if the codes are accessible, it is usually challenging to assemble them into a single optimization framework as they are typically coded in different programing languages. Therefore, further progress and future applications of these methods/codes are hampered, while reproducibility and validation of results has become essentially impossible. In the spirit of providing open-access and modular codes to facilitate progress and reproducible research in deformation source estimations, we undertook the effort of developing BEAT, a python package that comprises all the above-mentioned features in one single programing environment. The package builds on the pyrocko seismological toolbox (www.pyrocko.org), and uses the pymc3 module for Bayesian statistical model fitting. BEAT is an open-source package (https://github.com/hvasbath/beat), and we encourage and solicit contributions to the project. Here, we

  20. Improving mass measurement accuracy in mass spectrometry based proteomics by combining open source tools for chromatographic alignment and internal calibration.

    Science.gov (United States)

    Palmblad, Magnus; van der Burgt, Yuri E M; Dalebout, Hans; Derks, Rico J E; Schoenmaker, Bart; Deelder, André M

    2009-05-02

    Accurate mass determination enhances peptide identification in mass spectrometry based proteomics. We here describe the combination of two previously published open source software tools to improve mass measurement accuracy in Fourier transform ion cyclotron resonance mass spectrometry (FTICRMS). The first program, msalign, aligns one MS/MS dataset with one FTICRMS dataset. The second software, recal2, uses peptides identified from the MS/MS data for automated internal calibration of the FTICR spectra, resulting in sub-ppm mass measurement errors.

  1. Sustaining an Online, Shared Community Resource for Models, Robust Open source Software Tools and Data for Volcanology - the Vhub Experience

    Science.gov (United States)

    Patra, A. K.; Valentine, G. A.; Bursik, M. I.; Connor, C.; Connor, L.; Jones, M.; Simakov, N.; Aghakhani, H.; Jones-Ivey, R.; Kosar, T.; Zhang, B.

    2015-12-01

    Over the last 5 years we have created a community collaboratory Vhub.org [Palma et al, J. App. Volc. 3:2 doi:10.1186/2191-5040-3-2] as a place to find volcanology-related resources, and a venue for users to disseminate tools, teaching resources, data, and an online platform to support collaborative efforts. As the community (current active users > 6000 from an estimated community of comparable size) embeds the tools in the collaboratory into educational and research workflows it became imperative to: a) redesign tools into robust, open source reusable software for online and offline usage/enhancement; b) share large datasets with remote collaborators and other users seamlessly with security; c) support complex workflows for uncertainty analysis, validation and verification and data assimilation with large data. The focus on tool development/redevelopment has been twofold - firstly to use best practices in software engineering and new hardware like multi-core and graphic processing units. Secondly we wish to enhance capabilities to support inverse modeling, uncertainty quantification using large ensembles and design of experiments, calibration, validation. Among software engineering practices we practice are open source facilitating community contributions, modularity and reusability. Our initial targets are four popular tools on Vhub - TITAN2D, TEPHRA2, PUFF and LAVA. Use of tools like these requires many observation driven data sets e.g. digital elevation models of topography, satellite imagery, field observations on deposits etc. These data are often maintained in private repositories that are privately shared by "sneaker-net". As a partial solution to this we tested mechanisms using irods software for online sharing of private data with public metadata and access limits. Finally, we adapted use of workflow engines (e.g. Pegasus) to support the complex data and computing workflows needed for usage like uncertainty quantification for hazard analysis using physical

  2. Public data and open source tools for multi-assay genomic investigation of disease.

    Science.gov (United States)

    Kannan, Lavanya; Ramos, Marcel; Re, Angela; El-Hachem, Nehme; Safikhani, Zhaleh; Gendoo, Deena M A; Davis, Sean; Gomez-Cabrero, David; Castelo, Robert; Hansen, Kasper D; Carey, Vincent J; Morgan, Martin; Culhane, Aedín C; Haibe-Kains, Benjamin; Waldron, Levi

    2016-07-01

    Molecular interrogation of a biological sample through DNA sequencing, RNA and microRNA profiling, proteomics and other assays, has the potential to provide a systems level approach to predicting treatment response and disease progression, and to developing precision therapies. Large publicly funded projects have generated extensive and freely available multi-assay data resources; however, bioinformatic and statistical methods for the analysis of such experiments are still nascent. We review multi-assay genomic data resources in the areas of clinical oncology, pharmacogenomics and other perturbation experiments, population genomics and regulatory genomics and other areas, and tools for data acquisition. Finally, we review bioinformatic tools that are explicitly geared toward integrative genomic data visualization and analysis. This review provides starting points for accessing publicly available data and tools to support development of needed integrative methods. © The Author 2015. Published by Oxford University Press.

  3. STUDENT OPINION TOWARDS USING AN OPEN SOURCE LEARNING MANAGEMENT SYSTEM TOGETHER WITH A COLLABORATIVE TOOL

    Directory of Open Access Journals (Sweden)

    Nadire Cavus

    2008-12-01

    Full Text Available This paper is about a pilot study which has been carried out at the Near East University during the 2004/5 FallSemester using the Moodle LMS together with GREWPtool collaborative editor. The system has been tested with 36students taking the Java and the Pascal programming courses. The results of the pilot study showed that a LearningManagement System can be made more efficient if it is enhanced by a collaborative learning tool. Our results have alsoshown that programming languages such as Pascal and Java can be thought successfully in a web-based environment usingan LMS system together with a collaborative tool

  4. Promoting Strong Written Communication Skills

    Science.gov (United States)

    Narayanan, M.

    2015-12-01

    The reason that an improvement in the quality of technical writing is still needed in the classroom is due to the fact that universities are facing challenging problems not only on the technological front but also on the socio-economic front. The universities are actively responding to the changes that are taking place in the global consumer marketplace. Obviously, there are numerous benefits of promoting strong written communication skills. They can be summarized into the following six categories. First, and perhaps the most important: The University achieves learner satisfaction. The learner has documented verbally, that the necessary knowledge has been successfully acquired. This results in learner loyalty that in turn will attract more qualified learners.Second, quality communication lowers the cost per pupil, consequently resulting in increased productivity backed by a stronger economic structure and forecast. Third, quality communications help to improve the cash flow and cash reserves of the university. Fourth, having high quality communication enables the university to justify the need for high costs of tuition and fees. Fifth, better quality in written communication skills result in attracting top-quality learners. This will lead to happier and satisfied learners, not to mention greater prosperity for the university as a whole. Sixth, quality written communication skills result in reduced complaints, thus meaning fewer hours spent on answering or correcting the situation. The University faculty and staff are thus able to devote more time on scholarly activities, meaningful research and productive community service. References Boyer, Ernest L. (1990). Scholarship reconsidered: Priorities of the Professorate.Princeton, NJ: Carnegie Foundation for the Advancement of Teaching. Hawkins, P., & Winter, J. (1997). Mastering change: Learning the lessons of the enterprise.London: Department for Education and Employment. Buzzel, Robert D., and Bradley T. Gale. (1987

  5. AHP Expert Programme As A Tool For Unsealed Sources Contamination Control Of The Environment

    International Nuclear Information System (INIS)

    Amin, E.T.; Ibrahim, M.S.; Hussein, A.Z.

    2007-01-01

    Unsealed sources of radionuclides are widely used in hot laboratories of medical centers and hospitals which can be easily dispersed and may be taken undue into the body. The presence of radioactive substances inside the human body generates risk of internal intakes of radionuclides and organ's tissue retention. In order to make control for any contamination occurring from unsealed sources, an AHP programme (PC programme) has been developed so that it includes all data of most unsealed sources used in the hot laboratories of nuclear medicine units at hospitals/medical centers. Sequence of questions are retrieved by the programme in relevance to the place address, uses, activity and half life of the unsealed radioisotopes that may cause contamination. The programme will also give information output about the hospital that use the unsealed source and its location which facilitate emergency planning and contamination control to the environment

  6. ADVANCED TOOLS FOR ASSESSING SELECTED PRESCRIPTION AND ILLICIT DRUGS IN TREATED SEWAGE EFFLUENTS AND SOURCE WATERS

    Science.gov (United States)

    The purpose of this poster is to present the application and assessment of advanced technologies in a real-world environment - wastewater effluent and source waters - for detecting six drugs (azithromycin, fluoxetine, omeprazole, levothyroxine, methamphetamine, and methylenedioxy...

  7. Free and Open Source GIS Tools: Role and Relevance in the Environmental Assessment Community

    Science.gov (United States)

    The presence of an explicit geographical context in most environmental decisions can complicate assessment and selection of management options. These decisions typically involve numerous data sources, complex environmental and ecological processes and their associated models, ris...

  8. Environmental and Landscape Remote Sensing Using Free and Open Source Image Processing Tools

    Science.gov (United States)

    As global climate change and human activities impact the environment, there is a growing need for scientific tools to monitor and measure environmental conditions that support human and ecological health. Remotely sensed imagery from satellite and airborne platforms provides a g...

  9. Compressed-air power tools in orthopaedic surgery: exhaust air is a potential source of contamination.

    Science.gov (United States)

    Sagi, H C; DiPasquale, Thomas; Sanders, Roy; Herscovici, Dolfi

    2002-01-01

    To determine if the exhaust from surgical compressed-air power tools contains bacteria and if the exhaust leads to contamination of sterile surfaces. Bacteriologic study of orthopaedic power tools. Level I trauma center operative theater. None. Part I. Exhaust from two sterile compact air drills was sampled directly at the exhaust port. Part II. Exhaust from the drills was directed at sterile agar plates from varying distances. The agar plates represented sterile surfaces within the operative field. Part III. Control cultures. A battery-powered drill was operated over open agar plates in similar fashion as the compressed-air drills. Agar plates left open in the operative theater served as controls to rule out atmospheric contamination. Random cultures were taken from agar plates, gloves, drills, and hoses. Incidence of positive cultures. In Part I, all filters from both compressed-air drill exhausts were culture negative ( = 0.008). In Part II, the incidence of positive cultures for air drills number one and number two was 73% and 82%, respectively. The most commonly encountered organisms were, coagulase-negative Staphylococcus, and Micrococcus species. All control cultures from agar plates, battery-powered drill, gloves, and hoses were negative ( compressed-air power tools in orthopaedic surgery may contribute to the dissemination of bacteria onto the surgical field. We do not recommend the use of compressed-air power tools that do not have a contained exhaust.

  10. Testing an Open Source installation and server provisioning tool for the INFN CNAF Tierl Storage system

    International Nuclear Information System (INIS)

    Pezzi, M; Favaro, M; Gregori, D; Ricci, P P; Sapunenko, V

    2014-01-01

    In large computing centers, such as the INFN CNAF Tier1 [1], is essential to be able to configure all the machines, depending on use, in an automated way. For several years at the Tier1 has been used Quattor[2], a server provisioning tool, which is currently used in production. Nevertheless we have recently started a comparison study involving other tools able to provide specific server installation and configuration features and also offer a proper full customizable solution as an alternative to Quattor. Our choice at the moment fell on integration between two tools: Cobbler [3] for the installation phase and Puppet [4] for the server provisioning and management operation. The tool should provide the following properties in order to replicate and gradually improve the current system features: implement a system check for storage specific constraints such as kernel modules black list at boot time to avoid undesired SAN (Storage Area Network) access during disk partitioning; a simple and effective mechanism for kernel upgrade and downgrade; the ability of setting package provider using yum, rpm or apt; easy to use Virtual Machine installation support including bonding and specific Ethernet configuration; scalability for managing thousands of nodes and parallel installations. This paper describes the results of the comparison and the tests carried out to verify the requirements and the new system suitability in the INFN-T1 environment.

  11. Total organic carbon, an important tool in a holistic approach to hydrocarbon source fingerprinting

    International Nuclear Information System (INIS)

    Boehm, P.D.; Burns, W.A.; Page, D.S.; Bence, A.E.; Mankiewicz, P.J.; Brown, J.S.; Douglas, G.S.

    2002-01-01

    Total organic carbon (TOC) was used to verify the consistency of source allocation results for the natural petrogenic hydrocarbon background of the northern Gulf of Alaska and Prince William Sound where the Exxon Valdez oil spill occurred in 1998. The samples used in the study were either pre-spill sediments or from the seafloor outside the spill path. It is assumed that the natural petrogenic hydrocarbon background in the area comes from either seep oil residues and shale erosion including erosion from petroleum source rock shales, or from coals including those of the Bering River coalfields. The objective of this study was to use the TOC calculations to discriminate between the two very different sources. TOC can constrain the contributions of specific sources and rule out incorrect source allocations, particularly when inputs are dominated by fossil organic carbon. The benthic sediments used in this study showed excellent agreement between measured TOC and calculated TOC from hydrocarbon fingerprint matches of polycyclic aromatic hydrocarbons (PAH) and chemical biomarkers. TOC and fingerprint matches confirmed that TOC sources were properly identified. The matches quantify the hydrocarbon contributions of different sources to the benthic sediments and the degree of hydrocarbon winnowing by waves and currents. It was concluded that the natural petrogenic hydrocarbon background in the sediments in the area comes from eroding Tertiary shales and oil seeps along the northern Gulf of Alaska coast. Thermally mature area coals are excluded from being important contributors to the background at Prince William Sound because of their high TOC content. 26 refs., 4 figs

  12. Total organic carbon, an important tool in a holistic approach to hydrocarbon source fingerprinting

    Energy Technology Data Exchange (ETDEWEB)

    Boehm, P.D. [Battelle, Waltham, MA (United States); Burns, W.A. [W.A. Burns Consulting Services, Houston, TX (United States); Page, D.S. [Bowdoin College, Brunswick, ME (United States); Bence, A.E.; Mankiewicz, P.J. [ExxonMobil Upstream Research Co., Houston, TX (United States); Brown, J.S.; Douglas, G.S. [Battelle, Duxbury, MA (United States)

    2002-07-01

    Total organic carbon (TOC) was used to verify the consistency of source allocation results for the natural petrogenic hydrocarbon background of the northern Gulf of Alaska and Prince William Sound where the Exxon Valdez oil spill occurred in 1998. The samples used in the study were either pre-spill sediments or from the seafloor outside the spill path. It is assumed that the natural petrogenic hydrocarbon background in the area comes from either seep oil residues and shale erosion including erosion from petroleum source rock shales, or from coals including those of the Bering River coalfields. The objective of this study was to use the TOC calculations to discriminate between the two very different sources. TOC can constrain the contributions of specific sources and rule out incorrect source allocations, particularly when inputs are dominated by fossil organic carbon. The benthic sediments used in this study showed excellent agreement between measured TOC and calculated TOC from hydrocarbon fingerprint matches of polycyclic aromatic hydrocarbons (PAH) and chemical biomarkers. TOC and fingerprint matches confirmed that TOC sources were properly identified. The matches quantify the hydrocarbon contributions of different sources to the benthic sediments and the degree of hydrocarbon winnowing by waves and currents. It was concluded that the natural petrogenic hydrocarbon background in the sediments in the area comes from eroding Tertiary shales and oil seeps along the northern Gulf of Alaska coast. Thermally mature area coals are excluded from being important contributors to the background at Prince William Sound because of their high TOC content. 26 refs., 4 figs.

  13. An open-source software tool for the generation of relaxation time maps in magnetic resonance imaging

    International Nuclear Information System (INIS)

    Messroghli, Daniel R; Rudolph, Andre; Abdel-Aty, Hassan; Wassmuth, Ralf; Kühne, Titus; Dietz, Rainer; Schulz-Menger, Jeanette

    2010-01-01

    In magnetic resonance (MR) imaging, T1, T2 and T2* relaxation times represent characteristic tissue properties that can be quantified with the help of specific imaging strategies. While there are basic software tools for specific pulse sequences, until now there is no universal software program available to automate pixel-wise mapping of relaxation times from various types of images or MR systems. Such a software program would allow researchers to test and compare new imaging strategies and thus would significantly facilitate research in the area of quantitative tissue characterization. After defining requirements for a universal MR mapping tool, a software program named MRmap was created using a high-level graphics language. Additional features include a manual registration tool for source images with motion artifacts and a tabular DICOM viewer to examine pulse sequence parameters. MRmap was successfully tested on three different computer platforms with image data from three different MR system manufacturers and five different sorts of pulse sequences: multi-image inversion recovery T1; Look-Locker/TOMROP T1; modified Look-Locker (MOLLI) T1; single-echo T2/T2*; and multi-echo T2/T2*. Computing times varied between 2 and 113 seconds. Estimates of relaxation times compared favorably to those obtained from non-automated curve fitting. Completed maps were exported in DICOM format and could be read in standard software packages used for analysis of clinical and research MR data. MRmap is a flexible cross-platform research tool that enables accurate mapping of relaxation times from various pulse sequences. The software allows researchers to optimize quantitative MR strategies in a manufacturer-independent fashion. The program and its source code were made available as open-source software on the internet

  14. An open-source software tool for the generation of relaxation time maps in magnetic resonance imaging

    Directory of Open Access Journals (Sweden)

    Kühne Titus

    2010-07-01

    Full Text Available Abstract Background In magnetic resonance (MR imaging, T1, T2 and T2* relaxation times represent characteristic tissue properties that can be quantified with the help of specific imaging strategies. While there are basic software tools for specific pulse sequences, until now there is no universal software program available to automate pixel-wise mapping of relaxation times from various types of images or MR systems. Such a software program would allow researchers to test and compare new imaging strategies and thus would significantly facilitate research in the area of quantitative tissue characterization. Results After defining requirements for a universal MR mapping tool, a software program named MRmap was created using a high-level graphics language. Additional features include a manual registration tool for source images with motion artifacts and a tabular DICOM viewer to examine pulse sequence parameters. MRmap was successfully tested on three different computer platforms with image data from three different MR system manufacturers and five different sorts of pulse sequences: multi-image inversion recovery T1; Look-Locker/TOMROP T1; modified Look-Locker (MOLLI T1; single-echo T2/T2*; and multi-echo T2/T2*. Computing times varied between 2 and 113 seconds. Estimates of relaxation times compared favorably to those obtained from non-automated curve fitting. Completed maps were exported in DICOM format and could be read in standard software packages used for analysis of clinical and research MR data. Conclusions MRmap is a flexible cross-platform research tool that enables accurate mapping of relaxation times from various pulse sequences. The software allows researchers to optimize quantitative MR strategies in a manufacturer-independent fashion. The program and its source code were made available as open-source software on the internet.

  15. ThManager: An Open Source Tool for Creating and Visualizing SKOS

    Directory of Open Access Journals (Sweden)

    Javier Lacasta

    2007-09-01

    Full Text Available Knowledge organization systems denotes formally represented knowledge that is used within the context of digital libraries to improve data sharing and information retrieval. To increase their use, and to reuse them when possible, it is vital to manage them adequately and to provide them in a standard interchange format. Simple knowledge organization systems (SKOS seem to be the most promising representation for the type of knowledge models used in digital libraries, but there is a lack of tools that are able to properly manage it. This work presents a tool that fills this gap, facilitating their use in different environments and using SKOS as an interchange format.

  16. The Advanced Light Source: A new tool for research in atomic and molecular physics

    International Nuclear Information System (INIS)

    Schlachter, F.; Robinson, A.

    1991-04-01

    The Advanced Light Source at the Lawrence Berkeley Laboratory will be the world's brightest synchrotron radiation source in the extreme ultraviolet and soft x-ray regions of the spectrum when it begins operation in 1993. It will be available as a national user facility to researchers in a broad range of disciplines, including materials science, atomic and molecular physics, chemistry, biology, imaging, and technology. The high brightness of the ALS will be particularly well suited to high-resolution studies of tenuous targets, such as excited atoms, ions, and clusters. 13 figs., 4 tabs

  17. The use of virtual reality as an information tool on externalities of energy sources

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Maria I.B.; Mol, Antonio C.A.; Lapa, Celso M.F., E-mail: isabel@ien.gov.br, E-mail: mol@ien.gov.br, E-mail: lapa@ien.gov.br [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil)

    2017-07-01

    Almost daily communication vehicles make some reference to the need to combat the indiscriminate use of fossil fuels and to use less polluting energy sources. In this scenario, nuclear energy should be presented as an option but this is still covered by many myths. Thus, to inform the youth public about the characteristics of the main sources that compose the brazilian energy matrix it is necessary to promote the transfer of knowledge and to demystify the nuclear sector in playful and responsible way. (author)

  18. The use of virtual reality as an information tool on externalities of energy sources

    International Nuclear Information System (INIS)

    Silva, Maria I.B.; Mol, Antonio C.A.; Lapa, Celso M.F.

    2017-01-01

    Almost daily communication vehicles make some reference to the need to combat the indiscriminate use of fossil fuels and to use less polluting energy sources. In this scenario, nuclear energy should be presented as an option but this is still covered by many myths. Thus, to inform the youth public about the characteristics of the main sources that compose the brazilian energy matrix it is necessary to promote the transfer of knowledge and to demystify the nuclear sector in playful and responsible way. (author)

  19. Rocker: Open source, easy-to-use tool for AUC and enrichment calculations and ROC visualization.

    Science.gov (United States)

    Lätti, Sakari; Niinivehmas, Sanna; Pentikäinen, Olli T

    2016-01-01

    Receiver operating characteristics (ROC) curve with the calculation of area under curve (AUC) is a useful tool to evaluate the performance of biomedical and chemoinformatics data. For example, in virtual drug screening ROC curves are very often used to visualize the efficiency of the used application to separate active ligands from inactive molecules. Unfortunately, most of the available tools for ROC analysis are implemented into commercially available software packages, or are plugins in statistical software, which are not always the easiest to use. Here, we present Rocker, a simple ROC curve visualization tool that can be used for the generation of publication quality images. Rocker also includes an automatic calculation of the AUC for the ROC curve and Boltzmann-enhanced discrimination of ROC (BEDROC). Furthermore, in virtual screening campaigns it is often important to understand the early enrichment of active ligand identification, for this Rocker offers automated calculation routine. To enable further development of Rocker, it is freely available (MIT-GPL license) for use and modifications from our web-site (http://www.jyu.fi/rocker).

  20. TACIT: An open-source text analysis, crawling, and interpretation tool.

    Science.gov (United States)

    Dehghani, Morteza; Johnson, Kate M; Garten, Justin; Boghrati, Reihane; Hoover, Joe; Balasubramanian, Vijayan; Singh, Anurag; Shankar, Yuvarani; Pulickal, Linda; Rajkumar, Aswin; Parmar, Niki Jitendra

    2017-04-01

    As human activity and interaction increasingly take place online, the digital residues of these activities provide a valuable window into a range of psychological and social processes. A great deal of progress has been made toward utilizing these opportunities; however, the complexity of managing and analyzing the quantities of data currently available has limited both the types of analysis used and the number of researchers able to make use of these data. Although fields such as computer science have developed a range of techniques and methods for handling these difficulties, making use of those tools has often required specialized knowledge and programming experience. The Text Analysis, Crawling, and Interpretation Tool (TACIT) is designed to bridge this gap by providing an intuitive tool and interface for making use of state-of-the-art methods in text analysis and large-scale data management. Furthermore, TACIT is implemented as an open, extensible, plugin-driven architecture, which will allow other researchers to extend and expand these capabilities as new methods become available.

  1. Crowd Sourcing for Conservation: Web 2.0 a Powerful Tool for Biologists

    Directory of Open Access Journals (Sweden)

    William E. Boyd

    2012-05-01

    Full Text Available The advent and adoption of Web 2.0 technologies offers a powerful approach to enhancing the capture of information in natural resource ecology, notably community knowledge of species distributions. Such information has previously been collected using, for example, postal surveys; these are typically inefficient, with low response rates, high costs, and requiring respondents to be spatially literate. Here we describe an example, using the Google Maps Application Programming Interface, to discuss the opportunities such tools provide to conservation biology. Toad Tracker was created as a prototype to demonstrate the utility of this technology to document the distribution of an invasive vertebrate pest species, the cane toad, within Australia. While the technological aspects of this tool are satisfactory, manager resistance towards its use raises issues around the public nature of the technology, the collaborative (non-expert role in data collection, and data ownership. We conclude in suggesting that, for such tools to be accepted by non-innovation adopters, work is required on both the technological aspects and, importantly, a cultural change is required to create an environment of acceptance of the shifting relationship between authority, expertise and knowledge.

  2. A new tool that links landscale connectivity and source-sink dynamics to population viability

    Science.gov (United States)

    The importance of connectivity and source-sink dynamics to conservation planning is widely appreciated. But the use of these concepts in practical applications such as the identification of critical habitat has been slowed because few models are designed to identify demographic s...

  3. Comparing a phased combination of acoustical radiosity and the image source method with other simulation tools

    DEFF Research Database (Denmark)

    Marbjerg, Gerd Høy; Brunskog, Jonas; Jeong, Cheol-Ho

    2015-01-01

    A phased combination of acoustical radiosity and the image source method (PARISM) has been developed in order to be able to model both specular and diffuse reflections with angle-dependent and complex-valued acoustical descriptions of the surfaces. It is of great interest to model both specular...

  4. Development of Environmental Decision Support System: Unifying Cross-Discipline Data Access Through Open Source Tools

    Science.gov (United States)

    Freeman, S.; Darmenova, K.; Higgins, G. J.; Apling, D.

    2012-12-01

    A common theme when it comes to accessing climate and environmental datasets is that it can be difficult to answer the five basic questions: Who, What, When, Where, and Why. Sometimes even the act of locating a data set or determining how it was generated can prove difficult. It is even more challenging for non-scientific individuals such as planners and policy makers who need to access and include such information in their work. Our Environmental Decision Support System (EDSS) attempts to address this issue by integrating several open source packages to create a simple yet robust web application for conglomerating, searching, viewing, and downloading environmental information for both scientists and decision makers alike. The system is comprised of several open source components, each playing an important role in the EDSS. The Geoportal web application provides an intuitive interface for searching and managing metadata ingested from data sets/data sources. The GeoServer and ncWMS web applications provide overlays and information for visual presentations of the data through web mapping services (WMS) by ingesting ESRI shapefiles, NetCDF, and HDF files. Users of the EDSS can browse the catalog of available products, enter a simple search string, or even constrain searches by temporal and spatial extents. Combined with a custom visualization web application, the EDSS provides a simple yet efficient means for users to not only access and manipulate climate and environmental data, but also trace the data source and the analytical methods used in the final decision aids products.

  5. Development of a Protocol and a Screening Tool for Selection of DNAPL Source Area Remediation

    Science.gov (United States)

    2012-02-01

    awaiting print. Gandhi, S., Oh, B., Schnoor J.L., Alvarez P. J. 2002, Degradation of TCE, Cr(VI), sulfate, and nitrate mixtures by granular iron in...disposal site in fractured rocks: Effect of multiscale heterogeneity and source term uncertainty on conceptual understanding of mass transfer

  6. Moving converter as the possible tool for producing ultra-cold neutrons on pulsed neutron sources

    International Nuclear Information System (INIS)

    Pokotilovskij, Yu.N.

    1991-01-01

    A method is proposed for producing ultra-cold neutrons (UCN) at aperiodic pulse neutron sources. It is based on the use of the fast moving cooled converter of UCN in the time of the neutron pulse and includes the trapping of generated UCN's in a moving trap. 6 refs.; 2 figs

  7. Open Source Tools for Numerical Simulation of Urban Greenhouse Gas Emissions

    Science.gov (United States)

    Nottrott, A.; Tan, S. M.; He, Y.

    2016-12-01

    There is a global movement toward urbanization. Approximately 7% of the global population lives in just 28 megacities, occupying less than 0.1% of the total land area used by human activity worldwide. These cities contribute a significant fraction of the global budget of anthropogenic primary pollutants and greenhouse gasses. The 27 largest cities consume 9.9%, 9.3%, 6.7% and 3.0% of global gasoline, electricity, energy and water use, respectively. This impact motivates novel approaches to quantify and mitigate the growing contribution of megacity emissions to global climate change. Cities are characterized by complex topography, inhomogeneous turbulence, and variable pollutant source distributions. These features create a scale separation between local sources and urban scale emissions estimates known as the Grey-Zone. Modern computational fluid dynamics (CFD) techniques provide a quasi-deterministic, physically based toolset to bridge the scale separation gap between source level dynamics, local measurements, and urban scale emissions inventories. CFD has the capability to represent complex building topography and capture detailed 3D turbulence fields in the urban boundary layer. This presentation discusses the application of OpenFOAM to urban CFD simulations of natural gas leaks in cities. OpenFOAM is an open source software for advanced numerical simulation of engineering and environmental fluid flows. When combined with free or low cost computer aided drawing and GIS, OpenFOAM generates a detailed, 3D representation of urban wind fields. OpenFOAM was applied to model methane (CH4) emissions from various components of the natural gas distribution system, to investigate the impact of urban meteorology on mobile CH4 measurements. The numerical experiments demonstrate that CH4 concentration profiles are highly sensitive to the relative location of emission sources and buildings. Sources separated by distances of 5-10 meters showed significant differences in

  8. Multi-stage ranking of emergency technology alternatives for water source pollution accidents using a fuzzy group decision making tool.

    Science.gov (United States)

    Qu, Jianhua; Meng, Xianlin; You, Hong

    2016-06-05

    Due to the increasing number of unexpected water source pollution events, selection of the most appropriate disposal technology for a specific pollution scenario is of crucial importance to the security of urban water supplies. However, the formulation of the optimum option is considerably difficult owing to the substantial uncertainty of such accidents. In this research, a multi-stage technical screening and evaluation tool is proposed to determine the optimal technique scheme, considering the areas of pollutant elimination both in drinking water sources and water treatment plants. In stage 1, a CBR-based group decision tool was developed to screen available technologies for different scenarios. Then, the threat degree caused by the pollution was estimated in stage 2 using a threat evaluation system and was partitioned into four levels. For each threat level, a corresponding set of technique evaluation criteria weights was obtained using Group-G1. To identify the optimization alternatives corresponding to the different threat levels, an extension of TOPSIS, a multi-criteria interval-valued trapezoidal fuzzy decision making technique containing the four arrays of criteria weights, to a group decision environment was investigated in stage 3. The effectiveness of the developed tool was elaborated by two actual thallium-contaminated scenarios associated with different threat levels. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Extending the 4I Organizational Learning Model: Information Sources, Foraging Processes and Tools

    Directory of Open Access Journals (Sweden)

    Tracy A. Jenkin

    2013-08-01

    Full Text Available The continued importance of organizational learning has recently led to several calls for further developing the theory. This article addresses these calls by extending Crossan, Lane and White’s (1999 4I model to include a fifth process, information foraging, and a fourth level, the tool. The resulting 5I organizational learning model can be generalized to a number of learning contexts, especially those that involve understanding and making sense of data and information. Given the need for organizations to both innovate and increase productivity, and the volumes of data and information that are available to support both, the 5I model addresses an important organizational issue.

  10. A Systematic Approach for Evaluating BPM Systems: Case Studies on Open Source and Proprietary Tools

    OpenAIRE

    Delgado , Andrea; Calegari , Daniel; Milanese , Pablo; Falcon , Renatta; García , Esteban

    2015-01-01

    Part 3: Examples and Case Studies; International audience; Business Process Management Systems (BPMS) provide support for modeling, developing, deploying, executing and evaluating business processes in an organization. Selecting a BPMS is not a trivial task, not only due to the many existing alternatives, both in the open source and proprietary realms, but also because it requires a thorough evaluation of its capabilities, contextualizing them in the organizational environment in which they w...

  11. FREEWAT: FREE and open source software tools for WATer resource management

    OpenAIRE

    Rossetto, Rudy; Borsi, Iacopo; Foglia, Laura

    2015-01-01

    FREEWAT is an HORIZON 2020 project financed by the EU Commission under the call WATER INNOVATION: BOOSTING ITS VALUE FOR EUROPE. FREEWAT main result will be an open source and public domain GIS integrated modelling environment for the simulation of water quantity and quality in surface water and groundwater with an integrated water management and planning module. FREEWAT aims at promoting water resource management by simplifying the application of the Water Framework Directive and other EU wa...

  12. Open-source tool for automatic import of coded surveying data to multiple vector layers in GIS environment

    Directory of Open Access Journals (Sweden)

    Eva Stopková

    2016-12-01

    Full Text Available This paper deals with a tool that enables import of the coded data in a singletext file to more than one vector layers (including attribute tables, together withautomatic drawing of line and polygon objects and with optional conversion toCAD. Python script v.in.survey is available as an add-on for open-source softwareGRASS GIS (GRASS Development Team. The paper describes a case study basedon surveying at the archaeological mission at Tell-el Retaba (Egypt. Advantagesof the tool (e.g. significant optimization of surveying work and its limits (demandson keeping conventions for the points’ names coding are discussed here as well.Possibilities of future development are suggested (e.g. generalization of points’names coding or more complex attribute table creation.

  13. 47 CFR 76.936 - Written decision.

    Science.gov (United States)

    2010-10-01

    ... CABLE TELEVISION SERVICE Cable Rate Regulation § 76.936 Written decision. (a) A franchising authority... of interested parties. A franchising authority is not required to issue a written decision that...

  14. Three new nondestructive evaluation tools based on high flux neutron sources

    International Nuclear Information System (INIS)

    Hubbard, C.R.; Raine, D.; Peascoe, R.; Wright, M.

    1997-01-01

    Nondestructive evaluation methods and systems based on specific attributes of neutron interactions with materials are being developed. The special attributes of neutrons are low attenuation in most engineering materials, strong interaction with low Z elements, and epithermal neutron absorption resonances. The three methods under development at ORNL include neutron based tomography and radiography; through thickness, nondestructive texture mapping; and internal, noninvasive temperature measurement. All three techniques require high flux sources such as the High Flux Isotope Reactor, a steady state source, or the Oak Ridge Electron Linear Accelerator, a pulsed neutron source. Neutrons are quite penetrating in most engineering materials and thus can be useful to detect internal flaws and features. Hydrogen atoms, such as in a hydrocarbon fuel, lubricant, or a metal hydride, are relatively opaque to neutron transmission and thus neutron based tomography/radiography is ideal to image their presence. Texture, the nonrandom orientation of crystalline grains within materials, can be mapped nondestructively using neutron diffraction methods. Epithermal neutron resonance absorption is being studied as a noncontacting temperature sensor. This paper highlights the underlying physics of the methods, progress in development, and the potential benefits for science and industry of the three facilities

  15. Increasing rigor in NMR-based metabolomics through validated and open source tools.

    Science.gov (United States)

    Eghbalnia, Hamid R; Romero, Pedro R; Westler, William M; Baskaran, Kumaran; Ulrich, Eldon L; Markley, John L

    2017-02-01

    The metabolome, the collection of small molecules associated with an organism, is a growing subject of inquiry, with the data utilized for data-intensive systems biology, disease diagnostics, biomarker discovery, and the broader characterization of small molecules in mixtures. Owing to their close proximity to the functional endpoints that govern an organism's phenotype, metabolites are highly informative about functional states. The field of metabolomics identifies and quantifies endogenous and exogenous metabolites in biological samples. Information acquired from nuclear magnetic spectroscopy (NMR), mass spectrometry (MS), and the published literature, as processed by statistical approaches, are driving increasingly wider applications of metabolomics. This review focuses on the role of databases and software tools in advancing the rigor, robustness, reproducibility, and validation of metabolomics studies. Copyright © 2016. Published by Elsevier Ltd.

  16. A safe private nuclear tool-the miniature neutron source reactor

    International Nuclear Information System (INIS)

    Zhou Yongmao

    1987-01-01

    The prototype miniature neutron source reactor (MNSR) designed by China Institute of Atomic Energy has been operated successfully for more than 3 years and the practical experience enriches the original design idea. The commercial MNSR is under study design and develop in following aspects: 1. Prolonging the control rod cycle duration and core burn-up life; 2. Increasing the neutron flux per unit power. Obviously, the MNSR will show more advantages in extending application area and in providing the core using low envichment fuel. (Liu)

  17. Using open source computational tools for predicting human metabolic stability and additional absorption, distribution, metabolism, excretion, and toxicity properties.

    Science.gov (United States)

    Gupta, Rishi R; Gifford, Eric M; Liston, Ted; Waller, Chris L; Hohman, Moses; Bunin, Barry A; Ekins, Sean

    2010-11-01

    Ligand-based computational models could be more readily shared between researchers and organizations if they were generated with open source molecular descriptors [e.g., chemistry development kit (CDK)] and modeling algorithms, because this would negate the requirement for proprietary commercial software. We initially evaluated open source descriptors and model building algorithms using a training set of approximately 50,000 molecules and a test set of approximately 25,000 molecules with human liver microsomal metabolic stability data. A C5.0 decision tree model demonstrated that CDK descriptors together with a set of Smiles Arbitrary Target Specification (SMARTS) keys had good statistics [κ = 0.43, sensitivity = 0.57, specificity = 0.91, and positive predicted value (PPV) = 0.64], equivalent to those of models built with commercial Molecular Operating Environment 2D (MOE2D) and the same set of SMARTS keys (κ = 0.43, sensitivity = 0.58, specificity = 0.91, and PPV = 0.63). Extending the dataset to ∼193,000 molecules and generating a continuous model using Cubist with a combination of CDK and SMARTS keys or MOE2D and SMARTS keys confirmed this observation. When the continuous predictions and actual values were binned to get a categorical score we observed a similar κ statistic (0.42). The same combination of descriptor set and modeling method was applied to passive permeability and P-glycoprotein efflux data with similar model testing statistics. In summary, open source tools demonstrated predictive results comparable to those of commercial software with attendant cost savings. We discuss the advantages and disadvantages of open source descriptors and the opportunity for their use as a tool for organizations to share data precompetitively, avoiding repetition and assisting drug discovery.

  18. Characterization of sildenafil citrate tablets of different sources by near infrared chemical imaging and chemometric tools.

    Science.gov (United States)

    Sabin, Guilherme P; Lozano, Valeria A; Rocha, Werickson F C; Romão, Wanderson; Ortiz, Rafael S; Poppi, Ronei J

    2013-11-01

    The chemical imaging technique by near infrared spectroscopy was applied for characterization of formulations in tablets of sildenafil citrate of six different sources. Five formulations were provided by Brazilian Federal Police and correspond to several trademarks of prohibited marketing and one was an authentic sample of Viagra. In a first step of the study, multivariate curve resolution was properly chosen for the estimation of the distribution map of concentration of the active ingredient in tablets of different sources, where the chemical composition of all excipients constituents was not truly known. In such cases, it is very difficult to establish an appropriate calibration technique, so that only the information of sildenafil is considered independently of the excipients. This determination was possible only by reaching the second-order advantage, where the analyte quantification can be performed in the presence of unknown interferences. In a second step, the normalized histograms of images from active ingredient were grouped according to their similarities by hierarchical cluster analysis. Finally it was possible to recognize the patterns of distribution maps of concentration of sildenafil citrate, distinguishing the true formulation of Viagra. This concept can be used to improve the knowledge of industrial products and processes, as well as, for characterization of counterfeit drugs. Copyright © 2013. Published by Elsevier B.V.

  19. The Microbial DNA Index System (MiDIS): A tool for microbial pathogen source identification

    Energy Technology Data Exchange (ETDEWEB)

    Velsko, S. P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2010-08-09

    The microbial DNA Index System (MiDIS) is a concept for a microbial forensic database and investigative decision support system that can be used to help investigators identify the sources of microbial agents that have been used in a criminal or terrorist incident. The heart of the proposed system is a rigorous method for calculating source probabilities by using certain fundamental sampling distributions associated with the propagation and mutation of microbes on disease transmission networks. This formalism has a close relationship to mitochondrial and Y-chromosomal human DNA forensics, and the proposed decision support system is somewhat analogous to the CODIS and SWGDAM mtDNA databases. The MiDIS concept does not involve the use of opportunistic collections of microbial isolates and phylogenetic tree building as a basis for inference. A staged approach can be used to build MiDIS as an enduring capability, beginning with a pilot demonstration program that must meet user expectations for performance and validation before evolving into a continuing effort. Because MiDIS requires input from a a broad array of expertise including outbreak surveillance, field microbial isolate collection, microbial genome sequencing, disease transmission networks, and laboratory mutation rate studies, it will be necessary to assemble a national multi-laboratory team to develop such a system. The MiDIS effort would lend direction and focus to the national microbial genetics research program for microbial forensics, and would provide an appropriate forensic framework for interfacing to future national and international disease surveillance efforts.

  20. Airline Transport Pilot-Airplane (Air Carrier) Written Test Guide.

    Science.gov (United States)

    Federal Aviation Administration (DOT), Washington, DC. Flight Standards Service.

    Presented is information useful to applicants who are preparing for the Airline Transport Pilot-Airplane (Air Carrier) Written Test. The guide describes the basic aeronautical knowledge and associated requirements for certification, as well as information on source material, instructions for taking the official test, and questions that are…

  1. Operation modes of the FALCON ion source as a part of the AMS cluster tool

    Directory of Open Access Journals (Sweden)

    Girka Oleksii

    2015-06-01

    Full Text Available The paper investigates the options to increase the production yield of temperature compensated surface acoustic wave (SAW devices with a defined range of operational frequencies. The paper focuses on the preparation of large wafers with SiO2 and AlN/Si3N4 depositions. Stability of the intermediate SiO2 layer is achieved by combining high power density UV radiation with annealing in high humidity environment. A uniform thickness of the capping AlN layer is achieved by local high-rate etching with a focused ion beam emitted by the FALCON ion source. Operation parameters and limitations of the etching process are discussed.

  2. The use of open source bioinformatics tools to dissect transcriptomic data.

    Science.gov (United States)

    Nitsche, Benjamin M; Ram, Arthur F J; Meyer, Vera

    2012-01-01

    Microarrays are a valuable technology to study fungal physiology on a transcriptomic level. Various microarray platforms are available comprising both single and two channel arrays. Despite different technologies, preprocessing of microarray data generally includes quality control, background correction, normalization, and summarization of probe level data. Subsequently, depending on the experimental design, diverse statistical analysis can be performed, including the identification of differentially expressed genes and the construction of gene coexpression networks.We describe how Bioconductor, a collection of open source and open development packages for the statistical programming language R, can be used for dissecting microarray data. We provide fundamental details that facilitate the process of getting started with R and Bioconductor. Using two publicly available microarray datasets from Aspergillus niger, we give detailed protocols on how to identify differentially expressed genes and how to construct gene coexpression networks.

  3. The Advanced Light Source: A new tool for research in atomic physics

    International Nuclear Information System (INIS)

    Schlachter, A.S.

    1990-09-01

    The Advanced Light Source, a third-generation national synchrotron-radiation facility now under construction at the Lawrence Berkeley Laboratory in Berkeley, California, is scheduled to begin serving qualified users across a broad spectrum of research areas in the spring of 1993. Undulators will generate high-brightness, partially coherent, plane polarized, soft-x-ray and ultraviolet (XUV) radiation from below 10 eV to above 2 keV. Wigglers and bend magnets will generate high fluxes of x-rays to photon energies above 10 keV. The ALS will have an extensive research program in which XUV radiation is used to study matter in all its varied gaseous, liquid, and solid forms. 7 refs., 3 figs

  4. Phytoscreening as an efficient tool to delineate chlorinated solvent sources at a chlor-alkali facility.

    Science.gov (United States)

    Yung, Loïc; Lagron, Jérôme; Cazaux, David; Limmer, Matt; Chalot, Michel

    2017-05-01

    Chlorinated ethenes (CE) are among the most common volatile organic compounds (VOC) that contaminate groundwater, currently representing a major source of pollution worldwide. Phytoscreening has been developed and employed through different applications at numerous sites, where it was generally useful for detection of subsurface chlorinated solvents. We aimed at delineating subsurface CE contamination at a chlor-alkali facility using tree core data that we compared with soil data. For this investigation a total of 170 trees from experimental zones was sampled and analyzed for perchloroethene (PCE) and trichloroethene (TCE) concentrations, measured by solid phase microextraction gas chromatography coupled to mass spectrometry. Within the panel of tree genera sampled, Quercus and Ulmus appeared to be efficient biomonitors of subjacent TCE and PCE contamination, in addition to the well known and widely used Populus and Salix genera. Among the 28 trees located above the dense non-aqueous phase liquid (DNAPL) phase zone, 19 tree cores contained detectable amounts of CE, with concentrations ranging from 3 to 3000 μg L -1 . Our tree core dataset was found to be well related to soil gas sampling results, although the tree coring data were more informative. Our data further emphasized the need for choosing the relevant tree species and sampling periods, as well as taking into consideration the nature of the soil and its heterogeneity. Overall, this low-invasive screening method appeared useful to delineate contaminants at a small-scale site impacted by multiple sources of chlorinated solvents. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. The tissue microarray OWL schema: An open-source tool for sharing tissue microarray data

    Directory of Open Access Journals (Sweden)

    Hyunseok P Kang

    2010-01-01

    Full Text Available Background: Tissue microarrays (TMAs are enormously useful tools for translational research, but incompatibilities in database systems between various researchers and institutions prevent the efficient sharing of data that could help realize their full potential. Resource Description Framework (RDF provides a flexible method to represent knowledge in triples, which take the form Subject- Predicate-Object. All data resources are described using Uniform Resource Identifiers (URIs, which are global in scope. We present an OWL (Web Ontology Language schema that expands upon the TMA data exchange specification to address this issue and assist in data sharing and integration. Methods: A minimal OWL schema was designed containing only concepts specific to TMA experiments. More general data elements were incorporated from predefined ontologies such as the NCI thesaurus. URIs were assigned using the Linked Data format. Results: We present examples of files utilizing the schema and conversion of XML data (similar to the TMA DES to OWL. Conclusion: By utilizing predefined ontologies and global unique identifiers, this OWL schema provides a solution to the limitations of XML, which represents concepts defined in a localized setting. This will help increase the utilization of tissue resources, facilitating collaborative translational research efforts.

  6. A multi-source feedback tool for measuring a subset of Pediatrics Milestones.

    Science.gov (United States)

    Schwartz, Alan; Margolis, Melissa J; Multerer, Sara; Haftel, Hilary M; Schumacher, Daniel J

    2016-10-01

    The Pediatrics Milestones Assessment Pilot employed a new multisource feedback (MSF) instrument to assess nine Pediatrics Milestones among interns and subinterns in the inpatient context. To report validity evidence for the MSF tool for informing milestone classification decisions. We obtained MSF instruments by different raters per learner per rotation. We present evidence for validity based on the unified validity framework. One hundred and ninety two interns and 41 subinterns at 18 Pediatrics residency programs received a total of 1084 MSF forms from faculty (40%), senior residents (34%), nurses (22%), and other staff (4%). Variance in ratings was associated primarily with rater (32%) and learner (22%). The milestone factor structure fit data better than simpler structures. In domains except professionalism, ratings by nurses were significantly lower than those by faculty and ratings by other staff were significantly higher. Ratings were higher when the rater observed the learner for longer periods and had a positive global opinion of the learner. Ratings of interns and subinterns did not differ, except for ratings by senior residents. MSF-based scales correlated with summative milestone scores. We obtain moderately reliable MSF ratings of interns and subinterns in the inpatient context to inform some milestone assignments.

  7. CrossCheck: an open-source web tool for high-throughput screen data analysis.

    Science.gov (United States)

    Najafov, Jamil; Najafov, Ayaz

    2017-07-19

    Modern high-throughput screening methods allow researchers to generate large datasets that potentially contain important biological information. However, oftentimes, picking relevant hits from such screens and generating testable hypotheses requires training in bioinformatics and the skills to efficiently perform database mining. There are currently no tools available to general public that allow users to cross-reference their screen datasets with published screen datasets. To this end, we developed CrossCheck, an online platform for high-throughput screen data analysis. CrossCheck is a centralized database that allows effortless comparison of the user-entered list of gene symbols with 16,231 published datasets. These datasets include published data from genome-wide RNAi and CRISPR screens, interactome proteomics and phosphoproteomics screens, cancer mutation databases, low-throughput studies of major cell signaling mediators, such as kinases, E3 ubiquitin ligases and phosphatases, and gene ontological information. Moreover, CrossCheck includes a novel database of predicted protein kinase substrates, which was developed using proteome-wide consensus motif searches. CrossCheck dramatically simplifies high-throughput screen data analysis and enables researchers to dig deep into the published literature and streamline data-driven hypothesis generation. CrossCheck is freely accessible as a web-based application at http://proteinguru.com/crosscheck.

  8. Hybrid Ground-Source Heat Pump Installations: Experiences, Improvements, and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Scott Hackel; Amanda Pertzborn

    2011-06-30

    One innovation to ground-source heat pump (GSHP, or GHP) systems is the hybrid GSHP (HyGSHP) system, which can dramatically decrease the first cost of GSHP systems by using conventional technology (such as a cooling tower or a boiler) to meet a portion of the peak heating or cooling load. This work uses three case studies (two cooling-dominated, one heating-dominated) to demonstrate the performance of the hybrid approach. Three buildings were studied for a year; the measured data was used to validate models of each system. The models were used to analyze further improvements to the hybrid approach, and establish that this approach has positive impacts, both economically and environmentally. Lessons learned by those who design and operate the systems are also documented, including discussions of equipment sizing, pump operation, and cooling tower control. Finally, the measured data sets and models that were created during this work are described; these materials have been made freely available for further study of hybrid systems.

  9. The Advanced Light Source at Lawrence Berkeley Laboratory: a new tool for research in atomic physics

    International Nuclear Information System (INIS)

    Schlachter, A.S.; Robinson, A.L.

    1991-01-01

    The Advanced Light Source, a third-generation national synchrotron-radiation facility now under construction at the Lawrence Berkeley Laboratory, is scheduled to begin serving qualified users across a broad spectrum of research areas in the spring of 1993. Based on a low-emittance electron storage ring optimized to operate at 1.5 GeV, the ALS will have 10 long straight sections available for insertion devices (undulators and wigglers) and 24 high-quality bend-magnet ports. The short pulse width (30-50 ps) will be ideal for time-resolved measurements. Undulators will generate high-brightness partially coherent soft X-ray and ultraviolet (XUV) radiation from below 10 eV to above 2 keV; this radiation is plane polarized. Wigglers and bend magnets will extend the spectrum by generating high fluxes of X-rays to photon energies above 10 keV. The ALS will have an extensive research program in which XUV radiation is used to study matter in allk its varied gaseous, liquid, and solid forms. The high brightness will open new areas of research in the materials sciences, such as spatially resolved spectroscopy (spectromicroscopy), and in biology, such as X-ray microscopy with element-specific sensitivity; the high flux will allow measurements in atomic physics and chemistry to be made with tenuous gas-phase targets. Technological applications could include lithography and nano-fabrication. (orig.)

  10. Baseliner: An open-source, interactive tool for processing sap flux data from thermal dissipation probes

    Directory of Open Access Journals (Sweden)

    A. Christopher Oishi

    2016-01-01

    Full Text Available Estimating transpiration from woody plants using thermal dissipation sap flux sensors requires careful data processing. Currently, researchers accomplish this using spreadsheets, or by personally writing scripts for statistical software programs (e.g., R, SAS. We developed the Baseliner software to help establish a standardized protocol for processing sap flux data. Baseliner enables users to QA/QC data and process data using a combination of automated steps, visualization, and manual editing. Data processing requires establishing a zero-flow reference value, or “baseline”, which varies among sensors and with time. Since no set of algorithms currently exists to reliably QA/QC and estimate the zero-flow baseline, Baseliner provides a graphical user interface to allow visual inspection and manipulation of data. Data are first automatically processed using a set of user defined parameters. The user can then view the data for additional, manual QA/QC and baseline identification using mouse and keyboard commands. The open-source software allows for user customization of data processing algorithms as improved methods are developed.

  11. Early Detection of Apathetic Phenotypes in Huntington's Disease Knock-in Mice Using Open Source Tools.

    Science.gov (United States)

    Minnig, Shawn; Bragg, Robert M; Tiwana, Hardeep S; Solem, Wes T; Hovander, William S; Vik, Eva-Mari S; Hamilton, Madeline; Legg, Samuel R W; Shuttleworth, Dominic D; Coffey, Sydney R; Cantle, Jeffrey P; Carroll, Jeffrey B

    2018-02-02

    Apathy is one of the most prevalent and progressive psychiatric symptoms in Huntington's disease (HD) patients. However, preclinical work in HD mouse models tends to focus on molecular and motor, rather than affective, phenotypes. Measuring behavior in mice often produces noisy data and requires large cohorts to detect phenotypic rescue with appropriate power. The operant equipment necessary for measuring affective phenotypes is typically expensive, proprietary to commercial entities, and bulky which can render adequately sized mouse cohorts as cost-prohibitive. Thus, we describe here a home-built, open-source alternative to commercial hardware that is reliable, scalable, and reproducible. Using off-the-shelf hardware, we adapted and built several of the rodent operant buckets (ROBucket) to test Htt Q111/+ mice for attention deficits in fixed ratio (FR) and progressive ratio (PR) tasks. We find that, despite normal performance in reward attainment in the FR task, Htt Q111/+ mice exhibit reduced PR performance at 9-11 months of age, suggesting motivational deficits. We replicated this in two independent cohorts, demonstrating the reliability and utility of both the apathetic phenotype, and these ROBuckets, for preclinical HD studies.

  12. Writing in the air: A visualization tool for written languages.

    Directory of Open Access Journals (Sweden)

    Yoshihiro Itaguchi

    Full Text Available The present study investigated interactions between cognitive processes and finger actions called "kusho," meaning "air-writing" in Japanese. Kanji-culture individuals often employ kusho behavior in which they move their fingers as a substitute for a pen to write mostly done when they are trying to recall the shape of a Kanji character or the spelling of an English word. To further examine the visualization role of kusho behavior on cognitive processing, we conducted a Kanji construction task in which a stimulus (i.e., sub-parts to be constructed was simultaneously presented. In addition, we conducted a Kanji vocabulary test to reveal the relation between the kusho benefit and vocabulary size. The experiment provided two sets of novel findings. First, executing kusho behavior improved task performance (correct responses as long as the participants watched their finger movements while solving the task. This result supports the idea that visual feedback of kusho behavior helps cognitive processing for the task. Second, task performance was positively correlated with the vocabulary score when stimuli were presented for a relatively long time, whereas the kusho benefits and vocabulary score were not correlated regardless of stimulus-presentation time. These results imply that a longer stimulus-presentation could allow participants to utilize their lexical resources for solving the task. The current findings together support the visualization role of kusho behavior, adding experimental evidence supporting the view that there are interactions between cognition and motor behavior.

  13. An Open-Source Label Atlas Correction Tool and Preliminary Results on Huntingtons Disease Whole-Brain MRI Atlases.

    Science.gov (United States)

    Forbes, Jessica L; Kim, Regina E Y; Paulsen, Jane S; Johnson, Hans J

    2016-01-01

    The creation of high-quality medical imaging reference atlas datasets with consistent dense anatomical region labels is a challenging task. Reference atlases have many uses in medical image applications and are essential components of atlas-based segmentation tools commonly used for producing personalized anatomical measurements for individual subjects. The process of manual identification of anatomical regions by experts is regarded as a so-called gold standard; however, it is usually impractical because of the labor-intensive costs. Further, as the number of regions of interest increases, these manually created atlases often contain many small inconsistently labeled or disconnected regions that need to be identified and corrected. This project proposes an efficient process to drastically reduce the time necessary for manual revision in order to improve atlas label quality. We introduce the LabelAtlasEditor tool, a SimpleITK-based open-source label atlas correction tool distributed within the image visualization software 3D Slicer. LabelAtlasEditor incorporates several 3D Slicer widgets into one consistent interface and provides label-specific correction tools, allowing for rapid identification, navigation, and modification of the small, disconnected erroneous labels within an atlas. The technical details for the implementation and performance of LabelAtlasEditor are demonstrated using an application of improving a set of 20 Huntingtons Disease-specific multi-modal brain atlases. Additionally, we present the advantages and limitations of automatic atlas correction. After the correction of atlas inconsistencies and small, disconnected regions, the number of unidentified voxels for each dataset was reduced on average by 68.48%.

  14. HF turbulence as a source of novel diagnostics tool for space plasma

    International Nuclear Information System (INIS)

    Rothkaehl, H.; Klos, Z.; Thide, B.; Bergman, J.

    2005-01-01

    The T type of turbulence and instabilities can be produced by a source of free energy in the form of natural and anthropogenic perturbation. Space turbulence acts as a tracer of the various physical processes acting in these regions and gives access to them, but on the other side it disturbs the propagation of radio waves and the ability of detecting targets of interests. To understand the property of solar terrestrial environment and to develop a quantitative model of the magnetosphere-ionosphere-thermosphere subsystem, which is strongly coupled via the electric field, particle precipitation, heat flows and small scale interaction, it is necessary to design and build new generation multipoint and different type sensor diagnostics, as proposed by LOFAR/LOIS facility in complementary of space borne satellite experiments. Ground based multi frequency and multi polarization LOIS clusters antennas and clusters observations in the in the space should be helpful in achieving to solve problems of space physics and described long term environmental changes. The real-time access to gathered based data, relevant to the impact of environment physical condition on communications and global positioning system, will create the possibility to improve quality of different type space related services. Simultaneously investigation and monitoring of Earth environment will be coordinated with space borne experiment COMPAS 2 experiment. The new design radio spectrometer will be designed to investigate the still largely unknown mechanisms which govern these turbulent interactions natural and man-made origin. The main aim of this presentation is to show the general architecture of LOIS and COMPAS 2 experiment and its scientific challenges. It will be emphasize the description of electromagnetic Earth environments in HF range as well. (author)

  15. Sources

    International Nuclear Information System (INIS)

    Duffy, L.P.

    1991-01-01

    This paper discusses the sources of radiation in the narrow perspective of radioactivity and the even narrow perspective of those sources that concern environmental management and restoration activities at DOE facilities, as well as a few related sources. Sources of irritation, Sources of inflammatory jingoism, and Sources of information. First, the sources of irritation fall into three categories: No reliable scientific ombudsman to speak without bias and prejudice for the public good, Technical jargon with unclear definitions exists within the radioactive nomenclature, and Scientific community keeps a low-profile with regard to public information. The next area of personal concern are the sources of inflammation. This include such things as: Plutonium being described as the most dangerous substance known to man, The amount of plutonium required to make a bomb, Talk of transuranic waste containing plutonium and its health affects, TMI-2 and Chernobyl being described as Siamese twins, Inadequate information on low-level disposal sites and current regulatory requirements under 10 CFR 61, Enhanced engineered waste disposal not being presented to the public accurately. Numerous sources of disinformation regarding low level radiation high-level radiation, Elusive nature of the scientific community, The Federal and State Health Agencies resources to address comparative risk, and Regulatory agencies speaking out without the support of the scientific community

  16. A CDL written intercom editor

    International Nuclear Information System (INIS)

    Marinescu, D.S.; Shirikov, V.P.

    1977-01-01

    This report contains the formalized description of the program EDITOR which is the most important part of the teleprocessing system INTERCOM driving terminals for CDC series 6000 and CYBER computers. CDL (Compiler Description language) is used for this description. EDITOR is a tool for the text file acquisition and modifications. It also gives the possibility to execute some commands to the computer software. The EDITOR independent description may be used for the implementation of EDITOR-like programmes for different types of computers (is particular, small computers)

  17. Deriving phenological metrics from NDVI through an open source tool developed in QGIS

    Science.gov (United States)

    Duarte, Lia; Teodoro, A. C.; Gonçalves, Hernãni

    2014-10-01

    Vegetation indices have been commonly used over the past 30 years for studying vegetation characteristics using images collected by remote sensing satellites. One of the most commonly used is the Normalized Difference Vegetation Index (NDVI). The various stages that green vegetation undergoes during a complete growing season can be summarized through time-series analysis of NDVI data. The analysis of such time-series allow for extracting key phenological variables or metrics of a particular season. These characteristics may not necessarily correspond directly to conventional, ground-based phenological events, but do provide indications of ecosystem dynamics. A complete list of the phenological metrics that can be extracted from smoothed, time-series NDVI data is available in the USGS online resources (http://phenology.cr.usgs.gov/methods_deriving.php).This work aims to develop an open source application to automatically extract these phenological metrics from a set of satellite input data. The main advantage of QGIS for this specific application relies on the easiness and quickness in developing new plug-ins, using Python language, based on the experience of the research group in other related works. QGIS has its own application programming interface (API) with functionalities and programs to develop new features. The toolbar developed for this application was implemented using the plug-in NDVIToolbar.py. The user introduces the raster files as input and obtains a plot and a report with the metrics. The report includes the following eight metrics: SOST (Start Of Season - Time) corresponding to the day of the year identified as having a consistent upward trend in the NDVI time series; SOSN (Start Of Season - NDVI) corresponding to the NDVI value associated with SOST; EOST (End of Season - Time) which corresponds to the day of year identified at the end of a consistent downward trend in the NDVI time series; EOSN (End of Season - NDVI) corresponding to the NDVI value

  18. Open Source Web Tool for Tracking in a Lowcost Mobile Mapping System

    Science.gov (United States)

    Fissore, F.; Pirotti, F.; Vettore, A.

    2017-11-01

    alternative solution to other more expensive MMSs. The first objective of this paper is to report on the development of a prototype of MMS for the collection of geospatial data based on the assembly of low cost sensors managed through a web interface developed using open source libraries. The main goal is to provide a system accessible by any type of user, and flexible to any type of upgrade or introduction of new models of sensors or versions thereof. After a presentation of the hardware components used in our system, a more detailed description of the software developed for the management of the MMS will be provided, which is the part of the innovation of the project. According to the worldwide request for having big data available through the web from everywhere in the world (Pirotti et al., 2011), the proposed solution allows to retrieve data from a web interface Figure 4. Actually, this is part of a project for the development of a new web infrastructure in the University of Padua (but it will be available for external users as well), in order to ease collaboration between researchers from different areas. Finally, strengths, weaknesses and future developments of the low cost MMS are discussed.

  19. Development of a Modular Laboratory Information Management System (LIMS) for NAA Laboratories Using Open-Source Developing Tools

    International Nuclear Information System (INIS)

    Bounakhla, Moussa; Amsil, Hamid; Embarch, K.; Bounouira, Hamid

    2018-01-01

    CNESTEN designed and developed a modular Laboratory Information Management System (LIMS) for the NAA Laboratory using open-source developing tools. This LIMS ensures a personalized management web space for sample acquisition and preparation, spectra processing and for final analysis of the sample. The system helps also dematerializing process for irradiation requests and for the acquisition of new equipments and samples. It allows managing circulating documents between different actors of the LIMS. Modules for concentration determination, facilities characterization are also included in this LIMS. New modules such as spectra fitting, true coincidence and attenuation corrections can be developed and integrated individually in this system. All data, including nuclear data libraries, are stored in a unique distant database via intranet network to allow instantaneous multi-user access. (author)

  20. Nitrogen Source Inventory and Loading Tool: An integrated approach toward restoration of water-quality impaired karst springs.

    Science.gov (United States)

    Eller, Kirstin T; Katz, Brian G

    2017-07-01

    Nitrogen (N) from anthropogenic sources has contaminated groundwater used as drinking water in addition to impairing water quality and ecosystem health of karst springs. The Nitrogen Source Inventory and Loading Tool (NSILT) was developed as an ArcGIS and spreadsheet-based approach that provides spatial estimates of current nitrogen (N) inputs to the land surface and loads to groundwater from nonpoint and point sources within the groundwater contributing area. The NSILT involves a three-step approach where local and regional land use practices and N sources are evaluated to: (1) estimate N input to the land surface, (2) quantify subsurface environmental attenuation, and (3) assess regional recharge to the aquifer. NSILT was used to assess nitrogen loading to groundwater in two karst spring areas in west-central Florida: Rainbow Springs (RS) and Kings Bay (KB). The karstic Upper Floridan aquifer (UFA) is the source of water discharging to the springs in both areas. In the KB study area (predominantly urban land use), septic systems and urban fertilizers contribute 48% and 22%, respectively, of the estimated total annual N load to groundwater 294,400 kg-N/yr. In contrast for the RS study area (predominantly agricultural land use), livestock operations and crop fertilizers contribute 50% and 13%, respectively, of the estimated N load to groundwater. Using overall groundwater N loading rates for the KB and RS study areas, 4.4 and 3.3 kg N/ha, respectively, and spatial recharge rates, the calculated groundwater nitrate-N concentration (2.1 mg/L) agreed closely with the median nitrate-N concentration (1.7 mg/L) from groundwater samples in agricultural land use areas in the RS study area for the period 2010-2014. NSILT results provide critical information for prioritizing and designing restoration efforts for water-quality impaired springs and spring runs affected by multiple sources of nitrogen loading to groundwater. The calculated groundwater N concentration for

  1. Open source GIS based tools to improve hydrochemical water resources management in EU H2020 FREEWAT platform

    Science.gov (United States)

    Criollo, Rotman; Velasco, Violeta; Vázquez-Suñé, Enric; Nardi, Albert; Marazuela, Miguel A.; Rossetto, Rudy; Borsi, Iacopo; Foglia, Laura; Cannata, Massimiliano; De Filippis, Giovanna

    2017-04-01

    Due to the general increase of water scarcity (Steduto et al., 2012), water quantity and quality must be well known to ensure a proper access to water resources in compliance with local and regional directives. This circumstance can be supported by tools which facilitate process of data management and its analysis. Such analyses have to provide research/professionals, policy makers and users with the ability to improve the management of the water resources with standard regulatory guidelines. Compliance with the established standard regulatory guidelines (with a special focus on requirement deriving from the GWD) should have an effective monitoring, evaluation, and interpretation of a large number of physical and chemical parameters. These amounts of datasets have to be assessed and interpreted: (i) integrating data from different sources and gathered with different data access techniques and formats; (ii) managing data with varying temporal and spatial extent; (iii) integrating groundwater quality information with other relevant information such as further hydrogeological data (Velasco et al., 2014) and pre-processing these data generally for the realization of groundwater models. In this context, the Hydrochemical Analysis Tools, akvaGIS Tools, has been implemented within the H2020 FREEWAT project; which aims to manage water resources by modelling water resource management in an open source GIS platform (QGIS desktop). The main goal of AkvaGIS Tools is to improve water quality analysis through different capabilities to improve the case study conceptual model managing all data related into its geospatial database (implemented in Spatialite) and a set of tools for improving the harmonization, integration, standardization, visualization and interpretation of the hydrochemical data. To achieve that, different commands cover a wide range of methodologies for querying, interpreting, and comparing groundwater quality data and facilitate the pre-processing analysis for

  2. Natural killer cells as a promising tool to tackle cancer-A review of sources, methodologies, and potentials.

    Science.gov (United States)

    Preethy, Senthilkumar; Dedeepiya, Vidyasagar Devaprasad; Senthilkumar, Rajappa; Rajmohan, Mathaiyan; Karthick, Ramalingam; Terunuma, Hiroshi; Abraham, Samuel J K

    2017-07-04

    Immune cell-based therapies are emerging as a promising tool to tackle malignancies, both solid tumors and selected hematological tumors. Vast experiences in literature have documented their safety and added survival benefits when such cell-based therapies are combined with the existing treatment options. Numerous methodologies of processing and in vitro expansion protocols of immune cells, such as the dendritic cells, natural killer (NK) cells, NKT cells, αβ T cells, so-called activated T lymphocytes, γδ T cells, cytotoxic T lymphocytes, and lymphokine-activated killer cells, have been reported for use in cell-based therapies. Among this handful of immune cells of significance, the NK cells stand apart from the rest for not only their direct cytotoxic ability against cancer cells but also their added advantage, which includes their capability of (i) action through both innate and adaptive immune mechanism, (ii) tackling viruses too, giving benefits in conditions where viral infections culminate in cancer, and (iii) destroying cancer stem cells, thereby preventing resistance to chemotherapy and radiotherapy. This review thoroughly analyses the sources of such NK cells, methods for expansion, and the future potentials of taking the in vitro expanded allogeneic NK cells with good cytotoxic ability as a drug for treating cancer and/or viral infection and even as a prophylactic tool for prevention of cancer after initial remission.

  3. sources

    Directory of Open Access Journals (Sweden)

    Shu-Yin Chiang

    2002-01-01

    Full Text Available In this paper, we study the simplified models of the ATM (Asynchronous Transfer Mode multiplexer network with Bernoulli random traffic sources. Based on the model, the performance measures are analyzed by the different output service schemes.

  4. M2Lite: An Open-source, Light-weight, Pluggable and Fast Proteome Discoverer MSF to mzIdentML Tool.

    Science.gov (United States)

    Aiyetan, Paul; Zhang, Bai; Chen, Lily; Zhang, Zhen; Zhang, Hui

    2014-04-28

    Proteome Discoverer is one of many tools used for protein database search and peptide to spectrum assignment in mass spectrometry-based proteomics. However, the inadequacy of conversion tools makes it challenging to compare and integrate its results to those of other analytical tools. Here we present M2Lite, an open-source, light-weight, easily pluggable and fast conversion tool. M2Lite converts proteome discoverer derived MSF files to the proteomics community defined standard - the mzIdentML file format. M2Lite's source code is available as open-source at https://bitbucket.org/paiyetan/m2lite/src and its compiled binaries and documentation can be freely downloaded at https://bitbucket.org/paiyetan/m2lite/downloads.

  5. Development of modified voxel phantoms for the numerical dosimetric reconstruction of radiological accidents involving external sources: implementation in SESAME tool.

    Science.gov (United States)

    Courageot, Estelle; Sayah, Rima; Huet, Christelle

    2010-05-07

    Estimating the dose distribution in a victim's body is a relevant indicator in assessing biological damage from exposure in the event of a radiological accident caused by an external source. When the dose distribution is evaluated with a numerical anthropomorphic model, the posture and morphology of the victim have to be reproduced as realistically as possible. Several years ago, IRSN developed a specific software application, called the simulation of external source accident with medical images (SESAME), for the dosimetric reconstruction of radiological accidents by numerical simulation. This tool combines voxel geometry and the MCNP(X) Monte Carlo computer code for radiation-material interaction. This note presents a new functionality in this software that enables the modelling of a victim's posture and morphology based on non-uniform rational B-spline (NURBS) surfaces. The procedure for constructing the modified voxel phantoms is described, along with a numerical validation of this new functionality using a voxel phantom of the RANDO tissue-equivalent physical model.

  6. Compound-specific stable carbon isotopic composition of petroleum hydrocarbons as a tool for tracing the source of oil spills

    International Nuclear Information System (INIS)

    Li Yun; Xiong Yongqiang; Yang Wanying; Xie Yueliang; Li Siyuan; Sun Yongge

    2009-01-01

    With the increasing demand for and consumption of crude oils, oil spill accidents happen frequently during the transportation of crude oils and oil products, and the environmental hazard they pose has become increasingly serious in China. The exact identification of the source of spilled oil can act as forensic evidence in the investigation and handling of oil spill accidents. In this study, a weathering simulation experiment demonstrates that the mass loss of crude oils caused by short-term weathering mainly occurs within the first 24 h after a spill, and is dominated by the depletion of low-molecular weight hydrocarbons ( 18 n-alkanes). Short-term weathering has no significant effect on δ 13 C values of individual n-alkanes (C 12 -C 33 ), suggesting that a stable carbon isotope profile of n-alkanes can be a useful tool for tracing the source of an oil spill, particularly for weathered oils or those with a relatively low concentration or absence of sterane and terpane biomarkers

  7. Recommendations for reducing ambiguity in written procedures.

    Energy Technology Data Exchange (ETDEWEB)

    Matzen, Laura E.

    2009-11-01

    Previous studies in the nuclear weapons complex have shown that ambiguous work instructions (WIs) and operating procedures (OPs) can lead to human error, which is a major cause for concern. This report outlines some of the sources of ambiguity in written English and describes three recommendations for reducing ambiguity in WIs and OPs. The recommendations are based on commonly used research techniques in the fields of linguistics and cognitive psychology. The first recommendation is to gather empirical data that can be used to improve the recommended word lists that are provided to technical writers. The second recommendation is to have a review in which new WIs and OPs and checked for ambiguities and clarity. The third recommendation is to use self-paced reading time studies to identify any remaining ambiguities before the new WIs and OPs are put into use. If these three steps are followed for new WIs and OPs, the likelihood of human errors related to ambiguity could be greatly reduced.

  8. Feedforward: helping students interpret written feedback

    OpenAIRE

    Hurford, Donna; Read, Andrew

    2008-01-01

    "Assessment for Learning is the process of seeking and interpreting evidence for use by learners... "(Assessment Reform Group, 2002, p.2): for the Higher Education tutor, written feedback forms an integral part of this. This short article reports on teaching methods to engage students in feedback and assessment of their written work.

  9. Sports metaphors in Polish written commentaries on politics

    OpenAIRE

    Jarosław Wiliński

    2015-01-01

    This paper seeks to investigate what sports metaphors are used in Polish written commentaries on politics and what special purpose they serve. In particular, the paper examines structural metaphors that come from the lexicon of popular sports, such as boxing, racing, track and field athletics, sailing, etc. The language data, derived from English Internet websites, has been grouped and discussed according to source domains. Applying George Lakoff and Mark Johnson’s approach to metaphor, the p...

  10. Development of a web GIS application for emissions inventory spatial allocation based on open source software tools

    Science.gov (United States)

    Gkatzoflias, Dimitrios; Mellios, Giorgos; Samaras, Zissis

    2013-03-01

    Combining emission inventory methods and geographic information systems (GIS) remains a key issue for environmental modelling and management purposes. This paper examines the development of a web GIS application as part of an emission inventory system that produces maps and files with spatial allocated emissions in a grid format. The study is not confined in the maps produced but also presents the features and capabilities of a web application that can be used by every user even without any prior knowledge of the GIS field. The development of the application was based on open source software tools such as MapServer for the GIS functions, PostgreSQL and PostGIS for the data management and HTML, PHP and JavaScript as programming languages. In addition, background processes are used in an innovative manner to handle the time consuming and computational costly procedures of the application. Furthermore, a web map service was created to provide maps to other clients such as the Google Maps API v3 that is used as part of the user interface. The output of the application includes maps in vector and raster format, maps with temporal resolution on daily and hourly basis, grid files that can be used by air quality management systems and grid files consistent with the European Monitoring and Evaluation Programme Grid. Although the system was developed and validated for the Republic of Cyprus covering a remarkable wide range of pollutant and emissions sources, it can be easily customized for use in other countries or smaller areas, as long as geospatial and activity data are available.

  11. New target prediction and visualization tools incorporating open source molecular fingerprints for TB Mobile 2.0.

    Science.gov (United States)

    Clark, Alex M; Sarker, Malabika; Ekins, Sean

    2014-01-01

    We recently developed a freely available mobile app (TB Mobile) for both iOS and Android platforms that displays Mycobacterium tuberculosis (Mtb) active molecule structures and their targets with links to associated data. The app was developed to make target information available to as large an audience as possible. We now report a major update of the iOS version of the app. This includes enhancements that use an implementation of ECFP_6 fingerprints that we have made open source. Using these fingerprints, the user can propose compounds with possible anti-TB activity, and view the compounds within a cluster landscape. Proposed compounds can also be compared to existing target data, using a näive Bayesian scoring system to rank probable targets. We have curated an additional 60 new compounds and their targets for Mtb and added these to the original set of 745 compounds. We have also curated 20 further compounds (many without targets in TB Mobile) to evaluate this version of the app with 805 compounds and associated targets. TB Mobile can now manage a small collection of compounds that can be imported from external sources, or exported by various means such as email or app-to-app inter-process communication. This means that TB Mobile can be used as a node within a growing ecosystem of mobile apps for cheminformatics. It can also cluster compounds and use internal algorithms to help identify potential targets based on molecular similarity. TB Mobile represents a valuable dataset, data-visualization aid and target prediction tool.

  12. A qPCR-Based Tool to Diagnose the Presence of Harmful Cyanobacteria and Cyanotoxins in Drinking Water Sources

    Directory of Open Access Journals (Sweden)

    Yi-Ting Chiu

    2017-05-01

    Full Text Available Harmful cyanobacteria have been an important concern for drinking water quality for quite some time, as they may produce cyanotoxins and odorants. Microcystis and Cylindrospermopsis are two common harmful cyanobacterial genera detected in freshwater lakes and reservoirs, with microcystins (MCs and cylindrospermopsin (CYN as their important metabolites, respectively. In this study, two sets of duplex qPCR systems were developed, one for quantifying potentially-toxigenic Microcystis and Microcystis, and the other one for cylindrospermopsin-producing cyanobacteria and Cylindrospermopsis. The duplex qPCR systems were developed and validated in the laboratory by using 338 samples collected from 29 reservoirs in Taiwan and her offshore islands. Results show that cell numbers of Microcystis and Cylindorspermopsis enumerated with microscopy, and MCs and CYN concentrations measured with the enzyme-linked immuno-sorbent assay method, correlated well with their corresponding gene copies determined with the qPCR systems (range of coefficients of determination R2 = 0.392−0.740. The developed qPCR approach may serve as a useful tool for the water industry to diagnose the presence of harmful cyanobacteria and the potential presence of cyanotoxins in source waters.

  13. A qPCR-Based Tool to Diagnose the Presence of Harmful Cyanobacteria and Cyanotoxins in Drinking Water Sources.

    Science.gov (United States)

    Chiu, Yi-Ting; Chen, Yi-Hsuan; Wang, Ting-Shaun; Yen, Hung-Kai; Lin, Tsair-Fuh

    2017-05-20

    Harmful cyanobacteria have been an important concern for drinking water quality for quite some time, as they may produce cyanotoxins and odorants. Microcystis and Cylindrospermopsis are two common harmful cyanobacterial genera detected in freshwater lakes and reservoirs, with microcystins (MCs) and cylindrospermopsin (CYN) as their important metabolites, respectively. In this study, two sets of duplex qPCR systems were developed, one for quantifying potentially-toxigenic Microcystis and Microcystis , and the other one for cylindrospermopsin-producing cyanobacteria and Cylindrospermopsis . The duplex qPCR systems were developed and validated in the laboratory by using 338 samples collected from 29 reservoirs in Taiwan and her offshore islands. Results show that cell numbers of Microcystis and Cylindorspermopsis enumerated with microscopy, and MCs and CYN concentrations measured with the enzyme-linked immuno-sorbent assay method, correlated well with their corresponding gene copies determined with the qPCR systems (range of coefficients of determination R² = 0.392-0.740). The developed qPCR approach may serve as a useful tool for the water industry to diagnose the presence of harmful cyanobacteria and the potential presence of cyanotoxins in source waters.

  14. Development and Experimental Validation of a TRNSYS Dynamic Tool for Design and Energy Optimization of Ground Source Heat Pump Systems

    Directory of Open Access Journals (Sweden)

    Félix Ruiz-Calvo

    2017-09-01

    Full Text Available Ground source heat pump (GSHP systems stand for an efficient technology for renewable heating and cooling in buildings. To optimize not only the design but also the operation of the system, a complete dynamic model becomes a highly useful tool, since it allows testing any design modifications and different optimization strategies without actually implementing them at the experimental facility. Usually, this type of systems presents strong dynamic operating conditions. Therefore, the model should be able to predict not only the steady-state behavior of the system but also the short-term response. This paper presents a complete GSHP system model based on an experimental facility, located at Universitat Politècnica de València. The installation was constructed in the framework of a European collaborative project with title GeoCool. The model, developed in TRNSYS, has been validated against experimental data, and it accurately predicts both the short- and long-term behavior of the system.

  15. 10 CFR 2.813 - Written communications.

    Science.gov (United States)

    2010-01-01

    ... other written communications under the regulations of this chapter is requested but not required to cite whenever practical, in the upper right corner of the first page of the submission, the specific regulation...

  16. Accurate modeling of UV written waveguide components

    DEFF Research Database (Denmark)

    Svalgaard, Mikael

    BPM simulation results of UV written waveguide components that are indistinguishable from measurements can be achieved on the basis of trajectory scan data and an equivalent step index profile that is very easy to measure.......BPM simulation results of UV written waveguide components that are indistinguishable from measurements can be achieved on the basis of trajectory scan data and an equivalent step index profile that is very easy to measure....

  17. Accurate modelling of UV written waveguide components

    DEFF Research Database (Denmark)

    Svalgaard, Mikael

    BPM simulation results of UV written waveguide components that are indistinguishable from measurements can be achieved on the basis of trajectory scan data and an equivalent step index profile that is very easy to measure.......BPM simulation results of UV written waveguide components that are indistinguishable from measurements can be achieved on the basis of trajectory scan data and an equivalent step index profile that is very easy to measure....

  18. Evanescent fields of laser written waveguides

    Science.gov (United States)

    Jukić, Dario; Pohl, Thomas; Götte, Jörg B.

    2015-03-01

    We investigate the evanescent field at the surface of laser written waveguides. The waveguides are written by a direct femtosecond laser writing process into fused silica, which is then sanded down to expose the guiding layer. These waveguides support eigenmodes which have an evanescent field reaching into the vacuum above the waveguide. We study the governing wave equations and present solution for the fundamental eigenmodes of the modified waveguides.

  19. Analysis and prediction of flow from local source in a river basin using a Neuro-fuzzy modeling tool.

    Science.gov (United States)

    Aqil, Muhammad; Kita, Ichiro; Yano, Akira; Nishiyama, Soichi

    2007-10-01

    Traditionally, the multiple linear regression technique has been one of the most widely used models in simulating hydrological time series. However, when the nonlinear phenomenon is significant, the multiple linear will fail to develop an appropriate predictive model. Recently, neuro-fuzzy systems have gained much popularity for calibrating the nonlinear relationships. This study evaluated the potential of a neuro-fuzzy system as an alternative to the traditional statistical regression technique for the purpose of predicting flow from a local source in a river basin. The effectiveness of the proposed identification technique was demonstrated through a simulation study of the river flow time series of the Citarum River in Indonesia. Furthermore, in order to provide the uncertainty associated with the estimation of river flow, a Monte Carlo simulation was performed. As a comparison, a multiple linear regression analysis that was being used by the Citarum River Authority was also examined using various statistical indices. The simulation results using 95% confidence intervals indicated that the neuro-fuzzy model consistently underestimated the magnitude of high flow while the low and medium flow magnitudes were estimated closer to the observed data. The comparison of the prediction accuracy of the neuro-fuzzy and linear regression methods indicated that the neuro-fuzzy approach was more accurate in predicting river flow dynamics. The neuro-fuzzy model was able to improve the root mean square error (RMSE) and mean absolute percentage error (MAPE) values of the multiple linear regression forecasts by about 13.52% and 10.73%, respectively. Considering its simplicity and efficiency, the neuro-fuzzy model is recommended as an alternative tool for modeling of flow dynamics in the study area.

  20. Kinetic Analysis of Dynamic Positron Emission Tomography Data using Open-Source Image Processing and Statistical Inference Tools.

    Science.gov (United States)

    Hawe, David; Hernández Fernández, Francisco R; O'Suilleabháin, Liam; Huang, Jian; Wolsztynski, Eric; O'Sullivan, Finbarr

    2012-05-01

    In dynamic mode, positron emission tomography (PET) can be used to track the evolution of injected radio-labelled molecules in living tissue. This is a powerful diagnostic imaging technique that provides a unique opportunity to probe the status of healthy and pathological tissue by examining how it processes substrates. The spatial aspect of PET is well established in the computational statistics literature. This article focuses on its temporal aspect. The interpretation of PET time-course data is complicated because the measured signal is a combination of vascular delivery and tissue retention effects. If the arterial time-course is known, the tissue time-course can typically be expressed in terms of a linear convolution between the arterial time-course and the tissue residue. In statistical terms, the residue function is essentially a survival function - a familiar life-time data construct. Kinetic analysis of PET data is concerned with estimation of the residue and associated functionals such as flow, flux, volume of distribution and transit time summaries. This review emphasises a nonparametric approach to the estimation of the residue based on a piecewise linear form. Rapid implementation of this by quadratic programming is described. The approach provides a reference for statistical assessment of widely used one- and two-compartmental model forms. We illustrate the method with data from two of the most well-established PET radiotracers, (15)O-H(2)O and (18)F-fluorodeoxyglucose, used for assessment of blood perfusion and glucose metabolism respectively. The presentation illustrates the use of two open-source tools, AMIDE and R, for PET scan manipulation and model inference.

  1. A new tool for rapid and automatic estimation of earthquake source parameters and generation of seismic bulletins

    Science.gov (United States)

    Zollo, Aldo

    2016-04-01

    of the equivalent Wood-Anderson displacement recordings. The moment magnitude (Mw) is then estimated from the inversion of displacement spectra. The duration magnitude (Md) is rapidly computed, based on a simple and automatic measurement of the seismic wave coda duration. Starting from the magnitude estimates, other relevant pieces of information are also computed, such as the corner frequency, the seismic moment, the source radius and the seismic energy. The ground-shaking maps on a Google map are produced, for peak ground acceleration (PGA), peak ground velocity (PGV) and instrumental intensity (in SHAKEMAP® format), or a plot of the measured peak ground values. Furthermore, based on a specific decisional scheme, the automatic discrimination between local earthquakes occurred within the network and regional/teleseismic events occurred outside the network is performed. Finally, for largest events, if a consistent number of P-wave polarity reading are available, the focal mechanism is also computed. For each event, all of the available pieces of information are stored in a local database and the results of the automatic analyses are published on an interactive web page. "The Bulletin" shows a map with event location and stations, as well as a table listing all the events, with the associated parameters. The catalogue fields are the event ID, the origin date and time, latitude, longitude, depth, Ml, Mw, Md, the number of triggered stations, the S-displacement spectra, and shaking maps. Some of these entries also provide additional information, such as the focal mechanism (when available). The picked traces are uploaded in the database and from the web interface of the Bulletin the traces can be download for more specific analysis. This innovative software represents a smart solution, with a friendly and interactive interface, for high-level analysis of seismic data analysis and it may represent a relevant tool not only for seismologists, but also for non

  2. Written Language Shift among Norwegian Youth

    Directory of Open Access Journals (Sweden)

    Kamil ÖZERK

    2013-07-01

    Full Text Available In Norway there are two written Norwegian languages, Bokmål and Nynorsk. Of these two written languages Bokmål is being used by the majority of the people, and Bokmål has the highest prestige in the society. This article is about the shift of written language from Nynorsk to Bokmål among young people in a traditional Nynorsk district in the country. Drawing on empirical data we conclude that many adolescents are experiencing written language shift. We discuss various reasons for this phenomenon in the linguistic landscape of Norway. In our discussions we emphasize the importance of the school with regard to language maintenance and language revitalization. We call for a new language policy in the educational system that can prevent language shift. Having several dialects and two officially written forms of Norwegian in the country, creates a special linguistic landscape in Norway. Despite the fact that the Norwegian language situation is in several ways unique, it’s done very little research on how the existing policy works in practice. Our research reveals that the existing language policy and practice in the school system is not powerful enough to prevent language shift and language decay among the youngsters. The school system functions like a fabric for language shift.

  3. STRATEGIES OF EXPRESSING WRITTEN APOLOGIES IN THE ONLINE NEWSPAPERS

    Directory of Open Access Journals (Sweden)

    Cipto Wardoyo

    2015-12-01

    Full Text Available Expressing apology is a universal activity although people have different strategies or ways to express the apology based on the culture, situation, and context. An apology has played a vital role in verbal politeness; it is certainly impolite when someone does not express an apology when he or she has commited an offence to the others. Apologies in the Pragmatic study is classified under speech act theory. An apology based on Searle (1969 is classified as expressive speech acts because it expresses speaker’s physiological attitude. An apology expresses speaker’s sorrow and regret because he/she has offended hearers or readers.  This paper tries to discuss strategies of editors in expressing written apologies in the online newspaper. The objective of this paper is to explain what the strategies of written apologies are in the online newspaper. This study uses qualitative method; the writer chooses descriptive interpretative technique for analyzing data. There are four written apologies in the online neswpapers as data sources in this paper, the data are taken from The Jakarta Post, The Daily Express, The Sun, and Brisbane Times. The writer tries to describe and analyzes utterances in the data sources based on Olshtain & Cohen theory (1986. There are five main strategies in expressing apologies according to Olshtain & Cohen (1986; they are Illocutionary Force Indicating Device (IFID, expression responsibility, explanation/justification, offer repairs, and promise forbearance. The writer found that all of the written apologies used combination strategies, they used IFID by using performative verb: apologize and be sorry then followed by expression resposbility, explanation, offer repairs, and promise forbearance. Keywords: apologies, speech acts, politeness, pragmatics

  4. Interpretation of Written Contracts in England

    Directory of Open Access Journals (Sweden)

    Neil Andrews

    2014-01-01

    Full Text Available This article examines the leading principles governing interpretation of written contracts under English law. This is a comprehensive and incisive analysis of the current law and of the relevant doctrines, including the equitable principles of rectification, as well as the powers of appeal courts or of the High Court when hearing an appeal from an arbitral award. The topic of interpretation of written contracts is fast-moving. It is of fundamental importance because this is the most significant commercial focus for dispute and because of the number of cross-border transactions to which English law is expressly applied by businesses.

  5. Poling of UV-written Waveguides

    DEFF Research Database (Denmark)

    Arentoft, Jesper; Kristensen, Martin; Hübner, Jörg

    1999-01-01

    We report poling of UV-written silica waveguides. Thermal poling induces an electro-optic coefficient of 0.05 pm/V. We also demonstrate simultaneous UV-writing and UV-poling. No measurable decay in the induced electro-optic effect was detected after nine months......We report poling of UV-written silica waveguides. Thermal poling induces an electro-optic coefficient of 0.05 pm/V. We also demonstrate simultaneous UV-writing and UV-poling. No measurable decay in the induced electro-optic effect was detected after nine months...

  6. Oral vs. written evaluation of students

    DEFF Research Database (Denmark)

    Asklund, U.; Bendix, Lars Gotfred

    2003-01-01

    In this short paper we discuss the advantages and drawbacks of oral and written evaluation of students. First in more general terms and then followed by details of what we did in our course and our experience. Finally, we outline some topics for further study and discussions......In this short paper we discuss the advantages and drawbacks of oral and written evaluation of students. First in more general terms and then followed by details of what we did in our course and our experience. Finally, we outline some topics for further study and discussions...

  7. Classifying Written Texts Through Rhythmic Features

    NARCIS (Netherlands)

    Balint, Mihaela; Dascalu, Mihai; Trausan-Matu, Stefan

    2016-01-01

    Rhythm analysis of written texts focuses on literary analysis and it mainly considers poetry. In this paper we investigate the relevance of rhythmic features for categorizing texts in prosaic form pertaining to different genres. Our contribution is threefold. First, we define a set of rhythmic

  8. 37 CFR 251.43 - Written cases.

    Science.gov (United States)

    2010-07-01

    ... and redirect) must be referenced. (d) In the case of a royalty fee distribution proceeding, each party... ROYALTY PANEL RULES AND PROCEDURES COPYRIGHT ARBITRATION ROYALTY PANEL RULES OF PROCEDURE Procedures of Copyright Arbitration Royalty Panels § 251.43 Written cases. (a) All parties who have filed a notice of...

  9. Comparisons between written and computerised patient histories

    NARCIS (Netherlands)

    Quaak, Martien; Westerman, R. Frans; van Bemmel, Jan H.

    1987-01-01

    Patient histories were obtained from 99 patients in three different ways: by a computerised patient interview (patient record), by the usual written interview (medical record), and by the transcribed record, which was a computerised version of the medical record. Patient complaints, diagnostic

  10. Cue Reliance in L2 Written Production

    Science.gov (United States)

    Wiechmann, Daniel; Kerz, Elma

    2014-01-01

    Second language learners reach expert levels in relative cue weighting only gradually. On the basis of ensemble machine learning models fit to naturalistic written productions of German advanced learners of English and expert writers, we set out to reverse engineer differences in the weighting of multiple cues in a clause linearization problem. We…

  11. On written expression of primary school pupils

    Directory of Open Access Journals (Sweden)

    Stevanović Jelena

    2009-01-01

    Full Text Available Normative rules of standard Serbian language are acquired during primary and secondary education through curriculum demands of Serbian language instruction, which takes place in three fields: grammar, orthography and culture of expression. Topic of interest in this paper is the quality of written expression of 6th and 7th grade pupils, in the context of all three fields specified to be mastered by the curriculum of Serbian language. Research comprised 148 primary school pupils from Belgrade. Linguistic analysis of spontaneously created written text was performed, in the conditions where it was not explicitly demanded form the pupil to write correctly. The results indicate that the majority of pupils make spelling and grammatical errors, meeting the condition for the basic level of mastering the knowledge in Serbian language according to the standards specified for the end of compulsory education. In addition to this, a considerable majority of pupils has a satisfactory level of culture of written expression. Pupils more often make spelling than grammatical errors. Seventh grade pupils are better than sixth grade pupils with respect to adhering to grammar rules and according to culture of written expression, while the mark in Serbian language and general school achievement of pupils correlate only with the degree of adhering to the orthographic rules. It was concluded that not only individual programs of support for pupils who make more errors are necessary, but also launching national projects for the development of linguistic competence of the young in Serbia.

  12. Learners' right to freedom of written expression

    African Journals Online (AJOL)

    Erna Kinsey

    Learners' right to freedom of written expression. W.J. van Vollenhoven. Department of Education Management and Policy Studies, University of Pretoria, Pretoria, 0002 South Africa wvvollen@postino.up.ac.za. Charles I. Glenn. Training and Policy Studies of the University Professors' Program, University of Boston. Although ...

  13. 34 CFR 32.9 - Written decision.

    Science.gov (United States)

    2010-07-01

    ... the employee has submitted the financial statement and written explanation required under § 32.4(c... stating the facts supporting the nature and origin of the debt and the hearing official's analysis... determination of the existence and the amount of the overpayment or the extreme financial hardship caused by the...

  14. Increasing advertising power via written scent references

    NARCIS (Netherlands)

    Fenko, Anna; Breulmann, Svenja; Bialkova, Svetlana; Bialkova, Svetlana

    2014-01-01

    Olfactory cues in advertisements can evoke positive consumer emotions and product attitudes, yet including real scent in advertising is not always feasible. This study aimed at investigating whether written scent references could produce effects similar to real scents. Participants in online

  15. Written mathematical traditions in Ancient Mesopotamia

    DEFF Research Database (Denmark)

    Høyrup, Jens

    2015-01-01

    Writing, as well as various mathematical techniques, were created in proto-literate Uruk in order to serve accounting, and Mesopotamian mathematics as we know it was always expressed in writing. In so far, mathematics generically regarded was always part of the generic written tradition....

  16. Clinical presentation of women with pelvic source varicose veins in the perineum as a first step in the development of a disease-specific patient assessment tool.

    Science.gov (United States)

    Gibson, Kathleen; Minjarez, Renee; Ferris, Brian; Neradilek, Moni; Wise, Matthew; Stoughton, Julianne; Meissner, Mark

    2017-07-01

    Pelvic venous incompetence can cause symptomatic varicose veins in the perineum, buttock, and thigh. Presentation, symptom severity, and response to treatment of pelvic source varicose veins are not well defined. Currently available tools to measure the severity of lower extremity venous disease and its effects on quality of life may be inadequate to assess disease severity in these patients. The purpose of this study was to evaluate the histories, demographics, and clinical presentations of women with pelvic source varicose veins and to compare these data to a population of women with nonpelvic source varicose veins. A total of 72 female patients with symptomatic pelvic source varicose veins were prospectively followed up. Age, weight, height, parity, and birth weights of offspring were recorded. Both pelvic source varicose veins and saphenous incompetence were identified by duplex ultrasound. Patients were queried as to their primary symptoms, activities that made their symptoms worse, and time when their symptoms were most prominent. Severity of disease was objectively evaluated using the revised Venous Clinical Severity Score (rVCSS) and 10-point numeric pain rating scale (NPRS). Compared with women without a pelvic source of varicose veins (N = 1163), patients with pelvic source varicose veins were younger (mean, 44.6 ± 8.6 vs 52.6 ± 12.9 years; P source varicose veins are a unique subset of patients. They are younger and thinner than those with nonpelvic source varicose veins, have larger infants than the general U.S. population, and have an inverse correlation between age and pain. As the majority of premenopausal patients have increased symptoms during menses, this may be due to hormonal influence. As it is poorly associated with patient-reported discomfort, the rVCSS is a poor tool for evaluating pelvic source varicose veins. A disease-specific tool for the evaluation of pelvic source varicose veins is critically needed, and this study is a first

  17. High Throughput PBPK: Evaluating EPA's Open-Source Data and Tools for Dosimetry and Exposure Reconstruction (SOT)

    Science.gov (United States)

    To address this need, new tools have been created for characterizing, simulating, and evaluating chemical biokinetics. Physiologically-based pharmacokinetic (PBPK) models provide estimates of chemical exposures that produce potentially hazardous tissue concentrations, while tissu...

  18. Sports metaphors in Polish written commentaries on politics

    Directory of Open Access Journals (Sweden)

    Jarosław Wiliński

    2015-12-01

    Full Text Available This paper seeks to investigate what sports metaphors are used in Polish written commentaries on politics and what special purpose they serve. In particular, the paper examines structural metaphors that come from the lexicon of popular sports, such as boxing, racing, track and field athletics, sailing, etc. The language data, derived from English Internet websites, has been grouped and discussed according to source domains. Applying George Lakoff and Mark Johnson’s approach to metaphor, the paper attempts to determine both the kind of source domains from which common metaphors are drawn and to what degree structural metaphors are used. The data suggests that many structural metaphors can be found in the language of politics. They are drawn from a wide variety of sports source domains, although the domains of boxing, racing, sailing, and soccer are of particular prominence. It seems that the primary function of structural metaphors in written commentaries is to facilitate the interpretation of facts in a way that is enormously appealing to the reader.

  19. Glimpses into the transition world: New graduate nurses' written reflections.

    Science.gov (United States)

    Walton, Jo Ann; Lindsay, Natalie; Hales, Caz; Rook, Helen

    2018-01-01

    This study was born out of our reflections as educators responsible for helping new graduate nurses transition into their first year of professional practice through a formal education programme. Finding ourselves wondering about many of the questions the students raised with us, we set about looking more closely at what could be gleaned from the students' experience, captured in their written work over the course of a year. To identify the challenges and learning experiences revealed in reflective assignments written by new graduate nurses undertaking a postgraduate course as part of their transition to registered nurse practice. Data consisted of the written work of two cohorts of students who had completed a postgraduate university course as part of their transition to new graduate practice in New Zealand. Fifty four reflective essays completed by twenty seven participating students were collected and their contents analysed thematically. Five key themes were identified. The students' reflections noted individual attributes - personal and professional strengths and weaknesses; professional behaviour - actions such as engaging help and support, advocating for patients' needs and safety and putting their own feelings aside; situational challenges such as communication difficulties, both systemic and interpersonal, and the pressure of competing demands. Students also identified rewards - results they experienced such as achieving the nursing outcomes they desired, and commented on reflection as a useful tool. The findings shed light on the experiences of new graduates, and how they fare through this critical phase of career development. Challenges relating to the emotional labour of nursing work are particularly evident. In addition the reflective essay is shown to be a powerful tool for assisting both new graduate nurses and their lecturers to reflect on the learning opportunities inherent in current clinical practice environments. Copyright © 2017 Elsevier Ltd

  20. SESAME: a software tool for the numerical dosimetric reconstruction of radiological accidents involving external sources and its application to the accident in Chile in December 2005.

    Science.gov (United States)

    Huet, C; Lemosquet, A; Clairand, I; Rioual, J B; Franck, D; de Carlan, L; Aubineau-Lanièce, I; Bottollier-Depois, J F

    2009-01-01

    Estimating the dose distribution in a victim's body is a relevant indicator in assessing biological damage from exposure in the event of a radiological accident caused by an external source. This dose distribution can be assessed by physical dosimetric reconstruction methods. Physical dosimetric reconstruction can be achieved using experimental or numerical techniques. This article presents the laboratory-developed SESAME--Simulation of External Source Accident with MEdical images--tool specific to dosimetric reconstruction of radiological accidents through numerical simulations which combine voxel geometry and the radiation-material interaction MCNP(X) Monte Carlo computer code. The experimental validation of the tool using a photon field and its application to a radiological accident in Chile in December 2005 are also described.

  1. Maintenance, operation, and research (radiation) zones (MORZ) application model - a design and operation tool for intelligent buildings with application to the advanced neutron source

    International Nuclear Information System (INIS)

    Shapira, H.B.; Brown, R.A.

    1995-01-01

    This paper describes a user-friendly application tool to assist in the design, operation and maintenance of large buildings/facilities charged with complex/extensive/elaborate activities. The model centers around a specially designed, easy-access data base containing essentially all the relevant information about the facility. Our first test case is the Advanced Neutron Source (ANS) research reactor to be constructed as a center for neutron research

  2. Use of Information Technology Tools in Source Selection Decision Making: A Study on USAF's KC-X Tanker Replacement Program

    National Research Council Canada - National Science Library

    Kaymaz, Sidar; Diri, Alaattin

    2008-01-01

    ... and subjectivity is usually inevitable in this kind of a decision making process. The purpose of this project is to demonstrate how the USAF's current source selection method (color rating method...

  3. Written narrative practices in elementary school students.

    Science.gov (United States)

    Romano-Soares, Soraia; Soares, Aparecido José Couto; Cárnio, Maria Silvia

    2010-01-01

    Promotion of a written narratives production program in the third grade of an Elementary School. To analyze two written narrative practice proposals in order to verify which resources are more efficient in benefitting the textual productions of third grade Elementary School students. Sixty students were selected from two third grade groups of a public Elementary School in São Paulo (Brazil). For the analysis, students were divided into two groups (Group A and Group B). Fourteen children's storybooks were used. In Group A, the story was orally told by the researchers in a colloquial manner, keeping the narrator role and the original structure proposed by the author. In Group B, the story was fully read. The book was projected onto a screen and read aloud so the students could follow the reading and observe the corresponding illustrations. Voice changing resources in the characters' dialogues were used. In the overall comparison, statistically significant results were found for moment (initial and final assessments) and for interaction between groups. It was observed that both groups presented substantial development from initial to final assessment. The Written Narratives Promotion Program based on the shared reading of children's storybooks constituted a more effective strategy than telling the stories using a single reader.

  4. SU-F-BRB-16: A Spreadsheet Based Automatic Trajectory GEnerator (SAGE): An Open Source Tool for Automatic Creation of TrueBeam Developer Mode Robotic Trajectories

    Energy Technology Data Exchange (ETDEWEB)

    Etmektzoglou, A; Mishra, P; Svatos, M [Varian Medical Systems, Palo Alto, CA (United States)

    2015-06-15

    Purpose: To automate creation and delivery of robotic linac trajectories with TrueBeam Developer Mode, an open source spreadsheet-based trajectory generation tool has been developed, tested and made freely available. The computing power inherent in a spreadsheet environment plus additional functions programmed into the tool insulate users from the underlying schema tedium and allow easy calculation, parameterization, graphical visualization, validation and finally automatic generation of Developer Mode XML scripts which are directly loadable on a TrueBeam linac. Methods: The robotic control system platform that allows total coordination of potentially all linac moving axes with beam (continuous, step-and-shoot, or combination thereof) becomes available in TrueBeam Developer Mode. Many complex trajectories are either geometric or can be described in analytical form, making the computational power, graphing and programmability available in a spreadsheet environment an easy and ideal vehicle for automatic trajectory generation. The spreadsheet environment allows also for parameterization of trajectories thus enabling the creation of entire families of trajectories using only a few variables. Standard spreadsheet functionality has been extended for powerful movie-like dynamic graphic visualization of the gantry, table, MLC, room, lasers, 3D observer placement and beam centerline all as a function of MU or time, for analysis of the motions before requiring actual linac time. Results: We used the tool to generate and deliver extended SAD “virtual isocenter” trajectories of various shapes such as parameterized circles and ellipses. We also demonstrated use of the tool in generating linac couch motions that simulate respiratory motion using analytical parameterized functions. Conclusion: The SAGE tool is a valuable resource to experiment with families of complex geometric trajectories for a TrueBeam Linac. It makes Developer Mode more accessible as a vehicle to quickly

  5. SU-F-BRB-16: A Spreadsheet Based Automatic Trajectory GEnerator (SAGE): An Open Source Tool for Automatic Creation of TrueBeam Developer Mode Robotic Trajectories

    International Nuclear Information System (INIS)

    Etmektzoglou, A; Mishra, P; Svatos, M

    2015-01-01

    Purpose: To automate creation and delivery of robotic linac trajectories with TrueBeam Developer Mode, an open source spreadsheet-based trajectory generation tool has been developed, tested and made freely available. The computing power inherent in a spreadsheet environment plus additional functions programmed into the tool insulate users from the underlying schema tedium and allow easy calculation, parameterization, graphical visualization, validation and finally automatic generation of Developer Mode XML scripts which are directly loadable on a TrueBeam linac. Methods: The robotic control system platform that allows total coordination of potentially all linac moving axes with beam (continuous, step-and-shoot, or combination thereof) becomes available in TrueBeam Developer Mode. Many complex trajectories are either geometric or can be described in analytical form, making the computational power, graphing and programmability available in a spreadsheet environment an easy and ideal vehicle for automatic trajectory generation. The spreadsheet environment allows also for parameterization of trajectories thus enabling the creation of entire families of trajectories using only a few variables. Standard spreadsheet functionality has been extended for powerful movie-like dynamic graphic visualization of the gantry, table, MLC, room, lasers, 3D observer placement and beam centerline all as a function of MU or time, for analysis of the motions before requiring actual linac time. Results: We used the tool to generate and deliver extended SAD “virtual isocenter” trajectories of various shapes such as parameterized circles and ellipses. We also demonstrated use of the tool in generating linac couch motions that simulate respiratory motion using analytical parameterized functions. Conclusion: The SAGE tool is a valuable resource to experiment with families of complex geometric trajectories for a TrueBeam Linac. It makes Developer Mode more accessible as a vehicle to quickly

  6. Investigation and Evaluation of the open source ETL tools GeoKettle and Talend Open Studio in terms of their ability to process spatial data

    Science.gov (United States)

    Kuhnert, Kristin; Quedenau, Jörn

    2016-04-01

    Integration and harmonization of large spatial data sets is not only since the introduction of the spatial data infrastructure INSPIRE a big issue. The process of extracting and combining spatial data from heterogeneous source formats, transforming that data to obtain the required quality for particular purposes and loading it into a data store, are common tasks. The procedure of Extraction, Transformation and Loading of data is called ETL process. Geographic Information Systems (GIS) can take over many of these tasks but often they are not suitable for processing large datasets. ETL tools can make the implementation and execution of ETL processes convenient and efficient. One reason for choosing ETL tools for data integration is that they ease maintenance because of a clear (graphical) presentation of the transformation steps. Developers and administrators are provided with tools for identification of errors, analyzing processing performance and managing the execution of ETL processes. Another benefit of ETL tools is that for most tasks no or only little scripting skills are required so that also researchers without programming background can easily work with it. Investigations on ETL tools for business approaches are available for a long time. However, little work has been published on the capabilities of those tools to handle spatial data. In this work, we review and compare the open source ETL tools GeoKettle and Talend Open Studio in terms of processing spatial data sets of different formats. For evaluation, ETL processes are performed with both software packages based on air quality data measured during the BÄRLIN2014 Campaign initiated by the Institute for Advanced Sustainability Studies (IASS). The aim of the BÄRLIN2014 Campaign is to better understand the sources and distribution of particulate matter in Berlin. The air quality data are available in heterogeneous formats because they were measured with different instruments. For further data analysis

  7. The use of Pb, Sr, and Hg isotopes in Great Lakes precipitation as a tool for pollution source attribution

    Science.gov (United States)

    The anthropogenic emission and subsequent deposition of heavy metals including mercury (Hg) and lead (Pb) presents human health and environmental concerns. Although it is known that local and regional sources of these metals contribute to deposition in the Great Lakes region, it ...

  8. Open-Source Tools for Enhancing Full-Text Searching of OPACs: Use of Koha, Greenstone and Fedora

    Science.gov (United States)

    Anuradha, K. T.; Sivakaminathan, R.; Kumar, P. Arun

    2011-01-01

    Purpose: There are many library automation packages available as open-source software, comprising two modules: staff-client module and online public access catalogue (OPAC). Although the OPAC of these library automation packages provides advanced features of searching and retrieval of bibliographic records, none of them facilitate full-text…

  9. Design, implementation and practice of JBEI-ICE: an open source biological part registry platform and tools.

    Science.gov (United States)

    Ham, Timothy S; Dmytriv, Zinovii; Plahar, Hector; Chen, Joanna; Hillson, Nathan J; Keasling, Jay D

    2012-10-01

    The Joint BioEnergy Institute Inventory of Composable Elements (JBEI-ICEs) is an open source registry platform for managing information about biological parts. It is capable of recording information about 'legacy' parts, such as plasmids, microbial host strains and Arabidopsis seeds, as well as DNA parts in various assembly standards. ICE is built on the idea of a web of registries and thus provides strong support for distributed interconnected use. The information deposited in an ICE installation instance is accessible both via a web browser and through the web application programming interfaces, which allows automated access to parts via third-party programs. JBEI-ICE includes several useful web browser-based graphical applications for sequence annotation, manipulation and analysis that are also open source. As with open source software, users are encouraged to install, use and customize JBEI-ICE and its components for their particular purposes. As a web application programming interface, ICE provides well-developed parts storage functionality for other synthetic biology software projects. A public instance is available at public-registry.jbei.org, where users can try out features, upload parts or simply use it for their projects. The ICE software suite is available via Google Code, a hosting site for community-driven open source projects.

  10. Uses of the word "macula" in written English, 1400-present.

    Science.gov (United States)

    Schwartz, Stephen G; Leffler, Christopher T

    2014-01-01

    We compiled uses of the word "macula" in written English by searching multiple databases, including the Early English Books Online Text Creation Partnership, America's Historical Newspapers, the Gale Cengage Collections, and others. "Macula" has been used: as a non-medical "spot" or "stain", literal or figurative, including in astronomy and in Shakespeare; as a medical skin lesion, occasionally with a following descriptive adjective, such as a color (macula alba); as a corneal lesion, including the earliest identified use in English, circa 1400; and to describe the center of the retina. Francesco Buzzi described a yellow color in the posterior pole ("retina tinta di un color giallo") in 1782, but did not use the word "macula". "Macula lutea" was published by Samuel Thomas von Sömmering by 1799, and subsequently used in 1818 by James Wardrop, which appears to be the first known use in English. The Google n-gram database shows a marked increase in the frequencies of both "macula" and "macula lutea" following the introduction of the ophthalmoscope in 1850. "Macula" has been used in multiple contexts in written English. Modern databases provide powerful tools to explore historical uses of this word, which may be underappreciated by contemporary ophthalmologists. Copyright © 2014 Elsevier Inc. All rights reserved.

  11. a Free and Open Source Tool to Assess the Accuracy of Land Cover Maps: Implementation and Application to Lombardy Region (italy)

    Science.gov (United States)

    Bratic, G.; Brovelli, M. A.; Molinari, M. E.

    2018-04-01

    The availability of thematic maps has significantly increased over the last few years. Validation of these maps is a key factor in assessing their suitability for different applications. The evaluation of the accuracy of classified data is carried out through a comparison with a reference dataset and the generation of a confusion matrix from which many quality indexes can be derived. In this work, an ad hoc free and open source Python tool was implemented to automatically compute all the matrix confusion-derived accuracy indexes proposed by literature. The tool was integrated into GRASS GIS environment and successfully applied to evaluate the quality of three high-resolution global datasets (GlobeLand30, Global Urban Footprint, Global Human Settlement Layer Built-Up Grid) in the Lombardy Region area (Italy). In addition to the most commonly used accuracy measures, e.g. overall accuracy and Kappa, the tool allowed to compute and investigate less known indexes such as the Ground Truth and the Classification Success Index. The promising tool will be further extended with spatial autocorrelation analysis functions and made available to researcher and user community.

  12. Tool for the study of matter - the spallation neutron source. Werkzeug zur Erforschung der Materie - die Spallations-Neutronenquelle

    Energy Technology Data Exchange (ETDEWEB)

    1983-01-01

    It deals with the optimal use of a whole series of matter penetrating radiation types at the construction of a spallation neutron source which the Kernforschungsanlage Juelich will realize in agreement with its associated. This new big science device for the fundamental research in the Federal Republic of Germany shall as the most modern and intense source of neutrons, protons, pions, muons, and neutrinos permits to proceed in the fields of solid state physics, chemistry, molecular biology, intermediate-energy nuclear physics, radiochemistry and radiopharmacology, medicine, and materials science to virgin territory and to provide top research. All interested German groups of researchers and also scientists of foreign countries shall be able to work with this directive big science device.

  13. The use of Pb, Sr, and Hg isotopes in Great Lakes precipitation as a tool for pollution source attribution

    Energy Technology Data Exchange (ETDEWEB)

    Sherman, Laura S., E-mail: lsaylors@umich.edu [University of Michigan, Department of Earth and Environmental Sciences, 1100 N. University Ave., Ann Arbor, MI 48109 (United States); Blum, Joel D. [University of Michigan, Department of Earth and Environmental Sciences, 1100 N. University Ave., Ann Arbor, MI 48109 (United States); Dvonch, J. Timothy [University of Michigan, Air Quality Laboratory, 1415 Washington Heights, Ann Arbor, MI 48109 (United States); Gratz, Lynne E. [University of Washington-Bothell, 18115 Campus Way NE, Bothell, WA 98011 (United States); Landis, Matthew S. [U.S. EPA, Office of Research and Development, Research Triangle Park, NC 27709 (United States)

    2015-01-01

    The anthropogenic emission and subsequent deposition of heavy metals including mercury (Hg) and lead (Pb) present human health and environmental concerns. Although it is known that local and regional sources of these metals contribute to deposition in the Great Lakes region, it is difficult to trace emissions from point sources to impacted sites. Recent studies suggest that metal isotope ratios may be useful for distinguishing between and tracing source emissions. We measured Pb, strontium (Sr), and Hg isotope ratios in daily precipitation samples that were collected at seven sites across the Great Lakes region between 2003 and 2007. Lead isotope ratios ({sup 207}Pb/{sup 206}Pb = 0.8062 to 0.8554) suggest that Pb deposition was influenced by coal combustion and processing of Mississippi Valley-Type Pb ore deposits. Regional differences in Sr isotope ratios ({sup 87}Sr/{sup 86}Sr = 0.70859 to 0.71155) are likely related to coal fly ash and soil dust. Mercury isotope ratios (δ{sup 202}Hg = − 1.13 to 0.13‰) also varied among the sites, likely due to regional differences in coal isotopic composition, and fractionation occurring within industrial facilities and in the atmosphere. These data represent the first combined characterization of Pb, Sr, and Hg isotope ratios in precipitation collected across the Great Lakes region. We demonstrate the utility of multiple metal isotope ratios in parallel with traditional trace element multivariate statistical modeling to enable more complete pollution source attribution. - Highlights: • We measured Pb, Sr, and Hg isotopes in precipitation from the Great Lakes region. • Pb isotopes suggest that deposition was impacted by coal combustion and metal production. • Sr isotope ratios vary regionally, likely due to soil dust and coal fly ash. • Hg isotopes vary due to fractionation occurring within facilities and the atmosphere. • Isotope results support conclusions of previous trace element receptor modeling.

  14. Modeling statistical properties of written text.

    Directory of Open Access Journals (Sweden)

    M Angeles Serrano

    Full Text Available Written text is one of the fundamental manifestations of human language, and the study of its universal regularities can give clues about how our brains process information and how we, as a society, organize and share it. Among these regularities, only Zipf's law has been explored in depth. Other basic properties, such as the existence of bursts of rare words in specific documents, have only been studied independently of each other and mainly by descriptive models. As a consequence, there is a lack of understanding of linguistic processes as complex emergent phenomena. Beyond Zipf's law for word frequencies, here we focus on burstiness, Heaps' law describing the sublinear growth of vocabulary size with the length of a document, and the topicality of document collections, which encode correlations within and across documents absent in random null models. We introduce and validate a generative model that explains the simultaneous emergence of all these patterns from simple rules. As a result, we find a connection between the bursty nature of rare words and the topical organization of texts and identify dynamic word ranking and memory across documents as key mechanisms explaining the non trivial organization of written text. Our research can have broad implications and practical applications in computer science, cognitive science and linguistics.

  15. Written argument underlying the Brokdorf verdict

    International Nuclear Information System (INIS)

    Anon.

    1980-01-01

    In December 1979, the Schleswig administrative court delivered its judgment (AZ.: 10 A 512/76) against the plaintiffs (four neighbouring communities and nine individuals), who had brought in an action against the first part-construction permit for the Brokdorf nuclear power plant, issued on October 25, 1976. In mid-march 1980, the written argument underlying this court ruling (58 pages) has been sent out. The written argument conscientiously explains the reasoning of the court which delivered its verdict after several days of oral proceedings in October and November 1979, and clearly states the position of the court with regard to the limits of control by administrative jurisdiction as well as to the controversial legal problem of whether there is a lawful connection between the licensing in accordance with section 7, sub-section 2 of the AtG (Atomic Energy Act) and sufficient nuclear waste management provisions according to section 9a AtG. The court ruling declared the action to be substantially admissible but hot well-founded. (orig./HP) [de

  16. Written culture: reading pratices and printed book

    Directory of Open Access Journals (Sweden)

    Lidia Eugenia Cavalcante

    2009-07-01

    Full Text Available The history of the written culture and the reading practices is the subject argued in this article. It aims at to understand the trajectory of the printed book in its materiality, as well as the processes delineated from the undisputed cultural presence and politics of this support for the modern society. Search to evidence the reading practices, the phenomena and the mutations that fortify such support per centuries, approaching the “book crisis”, its causes and effects. Therefore, it deals with the particularitities of the written culture, that if they had accomplished in the Siècle des Lumières and if they had consecrated in “acting” of the spirit of the authors and the readers of that time, whose propagation influenced the western person. It analyzes the sociological and historical conditions of the place of the modern reader between Science, Philosophy and Romance, continuously transformed for the renewal of the thought and the culture.

  17. Managing Written Directives: A Software Solution to Streamline Workflow.

    Science.gov (United States)

    Wagner, Robert H; Savir-Baruch, Bital; Gabriel, Medhat S; Halama, James R; Bova, Davide

    2017-06-01

    A written directive is required by the U.S. Nuclear Regulatory Commission for any use of 131 I above 1.11 MBq (30 μCi) and for patients receiving radiopharmaceutical therapy. This requirement has also been adopted and must be enforced by the agreement states. As the introduction of new radiopharmaceuticals increases therapeutic options in nuclear medicine, time spent on regulatory paperwork also increases. The pressure of managing these time-consuming regulatory requirements may heighten the potential for inaccurate or incomplete directive data and subsequent regulatory violations. To improve on the paper-trail method of directive management, we created a software tool using a Health Insurance Portability and Accountability Act (HIPAA)-compliant database. This software allows for secure data-sharing among physicians, technologists, and managers while saving time, reducing errors, and eliminating the possibility of loss and duplication. Methods: The software tool was developed using Visual Basic, which is part of the Visual Studio development environment for the Windows platform. Patient data are deposited in an Access database on a local HIPAA-compliant secure server or hard disk. Once a working version had been developed, it was installed at our institution and used to manage directives. Updates and modifications of the software were released regularly until no more significant problems were found with its operation. Results: The software has been used at our institution for over 2 y and has reliably kept track of all directives. All physicians and technologists use the software daily and find it superior to paper directives. They can retrieve active directives at any stage of completion, as well as completed directives. Conclusion: We have developed a software solution for the management of written directives that streamlines and structures the departmental workflow. This solution saves time, centralizes the information for all staff to share, and decreases

  18. Hard- and software of real time simulation tools of Electric Power System for adequate modeling power semiconductors in voltage source convertor based HVDC and FACTS

    Directory of Open Access Journals (Sweden)

    Ufa Ruslan A.

    2014-01-01

    Full Text Available The motivation of the presented research is based on the needs for development of new methods and tools for adequate simulation of Flexible Alternating Current Transmission System (FACTS devices and High Voltage Direct Current Transmission (HVDC system as part of real electric power systems (EPS. For that, a hybrid approach for advanced simulation of the FACTS and HVDC based on Voltage Source is proposed. The presented simulation results of the developed hybrid model of VSC confirm the achievement of the desired properties of the model and the effectiveness of the proposed solutions.

  19. The Focinator - a new open-source tool for high-throughput foci evaluation of DNA damage

    International Nuclear Information System (INIS)

    Oeck, Sebastian; Malewicz, Nathalie M.; Hurst, Sebastian; Rudner, Justine; Jendrossek, Verena

    2015-01-01

    The quantitative analysis of foci plays an important role in many cell biological methods such as counting of colonies or cells, organelles or vesicles, or the number of protein complexes. In radiation biology and molecular radiation oncology, DNA damage and DNA repair kinetics upon ionizing radiation (IR) are evaluated by counting protein clusters or accumulations of phosphorylated proteins recruited to DNA damage sites. Consistency in counting and interpretation of foci remains challenging. Many current software solutions describe instructions for time-consuming and error-prone manual analysis, provide incomplete algorithms for analysis or are expensive. Therefore, we aimed to develop a tool for costless, automated, quantitative and qualitative analysis of foci. For this purpose we integrated a user-friendly interface into ImageJ and selected parameters to allow automated selection of regions of interest (ROIs) depending on their size and circularity. We added different export options and a batch analysis. The use of the Focinator was tested by analyzing γ-H2.AX foci in murine prostate adenocarcinoma cells (TRAMP-C1) at different time points after IR with 0.5 to 3 Gray (Gy). Additionally, measurements were performed by users with different backgrounds and experience. The Focinator turned out to be an easily adjustable tool for automation of foci counting. It significantly reduced the analysis time of radiation-induced DNA-damage foci. Furthermore, different user groups were able to achieve a similar counting velocity. Importantly, there was no difference in nuclei detection between the Focinator and ImageJ alone. The Focinator is a costless, user-friendly tool for fast high-throughput evaluation of DNA repair foci. The macro allows improved foci evaluation regarding accuracy, reproducibility and analysis speed compared to manual analysis. As innovative option, the macro offers a combination of multichannel evaluation including colocalization analysis and the

  20. RDFBuilder: a tool to automatically build RDF-based interfaces for MAGE-OM microarray data sources.

    Science.gov (United States)

    Anguita, Alberto; Martin, Luis; Garcia-Remesal, Miguel; Maojo, Victor

    2013-07-01

    This paper presents RDFBuilder, a tool that enables RDF-based access to MAGE-ML-compliant microarray databases. We have developed a system that automatically transforms the MAGE-OM model and microarray data stored in the ArrayExpress database into RDF format. Additionally, the system automatically enables a SPARQL endpoint. This allows users to execute SPARQL queries for retrieving microarray data, either from specific experiments or from more than one experiment at a time. Our system optimizes response times by caching and reusing information from previous queries. In this paper, we describe our methods for achieving this transformation. We show that our approach is complementary to other existing initiatives, such as Bio2RDF, for accessing and retrieving data from the ArrayExpress database. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  1. Recent Additions in the Modeling Capabilities of an Open-Source Wave Energy Converter Design Tool: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Tom, N.; Lawson, M.; Yu, Y. H.

    2015-04-20

    WEC-Sim is a midfidelity numerical tool for modeling wave energy conversion devices. The code uses the MATLAB SimMechanics package to solve multibody dynamics and models wave interactions using hydrodynamic coefficients derived from frequency-domain boundary-element methods. This paper presents the new modeling features introduced in the latest release of WEC-Sim. The first feature discussed conversion of the fluid memory kernel to a state-space form. This enhancement offers a substantial computational benefit after the hydrodynamic body-to-body coefficients are introduced and the number of interactions increases exponentially with each additional body. Additional features include the ability to calculate the wave-excitation forces based on the instantaneous incident wave angle, allowing the device to weathervane, as well as import a user-defined wave elevation time series. A review of the hydrodynamic theory for each feature is provided and the successful implementation is verified using test cases.

  2. Ethnobotany as a pharmacological research tool and recent developments in CNS-active natural products from ethnobotanical sources.

    Science.gov (United States)

    McClatchey, Will C; Mahady, Gail B; Bennett, Bradley C; Shiels, Laura; Savo, Valentina

    2009-08-01

    The science of ethnobotany is reviewed in light of its multi-disciplinary contributions to natural product research for the development of pharmaceuticals and pharmacological tools. Some of the issues reviewed involve ethical and cultural perspectives of healthcare and medicinal plants. While these are not usually part of the discussion of pharmacology, cultural concerns potentially provide both challenges and insight for field and laboratory researchers. Plant evolutionary issues are also considered as they relate to development of plant chemistry and accessing this through ethnobotanical methods. The discussion includes presentation of a range of CNS-active medicinal plants that have been recently examined in the field, laboratory and/or clinic. Each of these plants is used to illustrate one or more aspects about the valuable roles of ethnobotany in pharmacological research. We conclude with consideration of mutually beneficial future collaborations between field ethnobotanists and pharmacologists.

  3. Talking to Texts and Sketches: The Function of Written and Graphic Mediation in Engineering Design.

    Science.gov (United States)

    Lewis, Barbara

    2000-01-01

    Describes the author's research that explores the role of language, particularly texts, in the engineering design process. Notes that results of this case study support a new "mediated" model of engineering design as an inventional activity in which designers use talk, written language, and other symbolic representations as tools to think about…

  4. A Large-Scale Analysis of Variance in Written Language.

    Science.gov (United States)

    Johns, Brendan T; Jamieson, Randall K

    2018-01-22

    The collection of very large text sources has revolutionized the study of natural language, leading to the development of several models of language learning and distributional semantics that extract sophisticated semantic representations of words based on the statistical redundancies contained within natural language (e.g., Griffiths, Steyvers, & Tenenbaum, ; Jones & Mewhort, ; Landauer & Dumais, ; Mikolov, Sutskever, Chen, Corrado, & Dean, ). The models treat knowledge as an interaction of processing mechanisms and the structure of language experience. But language experience is often treated agnostically. We report a distributional semantic analysis that shows written language in fiction books varies appreciably between books from the different genres, books from the same genre, and even books written by the same author. Given that current theories assume that word knowledge reflects an interaction between processing mechanisms and the language environment, the analysis shows the need for the field to engage in a more deliberate consideration and curation of the corpora used in computational studies of natural language processing. Copyright © 2018 Cognitive Science Society, Inc.

  5. Analysis of children's narrative written by women

    Directory of Open Access Journals (Sweden)

    Francisca Sánchez-Pinilla

    2012-11-01

    Full Text Available Our work tries to approach the study of a corpus of children’s literature published in the adults’ Spanish press during the period from 1920 to 1939 and whose authorship is feminine. It’s fundamental aims, are therefore, to make this production visible; to analyze the different aesthetic and ideological propacals formalised in this writing and reintegrate this artistic production into the literary system which it took place, since it is born inside of this and they are the subsequent historical circumstances that isolate it and turn it into a minor writing. The work is constituted in three phases: women authors who publish in the last third of the nineteenth century; authors who published between 1900 and 1920, and authors writing between 1920 and 1939. The areas for which these women produced are are the school, the family, the woman’s associacions, the editorial, the wold of drawing, and the written press.

  6. Open-source LCA tool for estimating greenhouse gas emissions from crude oil production using field characteristics.

    Science.gov (United States)

    El-Houjeiri, Hassan M; Brandt, Adam R; Duffy, James E

    2013-06-04

    Existing transportation fuel cycle emissions models are either general and calculate nonspecific values of greenhouse gas (GHG) emissions from crude oil production, or are not available for public review and auditing. We have developed the Oil Production Greenhouse Gas Emissions Estimator (OPGEE) to provide open-source, transparent, rigorous GHG assessments for use in scientific assessment, regulatory processes, and analysis of GHG mitigation options by producers. OPGEE uses petroleum engineering fundamentals to model emissions from oil and gas production operations. We introduce OPGEE and explain the methods and assumptions used in its construction. We run OPGEE on a small set of fictional oil fields and explore model sensitivity to selected input parameters. Results show that upstream emissions from petroleum production operations can vary from 3 gCO2/MJ to over 30 gCO2/MJ using realistic ranges of input parameters. Significant drivers of emissions variation are steam injection rates, water handling requirements, and rates of flaring of associated gas.

  7. Lithographic measurement of EUV flare in the 0.3-NA Micro Exposure Tool optic at the Advanced Light Source

    International Nuclear Information System (INIS)

    Cain, Jason P.; Naulleau, Patrick; Spanos, Costas J.

    2005-01-01

    The level of flare present in a 0.3-NA EUV optic (the MET optic) at the Advanced Light Source at Lawrence Berkeley National Laboratory is measured using a lithographic method. Photoresist behavior at high exposure doses makes analysis difficult. Flare measurement analysis under scanning electron microscopy (SEM) and optical microscopy is compared, and optical microscopy is found to be a more reliable technique. In addition, the measured results are compared with predictions based on surface roughness measurement of the MET optical elements. When the fields in the exposure matrix are spaced far enough apart to avoid influence from surrounding fields and the data is corrected for imperfect mask contrast and aerial image proximity effects, the results match predicted values quite well. The amount of flare present in this optic ranges from 4.7% for 2 (micro)m features to 6.8% for 500 nm features

  8. Converging free and open source software tools for knowledge sharing in smallholder agricultural communities in Sri Lanka

    Directory of Open Access Journals (Sweden)

    Chandana Kumara Jayathilake

    2017-12-01

    Full Text Available In a world where the notion of ‘sharing of knowledge’ has been gained much prominence in the recent past, the importance of information and communications technologies (ICTs to promote sustainable agriculture, especially when combined with mobile and open source software technologies is discussed critically. On this rationale, this study was carried out to explore the applicability of the concept of converging ‘Free and Open Source Software (FOSS’ to promote sustainable knowledge sharing amongst the agricultural communities in Sri Lanka. A multi-stage community consultative process with a set of designated officials (“Sponsors” and a series of semi-structured questionnaire survey with a cross section of smallholder agriculture farmers (n=246, were carried out in the Batticaloa, Kurunegala and Puttalam districts to gather the baseline data. This was followed by a number of field experiments (“Campaigns” with the farmers (n=340 from same geographical areas. The two FOSS, namely: (1 “FrontlineSMS” for ‘Text Messaging’ and (2 “FreedomFone” for ‘Interactive Voice Responses’, were applied to evaluate the effectiveness of knowledge sharing within the farming communities. It was found that FOSS intervention increases the ‘Text messaging’ and ‘Voice Call’ usage in day-to-day agricultural communication by 26 and 8 percent, respectively. The demographic factors like age and income level of the farmers has positively influence on the knowledge sharing process. And also the ‘Mobile Telephony’ was the most extensive mode of communication within the communities. The outcome of analysis, as a whole, implies that, with a fitting mechanism in place, this approach can be promoted as a “drive for positive changes” in agriculture-based rural communities in developing countries like Sri Lanka, and those in South and East Asia with similar socio-economic and cultural perspectives.

  9. iCAVE: an open source tool for visualizing biomolecular networks in 3D, stereoscopic 3D and immersive 3D.

    Science.gov (United States)

    Liluashvili, Vaja; Kalayci, Selim; Fluder, Eugene; Wilson, Manda; Gabow, Aaron; Gümüs, Zeynep H

    2017-08-01

    Visualizations of biomolecular networks assist in systems-level data exploration in many cellular processes. Data generated from high-throughput experiments increasingly inform these networks, yet current tools do not adequately scale with concomitant increase in their size and complexity. We present an open source software platform, interactome-CAVE (iCAVE), for visualizing large and complex biomolecular interaction networks in 3D. Users can explore networks (i) in 3D using a desktop, (ii) in stereoscopic 3D using 3D-vision glasses and a desktop, or (iii) in immersive 3D within a CAVE environment. iCAVE introduces 3D extensions of known 2D network layout, clustering, and edge-bundling algorithms, as well as new 3D network layout algorithms. Furthermore, users can simultaneously query several built-in databases within iCAVE for network generation or visualize their own networks (e.g., disease, drug, protein, metabolite). iCAVE has modular structure that allows rapid development by addition of algorithms, datasets, or features without affecting other parts of the code. Overall, iCAVE is the first freely available open source tool that enables 3D (optionally stereoscopic or immersive) visualizations of complex, dense, or multi-layered biomolecular networks. While primarily designed for researchers utilizing biomolecular networks, iCAVE can assist researchers in any field. © The Authors 2017. Published by Oxford University Press.

  10. Evaluation of JRC source term methodology using MAAP5 as a fast-running crisis tool for a BWR4 Mark I reactor

    International Nuclear Information System (INIS)

    Vela-García, M.; Simola, K.

    2016-01-01

    JRC participated in the OECD/NEA FASTRUN benchmark reviewing fast-running software tools to model fission product releases during accidents at nuclear power plants. The main goal of fast-running software tools is to foresee the accident progression, so that mitigating actions can be taken and the population can be adequately protected. Within the FASTRUN, JRC used the MAAP 4.0.8 code and developed a methodology to obtain the source term (as activity released per radioisotope) of PWR and BWR station black-out accident scenarios. The modifications made in the MAAP models were limited to a minimum number of important parameters. This aims at reproducing a crisis situation with a limited time to adapt a generic input deck. This paper presents further studies, where JRC analysed the FASTRUN BWR scenario using MAAP 5.0.2 that has the capability of calculating doses. A sensitivity study was performed with the MAAP 5.0.2 DOSE package deactivated, using the same methodology as in the case of MAAP 4.0.8 for source term calculation. The results were close to the reference LTSBO SOARCA case, independently of the methodology used. One of the benefits of using the MAAP code is the short runtime of the simulations.

  11. A Comprehensive Tool for Exploring the Availability, Scalability and Growth Potential of Conventional and Renewable Energy Sources and Technologies

    Science.gov (United States)

    Jack-Scott, E.; Arnott, J. C.; Katzenberger, J.; Davis, S. J.; Delman, E.

    2015-12-01

    It has been a generational challenge to simultaneously meet the world's energy requirements, while remaining within the bounds of acceptable cost and environmental impact. To this end, substantial research has explored various energy futures on a global scale, leaving decision-makers and the public overwhelmed by information on energy options. In response, this interactive energy table was developed as a comprehensive resource through which users can explore the availability, scalability, and growth potentials of all energy technologies currently in use or development. Extensive research from peer-reviewed papers and reports was compiled and summarized, detailing technology costs, technical considerations, imminent breakthroughs, and obstacles to integration, as well as political, social, and environmental considerations. Energy technologies fall within categories of coal, oil, natural gas, nuclear, solar, wind, hydropower, ocean, geothermal and biomass. In addition to 360 expandable cells of cited data, the interactive table also features educational windows with background information on each energy technology. The table seeks not to advocate for specific energy futures, but to succinctly and accurately centralize peer-reviewed research and information in an interactive, accessible resource. With this tool, decision-makers, researchers and the public alike can explore various combinations of energy technologies and their quantitative and qualitative attributes that can satisfy the world's total primary energy supply (TPES) while making progress towards a near zero carbon future.

  12. Evaluation of the moisture sources in two extreme landfalling atmospheric river events using an Eulerian WRF tracers tool

    Science.gov (United States)

    Eiras-Barca, Jorge; Dominguez, Francina; Hu, Huancui; Garaboa-Paz, Daniel; Miguez-Macho, Gonzalo

    2017-12-01

    A new 3-D tracer tool is coupled to the WRF model to analyze the origin of the moisture in two extreme atmospheric river (AR) events: the so-called Great Coastal Gale of 2007 in the Pacific Ocean and the Great Storm of 1987 in the North Atlantic. Results show that between 80 and 90 % of moisture advected by the ARs, and a high percentage of the total precipitation produced by the systems have a tropical origin. The tropical contribution to precipitation is in general above 50 % and largely exceeds this value in the most affected areas. Local convergence transport is responsible for the remaining moisture and precipitation. The ratio of tropical moisture to total moisture is maximized as the cold front arrives on land. Vertical cross sections of the moisture content suggest that the maximum in tropical humidity does not necessarily coincide with the low-level jet (LLJ) of the extratropical cyclone. Instead, the amount of tropical humidity is maximized in the lowest atmospheric level in southern latitudes and can be located above, below or ahead of the LLJ in northern latitudes in both analyzed cases.

  13. Evaluation of the moisture sources in two extreme landfalling atmospheric river events using an Eulerian WRF tracers tool

    Directory of Open Access Journals (Sweden)

    J. Eiras-Barca

    2017-12-01

    Full Text Available A new 3-D tracer tool is coupled to the WRF model to analyze the origin of the moisture in two extreme atmospheric river (AR events: the so-called Great Coastal Gale of 2007 in the Pacific Ocean and the Great Storm of 1987 in the North Atlantic. Results show that between 80 and 90 % of moisture advected by the ARs, and a high percentage of the total precipitation produced by the systems have a tropical origin. The tropical contribution to precipitation is in general above 50 % and largely exceeds this value in the most affected areas. Local convergence transport is responsible for the remaining moisture and precipitation. The ratio of tropical moisture to total moisture is maximized as the cold front arrives on land. Vertical cross sections of the moisture content suggest that the maximum in tropical humidity does not necessarily coincide with the low-level jet (LLJ of the extratropical cyclone. Instead, the amount of tropical humidity is maximized in the lowest atmospheric level in southern latitudes and can be located above, below or ahead of the LLJ in northern latitudes in both analyzed cases.

  14. Evaluation of three methods for retrospective correction of vignetting on medical microscopy images utilizing two open source software tools.

    Science.gov (United States)

    Babaloukas, Georgios; Tentolouris, Nicholas; Liatis, Stavros; Sklavounou, Alexandra; Perrea, Despoina

    2011-12-01

    Correction of vignetting on images obtained by a digital camera mounted on a microscope is essential before applying image analysis. The aim of this study is to evaluate three methods for retrospective correction of vignetting on medical microscopy images and compare them with a prospective correction method. One digital image from four different tissues was used and a vignetting effect was applied on each of these images. The resulted vignetted image was replicated four times and in each replica a different method for vignetting correction was applied with fiji and gimp software tools. The highest peak signal-to-noise ratio from the comparison of each method to the original image was obtained from the prospective method in all tissues. The morphological filtering method provided the highest peak signal-to-noise ratio value amongst the retrospective methods. The prospective method is suggested as the method of choice for correction of vignetting and if it is not applicable, then the morphological filtering may be suggested as the retrospective alternative method. © 2011 The Authors Journal of Microscopy © 2011 Royal Microscopical Society.

  15. Open-source web-enabled data management, analyses, and visualization of very large data in geosciences using Jupyter, Apache Spark, and community tools

    Science.gov (United States)

    Chaudhary, A.

    2017-12-01

    Current simulation models and sensors are producing high-resolution, high-velocity data in geosciences domain. Knowledge discovery from these complex and large size datasets require tools that are capable of handling very large data and providing interactive data analytics features to researchers. To this end, Kitware and its collaborators are producing open-source tools GeoNotebook, GeoJS, Gaia, and Minerva for geosciences that are using hardware accelerated graphics and advancements in parallel and distributed processing (Celery and Apache Spark) and can be loosely coupled to solve real-world use-cases. GeoNotebook (https://github.com/OpenGeoscience/geonotebook) is co-developed by Kitware and NASA-Ames and is an extension to the Jupyter Notebook. It provides interactive visualization and python-based analysis of geospatial data and depending the backend (KTile or GeoPySpark) can handle data sizes of Hundreds of Gigabytes to Terabytes. GeoNotebook uses GeoJS (https://github.com/OpenGeoscience/geojs) to render very large geospatial data on the map using WebGL and Canvas2D API. GeoJS is more than just a GIS library as users can create scientific plots such as vector and contour and can embed InfoVis plots using D3.js. GeoJS aims for high-performance visualization and interactive data exploration of scientific and geospatial location aware datasets and supports features such as Point, Line, Polygon, and advanced features such as Pixelmap, Contour, Heatmap, and Choropleth. Our another open-source tool Minerva ((https://github.com/kitware/minerva) is a geospatial application that is built on top of open-source web-based data management system Girder (https://github.com/girder/girder) which provides an ability to access data from HDFS or Amazon S3 buckets and provides capabilities to perform visualization and analyses on geosciences data in a web environment using GDAL and GeoPandas wrapped in a unified API provided by Gaia (https

  16. DensToolKit: A comprehensive open-source package for analyzing the electron density and its derivative scalar and vector fields

    Science.gov (United States)

    Solano-Altamirano, J. M.; Hernández-Pérez, Julio M.

    2015-11-01

    DensToolKit is a suite of cross-platform, optionally parallelized, programs for analyzing the molecular electron density (ρ) and several fields derived from it. Scalar and vector fields, such as the gradient of the electron density (∇ρ), electron localization function (ELF) and its gradient, localized orbital locator (LOL), region of slow electrons (RoSE), reduced density gradient, localized electrons detector (LED), information entropy, molecular electrostatic potential, kinetic energy densities K and G, among others, can be evaluated on zero, one, two, and three dimensional grids. The suite includes a program for searching critical points and bond paths of the electron density, under the framework of Quantum Theory of Atoms in Molecules. DensToolKit also evaluates the momentum space electron density on spatial grids, and the reduced density matrix of order one along lines joining two arbitrary atoms of a molecule. The source code is distributed under the GNU-GPLv3 license, and we release the code with the intent of establishing an open-source collaborative project. The style of DensToolKit's code follows some of the guidelines of an object-oriented program. This allows us to supply the user with a simple manner for easily implement new scalar or vector fields, provided they are derived from any of the fields already implemented in the code. In this paper, we present some of the most salient features of the programs contained in the suite, some examples of how to run them, and the mathematical definitions of the implemented fields along with hints of how we optimized their evaluation. We benchmarked our suite against both a freely-available program and a commercial package. Speed-ups of ˜2×, and up to 12× were obtained using a non-parallel compilation of DensToolKit for the evaluation of fields. DensToolKit takes similar times for finding critical points, compared to a commercial package. Finally, we present some perspectives for the future development and

  17. TH-C-12A-12: Veritas: An Open Source Tool to Facilitate User Interaction with TrueBeam Developer Mode

    Energy Technology Data Exchange (ETDEWEB)

    Mishra, P [Brigham and Women' s Hospital, Harvard Medical School, Boston, MA (United States); Varian Medical Systems, Palo Alto, CA (United States); Lewis, J [Brigham and Women' s Hospital, Harvard Medical School, Boston, MA (United States); Etmektzoglou, T; Svatos, M [Varian Medical Systems, Palo Alto, CA (United States)

    2014-06-15

    Purpose: To address the challenges of creating delivery trajectories and imaging sequences with TrueBeam Developer Mode, a new open-source graphical XML builder, Veritas, has been developed, tested and made freely available. Veritas eliminates most of the need to understand the underlying schema and write XML scripts, by providing a graphical menu for each control point specifying the state of 30 mechanical/dose axes. All capabilities of Developer Mode are accessible in Veritas. Methods: Veritas was designed using QT Designer, a ‘what-you-is-what-you-get’ (WYSIWIG) tool for building graphical user interfaces (GUI). Different components of the GUI are integrated using QT's signals and slots mechanism. Functionalities are added using PySide, an open source, cross platform Python binding for the QT framework. The XML code generated is immediately visible, making it an interactive learning tool. A user starts from an anonymized DICOM file or XML example and introduces delivery modifications, or begins their experiment from scratch, then uses the GUI to modify control points as desired. The software automatically generates XML plans following the appropriate schema. Results: Veritas was tested by generating and delivering two XML plans at Brigham and Women's Hospital. The first example was created to irradiate the letter ‘B’ with a narrow MV beam using dynamic couch movements. The second was created to acquire 4D CBCT projections for four minutes. The delivery of the letter ‘B’ was observed using a 2D array of ionization chambers. Both deliveries were generated quickly in Veritas by non-expert Developer Mode users. Conclusion: We introduced a new open source tool Veritas for generating XML plans (delivery trajectories and imaging sequences). Veritas makes Developer Mode more accessible by reducing the learning curve for quick translation of research ideas into XML plans. Veritas is an open source initiative, creating the possibility for future

  18. TH-C-12A-12: Veritas: An Open Source Tool to Facilitate User Interaction with TrueBeam Developer Mode

    International Nuclear Information System (INIS)

    Mishra, P; Lewis, J; Etmektzoglou, T; Svatos, M

    2014-01-01

    Purpose: To address the challenges of creating delivery trajectories and imaging sequences with TrueBeam Developer Mode, a new open-source graphical XML builder, Veritas, has been developed, tested and made freely available. Veritas eliminates most of the need to understand the underlying schema and write XML scripts, by providing a graphical menu for each control point specifying the state of 30 mechanical/dose axes. All capabilities of Developer Mode are accessible in Veritas. Methods: Veritas was designed using QT Designer, a ‘what-you-is-what-you-get’ (WYSIWIG) tool for building graphical user interfaces (GUI). Different components of the GUI are integrated using QT's signals and slots mechanism. Functionalities are added using PySide, an open source, cross platform Python binding for the QT framework. The XML code generated is immediately visible, making it an interactive learning tool. A user starts from an anonymized DICOM file or XML example and introduces delivery modifications, or begins their experiment from scratch, then uses the GUI to modify control points as desired. The software automatically generates XML plans following the appropriate schema. Results: Veritas was tested by generating and delivering two XML plans at Brigham and Women's Hospital. The first example was created to irradiate the letter ‘B’ with a narrow MV beam using dynamic couch movements. The second was created to acquire 4D CBCT projections for four minutes. The delivery of the letter ‘B’ was observed using a 2D array of ionization chambers. Both deliveries were generated quickly in Veritas by non-expert Developer Mode users. Conclusion: We introduced a new open source tool Veritas for generating XML plans (delivery trajectories and imaging sequences). Veritas makes Developer Mode more accessible by reducing the learning curve for quick translation of research ideas into XML plans. Veritas is an open source initiative, creating the possibility for future

  19. Deep Controlled Source Electro-Magnetic Sensing: A Cost Effective, Long-Term Tool for Sequestration Monitoring

    Energy Technology Data Exchange (ETDEWEB)

    LaBrecque, Douglas [Multi-Phase Technologies, LLC, Sparks, NV (United States); Brigham, Russell D. [Multi-Phase Technologies, LLC, Sparks, NV (United States); Schmidt-Hattenburger, Conny [GFZ German Research Centre for Geoscience, Potsdam (Germany); Um, Evan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Petrov, Peter [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Daley, Thomas [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2017-05-01

    The proposed system was designed to operate as a permanent, autonomous monitoring and data collection system that can provide much higher temporal data density than can be achieved economically with 3-Dimensional (3D) seismic surveys. It can operate over broad areas for long periods of time providing full 3D data sets on a monthly basis at a very low cost. By borrowing techniques commonly used in marine CSEM, structural information from background seismic surveys can be incorporated into the CSEM modeling to provide high resolution models of CO2 progression within reservoirs. The system uses borehole-based vertical-electric-dipole sources placed at reservoir depths in the formation. The electric and magnetic fields induced by this source are received on the surface using an array of stations. The project was conducted in three phases. Phase I demonstrated the feasibility of the system to collect static/reference data at the Ketzin CO2 storage pilot site in Germany. In Phase I, numerical modeling was used to determine the optimal configurations and requirements for sensor sensitivity and data accuracy. Based on the model results, existing hardware and software were modified. The CSEM system was then field tested at the Ketzin site. The data were imaged and the results were compared with independent studies of the reservoir and overburden geo-electrical characteristics. Phase II demonstrated the ability to provide sensitive, cost-effective measurement of changes in reservoir properties and changes in the overlying formations using a second round of measurements at the Ketzin site. A prototype autonomous recording system was developed and tested as a subset of the measurement points. Phase III of the project quantified the advantages (and disadvantages) of the fully autonomous data collection subsystems by comparing them with repeated measurements made with mobile stations. The Phase III also provided an additional time point in measuring post

  20. MEŞÎHAT ARŞİV KAYITLARINDA HACI BEKTAŞ VELÎ VE BEKTAŞİLİK İLE İLGİLİ YAZILI KAYNAĞIN TESPİTİ [DETECTION OF WRITTEN SOURCES RELATED TO HAJI BEKTASH VELI AND BEKTASHI ORDER IN THE MEŞÎHAT ARCHIVE RECORDS

    Directory of Open Access Journals (Sweden)

    Uğur Sümer

    2017-09-01

    order to get their views on the information and demands included in the document, consists of three chapters and a conclusion part. The translation of the document dated 1920 was conducted by Asst. Prof. Ayhan Işık who detected the document in the meşîhat archive of the Istanbul Mufti’s Office, on behalf of the Turkish Culture and Haji Bektash Veli Association (Türk Kültürü ve Hacı Bektaş Velî Vakfı. The original images and transcription of the document is included in the appendix of the study. A simplified Turkish text which contains the following information was examined by our association and evaluated with the present written sources: Haji Bektash Veli’s genealogy, his spiritual lineage of Hodja Ahmed Yesevi, the education he received from Lokman Perende, his marital status being single, his following representatives who were not his children but companions, the number of dervishes he raised, the customary practices of the Bektashi order.

  1. Assessing enigmatic boulder deposits in NE Aegean Sea: importance of historical sources as tool to support hydrodynamic equations

    Directory of Open Access Journals (Sweden)

    M. Vacchi

    2012-04-01

    Full Text Available Due to their importance in the assessment of coastal hazards, several studies have focused on geomorphological and sedimentological field evidence of catastrophic wave impacts related to historical tsunami events. Among them, many authors used boulder fields as important indicators of past tsunamis, especially in the Mediterranean Sea. The aim of this study was to understand the mechanism of deposition of clusters of large boulders, consisting of beachrock slabs, which were found on the southern coasts of Lesvos Island (NE Aegean Sea. Methods to infer the origin of boulder deposits (tsunami vs. storm wave are often based on hydrodynamic models even if different environmental complexities are difficult to be incorporated into numerical models. In this study, hydrodynamic equations did not provide unequivocal indication of the mechanism responsible for boulder deposition in the study area. Further analyses, ranging from geomorphologic to seismotectonic data, indicated a tsunami as the most likely cause of displacement of the boulders but still do not allow to totally exclude the extreme storm origin. Additional historical investigations (based on tsunami catalogues, historical photos and aged inhabitants interviews indicated that the boulders are likely to have been deposited by the tsunami triggered by the 6.7 Ms Chios-Karaburum earthquake of 1949 or, alternatively, by minor effects of the destructive tsunami produced by 1956's Amorgos Island earthquake. Results of this study point out that, at Mediterranean scale, to flank numerical models with the huge amount of the available historical data become a crucial tool in terms of prevention policies related to catastrophic coastal events.

  2. 29 CFR 100.610 - Written demand for payment.

    Science.gov (United States)

    2010-07-01

    ... Procedures § 100.610 Written demand for payment. (a) The NLRB will promptly make written demand upon the debtor for payment of money or the return of specific property. The written demand for payment will be... late charges will be 60 days from the date that the demand letter is mailed or hand-delivered. (b) The...

  3. Oral and Literate Strategies in Spoken and Written Narratives.

    Science.gov (United States)

    Tannen, Deborah

    1982-01-01

    Discusses comparative analysis of spoken and written versions of a narrative to demonstrate that features which have been identified as characterizing oral discourse are also found in written discourse and that the written short story combines syntactic complexity expected in writing with features which create involvement expected in speaking.…

  4. Import/export policy as a tool of optimal utilization of power produced by renewable energy sources

    International Nuclear Information System (INIS)

    Meibom, P.; Svendsen, T.; Soerensen, B.

    1997-10-01

    The official Danish energy plan ENERGY 21 calls for a very high penetration of wind energy in the electricity sector. This will put issues of fitting a variable energy source into a stable supply system in focus. The present study investigates the role that collaboration with the hydro based Scandinavian countries can offer, and particularly looks at the conditions of the Nordic power pool, asking whether it will be possible profitably to trade wind power in such a pool system. Based upon hourly simulation of the Scandinavian electricity system, with connections to the European continent, we track the fate of wind power in satisfying Danish demand, selling surpluses to and buying deficits from the pool, assuming international transmission lines of the capacity existing today or larger, and repeating the simulation for combinations of good and bad wind and hydro years, as defined by historical sets of wind speeds and hydro resources depending on precipitation and ice melting. Decisions concerning pool bidding and regulation of the remaining power system (mainly gas-fired combined heat and power plants) are made on the basis of the calculated predictions based on the historical data sequences up to the time of decision, and then confronted with the subsequent data values in the set. (EG)

  5. Combining two open source tools for neural computation (BioPatRec and Netlab) improves movement classification for prosthetic control.

    Science.gov (United States)

    Prahm, Cosima; Eckstein, Korbinian; Ortiz-Catalan, Max; Dorffner, Georg; Kaniusas, Eugenijus; Aszmann, Oskar C

    2016-08-31

    Controlling a myoelectric prosthesis for upper limbs is increasingly challenging for the user as more electrodes and joints become available. Motion classification based on pattern recognition with a multi-electrode array allows multiple joints to be controlled simultaneously. Previous pattern recognition studies are difficult to compare, because individual research groups use their own data sets. To resolve this shortcoming and to facilitate comparisons, open access data sets were analysed using components of BioPatRec and Netlab pattern recognition models. Performances of the artificial neural networks, linear models, and training program components were compared. Evaluation took place within the BioPatRec environment, a Matlab-based open source platform that provides feature extraction, processing and motion classification algorithms for prosthetic control. The algorithms were applied to myoelectric signals for individual and simultaneous classification of movements, with the aim of finding the best performing algorithm and network model. Evaluation criteria included classification accuracy and training time. Results in both the linear and the artificial neural network models demonstrated that Netlab's implementation using scaled conjugate training algorithm reached significantly higher accuracies than BioPatRec. It is concluded that the best movement classification performance would be achieved through integrating Netlab training algorithms in the BioPatRec environment so that future prosthesis training can be shortened and control made more reliable. Netlab was therefore included into the newest release of BioPatRec (v4.0).

  6. Animal movement network analysis as a tool to map farms serving as contamination source in cattle cysticercosis

    Directory of Open Access Journals (Sweden)

    Samuel C. Aragão

    Full Text Available ABSTRACT: Bovine cysticercosis is a problem distributed worldwide that result in economic losses mainly due to the condemnation of infected carcasses. One of the difficulties in applying control measures is the identification of the source of infection, especially because cattle are typically acquired from multiple farms. Here, we tested the utility of an animal movement network constructed with data from a farm that acquires cattle from several other different farms to map the major contributors of cysticercosis propagation. Additionally, based on the results of the network analysis, we deployed a sanitary management and drug treatment scheme to decrease cysticercosis’ occurrence in the farm. Six farms that had commercial trades were identified by the animal movement network and characterized as the main contributors to the occurrence of cysticercosis in the studied farm. The identification of farms with a putative risk of Taenia saginata infection using the animal movement network along with the proper sanitary management and drug treatment resulted in a gradual decrease in cysticercosis prevalence, from 25% in 2010 to 3.7% in 2011 and 1.8% in 2012. These results suggest that the animal movement network can contribute towards controlling bovine cysticercosis, thus minimizing economic losses and preventing human taeniasis.

  7. Development of a flattening filter free multiple source model for use as an independent, Monte Carlo, dose calculation, quality assurance tool for clinical trials.

    Science.gov (United States)

    Faught, Austin M; Davidson, Scott E; Popple, Richard; Kry, Stephen F; Etzel, Carol; Ibbott, Geoffrey S; Followill, David S

    2017-09-01

    The Imaging and Radiation Oncology Core-Houston (IROC-H) Quality Assurance Center (formerly the Radiological Physics Center) has reported varying levels of compliance from their anthropomorphic phantom auditing program. IROC-H studies have suggested that one source of disagreement between institution submitted calculated doses and measurement is the accuracy of the institution's treatment planning system dose calculations and heterogeneity corrections used. In order to audit this step of the radiation therapy treatment process, an independent dose calculation tool is needed. Monte Carlo multiple source models for Varian flattening filter free (FFF) 6 MV and FFF 10 MV therapeutic x-ray beams were commissioned based on central axis depth dose data from a 10 × 10 cm 2 field size and dose profiles for a 40 × 40 cm 2 field size. The models were validated against open-field measurements in a water tank for field sizes ranging from 3 × 3 cm 2 to 40 × 40 cm 2 . The models were then benchmarked against IROC-H's anthropomorphic head and neck phantom and lung phantom measurements. Validation results, assessed with a ±2%/2 mm gamma criterion, showed average agreement of 99.9% and 99.0% for central axis depth dose data for FFF 6 MV and FFF 10 MV models, respectively. Dose profile agreement using the same evaluation technique averaged 97.8% and 97.9% for the respective models. Phantom benchmarking comparisons were evaluated with a ±3%/2 mm gamma criterion, and agreement averaged 90.1% and 90.8% for the respective models. Multiple source models for Varian FFF 6 MV and FFF 10 MV beams have been developed, validated, and benchmarked for inclusion in an independent dose calculation quality assurance tool for use in clinical trial audits. © 2017 American Association of Physicists in Medicine.

  8. Verification of Simulation Tools

    International Nuclear Information System (INIS)

    Richard, Thierry

    2015-01-01

    Before qualifying a simulation tool, the requirements shall first be clearly identified, i.e.: - What type of study needs to be carried out? - What phenomena need to be modeled? This phase involves writing a precise technical specification. Once the requirements are defined, the most adapted product shall be selected from the various software options available on the market. Before using a particular version of a simulation tool to support the demonstration of nuclear safety studies, the following requirements shall be met. - An auditable quality assurance process complying with development international standards shall be developed and maintained, - A process of verification and validation (V and V) shall be implemented. This approach requires: writing a report and/or executive summary of the V and V activities, defining a validated domain (domain in which the difference between the results of the tools and those of another qualified reference is considered satisfactory for its intended use). - Sufficient documentation shall be available, - A detailed and formal description of the product (software version number, user configuration, other settings and parameters) in the targeted computing environment shall be available. - Source codes corresponding to the software shall be archived appropriately. When these requirements are fulfilled, the version of the simulation tool shall be considered qualified for a defined domain of validity, in a given computing environment. The functional verification shall ensure that: - the computer architecture of the tool does not include errors, - the numerical solver correctly represents the physical mathematical model, - equations are solved correctly. The functional verification can be demonstrated through certification or report of Quality Assurance. The functional validation shall allow the user to ensure that the equations correctly represent the physical phenomena in the perimeter of intended use. The functional validation can

  9. The security analyzer: A security analyzer program written in Prolog

    International Nuclear Information System (INIS)

    Zimmerman, B.D.; Densley, P.J.

    1986-09-01

    The Security Analyzer is a software tool capable of analyzing the effectiveness of a facility's security system. It is written in the Prolog logic programming computer language, using entity-relationship data modeling techniques. The program performs the following functions: (1) provides descriptive, locational and operational status information about intrusion detectors and assessment devices (i.e., ''sensors'' and ''cameras'') upon request; (2) provides for storage and retrieval of maintenance history information for various components of the security system (including intrusion detectors), and allows for changing that information as desired; (3) provides a ''search'' mode, wherein all paths are found from any specified physical location to another specified location which satisfy user chosen ''intruder detection'' probability and elapsed time criteria (i.e., the program finds the ''weakest paths'' from a security point of view). The first two of these functions can be provided fairly easily with a conventional database program; the third function could be provided using Fortran or some similar language, though with substantial difficulty. In the Security Analyzer program, all these functions are provided in a simple and straight-forward manner. This simplicity is possible because the program is written in the symbolic (as opposed to numeric) processing language Prolog, and because the knowledge base is structured according to entity-relationship modeling principles. Also, the use of Prolog and the entity-relationship modeling technique allows the capabilities of the Security analyzer program, both for knowledge base interrogation and for searching-type operations, to be easily expanded in ways that would be very difficult for a numeric and more algorithmically deterministic language such as Fortran to duplicate. 4 refs

  10. Chapter 27: Deja vu All Over Again: Using NVO Tools to Re-Investigate a Complete Sample of Texas Radio Survey Sources

    Science.gov (United States)

    Lucas, Ray A.; Rohde, David; Tamura, Takayuki; van Dyne, Jeffrey

    At the first NVO Summer School in September 2004, a complete sample of Texas Radio Survey sources, first derived in 1989 and subsequently observed with the VLA in A-array snapshot mode in 1990, was revisited. The original investigators had never had the occasion to reduce the A-array 5-minute snapshot data, nor to do any other significant follow-up, though the sample still seemed a possibly useful but relatively small study of radio galaxies, AGN, quasars, extragalactic sources, and galaxy clusters, etc. At the time of the original sample definition in late 1989, the best optical material available for the region was the SRC-J plate from the UK Schmidt Telescope in Australia. In much more recent times, the Sloan Digital Sky Survey has included the region in its DR2 data release, so good multicolor optical imaging in a number of standard bandpasses has finally become available. These data, along with other material in the radio, infrared, and (where available) were used to get a better preliminary idea of the nature of the objects in the 1989 sample. We also investigated one of the original questions: whether these radio sources with steeper (or at least non-flat) radio spectra were associated with galaxy clusters, and in some cases higher-redshift galaxy clusters and AGN. A rudimentary web service was created which allowed the user to perform simple cone searches and SIAP image extractions of specified field sizes for multiwavelength data across the electromagnetic spectrum, and a prototype web page was set up which would display the resulting images in wavelength order across the page for sources in the sample. Finally, as an additional investigation, using radio and X-ray IDs as a proxy for AGN which might be associated with large, central cluster galaxies, positional matches of radio and X-ray sources from two much larger catalogs were done using the tool TOPCAT in order to search for the degree of correlation between ID positions, radio luminosity, and cluster

  11. RainyDay: An Online, Open-Source Tool for Physically-based Rainfall and Flood Frequency Analysis

    Science.gov (United States)

    Wright, D.; Yu, G.; Holman, K. D.

    2017-12-01

    Flood frequency analysis in ungaged or changing watersheds typically requires rainfall intensity-duration-frequency (IDF) curves combined with hydrologic models. IDF curves only depict point-scale rainfall depth, while true rainstorms exhibit complex spatial and temporal structures. Floods result from these rainfall structures interacting with watershed features such as land cover, soils, and variable antecedent conditions as well as river channel processes. Thus, IDF curves are traditionally combined with a variety of "design storm" assumptions such as area reduction factors and idealized rainfall space-time distributions to translate rainfall depths into inputs that are suitable for flood hydrologic modeling. The impacts of such assumptions are relatively poorly understood. Meanwhile, modern precipitation estimates from gridded weather radar, grid-interpolated rain gages, satellites, and numerical weather models provide more realistic depictions of rainfall space-time structure. Usage of such datasets for rainfall and flood frequency analysis, however, are hindered by relatively short record lengths. We present RainyDay, an open-source stochastic storm transposition (SST) framework for generating large numbers of realistic rainfall "scenarios." SST "lengthens" the rainfall record by temporal resampling and geospatial transposition of observed storms to extract space-time information from regional gridded rainfall data. Relatively short (10-15 year) records of bias-corrected radar rainfall data are sufficient to estimate rainfall and flood events with much longer recurrence intervals including 100-year and 500-year events. We describe the SST methodology as implemented in RainyDay and compare rainfall IDF results from RainyDay to conventional estimates from NOAA Atlas 14. Then, we demonstrate some of the flood frequency analysis properties that are possible when RainyDay is integrated with a distributed hydrologic model, including robust estimation of flood

  12. An Analysis of Written Feedback on a PhD Thesis

    Science.gov (United States)

    Kumar, Vijay; Stracke, Elke

    2007-01-01

    This paper offers an interim analysis of written feedback on a first draft of a PhD thesis. It first looks at two sources of data: in-text feedback and overall feedback. Looking at how language is used in its situational context, we then coded the feedback and developed a model for analysis based on three fundamental functions of speech:…

  13. A Quantitative Analysis of Uncertainty in the Grading of Written Exams in Mathematics and Physics

    Science.gov (United States)

    Hammer, Hugo Lewi; Habib, Laurence

    2016-01-01

    The most common way to grade students in courses at university and university college level is to use final written exams. The aim of final exams is generally to provide a reliable and a valid measurement of the extent to which a student has achieved the learning outcomes for the course. A source of uncertainty in grading students based on an exam…

  14. Radiotherapy: an interactive learning tool

    International Nuclear Information System (INIS)

    Frenzel, T.; Kruell, A.; Schmidt, R.; Dobrucki, W.; Malys, B.

    1998-01-01

    The program is primarily intended for radiological medical technicians, student nurses, students of medicine and physics, and doctors. It is designed as a tool for vocational training and further training and gives comprehensive insight into the daily routines of a radiotherapy unit. The chapters deal with: fundamental biological aspects - fundamental physical aspects - radiation sources and irradiation systems - preparatory examinations - therapies and concepts - irradiation planning - irradiation performance - termination of irradiation treatment. For every page displayed, spoken texts and written, on-screen keywords, illustrations, animated sequences and a large number of videos have been combined in a way easy to digest. The software of the program permits handling also by learners less familiar with computer-based learning. (orig./) [de

  15. Validation of the translation of an instrument to measure reliability of written information on treatment choices: a study on attention deficit/hyperactivity disorder (ADHD).

    Science.gov (United States)

    Montoya, A; Llopis, N; Gilaberte, I

    2011-12-01

    DISCERN is an instrument designed to help patients assess the reliability of written information on treatment choices. Originally created in English, there is no validated Spanish version of this instrument. This study seeks to validate the Spanish translation of the DISCERN instrument used as a primary measure on a multicenter study aimed to assess the reliability of web-based information on treatment choices for attention deficit/hyperactivity disorder (ADHD). We used a modified version of a method for validating translated instruments in which the original source-language version is formally compared with the back-translated source-language version. Each item was ranked in terms of comparability of language, similarity of interpretability, and degree of understandability. Responses used Likert scales ranging from 1 to 7, where 1 indicates the best interpretability, language and understandability, and 7 indicates the worst. Assessments were performed by 20 raters fluent in the source language. The Spanish translation of DISCERN, based on ratings of comparability, interpretability and degree of understandability (mean score (SD): 1.8 (1.1), 1.4 (0.9) and 1.6 (1.1), respectively), was considered extremely comparable. All items received a score of less than three, therefore no further revision of the translation was needed. The validation process showed that the quality of DISCERN translation was high, validating the comparable language of the tool translated on assessing written information on treatment choices for ADHD.

  16. Cascaded processing in written compound word production

    Directory of Open Access Journals (Sweden)

    Raymond eBertram

    2015-04-01

    Full Text Available In this study we investigated the intricate interplay between central linguistic processing and peripheral motor processes during typewriting. Participants had to typewrite two-constituent (noun-noun Finnish compounds in response to picture presentation while their typing behavior was registered. As dependent measures we used writing onset time to assess what processes were completed before writing and inter-key intervals to assess what processes were going on during writing. It was found that writing onset time was determined by whole word frequency rather than constituent frequencies, indicating that compound words are retrieved as whole orthographic units before writing is initiated. In addition, we found that the length of the first syllable also affects writing onset time, indicating that the first syllable is fully prepared before writing commences. The inter-key interval results showed that linguistic planning is not fully ready before writing, but cascades into the motor execution phase. More specifically, inter-key intervals were largest at syllable and morpheme boundaries, supporting the view that additional linguistic planning takes place at these boundaries. Bigram and trigram frequency also affected inter-key intervals with shorter intervals corresponding to higher frequencies. This can be explained by stronger memory traces for frequently co-occurring letter sequences in the motor memory for typewriting. These frequency effects were even larger in the second than in the first constituent, indicating that low-level motor memory starts to become more important during the course of writing compound words. We discuss our results in the light of current models of morphological processing and written word production.

  17. Cascaded processing in written compound word production.

    Science.gov (United States)

    Bertram, Raymond; Tønnessen, Finn Egil; Strömqvist, Sven; Hyönä, Jukka; Niemi, Pekka

    2015-01-01

    In this study we investigated the intricate interplay between central linguistic processing and peripheral motor processes during typewriting. Participants had to typewrite two-constituent (noun-noun) Finnish compounds in response to picture presentation while their typing behavior was registered. As dependent measures we used writing onset time to assess what processes were completed before writing and inter-key intervals to assess what processes were going on during writing. It was found that writing onset time was determined by whole word frequency rather than constituent frequencies, indicating that compound words are retrieved as whole orthographic units before writing is initiated. In addition, we found that the length of the first syllable also affects writing onset time, indicating that the first syllable is fully prepared before writing commences. The inter-key interval results showed that linguistic planning is not fully ready before writing, but cascades into the motor execution phase. More specifically, inter-key intervals were largest at syllable and morpheme boundaries, supporting the view that additional linguistic planning takes place at these boundaries. Bigram and trigram frequency also affected inter-key intervals with shorter intervals corresponding to higher frequencies. This can be explained by stronger memory traces for frequently co-occurring letter sequences in the motor memory for typewriting. These frequency effects were even larger in the second than in the first constituent, indicating that low-level motor memory starts to become more important during the course of writing compound words. We discuss our results in the light of current models of morphological processing and written word production.

  18. PRAGMATIC AND RHETORICAL STRATEGIES IN THE ENGLISH-WRITTEN JOKES

    Directory of Open Access Journals (Sweden)

    Dyah Rochmawati

    2017-05-01

    Full Text Available Understanding verbal jokes in English is problematic for English as Foreign Language (EFL readers since understanding the jokes requires understanding their linguistic, cultural and social elements. Since a joke constitutes a complex and paradoxical phenomenon, it needs multiple approaches of analyses—such as pragmatic and rhetorical analyses—in order to investigate the multiple layers of meanings it carries. Recently there has been a shift in humor studies, emphasizing linguistic humors and involving the field of rhetoric. These studies, however, have mostly addressed the connection between rhetoric and spoken jokes in persuasion. The present study therefore applied Austin’s Speech Act Theory (1975 and Grice’s Cooperative Principles (1957, and Berger’s rhetorical techniques (1993 to crack the funniness of the written jokes. Specifically, the study aims at describing: how the (1 rhetorical and (2 pragmatic strategies are used in the jokes, and (3 how the pragmatic and rhetorical strategies complement to create humor. The study employed a qualitative research method. Some jokes were purposively selected from the Reader’s Digest and two online sources: http://jokes.cc.com/, and http://www.ajokeaday.com/. Document studies were the means of data collection. The collected data were then analyzed using a qualitative content analysis. The results showed that that there was a relationship between the two pragmatic theories, i.e., Speech Act Theory and Cooperative Principles, and Berger’s rhetorical techniques. The results offered an alternative reading and richer understanding of how written jokes employed pragmatic and rhetorical strategies to advance their rhetorical objectives and humor functions.

  19. NOTE: Development of modified voxel phantoms for the numerical dosimetric reconstruction of radiological accidents involving external sources: implementation in SESAME tool

    Science.gov (United States)

    Courageot, Estelle; Sayah, Rima; Huet, Christelle

    2010-05-01

    Estimating the dose distribution in a victim's body is a relevant indicator in assessing biological damage from exposure in the event of a radiological accident caused by an external source. When the dose distribution is evaluated with a numerical anthropomorphic model, the posture and morphology of the victim have to be reproduced as realistically as possible. Several years ago, IRSN developed a specific software application, called the simulation of external source accident with medical images (SESAME), for the dosimetric reconstruction of radiological accidents by numerical simulation. This tool combines voxel geometry and the MCNP(X) Monte Carlo computer code for radiation-material interaction. This note presents a new functionality in this software that enables the modelling of a victim's posture and morphology based on non-uniform rational B-spline (NURBS) surfaces. The procedure for constructing the modified voxel phantoms is described, along with a numerical validation of this new functionality using a voxel phantom of the RANDO tissue-equivalent physical model.

  20. The Open Source Stochastic Building Simulation Tool SLBM and Its Capabilities to Capture Uncertainty of Policymaking in the U.S. Building Sector

    Energy Technology Data Exchange (ETDEWEB)

    Stadler, Michael; Marnay, Chris; Azevedo, Ines Lima; Komiyama, Ryoichi; Lai, Judy

    2009-05-14

    The increasing concern about climate change as well as the expected direct environmental economic impacts of global warming will put considerable constraints on the US building sector, which consumes roughly 48percent of the total primary energy, making it the biggest single source of CO2 emissions. It is obvious that the battle against climate change can only be won by considering innovative building approaches and consumer behaviors and bringing new, effective low carbon technologies to the building / consumer market. However, the limited time given to mitigate climate change is unforgiving to misled research and / or policy. This is the reason why Lawrence Berkeley National Lab is working on an open source long range Stochastic Lite Building Module (SLBM) to estimate the impact of different policies and consumer behavior on the market penetration of low carbon building technologies. SLBM is designed to be a fast running, user-friendly model that analysts can readily run and modify in its entirety through a visual interface. The tool is fundamentally an engineering-economic model with technology adoption decisions based on cost and energy performance characteristics of competing technologies. It also incorporates consumer preferences and passive building systems as well as interactions between technologies (such as internal heat gains). Furthermore, everything is based on service demand, e.g. a certain temperature or luminous intensity, instead of energy intensities. The core objectives of this paper are to demonstrate the practical approach used, to start a discussion process between relevant stakeholders and to build collaborations.

  1. Piloting a Structured Practice Audit to Assess ACGME Milestones in Written Handoff Communication in Internal Medicine.

    Science.gov (United States)

    Martin, Shannon K; Farnan, Jeanne M; McConville, John F; Arora, Vineet M

    2015-06-01

    Written communication skills are integral to patient care handoffs. Residency programs require feasible assessment tools that provide timely formative and summative feedback, ideally linked to the Accreditation Council for Graduate Medical Education Milestones. We describe the use of 1 such tool-UPDATED-to assess written handoff communication skills in internal medicine interns. During 2012-2013, the authors piloted a structured practice audit at 1 academic institution to audit written sign-outs completed by 45 interns, using the UPDATED tool, which scores 7 aspects of sign-out communication linked to milestones. Intern sign-outs were audited by trained faculty members throughout the year. Results were incorporated into intern performance reviews and Clinical Competency Committees. A total of 136 sign-outs were audited (averaging 3.1 audits per intern). In the first trimester, 14 interns (31%) had satisfactory audit results. Five interns (11%) had critical deficiencies and received immediate feedback, and the remaining 26 (58%) were assigned future audits due to missing audits or unsatisfactory scores. In the second trimester, 21 interns (68%) had satisfactory results, 1 had critical deficiencies, and 9 (29%) required future audits. Nine of the 10 remaining interns in the final trimester had satisfactory audits. Faculty time was estimated at 10 to 15 minutes per sign-out audited. The UPDATED audit is a milestone-based tool that can be used to assess written sign-out communication skills in internal medicine residency programs. Future work is planned to adapt the tool for use by senior supervisory residents to appraise sign-outs in real time.

  2. Web-based discovery, access and analysis tools for the provision of different data sources like remote sensing products and climate data

    Science.gov (United States)

    Eberle, J.; Hese, S.; Schmullius, C.

    2012-12-01

    To provide different of Earth Observation products in the area of Siberia, the Siberian Earth System Science Cluster (SIB-ESS-C) was established as a spatial data infrastructure at the University of Jena (Germany), Department for Earth Observation. The infrastructure implements standards published by the Open Geospatial Consortium (OGC) and the International Organization for Standardization (ISO) for data discovery, data access and data analysis. The objective of SIB-ESS-C is to faciliate environmental research and Earth system science in Siberia. Several products from the Moderate Resolution Imaging Spectroradiometer sensor were integrated by serving ISO-compliant Metadata and providing OGC-compliant Web Map Service for data visualization and Web Coverage Services / Web Feature Service for data access. Furthermore climate data from the World Meteorological Organization were downloaded, converted, provided as OGC Sensor Observation Service. Each climate data station is described with ISO-compliant Metadata. All these datasets from multiple sources are provided within the SIB-ESS-C infrastructure (figure 1). Furthermore an automatic workflow integrates updates of these datasets daily. The brokering approach within the SIB-ESS-C system is to collect data from different sources, convert the data into common data formats, if necessary, and provide them with standardized Web services. Additional tools are made available within the SIB-ESS-C Geoportal for an easy access to download and analysis functions (figure 2). The data can be visualized, accessed and analysed with this Geoportal. Providing OGC-compliant services the data can also be accessed with other OGC-compliant clients.; Figure 1. Technical Concept of SIB-ESS-C providing different data sources ; Figure 2. Screenshot of the web-based SIB-ESS-C system.

  3. The challenge of giving written thesis feedback to nursing students.

    Science.gov (United States)

    Tuvesson, Hanna; Borglin, Gunilla

    2014-11-01

    Providing effective written feedback on nursing student's assignments can be a challenging task for any assessor. Additionally, as the student groups tend to become larger, written feedback is likely to gain an overall more prominent position than verbal feedback. Lack of formal training or regular discussion in the teaching faculty about the skill set needed to provide written feedback could negatively affect the students' learning abilities. In this brief paper, we discuss written feedback practices, whilst using the Bachelor of Science in Nursing thesis as an example. Our aim is to highlight the importance of an informed understanding of the impact written feedback can have on students. Creating awareness about this can facilitate the development of more strategic and successful written feedback strategies. We end by offering examples of some relatively simple strategies for improving this practice. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Written Teacher Feedback: Aspects of Quality, Benefits and Challenges

    DEFF Research Database (Denmark)

    Holmeier, Monika; Grob, Regula; Nielsen, Jan Alexis

    2018-01-01

    was provided based on rubrics and templates for open comments. For this purpose, written teacher feedback itself, student artefacts and data from questionnaires were analysed. Furthermore, the benefits and challenges that teachers noticed in using written feedback will be examined. Finally......, it will be discussed which means of support for teachers seem necessary in order to foster the implementation of written teacher feedback as part of formative assessment in inquiry-based science education....

  5. Informed consent and the readability of the written consent form.

    Science.gov (United States)

    Sivanadarajah, N; El-Daly, I; Mamarelis, G; Sohail, M Z; Bates, P

    2017-11-01

    Introduction The aim of this study was to objectively ascertain the level of readability of standardised consent forms for orthopaedic procedures. Methods Standardised consent forms (both in summary and detailed formats) endorsed by the British Orthopaedic Association (BOA) were retrieved from orthoconsent.com and assessed for readability. This involved using an online tool to calculate the validated Flesch reading ease score (FRES). This was compared with the FRES for the National Health Service (NHS) Consent Form 1. Data were analysed and interpreted according to the FRES grading table. Results The FRES for Consent Form 1 was 55.6, relating to the literacy expected of an A level student. The mean FRES for the BOA summary consent forms (n=27) was 63.6 (95% confidence interval [CI]: 61.2-66.0) while for the detailed consent forms (n=32), it was 68.9 (95% CI: 67.7-70.0). All BOA detailed forms scored >60, correlating to the literacy expected of a 13-15-year-old. The detailed forms had a higher FRES than the summary forms (p<0.001). Conclusions This study demonstrates that the BOA endorsed standardised consent forms are much easier to read and understand than the NHS Consent Form 1, with the detailed BOA forms being the easiest to read. Despite this, owing to varying literacy levels, a significant proportion of patients may struggle to give informed consent based on the written information provided to them.

  6. Age of acquisition and word frequency in written picture naming.

    Science.gov (United States)

    Bonin, P; Fayol, M; Chalard, M

    2001-05-01

    This study investigates age of acquisition (AoA) and word frequency effects in both spoken and written picture naming. In the first two experiments, reliable AoA effects on object naming speed, with objective word frequency controlled for, were found in both spoken (Experiment 1) and written picture naming (Experiment 2). In contrast, no reliable objective word frequency effects were observed on naming speed, with AoA controlled for, in either spoken (Experiment 3) or written (Experiment 4) picture naming. The implications of the findings for written picture naming are briefly discussed.

  7. The neutron porosity tool

    International Nuclear Information System (INIS)

    Oelgaard, P.L.

    1988-01-01

    The report contains a review of available information on neutron porosity tools with the emphasis on dual thermal-neutron-detector porosity tools and epithermal-neutron-detector porosity tools. The general principle of such tools is discussed and theoretical models are very briefly reviewed. Available data on tool designs are summarized with special regard to the source-detector distance. Tool operational data, porosity determination and correction of measurements are briefly discussed. (author) 15 refs

  8. BioSimplify: an open source sentence simplification engine to improve recall in automatic biomedical information extraction

    OpenAIRE

    Jonnalagadda, Siddhartha; Gonzalez, Graciela

    2011-01-01

    BioSimplify is an open source tool written in Java that introduces and facilitates the use of a novel model for sentence simplification tuned for automatic discourse analysis and information extraction (as opposed to sentence simplification for improving human readability). The model is based on a "shot-gun" approach that produces many different (simpler) versions of the original sentence by combining variants of its constituent elements. This tool is optimized for processing biomedical scien...

  9. [Written pharmaceutical advertising--still unreliable?].

    Science.gov (United States)

    Gladsø, Kristin Haugen; Garberg, Hedda Rosland; Spigset, Olav; Slørdal, Lars

    2014-09-02

    Marketing by the pharmaceutical industry affects doctors' prescribing habits. All pharmaceutical advertising received by nine doctors in two GP offices over a period of three months was collected. The advertising material was sorted by compound. For each compound, the advert with the highest number of references was selected. The cited references were obtained, and the claims in the adverts were assessed in terms of their consistency with the source data based on the provisions in the Norwegian regulations on pharmaceuticals. The references were also assessed with regard to the incidence of conflicts of interest among authors. The doctors received a total of 270 shipments of advertising for 46 different compounds. Altogether 95% of the 173 references cited in the 46 selected adverts could be obtained. The adverts contained a total of 156 claims. Of these, 56% were assessed as correct when compared to the source data and as having clinical relevance. Altogether 75% of the journal articles reported relevant conflicts of interest for the authors. About half the claims in the adverts were found to be correct and clinically relevant. These results concur with those from a methodologically identical study based on advertising material collected in 2004. The cited literature was of varying quality and often funded by the pharmaceutical companies. The findings indicate that the target group should be sceptical of this type of marketing.

  10. Assessing the Hydrologic Performance of the EPA's Nonpoint Source Water Quality Assessment Decision Support Tool Using North American Land Data Assimilation System (Products)

    Science.gov (United States)

    Lee, S.; Ni-Meister, W.; Toll, D.; Nigro, J.; Guiterrez-Magness, A.; Engman, T.

    2010-01-01

    The accuracy of streamflow predictions in the EPA's BASINS (Better Assessment Science Integrating Point and Nonpoint Sources) decision support tool is affected by the sparse meteorological data contained in BASINS. The North American Land Data Assimilation System (NLDAS) data with high spatial and temporal resolutions provide an alternative to the NOAA National Climatic Data Center (NCDC)'s station data. This study assessed the improvement of streamflow prediction of the Hydrological Simulation Program-FORTRAN (HSPF) model contained within BASINS using the NLDAS 118 degree hourly precipitation and evapotranspiration estimates in seven watersheds of the Chesapeake Bay region. Our results demonstrated consistent improvements of daily streamflow predictions in five of the seven watersheds when NLDAS precipitation and evapotranspiration data was incorporated into BASINS. The improvement of using the NLDAS data is significant when watershed's meteorological station is either far away or not in a similar climatic region. When the station is nearby, using the NLDAS data produces similar results. The correlation coefficients of the analyses using the NLDAS data were greater than 0.8, the Nash-Sutcliffe (NS) model fit efficiency greater than 0.6, and the error in the water balance was less than 5%. Our analyses also showed that the streamflow improvements were mainly contributed by the NLDAS's precipitation data and that the improvement from using NLDAS's evapotranspiration data was not significant; partially due to the constraints of current BASINS-HSPF settings. However, NLDAS's evapotranspiration data did improve the baseflow prediction. This study demonstrates the NLDAS data has the potential to improve stream flow predictions, thus aid the water quality assessment in the EPA nonpoint water quality assessment decision tool.

  11. Justify Your Answer: The Role of Written Think Aloud in Script Concordance Testing.

    Science.gov (United States)

    Power, Alyssa; Lemay, Jean-Francois; Cooke, Suzette

    2017-01-01

    Construct: Clinical reasoning assessment is a growing area of interest in the medical education literature. Script concordance testing (SCT) evaluates clinical reasoning in conditions of uncertainty and has emerged as an innovative tool in the domain of clinical reasoning assessment. SCT quantifies the degree of concordance between a learner and an experienced clinician and attempts to capture the breadth of responses of expert clinicians, acknowledging the significant yet acceptable variation in practice under situations of uncertainty. SCT has been shown to be a valid and reliable clinical reasoning assessment tool. However, as SCT provides only quantitative information, it may not provide a complete assessment of clinical reasoning. Think aloud (TA) is a qualitative research tool used in clinical reasoning assessment in which learners verbalize their thought process around an assigned task. This study explores the use of TA, in the form of written reflection, in SCT to assess resident clinical reasoning, hypothesizing that the information obtained from the written TA would enrich the quantitative data obtained through SCT. Ninety-one pediatric postgraduate trainees and 21 pediatricians from 4 Canadian training centers completed an online test consisting of 24 SCT cases immediately followed by retrospective written TA. Six of 24 cases were selected to gather TA data. These cases were chosen to allow all phases of clinical decision making (diagnosis, investigation, and treatment) to be represented in the TA data. Inductive thematic analysis was employed when systematically reviewing TA responses. Three main benefits of adding written TA to SCT were identified: (a) uncovering instances of incorrect clinical reasoning despite a correct SCT response, (b) revealing sound clinical reasoning in the context of a suboptimal SCT response, and (c) detecting question misinterpretation. Written TA can optimize SCT by demonstrating when correct examinee responses are based on

  12. 19 CFR 148.111 - Written declaration for unaccompanied articles.

    Science.gov (United States)

    2010-04-01

    ... 19 Customs Duties 2 2010-04-01 2010-04-01 false Written declaration for unaccompanied articles... of the United States § 148.111 Written declaration for unaccompanied articles. The baggage... covers articles which do not accompany him and: (a) The articles are entitled to free entry under the $1...

  13. 12 CFR 704.16 - Contracts/written agreements.

    Science.gov (United States)

    2010-01-01

    ... 12 Banks and Banking 6 2010-01-01 2010-01-01 false Contracts/written agreements. 704.16 Section 704.16 Banks and Banking NATIONAL CREDIT UNION ADMINISTRATION REGULATIONS AFFECTING CREDIT UNIONS CORPORATE CREDIT UNIONS § 704.16 Contracts/written agreements. Services, facilities, personnel, or equipment...

  14. 45 CFR 99.26 - Unsponsored written material.

    Science.gov (United States)

    2010-10-01

    ... 45 Public Welfare 1 2010-10-01 2010-10-01 false Unsponsored written material. 99.26 Section 99.26 Public Welfare DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ADMINISTRATION PROCEDURE FOR HEARINGS FOR THE CHILD CARE AND DEVELOPMENT FUND Hearing Procedures § 99.26 Unsponsored written material. Letters...

  15. Concreteness and Imagery Effects in the Written Composition of Definitions.

    Science.gov (United States)

    Sadoski, Mark; Kealy, William A.; Goetz, Ernest T.; Paivio, Allan

    1997-01-01

    In two experiments, undergraduates (n=48 and n=50) composed written definitions of concrete and abstract nouns that were matched for frequency of use and meaningfulness. Results support previous research suggesting that common cognitive mechanisms underlie production of spoken and written language as explained by dual coding theory. (SLD)

  16. 42 CFR 2.16 - Security for written records.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Security for written records. 2.16 Section 2.16 Public Health PUBLIC HEALTH SERVICE, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL PROVISIONS CONFIDENTIALITY OF ALCOHOL AND DRUG ABUSE PATIENT RECORDS General Provisions § 2.16 Security for written records...

  17. The Written Communication Skills That Matter Most for Accountants

    Science.gov (United States)

    Riley, Tracey J.; Simons, Kathleen A.

    2016-01-01

    Given the importance of effective written communication skills to the discipline of accounting, faculty must emphasize these skills in their classroom in order to adequately prepare students for successful careers in the field. Since 2000, only two studies in the accounting literature have examined which written communication skills are needed by…

  18. Towards a Theory of Vernacularisation: Insights from Written Chinese Vernaculars

    Science.gov (United States)

    Snow, Don

    2013-01-01

    This paper examines the history of four Chinese vernaculars which have developed written forms, and argues that five of the patterns Hanan identifies in the early development of Bai Hua can also be found in the early development of written Wu, Cantonese, and Minnan. In each of the cases studied, there is a clear pattern of early use of the…

  19. 49 CFR 1018.20 - Written demand for payment.

    Science.gov (United States)

    2010-10-01

    ... Collection of Claims § 1018.20 Written demand for payment. (a) The Board shall make appropriate written demand upon the debtor for payment of money in terms which specify: (1) The basis for the indebtedness... the debtor has explicitly refused to pay, or that sending a further demand is futile. Depending upon...

  20. The Influence of Process Drama on Elementary Students' Written Language

    Science.gov (United States)

    Anderson, Alida

    2012-01-01

    This article describes the influence of process drama on fourth grade students' written language productivity and specificity. Participants included 16 students with learning and/or behavioral challenges at an urban public charter school. The influence of process drama on students' written language was compared across contextualized and…

  1. Appropriating Written French: Literacy Practices in a Parisian Elementary Classroom

    Science.gov (United States)

    Rockwell, Elsie

    2012-01-01

    In this article, I examine French language instruction in an elementary classroom serving primarily children of Afro-French immigrants in Paris. I show that a prevalent French language ideology privileges written over oral expression and associates full mastery of written French with rational thought and full inclusion in the French polity. This…

  2. Quantity and quality of written feedback, action plans, and student ...

    African Journals Online (AJOL)

    Background. Mini-clinical-evaluation exercise (mini-CEX) assessment forms that have been modified with the addition of specific spaces on separate sheets are expected to improve the quantity and quality of written feedback and the action plan for further learning which is agreed upon, and to encourage written reflection.

  3. 5 CFR 179.306 - Written agreement for repayment.

    Science.gov (United States)

    2010-01-01

    ... 5 Administrative Personnel 1 2010-01-01 2010-01-01 false Written agreement for repayment. 179.306 Section 179.306 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT CIVIL SERVICE REGULATIONS CLAIMS COLLECTION STANDARDS Administrative Offset § 179.306 Written agreement for repayment. A debtor who admits...

  4. Teaching Computation in Primary School without Traditional Written Algorithms

    Science.gov (United States)

    Hartnett, Judy

    2015-01-01

    Concerns regarding the dominance of the traditional written algorithms in schools have been raised by many mathematics educators, yet the teaching of these procedures remains a dominant focus in in primary schools. This paper reports on a project in one school where the staff agreed to put the teaching of the traditional written algorithm aside,…

  5. Machine Tool Software

    Science.gov (United States)

    1988-01-01

    A NASA-developed software package has played a part in technical education of students who major in Mechanical Engineering Technology at William Rainey Harper College. Professor Hack has been using (APT) Automatically Programmed Tool Software since 1969 in his CAD/CAM Computer Aided Design and Manufacturing curriculum. Professor Hack teaches the use of APT programming languages for control of metal cutting machines. Machine tool instructions are geometry definitions written in APT Language to constitute a "part program." The part program is processed by the machine tool. CAD/CAM students go from writing a program to cutting steel in the course of a semester.

  6. 14C as a tool for evaluating riverine POC sources and erosion of the Zhujiang (Pearl River) drainage basin, South China

    International Nuclear Information System (INIS)

    Wei Xiuguo; Yi Weixi; Shen Chengde; Yechieli, Yoseph; Li Ningli; Ding Ping; Wang Ning; Liu Kexin

    2010-01-01

    Radiocarbon can serve as a powerful tool for identifying sources of organic carbon and evaluating the erosion intensity in river drainage basins. In this paper we present 14 C-AMS measurements of particulate organic carbon (POC) collected from the three major tributaries of the Zhujiang (Pearl River) system: the Xijiang (Western River), Beijiang (Northern River) and Dongjiang (Eastern River) rivers. Furthermore, we discuss the distribution of POC 14 C apparent ages and the related watersheds erosion of these rivers. Results yield Δ 14 C values of -425 per mille to -65 per mille which indicate that the 14 C apparent ages of suspended POC in the entire area are in the range of 540-4445 years. The POC apparent ages from Xijiang are mostly between 2000 and 4000 years, while in Dongjiang they mostly range from 540 to 1010 years. These 14 C apparent ages indicate that the watershed erosion of the Xijiang is more severe than that of the Dongjiang. This is in agreement with other data showing deeper erosion in Xijiang due to human activities.

  7. LFQProfiler and RNP(xl): Open-Source Tools for Label-Free Quantification and Protein-RNA Cross-Linking Integrated into Proteome Discoverer.

    Science.gov (United States)

    Veit, Johannes; Sachsenberg, Timo; Chernev, Aleksandar; Aicheler, Fabian; Urlaub, Henning; Kohlbacher, Oliver

    2016-09-02

    Modern mass spectrometry setups used in today's proteomics studies generate vast amounts of raw data, calling for highly efficient data processing and analysis tools. Software for analyzing these data is either monolithic (easy to use, but sometimes too rigid) or workflow-driven (easy to customize, but sometimes complex). Thermo Proteome Discoverer (PD) is a powerful software for workflow-driven data analysis in proteomics which, in our eyes, achieves a good trade-off between flexibility and usability. Here, we present two open-source plugins for PD providing additional functionality: LFQProfiler for label-free quantification of peptides and proteins, and RNP(xl) for UV-induced peptide-RNA cross-linking data analysis. LFQProfiler interacts with existing PD nodes for peptide identification and validation and takes care of the entire quantitative part of the workflow. We show that it performs at least on par with other state-of-the-art software solutions for label-free quantification in a recently published benchmark ( Ramus, C.; J. Proteomics 2016 , 132 , 51 - 62 ). The second workflow, RNP(xl), represents the first software solution to date for identification of peptide-RNA cross-links including automatic localization of the cross-links at amino acid resolution and localization scoring. It comes with a customized integrated cross-link fragment spectrum viewer for convenient manual inspection and validation of the results.

  8. What does the media say about palliative care? A descriptive study of news coverage in written media in Spain

    Science.gov (United States)

    García, Miriam; Navas, Alejandro; Olza, Inés; Gómez-Baceiredo, Beatriz; Pujol, Francesc; Garralda, Eduardo; Centeno, Carlos

    2017-01-01

    Introduction The goal of palliative care (PC) is to improve the quality of life of terminal stage patients and their families. The subject frequently appears in the mass-media and this helps create a socially accepted identity. The aim of this study is to describe and analyse PC related news items appeared in the Spanish written media. Methodology A descriptive cross-sectional study was designed. Considering diffusion, scope and the range in editorial policy criteria, four printed newspapers (PN) were selected, together with four exclusively digital media sources (DM). Through Mynews, a newspaper content depository, and the search tool for each DM website, articles published between 2009 and 2014 which included the terms "palliative care" and "palliative medicine" were sought. A questionnaire was created to characterise each article identified and a descriptive analysis was undertaken. Results A total of 627 articles were identified, of which 359 (57%) were published in PN (42% in the printed editions -PE- 16% in their online editions -OE-) and 268 (43%) in DM. In general, they appeared mainly in sections concerning Health (23%), Culture and Society (18%) and General/Home News (15%). In PE, just 2% were found in the Health section and nearly 70% in Culture and Society and General/Home News. Most of the articles were informative in nature and contained socio-political messages (90%). Statements by PC professionals were found in 35% of the articles and by politicians in 32%. The most frequent content was related to facing end of life (74%) and patient quality of life (70%). Conclusions The Spanish written media reflects the socio-political interest aroused by PC. Nevertheless, messages circulating about PC do not describe professional practice, or the contribution of the same for patients. Content more in line with the clinical practice might help contribute to the development of this new area of medicine. PMID:28968433

  9. What does the media say about palliative care? A descriptive study of news coverage in written media in Spain.

    Directory of Open Access Journals (Sweden)

    José Miguel Carrasco

    Full Text Available The goal of palliative care (PC is to improve the quality of life of terminal stage patients and their families. The subject frequently appears in the mass-media and this helps create a socially accepted identity. The aim of this study is to describe and analyse PC related news items appeared in the Spanish written media.A descriptive cross-sectional study was designed. Considering diffusion, scope and the range in editorial policy criteria, four printed newspapers (PN were selected, together with four exclusively digital media sources (DM. Through Mynews, a newspaper content depository, and the search tool for each DM website, articles published between 2009 and 2014 which included the terms "palliative care" and "palliative medicine" were sought. A questionnaire was created to characterise each article identified and a descriptive analysis was undertaken.A total of 627 articles were identified, of which 359 (57% were published in PN (42% in the printed editions -PE- 16% in their online editions -OE- and 268 (43% in DM. In general, they appeared mainly in sections concerning Health (23%, Culture and Society (18% and General/Home News (15%. In PE, just 2% were found in the Health section and nearly 70% in Culture and Society and General/Home News. Most of the articles were informative in nature and contained socio-political messages (90%. Statements by PC professionals were found in 35% of the articles and by politicians in 32%. The most frequent content was related to facing end of life (74% and patient quality of life (70%.The Spanish written media reflects the socio-political interest aroused by PC. Nevertheless, messages circulating about PC do not describe professional practice, or the contribution of the same for patients. Content more in line with the clinical practice might help contribute to the development of this new area of medicine.

  10. What does the media say about palliative care? A descriptive study of news coverage in written media in Spain.

    Science.gov (United States)

    Carrasco, José Miguel; García, Miriam; Navas, Alejandro; Olza, Inés; Gómez-Baceiredo, Beatriz; Pujol, Francesc; Garralda, Eduardo; Centeno, Carlos

    2017-01-01

    The goal of palliative care (PC) is to improve the quality of life of terminal stage patients and their families. The subject frequently appears in the mass-media and this helps create a socially accepted identity. The aim of this study is to describe and analyse PC related news items appeared in the Spanish written media. A descriptive cross-sectional study was designed. Considering diffusion, scope and the range in editorial policy criteria, four printed newspapers (PN) were selected, together with four exclusively digital media sources (DM). Through Mynews, a newspaper content depository, and the search tool for each DM website, articles published between 2009 and 2014 which included the terms "palliative care" and "palliative medicine" were sought. A questionnaire was created to characterise each article identified and a descriptive analysis was undertaken. A total of 627 articles were identified, of which 359 (57%) were published in PN (42% in the printed editions -PE- 16% in their online editions -OE-) and 268 (43%) in DM. In general, they appeared mainly in sections concerning Health (23%), Culture and Society (18%) and General/Home News (15%). In PE, just 2% were found in the Health section and nearly 70% in Culture and Society and General/Home News. Most of the articles were informative in nature and contained socio-political messages (90%). Statements by PC professionals were found in 35% of the articles and by politicians in 32%. The most frequent content was related to facing end of life (74%) and patient quality of life (70%). The Spanish written media reflects the socio-political interest aroused by PC. Nevertheless, messages circulating about PC do not describe professional practice, or the contribution of the same for patients. Content more in line with the clinical practice might help contribute to the development of this new area of medicine.

  11. Source Water Management for Disinfection By-Product Control using New York City's Operations Support Tool and On-Line Monitoring

    Science.gov (United States)

    Weiss, W. J.; Becker, W.; Schindler, S.

    2012-12-01

    The United States Environmental Protection Agency's 2006 Stage 2 Disinfectant / Disinfection Byproduct Rule (DBPR) for finished drinking waters is intended to reduce overall DBP levels by limiting the levels of total trihalomethanes (TTHM) and five of the haloacetic acids (HAA5). Under Stage 2, maximum contaminant levels (MCLs), 80 μg/L for TTHM and 60 μg/L for HAA5, are based on a locational running annual average for individual sites instead of as the system-wide quarterly running annual average of the Stage 1 DBPR. This means compliance will have to be met at sampling locations of peak TTHM and HAA5 concentrations rather than an average across the entire system. Compliance monitoring under the Stage 2 DBPR began on April 1, 2012. The New York City (NYC) Department of Environmental Protection (DEP) began evaluating potential impacts of the Stage 2 DBPR on NYC's unfiltered water supply in 2002 by monitoring TTHM and HAA5 levels at various locations throughout the distribution system. Initial monitoring indicated that HAA5 levels could be of concern in the future, with the potential to intermittently violate the Stage 2 DBPR at specific locations, particularly those with high water age. Because of the uncertainty regarding the long-term prospect for compliance, DEP evaluated alternatives to ensure compliance, including operational changes (reducing chlorine dose, changing flow configurations to minimize water age, altering pH, altering source water withdrawals); changing the residual disinfectant from free chlorine to chloramines; and engineered treatment alternatives. This paper will discuss the potential for using DEP's Operations Support Tool (OST) and enhanced reservoir monitoring to support optimization of source water withdrawals to minimize finished water DBP levels. The OST is a state-of-the-art decision support system (DSS) to provide computational and predictive support for water supply operations and planning. It incorporates a water supply system

  12. Speech-language therapy for adolescents with written-language difficulties: The South African context

    Directory of Open Access Journals (Sweden)

    Danel Erasmus

    2013-11-01

    Method: A survey study was conducted, using a self-administered questionnaire. Twenty-two currently practising speech-language therapists who are registered members of the South African Speech-Language-Hearing Association (SASLHA participated in the study. Results: The respondents indicated that they are aware of their role regarding adolescents with written-language difficulties. However, they feel that South-African speech-language therapists are not fulfilling this role. Existing assessment tools and interventions for written-language difficulties are described as inadequate, and culturally and age inappropriate. Yet, the majority of the respondents feel that they are adequately equipped to work with adolescents with written-language difficulties, based on their own experience, self-study and secondary training. The respondents feel that training regarding effective collaboration with teachers is necessary to establish specific roles, and to promote speech-language therapy for adolescents among teachers. Conclusion: Further research is needed in developing appropriate assessment and intervention tools as well as improvement of training at an undergraduate level.

  13. Teaching Written Communication Strategies: A Training to Improve Writing

    Directory of Open Access Journals (Sweden)

    Hanane Benali Taouis

    2018-03-01

    Full Text Available This research can be described as an experimental quantitative one including: a strategy training; two homogenous experimental groups with different levels of proficiency; and two homogenous control groups. The subjects are 60 Spanish high school students, who have been selected after taking the Oxford Quick Placement-Test. The study aims at investigating the possible relationship between the effect of the strategy training and the subjects' level of proficiency. It is also designed to analyze the effect of the training on the use of communication strategies in the written medium. It is meant to study the effect of the strategy training on the subjects' writing skill in English. The results show that the students' level of proficiency exerts a strong effect on the subjects' use of written communication strategies (CSs and on their strategy preference in written production. They also demonstrate how strategy training improves the subjects' written communication ability.

  14. 42 CFR 456.180 - Individual written plan of care.

    Science.gov (United States)

    2010-10-01

    ... SERVICES (CONTINUED) MEDICAL ASSISTANCE PROGRAMS UTILIZATION CONTROL Utilization Control: Mental Hospitals Plan of Care § 456.180 Individual written plan of care. (a) Before admission to a mental hospital or...

  15. A Theory of Developing Competence with Written Mathematical Symbols.

    Science.gov (United States)

    Hiebert, James

    1988-01-01

    Presented is a theory of how competence with written mathematical symbols develops, tracing a succession of cognitive processes that cumulate to yield competence. Arguments supporting the theory are drawn from the history, philosophy, and psychology of mathematics. (MNS)

  16. Improving Written Language Performance of Adolescents with Asperger Syndrome

    Science.gov (United States)

    Delano, Monica E

    2007-01-01

    The effects of a multicomponent intervention involving self-regulated strategy development delivered via video self-modeling on the written language performance of 3 students with Asperger syndrome were examined. During intervention sessions, each student watched a video of himself performing strategies for increasing the number of words written and the number of functional essay elements. He then wrote a persuasive essay. The number of words written and number of functional essay elements included in each essay were measured. Each student demonstrated gains in the number of words written and number of functional essay elements. Maintenance of treatment effects at follow-up varied across targets and participants. Implications for future research are suggested. PMID:17624076

  17. Picante: R tools for integrating phylogenies and ecology.

    Science.gov (United States)

    Kembel, Steven W; Cowan, Peter D; Helmus, Matthew R; Cornwell, William K; Morlon, Helene; Ackerly, David D; Blomberg, Simon P; Webb, Campbell O

    2010-06-01

    Picante is a software package that provides a comprehensive set of tools for analyzing the phylogenetic and trait diversity of ecological communities. The package calculates phylogenetic diversity metrics, performs trait comparative analyses, manipulates phenotypic and phylogenetic data, and performs tests for phylogenetic signal in trait distributions, community structure and species interactions. Picante is a package for the R statistical language and environment written in R and C, released under a GPL v2 open-source license, and freely available on the web (http://picante.r-forge.r-project.org) and from CRAN (http://cran.r-project.org).

  18. Enhancing the Benefits of Written Emotional Disclosure through Response Training

    OpenAIRE

    Konig, Andrea; Eonta, Alison; Dyal, Stephanie R.; Vrana, Scott R.

    2013-01-01

    Writing about a personal stressful event has been found to have psychological and physical health benefits, especially when physiological response increases during writing. Response training was developed to amplify appropriate physiological reactivity in imagery exposure. The present study examined whether response training enhances the benefits of written emotional disclosure. Participants were assigned to either a written emotional disclosure condition (n = 113) or a neutral writing condit...

  19. Written Cultural Heritage in the Context of Adopted Legal Regulations

    Directory of Open Access Journals (Sweden)

    Eva Kodrič-Dačić

    2013-09-01

    Full Text Available ABSTRACTPurpose: Libraries collect written cultural heritage which is not only the most valuable part of their collections but also a part of library materials which is, due to digitalization projects in the last decade, becoming more and more interesting to librarians and library users. The main goal of the study is a theoretical research of library materials acknowledged as Slovenian heritage. By defining the basic terms it highlights the attributes which are immanent to library materials, derived from the context of their origin or later destiny. Slovenian library legislation concerning protection of written cultural heritage is also critically analysed.Methodology/approach: Comparative analyses of European and Slovenian legislation concerning librarianship and written cultural heritage. Research limitation: Research was mainly limited to professional literature and resources dealing with written cultural heritage. Originality/practical implications: Results of the research serve as formal criteria for definition of library materials as written heritage and suggest how to improve legislation in the field of protection of written heritage in libraries. 

  20. Written cohesion in children with and without language learning disabilities.

    Science.gov (United States)

    Koutsoftas, Anthony D; Petersen, Victoria

    2017-09-01

    Cohesion refers to the linguistic elements of discourse that contribute to its continuity and is an important element to consider as part of written language intervention, especially in children with language learning disabilities (LLD). There is substantial evidence that children with LLD perform more poorly than typically developing (TD) peers on measures of cohesion in spoken language and on written transcription measures; however, there is far less research comparing groups on cohesion as a measure of written language across genres. The current study addresses this gap through the following two aims. First, to describe and compare cohesion in narrative and expository writing samples of children with and without language learning disabilities. Second, to relate measures of cohesion to written transcription and translation measures, oral language, and writing quality. Fifty intermediate-grade children produced one narrative and one expository writing sample from which measures of written cohesion were obtained. These included the frequency, adequacy and complexity of referential and conjunctive ties. Expository samples resulted in more complex cohesive ties and children with TD used more complex ties than peers with LLD. Different relationships among cohesion measures and writing were observed for narrative verse expository samples. Findings from this study demonstrate cohesion as a discourse-level measure of written transcription and how the use of cohesion can vary by genre and group (LLD, TD). Clinical implications for assessment, intervention, and future research are provided. © 2016 Royal College of Speech and Language Therapists.

  1. Examining Elementary Students' Development of Oral and Written Argumentation Practices Through Argument-Based Inquiry

    Science.gov (United States)

    Chen, Ying-Chih; Hand, Brian; Park, Soonhye

    2016-05-01

    Argumentation, and the production of scientific arguments are critical elements of inquiry that are necessary for helping students become scientifically literate through engaging them in constructing and critiquing ideas. This case study employed a mixed methods research design to examine the development in 5th grade students' practices of oral and written argumentation from one unit to another over 16 weeks utilizing the science writing heuristic approach. Data sources included five rounds of whole-class discussion focused on group presentations of arguments that occurred over eleven class periods; students' group writings; interviews with six target students and the teacher; and the researcher's field notes. The results revealed five salient trends in students' development of oral and written argumentative practices over time: (1) Students came to use more critique components as they participated in more rounds of whole-class discussion focused on group presentations of arguments; (2) by challenging each other's arguments, students came to focus on the coherence of the argument and the quality of evidence; (3) students came to use evidence to defend, support, and reject arguments; (4) the quality of students' writing continuously improved over time; and (5) students connected oral argument skills to written argument skills as they had opportunities to revise their writing after debating and developed awareness of the usefulness of critique from peers. Given the development in oral argumentative practices and the quality of written arguments over time, this study indicates that students' development of oral and written argumentative practices is positively related to each other. This study suggests that argumentative practices should be framed through both a social and epistemic understanding of argument-utilizing talk and writing as vehicles to create norms of these complex practices.

  2. Requirements for UML and OWL Integration Tool for User Data Consistency Modeling and Testing

    DEFF Research Database (Denmark)

    Nytun, J. P.; Jensen, Christian Søndergaard; Oleshchuk, V. A.

    2003-01-01

    The amount of data available on the Internet is continuously increasing, consequentially there is a growing need for tools that help to analyse the data. Testing of consistency among data received from different sources is made difficult by the number of different languages and schemas being used....... In this paper we analyze requirements for a tool that support integration of UML models and ontologies written in languages like the W3C Web Ontology Language (OWL). The tool can be used in the following way: after loading two legacy models into the tool, the tool user connects them by inserting modeling......, an important part of this technique is attaching of OCL expressions to special boolean class attributes that we call consistency attributes. The resulting integration model can be used for automatic consistency testing of two instances of the legacy models by automatically instantiate the whole integration...

  3. IB: A Monte Carlo simulation tool for neutron scattering instrument design under PVM and MPI

    International Nuclear Information System (INIS)

    Zhao Jinkui

    2011-01-01

    Design of modern neutron scattering instruments relies heavily on Monte Carlo simulation tools for optimization. IB is one such tool written in C++ and implemented under Parallel Virtual Machine and the Message Passing Interface. The program was initially written for the design and optimization of the EQ-SANS instrument at the Spallation Neutron Source. One of its features is the ability to group simple instrument components into more complex ones at the user input level, e.g. grouping neutron mirrors into neutron guides and curved benders. The simulation engine manages the grouped components such that neutrons entering a group are properly operated upon by all components, multiple times if needed, before exiting the group. Thus, only a few basic optical modules are needed at the programming level. For simulations that require higher computer speeds, the program can be compiled and run in parallel modes using either the PVM or the MPI architectures.

  4. Font size matters--emotion and attention in cortical responses to written words.

    Science.gov (United States)

    Bayer, Mareike; Sommer, Werner; Schacht, Annekathrin

    2012-01-01

    For emotional pictures with fear-, disgust-, or sex-related contents, stimulus size has been shown to increase emotion effects in attention-related event-related potentials (ERPs), presumably reflecting the enhanced biological impact of larger emotion-inducing pictures. If this is true, size should not enhance emotion effects for written words with symbolic and acquired meaning. Here, we investigated ERP effects of font size for emotional and neutral words. While P1 and N1 amplitudes were not affected by emotion, the early posterior negativity started earlier and lasted longer for large relative to small words. These results suggest that emotion-driven facilitation of attention is not necessarily based on biological relevance, but might generalize to stimuli with arbitrary perceptual features. This finding points to the high relevance of written language in today's society as an important source of emotional meaning.

  5. Perception and Assessment of Verbal and Written Information on Sex and Relationships after Hematopoietic Stem Cell Transplantation.

    Science.gov (United States)

    Wendt, Christel

    2017-12-01

    This study aimed to investigate experiences of verbal and written information about sex and relationships among men and women treated with hematopoietic stem cell transplantation. The study also aimed to investigate the demand for information and assessment of the quality of written patient information material entitled "Sex and relationships in the treatment of blood diseases." Few studies exist that shed any light on the demand for information about sex and relationships on the part of patients with hematological diseases before, during, and after their treatment. A total of 216 patients undergoing treatment for malignant blood diseases between 2000 and 2010 participated in this study. Patients' experiences of information about sex and relationships, and their opinions about the written patient information, were assessed using a questionnaire created specifically for this study. Most patients (81 %) had not received information about sex and relationships from a healthcare professional. Almost 90 % of men felt that verbal information was important, compared with 82 % of women. The majority also held that written information was important. These results indicate that patients, regardless of gender, age, and treatment, consider oral and written information about sex and relationships to be important and that the healthcare system should provide the information. The written patient information was considered to play an important role in creating an opening for a conversation about a sensitive topic such as sexuality, and also as a source of reference and support for the patient and his/her partner.

  6. The Influence of Group Formation on Learner Participation, Language Complexity, and Corrective Behaviour in Synchronous Written Chat as Part of Academic German Studies

    Science.gov (United States)

    Fredriksson, Christine

    2015-01-01

    Synchronous written chat and instant messaging are tools which have been used and explored in online language learning settings for at least two decades. Research literature has shown that such tools give second language (L2) learners opportunities for language learning, e.g. , the interaction in real time with peers and native speakers, the…

  7. Application of an integrated Weather Research and Forecasting (WRF)/CALPUFF modeling tool for source apportionment of atmospheric pollutants for air quality management: A case study in the urban area of Benxi, China.

    Science.gov (United States)

    Wu, Hao; Zhang, Yan; Yu, Qi; Ma, Weichun

    2018-04-01

    In this study, the authors endeavored to develop an effective framework for improving local urban air quality on meso-micro scales in cities in China that are experiencing rapid urbanization. Within this framework, the integrated Weather Research and Forecasting (WRF)/CALPUFF modeling system was applied to simulate the concentration distributions of typical pollutants (particulate matter with an aerodynamic diameter air quality to different degrees. According to the type-based classification, which categorized the pollution sources as belonging to the Bengang Group, large point sources, small point sources, and area sources, the source apportionment showed that the Bengang Group, the large point sources, and the area sources had considerable impacts on urban air quality. Finally, combined with the industrial characteristics, detailed control measures were proposed with which local policy makers could improve the urban air quality in Benxi. In summary, the results of this study showed that this framework has credibility for effectively improving urban air quality, based on the source apportionment of atmospheric pollutants. The authors endeavored to build up an effective framework based on the integrated WRF/CALPUFF to improve the air quality in many cities on meso-micro scales in China. Via this framework, the integrated modeling tool is accurately used to study the characteristics of meteorological fields, concentration fields, and source apportionments of pollutants in target area. The impacts of classified sources on air quality together with the industrial characteristics can provide more effective control measures for improving air quality. Through the case study, the technical framework developed in this study, particularly the source apportionment, could provide important data and technical support for policy makers to assess air pollution on the scale of a city in China or even the world.

  8. The determinants of spoken and written picture naming latencies.

    Science.gov (United States)

    Bonin, Patrick; Chalard, Marylène; Méot, Alain; Fayol, Michel

    2002-02-01

    The influence of nine variables on the latencies to write down or to speak aloud the names of pictures taken from Snodgrass and Vanderwart (1980) was investigated in French adults. The major determinants of both written and spoken picture naming latencies were image variability, image agreement and age of acquisition. To a lesser extent, name agreement was also found to have an impact in both production modes. The implications of the findings for theoretical views of both spoken and written picture naming are discussed.

  9. Critical evaluation of the use of bioinformatics as a theoretical tool to find high-potential sources of ACE inhibitory peptides

    NARCIS (Netherlands)

    Vercruysse, L.; Smagghe, G.; Bent, van der A.; Amerongen, van A.; Ongenaert, M.; Camp, van J.

    2009-01-01

    A bioinformatics analysis to screen for high-potential sources of angiotensin converting enzyme (ACE) inhibitory peptides was conducted in the area of insect muscle proteins. Vertebrate muscle proteins are reported as good sources of ACE inhibitory peptides, while the research on invertebrate muscle

  10. Comic Books: A Learning Tool for Meaningful Acquisition of Written Sign Language

    Science.gov (United States)

    Guimarães, Cayley; Oliveira Machado, Milton César; Fernandes, Sueli F.

    2018-01-01

    Deaf people use Sign Language (SL) for intellectual development, communications and other human activities that are mediated by language--such as the expression of complex and abstract thoughts and feelings; and for literature, culture and knowledge. The Brazilian Sign Language (Libras) is a complete linguistic system of visual-spatial manner,…

  11. The 87Sr/86Sr aquatic isoscape of the Danube catchment from the source to the mouth as tool for studying fish migrations

    Science.gov (United States)

    Zitek, Andreas; Tchaikovsky, Anastassiya; Irrgeher, Johanna; Waidbacher, Herwig; Prohaska, Thomas

    2014-05-01

    Isoscapes - spatially distributed isotope patterns across landscapes - are increasingly used as important basis for ecological studies. The natural variation of the isotopic abundances in a studied area bears the potential to be used as natural tracer for studying e.g. migrations of animals or prey-predator relations. The 87Sr/86Sr ratio is one important tracer, since it is known to provide a direct relation of biological samples to geologically distinct regions, as Sr isotopes are incorporated into living tissues as a proxy for calcium and taken up from the environment without any significant fractionation. Although until now the focus has been mainly set on terrestrial systems, maps for aquatic systems are increasingly being established. Here we present the first 87Sr/86Sr aquatic isoscape of the Danube catchment, the second largest river catchment in Europe, from near its source starting at river km 2581 in Germany down to its mouth to river km 107 in Romania. The total length of the river Danube is 2780 km draining a catchment area 801 463 km2 (10 % of the European continent). The major purpose of this study was to assess the potential of the 87Sr/86Sr isotope ratio to be used as tool for studying fish migrations at different scales in the entire Danube catchment. Within the Joint Danube Research 3 (JDS 3), the biggest scientific multi-disciplinary river expedition of the World in 2013 aiming at the assessment of the ecological status and degree of human alterations along the river Danube, water samples were taken at 68 pre-defined sites along the course of the river Danube including the major tributaries as a basis to create the so called 'Isoscape of the Danube catchment'. The determination of 87Sr/86Sr isotope ratio in river water was performed by multicollector-sector field-inductively coupled plasma-mass spectrometry (MC-SF-ICP-MS). The JDS 3 data were combined with existing data from prior studies conducted within the Austrian part of the Danube catchment

  12. Toward better Alzheimer's research information sources for the public.

    Science.gov (United States)

    Payne, Perry W

    2013-03-01

    The National Plan to Address Alzheimer's Disease calls for a new relationship between researchers and members of the public. This relationship is one that provides research information to patients and allows patients to provide ideas to researchers. One way to describe it is a "bidirectional translational relationship." Despite the numerous sources of online and offline information about Alzheimer's disease, there is no information source which currently provides this interaction. This article proposes the creation an Alzheimer's research information source dedicated to monitoring Alzheimer's research literature and providing user friendly, publicly accessible summaries of data written specifically for a lay audience. This information source should contain comprehensive, updated, user friendly, publicly available, reviews of Alzheimer's research and utilize existing online multimedia/social networking tools to provide information in useful formats that help patients, caregivers, and researchers learn rapidly from one another.

  13. Development and Application of Assessment Standards to Advanced Written Assignments

    Science.gov (United States)

    Miihkinen, Antti; Virtanen, Tuija

    2018-01-01

    This study describes the results of a project that focused on developing an assessment rubric to be used as the assessment criteria for the written thesis of accounting majors and the quality of the coursework during the seminar. We used descriptive analysis and the survey method to collect information for the development work and to examine the…

  14. Distribution of Articles in Written Composition among Malaysian ESL Learners

    Science.gov (United States)

    Rahim, Mia Emily Abdul; Rahim, Emma Marini Abdul; Ning, Chia Han

    2013-01-01

    The study aimed to investigate the distribution patterns of the English grammar articles (a, an, and the) as well as the distributions of their colligation patterns in written compositions of English among Malaysian ESL learners. This paper reports the results of a corpus-based study on articles used by these learners. The method used in this…

  15. Oral and Written Picture Description in Individuals with Aphasia

    Science.gov (United States)

    Vandenborre, Dorien; Visch-Brink, Evy; van Dun, Kim; Verhoeven, Jo; Mariën, Peter

    2018-01-01

    Background: Aphasia is characterized by difficulties in connected speech/writing. Aims: To explore the differences between the oral and written description of a picture in individuals with chronic aphasia (IWA) and healthy controls. Descriptions were controlled for productivity, efficiency, grammatical organization, substitution behaviour and…

  16. A Comparison between Written and Spoken Narratives in Aphasia

    Science.gov (United States)

    Behrns, Ingrid; Wengelin, Asa; Broberg, Malin; Hartelius, Lena

    2009-01-01

    The aim of the present study was to explore how a personal narrative told by a group of eight persons with aphasia differed between written and spoken language, and to compare this with findings from 10 participants in a reference group. The stories were analysed through holistic assessments made by 60 participants without experience of aphasia…

  17. A History of Oral and Written Storytelling in Nigeria

    Science.gov (United States)

    Edosomwan, Simeon; Peterson, Claudette M.

    2016-01-01

    Storytelling is a powerful process in adult education as a useful instructional approach in facilitating adult instruction and learning, especially during preliterate eras. What began as oral tradition has evolved to include written literature. A popular Eurocentric perspective in the early 19th century was that before the arrival of Europeans…

  18. The Written Literacy Forum: Combining Research and Practice.

    Science.gov (United States)

    Clark, Christopher M.; Florio, Susan

    1983-01-01

    Writing teachers and researchers collaborate in the Written Literacy Forum at Michigan State University to: (1) heighten teachers' awareness of the complexity of writing; (2) stimulate discussion across grade levels; and (3) focus research on areas concerning teachers. Discussion formats and inservice activities are described, and materials…

  19. Language Parameters in Written Compositions of Nine Year Old Children.

    Science.gov (United States)

    Rubin, Rosalyn; Buium, Nissan

    The purpose of this study was to develop a foundation for reliable and effective measurement of significant parameters in the development of written language skills in school age children. The subjects for the study were 25 nine-year-old children, 12 boys and 13 girls, who were randomly selected from among 1,559 participants. The findings…

  20. Concreteness Effects and Syntactic Modification in Written Composition.

    Science.gov (United States)

    Sadoski, Mark; Goetz, Ernest T.

    1998-01-01

    Investigates whether concreteness was related to a key characteristic of written composition--the cumulative sentence with a final modifier--which has been consistently associated with higher quality writing. Supports the conceptual-peg hypothesis of dual coding theory, with concrete verbs providing the pegs on which cumulative sentences are…

  1. Written Composition Process, Evaluation Difficulties and Modalities: An Experimental Study

    Science.gov (United States)

    Rodriguez, Celestino; Garci, Jesus Nicasio; Gonzalez-Castro, Paloma; Alvarez, David; Cerezo, Rebeca; Bernardo, Ana

    2011-01-01

    The underlying processes used in written compositions are currently a very interesting subject. Participants in this study were 326 people between 10 and 16 years of age, divided into two groups and compared by means of a "writing log". One group was provided assistance in the writing task by means of a graphic organiser, whilst the other was not…

  2. Written Corrective Feedback: The Perception of Korean EFL Learners

    Science.gov (United States)

    Chung, Bohyon

    2015-01-01

    This paper reports on the perception of Korean EFL learners toward feedback types on their written errors. The survey was administered using an adopted questionnaire from previous studies (Ishii 2011; Leki, 1991). This further allows a comparison of Korean EFL learners' attitudes with the responses to an identical questionnaire by Japanese EFL…

  3. Dynamic Written Corrective Feedback in Developmental Multilingual Writing Classes

    Science.gov (United States)

    Kurzer, Kendon

    2018-01-01

    This study investigated the role of dynamic written corrective feedback (DWCF; Evans, Hartshorn, McCollum, & Wolfersberger, 2010; Hartshorn & Evans, 2015; Hartshorn et al., 2010), a mode of providing specific, targeted, and individualized grammar feedback in developmental English as a second language (ESL) writing classes (pre-first year…

  4. Validating a Written Instrument for Assessing Students' Fractions Schemes and

    Science.gov (United States)

    Wilkins, Jesse L. M.; Norton, Anderson; Boyce, Steven J.

    2013-01-01

    Previous research has documented schemes and operations that undergird students' understanding of fractions. This prior research was based, in large part, on small-group teaching experiments. However, written assessments are needed in order for teachers and researchers to assess students' ways of operating on a whole-class scale. In this study,…

  5. 22 CFR 208.50 - How is this part written?

    Science.gov (United States)

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false How is this part written? 208.50 Section 208.50 Foreign Relations AGENCY FOR INTERNATIONAL DEVELOPMENT GOVERNMENTWIDE DEBARMENT AND SUSPENSION... for the general public and business community to use. The section headings and text, often in the form...

  6. Short message service (SMS) language and written language skills ...

    African Journals Online (AJOL)

    SMS language is English language slang, used as a means of mobile phone text messaging. This practice may impact on the written language skills of learners at school. The main aim of this study was to determine the perspectives of Grade 8 and 9 English (as Home Language) educators in Gauteng regarding the ...

  7. 19 CFR 210.4 - Written submissions; representations; sanctions.

    Science.gov (United States)

    2010-04-01

    ... 19 Customs Duties 3 2010-04-01 2010-04-01 false Written submissions; representations; sanctions. 210.4 Section 210.4 Customs Duties UNITED STATES INTERNATIONAL TRADE COMMISSION INVESTIGATIONS OF UNFAIR PRACTICES IN IMPORT TRADE ADJUDICATION AND ENFORCEMENT Rules of General Applicability § 210.4...

  8. Cracking the code: residents' interpretations of written assessment comments

    NARCIS (Netherlands)

    Ginsburg, S.; Vleuten, C.P.M. van der; Eva, K.W.; Lingard, L.

    2017-01-01

    CONTEXT: Interest is growing in the use of qualitative data for assessment. Written comments on residents' in-training evaluation reports (ITERs) can be reliably rank-ordered by faculty attendings, who are adept at interpreting these narratives. However, if residents do not interpret assessment

  9. 9 CFR 202.113 - Rule 13: Written hearing.

    Science.gov (United States)

    2010-01-01

    ... 9 Animals and Animal Products 2 2010-01-01 2010-01-01 false Rule 13: Written hearing. 202.113 Section 202.113 Animals and Animal Products GRAIN INSPECTION, PACKERS AND STOCKYARDS ADMINISTRATION... waiver of the right to file such evidence. (g) Extension of time for depositions. If any party timely...

  10. Argumentation Schema and the Myside Bias in Written Argumentation

    Science.gov (United States)

    Wolfe, Christopher R.; Britt, M. Anne; Butler, Jodie A.

    2009-01-01

    This article describes a cognitive argumentation schema for written arguments and presents three empirical studies on the "myside" bias--the tendency to ignore or exclude evidence against one's position. Study 1 examined the consequences of conceding, rebutting, and denying other-side information. Rebuttal led to higher ratings of…

  11. RECOGNITION METHOD FOR CURSIVE JAPANESE WORD WRITTEN IN LATIN CHARACTERS

    NARCIS (Netherlands)

    Maruyama, K.; Nakano, Y.

    2004-01-01

    This paper proposes a recognition method for cursive Japanese words written in Latin characters. The method integrates multiple classifiers using duplicated can­ didates in multiple classifiers and orders of classifiers to improve the word recog­ nition rate combining their results. In experiments

  12. THE PHONOLOGICAL BASIS OF MISSPELLINGS IN THE WRITTEN ...

    African Journals Online (AJOL)

    Misspellings have been a common error in the written English of non-native speakers. ... The study was done with a view to investigating whether the phonology of Kikuyu as a learner's first language and pronunciation of words in English as the second language, based on the influence of the phonology of Kikuyu affects ...

  13. Optimizing the efficiency of femtosecond-laser-written holograms

    DEFF Research Database (Denmark)

    Wædegaard, Kristian Juncher; Hansen, Henrik Dueholm; Balling, Peter

    2013-01-01

    Computer-generated binary holograms are written on a polished copper surface using single 800-nm, 120-fs pulses from a 1-kHz-repetition-rate laser system. The hologram efficiency (i.e. the power in the holographic reconstructed image relative to the incoming laser power) is investigated...

  14. Written Emotional Expression as an Intervention for Asthma

    Science.gov (United States)

    Bray, Melissa A.; Theodore, Lea A.; Patwa, Shamim S.; Margiano, Suzanne G.; Alric, Jolie M.; Peck, Heather L.

    2003-01-01

    This investigation employed a multiple baseline design across five participants to examine written emotional expression as an intervention to improve lung function in high school-aged students, college students, and adults with asthma. The predicted forced expiratory volume in 1 second (FEV[subscript 1] measure of large airway functioning) and…

  15. Students' Written Arguments in General Chemistry Laboratory Investigations

    Science.gov (United States)

    Choi, Aeran; Hand, Brian; Greenbowe, Thomas

    2013-01-01

    This study aimed to examine the written arguments developed by college freshman students using the Science Writing Heuristic approach in inquiry-based general chemistry laboratory classrooms and its relationships with students' achievement in chemistry courses. Fourteen freshman students participated in the first year of the study while 19…

  16. Written Formative Assessment and Silence in the Classroom

    Science.gov (United States)

    Lee Hang, Desmond Mene; Bell, Beverley

    2015-01-01

    In this commentary, we build on Xinying Yin and Gayle Buck's discussion by exploring the cultural practices which are integral to formative assessment, when it is viewed as a sociocultural practice. First we discuss the role of assessment and in particular oral and written formative assessments in both western and Samoan cultures, building on the…

  17. 2 CFR 182.100 - How is this part written?

    Science.gov (United States)

    2010-01-01

    ... 2 Grants and Agreements 1 2010-01-01 2010-01-01 false How is this part written? 182.100 Section 182.100 Grants and Agreements OFFICE OF MANAGEMENT AND BUDGET GOVERNMENTWIDE GUIDANCE FOR GRANTS AND AGREEMENTS Reserved GOVERNMENTWIDE REQUIREMENTS FOR DRUG-FREE WORKPLACE (FINANCIAL ASSISTANCE) Purpose and...

  18. 17 CFR 230.437a - Written consents.

    Science.gov (United States)

    2010-04-01

    ...) Are filing a registration statement containing financial statements in which Arthur Andersen LLP (or a foreign affiliate of Arthur Andersen LLP) had been acting as the independent public accountant. (b... dispense with the requirement for the registrant to file the written consent of Arthur Andersen LLP (or a...

  19. Shortcomings of the written survey questionnaire for discovering ...

    African Journals Online (AJOL)

    In this article I describe my reflections on using a written survey questionnaire to investigate, on a large-scale, students' perceptions of studying Xhosa as a first language in high schools. I describe the aims of the project, how the questionnaire was designed, and the problems I encountered with the analysis of the data.

  20. Comparing Written Competency in Core French and French Immersion Graduates

    Science.gov (United States)

    Lappin-Fortin, Kerry

    2014-01-01

    Few studies have compared the written competency of French immersion students and their core French peers, and research on these learners at a postsecondary level is even scarcer. My corpus consists of writing samples from 255 students from both backgrounds beginning a university course in French language. The writing proficiency of core French…

  1. 42 CFR 456.80 - Individual written plan of care.

    Science.gov (United States)

    2010-10-01

    ... (CONTINUED) MEDICAL ASSISTANCE PROGRAMS UTILIZATION CONTROL Utilization Control: Hospitals Plan of Care § 456.80 Individual written plan of care. (a) Before admission to a hospital or before authorization for... and rehabilitative services; (iv) Activities; (v) Social services; (vi) Diet; (4) Plans for continuing...

  2. Enhancing Management Tools: Molecular Genetic Tracking to Target Microbial Pollution Sources in South Florida Coral Reefs, Year 1 - CRCP project #1114

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — Both coastal inlets and treated wastewater outfalls are recognized as major pathways for microbial contaminants from Land-Based Sources of Pollution (LBSP) to enter...

  3. Integrated site-specific quantification of faecal bacteria and detection of DNA markers in faecal contamination source tracking as a microbial risk tracking tool in urban Lake ecosystems

    Science.gov (United States)

    Donde, Oscar Omondi; Tian, Cuicui; Xiao, Bangding

    2017-11-01

    The presence of feacal-derived pathogens in water is responsible for several infectious diseases and deaths worldwide. As a solution, sources of fecal pollution in waters must be accurately assessed, properly determined and strictly controlled. However, the exercise has remained challenging due to the existing overlapping characteristics by different members of faecal coliform bacteria and the inadequacy of information pertaining to the contribution of seasonality and weather condition on tracking the possible sources of pollution. There are continued efforts to improve the Faecal Contamination Source Tracking (FCST) techniques such as Microbial Source Tracking (MST). This study aimed to make contribution to MST by evaluating the efficacy of combining site specific quantification of faecal contamination indicator bacteria and detection of DNA markers while accounting for seasonality and weather conditions' effects in tracking the major sources of faecal contamination in a freshwater system (Donghu Lake, China). The results showed that the use of cyd gene in addition to lacZ and uidA genes differentiates E. coli from other closely related faecal bacteria. The use of selective media increases the pollution source tracking accuracy. BSA addition boosts PCR detection and increases FCST efficiency. Seasonality and weather variability also influence the detection limit for DNA markers.

  4. [Written and pictorial content in magazines and their possible relationship to eating disorders].

    Science.gov (United States)

    Szabó, Kornélia; Túry, Ferenc

    2012-02-01

    In the current study we reviewed the literature on studies exploring the magazine reading frequency, written and pictorial contents appearing in magazines and their connection to eating disorders. Reading different fashion and fitness magazines has effect on readers through several indirect and direct factors and through trustable and false information. They affect readers' body satisfaction, self-esteem, eating habits and more generally their health behavior. Different theories have been explained to account for these associations and several other studies examined empirically the connection between the frequency of magazine reading and eating disorders, as well as the symptoms leading to eating disorders. We analyzed and summarized articles between 1975 and 2009 from online databases. We used the following sources: Science Direct (http://www.sciencedirect.com/), Springer-Verlag GmbH (http://www.springerlink.com/) and SAGE Publications Ltd (http://online.sagepub. com/). The pictorial and written magazine contents were associated with the development and maintenance of eating disorders or with symptoms that might lead to eating disorders. The publications compared to previous years featured an increased number of advertisements for unhealthy foods, for unhealthy radical diet plans and exercise programs. Furthermore the magazines contained conflicting messages about nutrition, body functions and eating disorders. Written and pictorial magazine contents, messages might increase the risk for development of eating disorders, especially in vulnerable individuals.

  5. Alternate Reality Games as an Informal Learning Tool for Generating STEM Engagement among Underrepresented Youth: a Qualitative Evaluation of the Source

    Science.gov (United States)

    Gilliam, Melissa; Jagoda, Patrick; Fabiyi, Camille; Lyman, Phoebe; Wilson, Claire; Hill, Brandon; Bouris, Alida

    2017-06-01

    This project developed and studied The Source, an alternate reality game (ARG) designed to foster interest and knowledge related to science, technology, engineering, and math (STEM) among youth from populations underrepresented in STEM fields. ARGs are multiplayer games that engage participants across several media such as shared websites, social media, personal communications, and real-world settings to complete activities and collaborate with team members. The Source was a five-week summer program with 144 participants from Chicago aged 13 to 18 years. The Source incorporated six socio-contextual factors derived from three frameworks: Chang's (ERIC Digest, 2002) recommendations for engaging underrepresented populations in STEM careers, Lave and Wenger's (Cambridge University Press, 1991) situated learning model, and Barron's (Human Development, 49(4); 193-224, 2006) learning ecology perspective. These factors aligned with the program's aims of promoting (1) social community and peer support, (2) collaboration and teamwork, (3) real-world relevance and investigative learning, (4) mentoring and exposure to STEM professionals, (5) hands-on activities to foster transferable skill building, and (6) interface with technology. This paper presents results from 10 focus groups and 10 individual interviews conducted with a subset of the 144 youth participants who completed the game. It describes how these six factors were realized through The Source and uses them as a lens for considering how The Source functioned pedagogically. Qualitative findings describe youth's perception of The Source's potential influence on STEM interest, engagement, and identity formation. Despite limitations, study results indicate that underrepresented youth can engage in an immersive, narrative, and game-based experience as a potential mechanism for piquing and developing STEM interest and skills, particularly among underrepresented youth.

  6. Thermal Insulation System Analysis Tool (TISTool) User's Manual. Version 1.0.0

    Science.gov (United States)

    Johnson, Wesley; Fesmire, James; Leucht, Kurt; Demko, Jonathan

    2010-01-01

    The Thermal Insulation System Analysis Tool (TISTool) was developed starting in 2004 by Jonathan Demko and James Fesmire. The first edition was written in Excel and Visual BasIc as macros. It included the basic shapes such as a flat plate, cylinder, dished head, and sphere. The data was from several KSC tests that were already in the public literature realm as well as data from NIST and other highly respectable sources. More recently, the tool has been updated with more test data from the Cryogenics Test Laboratory and the tank shape was added. Additionally, the tool was converted to FORTRAN 95 to allow for easier distribution of the material and tool. This document reviews the user instructions for the operation of this system.

  7. Source SDK development essentials

    CERN Document Server

    Bernier, Brett

    2014-01-01

    The Source Authoring Tools are the pieces of software used to create custom content for games made with Valve's Source engine. Creating mods and maps for your games without any programming knowledge can be time consuming. These tools allow you to create your own maps and levels without the need for any coding knowledge. All the tools that you need to start creating your own levels are built-in and ready to go! This book will teach you how to use the Authoring Tools provided with Source games and will guide you in creating your first maps and mods (modifications) using Source. You will learn ho

  8. Micro-simulation as a tool to assess policy concerning non-point source pollution: the case of ammonia in Dutch agriculture

    NARCIS (Netherlands)

    Kruseman, G.; Blokland, P.W.; Bouma, F.; Luesink, H.H.; Vrolijk, H.C.J.

    2008-01-01

    Non-point source pollution is notoriously difficult to asses. A relevant example is ammonia emissions in the Netherlands. Since the mid 1980s the Dutch government has sought to reduce emissions through a wide variety of measures, the effect of which in turn is monitored using modeling techniques.

  9. Synergistic relationships between Analytical Chemistry and written standards

    International Nuclear Information System (INIS)

    Valcárcel, Miguel; Lucena, Rafael

    2013-01-01

    Graphical abstract: -- Highlights: •Analytical Chemistry is influenced by international written standards. •Different relationships can be established between them. •Synergies can be generated when these standards are conveniently managed. -- Abstract: This paper describes the mutual impact of Analytical Chemistry and several international written standards (norms and guides) related to knowledge management (CEN-CWA 14924:2004), social responsibility (ISO 26000:2010), management of occupational health and safety (OHSAS 18001/2), environmental management (ISO 14001:2004), quality management systems (ISO 9001:2008) and requirements of the competence of testing and calibration laboratories (ISO 17025:2004). The intensity of this impact, based on a two-way influence, is quite different depending on the standard considered. In any case, a new and fruitful approach to Analytical Chemistry based on these relationships can be derived

  10. Synergistic relationships between Analytical Chemistry and written standards

    Energy Technology Data Exchange (ETDEWEB)

    Valcárcel, Miguel, E-mail: qa1vacam@uco.es; Lucena, Rafael

    2013-07-25

    Graphical abstract: -- Highlights: •Analytical Chemistry is influenced by international written standards. •Different relationships can be established between them. •Synergies can be generated when these standards are conveniently managed. -- Abstract: This paper describes the mutual impact of Analytical Chemistry and several international written standards (norms and guides) related to knowledge management (CEN-CWA 14924:2004), social responsibility (ISO 26000:2010), management of occupational health and safety (OHSAS 18001/2), environmental management (ISO 14001:2004), quality management systems (ISO 9001:2008) and requirements of the competence of testing and calibration laboratories (ISO 17025:2004). The intensity of this impact, based on a two-way influence, is quite different depending on the standard considered. In any case, a new and fruitful approach to Analytical Chemistry based on these relationships can be derived.

  11. THE WRITTEN DISCOURSE OF INTERVIEWING STYLE FOR A MAGAZINE INTERVIEW

    Directory of Open Access Journals (Sweden)

    Jessie Barrot

    2012-07-01

    Full Text Available Abstract: This paper examines the written discourse of interviewing style for the purpose of print publication. Specifically, this paper sought to describe and explain the phases of interviewing procedures, the typology of the questions, and the transitional strategies executed by Oprah Winfrey during her interviews for O Magazine. One hundred and ten (110 response-soliciting statements were subjected to discourse analytic procedure to determine the features of such utterances. The results showed that her interview procedure follows a certain pattern that contributes to her ability to maintain the intimacy, familiarity, and dynamics of conversation. Further, results revealed that the interviewer employs a variety of response-soliciting strategies and transitional strategies that unconsciously put the control and authority in the conversation to the interviewees. Finally, some pedagogical implications were also presented for classroom use. Keywords: discourse analysis, interviewing style, interview questions, written discourse

  12. Characterization of UV written waveguides with luminescence microscopy

    DEFF Research Database (Denmark)

    Svalgaard, Mikael; Harpøth, Anders; Rosbirk, Tue

    2005-01-01

    Luminescence microscopy is used to measure the refractive index profile and molecular defect distribution of UV written waveguides with a spatial resolution of ~0.4 mm and high signal-to-noise ratio. The measurements reveal comlex waveguide formation dynamics with significant topological changes...... in the core profile. In addition, it is observed that thewaveguide formation process requires several milliseconds of UV exposure before starting....

  13. Polish Phoneme Statistics Obtained On Large Set Of Written Texts

    Directory of Open Access Journals (Sweden)

    Bartosz Ziółko

    2009-01-01

    Full Text Available The phonetical statistics were collected from several Polish corpora. The paper is a summaryof the data which are phoneme n-grams and some phenomena in the statistics. Triphonestatistics apply context-dependent speech units which have an important role in speech recognitionsystems and were never calculated for a large set of Polish written texts. The standardphonetic alphabet for Polish, SAMPA, and methods of providing phonetic transcriptions are described.

  14. Enhancing the benefits of written emotional disclosure through response training.

    Science.gov (United States)

    Konig, Andrea; Eonta, Alison; Dyal, Stephanie R; Vrana, Scott R

    2014-05-01

    Writing about a personal stressful event has been found to have psychological and physical health benefits, especially when physiological response increases during writing. Response training was developed to amplify appropriate physiological reactivity in imagery exposure. The present study examined whether response training enhances the benefits of written emotional disclosure. Participants were assigned to either a written emotional disclosure condition (n=113) or a neutral writing condition (n=133). Participants in each condition wrote for 20 minutes on 3 occasions and received response training (n=79), stimulus training (n=84) or no training (n=83). Heart rate and skin conductance were recorded throughout a 10-minute baseline, 20-minute writing, and a 10-minute recovery period. Self-reported emotion was assessed in each session. One month after completing the sessions, participants completed follow-up assessments of psychological and physical health outcomes. Emotional disclosure elicited greater physiological reactivity and self-reported emotion than neutral writing. Response training amplified physiological reactivity to emotional disclosure. Greater heart rate during emotional disclosure was associated with the greatest reductions in event-related distress, depression, and physical illness symptoms at follow-up, especially among response trained participants. Results support an exposure explanation of emotional disclosure effects and are the first to demonstrate that response training facilitates emotional processing and may be a beneficial adjunct to written emotional disclosure. Copyright © 2014. Published by Elsevier Ltd.

  15. Prosodic Parallelism – comparing spoken and written language

    Directory of Open Access Journals (Sweden)

    Richard Wiese

    2016-10-01

    Full Text Available The Prosodic Parallelism hypothesis claims adjacent prosodic categories to prefer identical branching of internal adjacent constituents. According to Wiese and Speyer (2015, this preference implies feet contained in the same phonological phrase to display either binary or unary branching, but not different types of branching. The seemingly free schwa-zero alternations at the end of some words in German make it possible to test this hypothesis. The hypothesis was successfully tested by conducting a corpus study which used large-scale bodies of written German. As some open questions remain, and as it is unclear whether Prosodic Parallelism is valid for the spoken modality as well, the present study extends this inquiry to spoken German. As in the previous study, the results of a corpus analysis recruiting a variety of linguistic constructions are presented. The Prosodic Parallelism hypothesis can be demonstrated to be valid for spoken German as well as for written German. The paper thus contributes to the question whether prosodic preferences are similar between the spoken and written modes of a language. Some consequences of the results for the production of language are discussed.

  16. Enhancing the Benefits of Written Emotional Disclosure through Response Training

    Science.gov (United States)

    Konig, Andrea; Eonta, Alison; Dyal, Stephanie R.; Vrana, Scott R.

    2014-01-01

    Writing about a personal stressful event has been found to have psychological and physical health benefits, especially when physiological response increases during writing. Response training was developed to amplify appropriate physiological reactivity in imagery exposure. The present study examined whether response training enhances the benefits of written emotional disclosure. Participants were assigned to either a written emotional disclosure condition (n = 113) or a neutral writing condition (n = 133). Participants in each condition wrote for 20 minutes on three occasions and received response training (n = 79), stimulus training (n = 84) or no training (n = 83). Heart rate and skin conductance were recorded throughout a 10-minute baseline, 20-minute writing, and a 10-minute recovery period. Self-reported emotion was assessed in each session. One month after completing the sessions, participants completed follow-up assessments of psychological and physical health outcomes. Emotional disclosure elicited greater physiological reactivity and self-reported emotion than neutral writing. Response training amplified physiological reactivity to emotional disclosure. Greater heart rate during emotional disclosure was associated with the greatest reductions in event-related distress, depression, and physical illness symptoms at follow-up, especially among response trained participants. Results support an exposure explanation of emotional disclosure effects and are the first to demonstrate that response training facilitates emotional processing and may be a beneficial adjunct to written emotional disclosure. PMID:24680230

  17. [Alcohol advertising in written mass media in Spain].

    Science.gov (United States)

    Montes-Santiago, J; Alvarez Muñiz, M L; Baz Lomba, A

    2007-03-01

    Alcohol advertising is a powerful factor of incitation to consumption. We analyzed the alcohol advertising, especially that youth-focused, in written mass media in Spain during the period 2002-2006. Annual cross-sectional study of advertisements in 41 widely difused written mass media (average readers: 10,1 millions). Media admitting alcohol publicity were 29% in the whole. (2,9 millions of readers on average, 29% of total readers). Alcohol advertising constituted the 3,8% of global publicity and the 8,6% of the publicity in media admitting alcohol publicity. In this period only 4% of the media (2,4% of total readers) inserted antidrug campaigns. In brief, three out of 10 total readers and one out of 12 people older than 15 years suffered the impact of tobacco advertising. Young people were included in 33% of alcohol advertisements and 3 out of 6 of youth-oriented magazines permitted a such publicity. Alcohol publicity remains high in written mass media in Spain. By contrast few people received informative antidrug campaigns. Advertising was preferentially directed to young people.

  18. The study on force, surface integrity, tool life and chip on laser assisted machining of inconel 718 using Nd:YAG laser source.

    Science.gov (United States)

    Venkatesan, K

    2017-07-01

    Inconel 718, a high-temperature alloy, is a promising material for high-performance aerospace gas turbine engines components. However, the machining of the alloy is difficult owing to immense shear strength, rapid work hardening rate during turning, and less thermal conductivity. Hence, like ceramics and composites, the machining of this alloy is considered as difficult-to-turn materials. Laser assisted turning method has become a promising solution in recent years to lessen cutting stress when materials that are considered difficult-to-turn, such as Inconel 718 is employed. This study investigated the influence of input variables of laser assisted machining on the machinability aspect of the Inconel 718. The comparison of machining characteristics has been carried out to analyze the process benefits with the variation of laser machining variables. The laser assisted machining variables are cutting speeds of 60-150 m/min, feed rates of 0.05-0.125 mm/rev with a laser power between 1200 W and 1300 W. The various output characteristics such as force, roughness, tool life and geometrical characteristic of chip are investigated and compared with conventional machining without application of laser power. From experimental results, at a laser power of 1200 W, laser assisted turning outperforms conventional machining by 2.10 times lessening in cutting force, 46% reduction in surface roughness as well as 66% improvement in tool life when compared that of conventional machining. Compared to conventional machining, with the application of laser, the cutting speed of carbide tool has increased to a cutting condition of 150 m/min, 0.125 mm/rev. Microstructural analysis shows that no damage of the subsurface of the workpiece.

  19. The study on force, surface integrity, tool life and chip on laser assisted machining of inconel 718 using Nd:YAG laser source

    Directory of Open Access Journals (Sweden)

    K. Venkatesan

    2017-07-01

    Full Text Available Inconel 718, a high-temperature alloy, is a promising material for high-performance aerospace gas turbine engines components. However, the machining of the alloy is difficult owing to immense shear strength, rapid work hardening rate during turning, and less thermal conductivity. Hence, like ceramics and composites, the machining of this alloy is considered as difficult-to-turn materials. Laser assisted turning method has become a promising solution in recent years to lessen cutting stress when materials that are considered difficult-to-turn, such as Inconel 718 is employed. This study investigated the influence of input variables of laser assisted machining on the machinability aspect of the Inconel 718. The comparison of machining characteristics has been carried out to analyze the process benefits with the variation of laser machining variables. The laser assisted machining variables are cutting speeds of 60–150 m/min, feed rates of 0.05–0.125 mm/rev with a laser power between 1200 W and 1300 W. The various output characteristics such as force, roughness, tool life and geometrical characteristic of chip are investigated and compared with conventional machining without application of laser power. From experimental results, at a laser power of 1200 W, laser assisted turning outperforms conventional machining by 2.10 times lessening in cutting force, 46% reduction in surface roughness as well as 66% improvement in tool life when compared that of conventional machining. Compared to conventional machining, with the application of laser, the cutting speed of carbide tool has increased to a cutting condition of 150 m/min, 0.125 mm/rev. Microstructural analysis shows that no damage of the subsurface of the workpiece.

  20. SPPTOOLS: Programming tools for the IRAF SPP language

    Science.gov (United States)

    Fitzpatrick, M.

    1992-01-01

    An IRAF package to assist in SPP code development and debugging is described. SPP is the machine-independent programming language used by virtually all IRAF tasks. Tools have been written to aide both novice and advanced SPP programmers with development and debugging by providing tasks to check the code for the number and type of arguments in all calls to IRAF VOS library procedures, list the calling sequences of IRAF tasks, create a database of identifiers for quick access, check for memory which is not freed, and a source code formatter. Debugging is simplified since the programmer is able to get a better understanding of the structure of his/her code, and IRAF library procedure calls (probably the most common source of errors) are automatically checked for correctness.

  1. Crowdfunding, an alternative source of financing construction and real estate projects. Guideline for Developers on how to use this tool in medium size projects.

    OpenAIRE

    Sierra Mercado, David

    2017-01-01

    Real estate crowdfunding comprises the process of investing in a real estate projects using online platforms, specialized websites that can reach a large number of potential investors, changing in just few years the traditional approach of the real estate industry. This phenomenon has become a trend among small and medium project developers, which nowadays have this additional source of financing. However, many people still unfamiliar about this new business model. Therefore, it is relevant t...

  2. Combining chemometric tools for assessing hazard sources and factors acting simultaneously in contaminated areas. Case study: "Mar Piccolo" Taranto (South Italy).

    Science.gov (United States)

    Mali, Matilda; Dell'Anna, Maria Michela; Notarnicola, Michele; Damiani, Leonardo; Mastrorilli, Piero

    2017-10-01

    Almost all marine coastal ecosystems possess complex structural and dynamic characteristics, which are influenced by anthropogenic causes and natural processes as well. Revealing the impact of sources and factors controlling the spatial distributions of contaminants within highly polluted areas is a fundamental propaedeutic step of their quality evaluation. Combination of different pattern recognition techniques, applied to one of the most polluted Mediterranean coastal basin, resulted in a more reliable hazard assessment. PCA/CA and factorial ANOVA were exploited as complementary techniques for apprehending the impact of multi-sources and multi-factors acting simultaneously and leading to similarities or differences in the spatial contamination pattern. The combination of PCA/CA and factorial ANOVA allowed, on one hand to determine the main processes and factors controlling the contamination trend within different layers and different basins, and, on the other hand, to ascertain possible synergistic effects. This approach showed the significance of a spatially representative overview given by the combination of PCA-CA/ANOVA in inferring the historical anthropogenic sources loading on the area. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Pilot scale digestion of source-sorted household waste as a tool for evaluation of different pre-sorting and pre-treatment strategies

    DEFF Research Database (Denmark)

    Svärd, Å; Gruvberger, C.; Aspegren, H.

    2002-01-01

    Pilot scale digestion of the organic fraction of source-sorted household waste from Sweden and Denmark was performed during one year. The study includes 17 waste types with differences in originating municipality, housing type, kitchen wrapping, sack type, pre-treatment method and season. The pilot...... scale digestion has been carried out in systems with a 35-litres digester connected to a 77-litres gas tank. Four rounds of digestion were performed including start-up periods, full operation periods for evaluation and post-digestion periods without feeding. Different pre-sorting and pre-treatment...

  4. Marginalia as the beginning of written culture: The Glosas Emilianensis

    Directory of Open Access Journals (Sweden)

    Maja Šabec

    2010-12-01

    Full Text Available The Glosas emilianenses are notes in Latin and in a Romance language dating from the eleventh century, written by an anonymous monk between the lines and in the margins of a Latin manuscript known as Codex Aemilianensis 60 to explicate syntactic, morphological, and semantic difficulties in understanding the original. The document was named after its place of origin, a monastery in the village of San Millán de la Cogolla, known as “the cradle of Castilian.” The non-Latin Romance glosses are believed to be the first written accounts of the language that later evolved into present-day Castilian or Spanish; they are therefore invaluable historical, linguistic, literary, and cultural material. The place and time of the origin of the glosses are not a coincidence, but a consequence of particular historical circumstances in the Iberian Peninsula. The Moorish invasion in 711 AD destroyed the Visigothic Kingdom and constrained the development of Christian culture, confining it to two independent cores in the north. The ninth century therefore saw the establishment of the County of Castile emerging from the two cores as the predecessor of the Kingdom of Castile (1065. Due to turbulent historical events, the place was populated by people from various adjacent and rather distant countries, thus making the spoken language a mixture of several varieties of Vulgar Latin, Mozarabic, and Navarrian (Basque elements. All of these features are reflected in the glosses in the San Millán manuscript. Therefore, it is difficult for linguists to name the variant of the Romance language the glosses were written in: “the Riojan dialect,” “a vernacular Castilian-Riojan dialect of the second half of the eleventh century displaying tendencies towards learned Latin,” or “a Riojan dialect with elements more common to neighboring dialects (Aragon, Navarrian, Léon, and Mozarabic than to Castilian.” However, because the San Millán glosses also include elements

  5. Renewable Energy Sources Act and Trading of Emission Certificates: A national and a supranational tool direct energy turnover to renewable electricity-supply in Germany

    International Nuclear Information System (INIS)

    Kirsten, Selder

    2014-01-01

    Aim: After the nuclear disaster at Fukushima in 2011, Germany decided to phase out atomic energy, without producing new CO 2 emissions. The article discusses the promotion systems that are used. Scope: The percentage of renewable energies in Germany's electricity consumption increased from 3 in 1990 to 23 in 2012. This development was introduced and guided by a law called Renewable Energy Sources Act. It guarantees a privileged acceptance of electricity and a fixed gratification for 20 years to the operators of regenerative power plants. It allows the operators to install regenerative power plants at a reduced risk. By contrast, the international means for CO 2 reduction is the trading of emission certificates, which is also valid for Germany. The article discusses how the promotion of the Erneuerbar-Energien-Gesetz (EEG) and other plant-based promotion systems fit into this condition. It also elucidates the actual decline of promotion, its problems to the country’s environmental economy and the approach of decentralized photovoltaic (PV) energy plants towards economical efficiency. Conclusions: Germany’s energy turnaround to a regenerative energy supply is characterized by a strong and differentiated promotion system. Substantial efforts have to be made as the percentage of the renewable energy sources has significantly increased but is still under 25%

  6. Understanding Extraordinary Architectural Experiences through Content Analysis of Written Narratives

    Directory of Open Access Journals (Sweden)

    Brandon Richard Ro

    2015-12-01

    Full Text Available This study a identifies how people describe, characterize, and communicate in written form Extraordinary Architectural Experiences (EAE, and b expands the traditional qualitative approach to architectural phenomenology by demonstrating a quantitative method to analyze written narratives. Specifically, this study reports on the content analysis of 718 personal accounts of EAEs. Using a deductive, ‘theory-driven’ approach, these narratives were read, coded, and statistically analyzed to identify storyline structure, convincing power, and the relationship between subjective and objective experiential qualities used in the story-telling process. Statistical intercoder agreement tests were conducted to verify the reliability of the interpretations to approach the hard problem of “extraordinary aesthetics” in architecture empirically. The results of this study confirm the aesthetic nature of EAE narratives (and of told experiences by showing their higher dependence on external objective content (e.g., a building’s features and location rather than its internal subjective counterpart (e.g., emotions and sensations, which makes them more outwardly focused. The strong interrelationships and intercoder agreement between the thematic realms provide a unique aesthetic construct revealing EAE narratives as memorable, embodied, emotional events mapped by the externally focused content of place, social setting, time, and building features. A majority of EAE narratives were found to possess plot-structure along with significant relationships to objective-subjective content that further grounded their storylines. This study concludes that content analysis provides not only a valid method to understand written narratives about extraordinary architectural experiences quantitatively, but also a view as to how to map the unique nature of aesthetic phenomenology empirically.

  7. The aeromagnetic method as a tool to identify Cenozoic magmatism in the West Antarctic Rift System beneath the West Antarctic Ice Sheet: a review; Thiel subglacial volcano as possible source of the ash layer in the WAISCOR

    Science.gov (United States)

    Behrendt, John C.

    2013-01-01

    The West Antarctic Ice Sheet (WAIS) flows through the volcanically active West Antarctic Rift System (WARS). The aeromagnetic method has been the most useful geophysical tool for identification of subglacial volcanic rocks, since 1959–64 surveys, particularly combined with 1978 radar ice-sounding. The unique 1991–97 Central West Antarctica (CWA) aerogeophysical survey covering 354,000 km2 over the WAIS, (5-km line-spaced, orthogonal lines of aeromagnetic, radar ice-sounding, and aerogravity measurements), still provides invaluable information on subglacial volcanic rocks, particularly combined with the older aeromagnetic profiles. These data indicate numerous 100–>1000 nT, 5–50-km width, shallow-source, magnetic anomalies over an area greater than 1.2 × 106 km2, mostly from subglacial volcanic sources. I interpreted the CWA anomalies as defining about 1000 “volcanic centers” requiring high remanent normal magnetizations in the present field direction. About 400 anomaly sources correlate with bed topography. At least 80% of these sources have less than 200 m relief at the WAIS bed. They appear modified by moving ice, requiring a younger age than the WAIS (about 25 Ma). Exposed volcanoes in the WARS are The present rapid changes resulting from global warming, could be accelerated by subglacial volcanism.

  8. The beginnings of the written culture in Antiquity

    Directory of Open Access Journals (Sweden)

    M. Isabel Panosa

    2004-04-01

    Full Text Available This paper proposes an analysis of writing as a system for communication, since its origins, in terms of its uses and socio-cultural context. We shall also look to review and comment on the way in which it has evolved in time and space and its primordial domains for expression. Likewise, we shall look at the current state of affairs with respect to graphic communication, which includes the alphabet, logographic systems and symbols. From a more global point of view, the relationship between the concept of writing and the concept of civilisation is studied and two dimensions are set out: the oral culture and the written culture.

  9. THE ORTHOGRAPHIC NORM IN SECONDARY SCHOOL STUDENTS’ WRITTEN ASSIGNMENTS

    Directory of Open Access Journals (Sweden)

    Ivana Đorđev

    2016-06-01

    Full Text Available This paper presents the results of research conducted with the primary objective to determine in which areas secondary school students usually make orthographic mistakes when writing (official written assignments. Starting from the hypothesis that the punctuation writing of whole and split words are areas in which secondary school students (regardless of age and school orientation achieved the weakest achievements an (exploratory research was conducted on a corpus of 3,135 written assignments written in the school year of 2010/11. The research sample was intentional, descriptive and analytical methods were used for the description and the analysis of the results. The results showed the following (1 secondary school students usually make mistakes in punctuation of written assignments - we recorded 4,487 errors in the use of signs to denote intonation and meaning of a text (errors of this type make 53.93% of the total number of spelling errors reported in the corpus of research; by frequency of errors the second are errors related to writing whole and split words (11.02%, the third error is in the use of the capital letter (9.34%; (2 most problems in orthography have second grade students, quantum of mistakes is almost the same with first graders and seniors, but in all grades the most frequent errors are in punctuation, writing of whole and split words and the use of capital letters; (3 Although school orientation affects the spelling skills of pupils, the weakest orthographic achievements are also recorded in punctuation, writing of whole and split words and capitalization, so those are areas that need to be thoroughly addressed in teaching and methodology literature. The results are, on the one hand, a picture of the current status of teaching orthography and grammar knowledge of secondary school students. On the other hand, the research results can be applied in all phases of methodical practical work in teaching orthography, the upgrading the

  10. Participation in Written Government Consultations in Denmark and the UK

    DEFF Research Database (Denmark)

    Rasmussen, Anne

    2015-01-01

    Despite the proliferation of instruments of public consultation in liberal democracies, little is known of how the design and use of these instruments affect stakeholder participation in practice. The article examines participation in written government consultations in an analysis of approximately...... 5,000 responses to consultations in Denmark and the UK in the first half of 2008. It shows that participation is highly conditional upon system-and actor-level characteristics in practice. Our findings indicate that, even if liberal democracies have adopted similar procedures for actor consultation...

  11. DOUBLE PARTICIPLES: STANDARD LANGUAGE, EVALUATION AND WRITTEN DATA

    OpenAIRE

    Fernanda Lima Jardim Miara; Izete Lehmkuhl Coelho

    2016-01-01

    This research aims to investigate the variation in the regular and irregular past participle forms, in active sentences and in passive sentences, based on a synchronic analysis of four verbs: salvar (to save), pegar (to take), abrir (to open) and chegar (to arrive). The sample is formed by a corpus, extracted from the online newspaper Diário Catarinense online. The results of this work show that irregular participles are best evaluated forms and they are the most frequent forms in written dat...

  12. DOUBLE PARTICIPLES: STANDARD LANGUAGE, EVALUATION AND WRITTEN DATA

    Directory of Open Access Journals (Sweden)

    Fernanda Lima Jardim Miara

    2016-01-01

    Full Text Available This research aims to investigate the variation in the regular and irregular past participle forms, in active sentences and in passive sentences, based on a synchronic analysis of four verbs: salvar (to save, pegar (to take, abrir (to open and chegar (to arrive. The sample is formed by a corpus, extracted from the online newspaper Diário Catarinense online. The results of this work show that irregular participles are best evaluated forms and they are the most frequent forms in written data in PB (cf. MIARA, 2013.

  13. Problems of culture of written expression in primary school

    Directory of Open Access Journals (Sweden)

    Zlatić Marina V.

    2014-01-01

    Full Text Available This paper investigates the issue of the culture of written expression in primary school students. Starting from the fact that teaching practices increasingly points to the fact that knowledge of rules of writing in primary school students presents the weakest link in teaching Serbian language, we sought to describe the problem, point to the possible causes, propose measures and illustrate all this on concrete examples of students' essays. Our microinvestigation showed that primary school students display considerably poorer mastery of rules of writing than previously thought, to the extent that it presents a serious obstacle in language teaching as well as in other areas of educational process.

  14. Transforming Biology Assessment with Machine Learning: Automated Scoring of Written Evolutionary Explanations

    Science.gov (United States)

    Nehm, Ross H.; Ha, Minsu; Mayfield, Elijah

    2012-02-01

    This study explored the use of machine learning to automatically evaluate the accuracy of students' written explanations of evolutionary change. Performance of the Summarization Integrated Development Environment (SIDE) program was compared to human expert scoring using a corpus of 2,260 evolutionary explanations written by 565 undergraduate students in response to two different evolution instruments (the EGALT-F and EGALT-P) that contained prompts that differed in various surface features (such as species and traits). We tested human-SIDE scoring correspondence under a series of different training and testing conditions, using Kappa inter-rater agreement values of greater than 0.80 as a performance benchmark. In addition, we examined the effects of response length on scoring success; that is, whether SIDE scoring models functioned with comparable success on short and long responses. We found that SIDE performance was most effective when scoring models were built and tested at the individual item level and that performance degraded when suites of items or entire instruments were used to build and test scoring models. Overall, SIDE was found to be a powerful and cost-effective tool for assessing student knowledge and performance in a complex science domain.

  15. Uses of the Word “Macula” in Written English, 1400-Present

    Science.gov (United States)

    Schwartz, Stephen G.; Leffler, Christopher T.

    2014-01-01

    We compiled uses of the word “macula” in written English by searching multiple databases, including the Early English Books Online Text Creation Partnership, America’s Historical Newspapers, the Gale Cengage Collections, and others. “Macula” has been used: as a non-medical “spot” or “stain”, literal or figurative, including in astronomy and in Shakespeare; as a medical skin lesion, occasionally with a following descriptive adjective, such as a color (macula alba); as a corneal lesion, including the earliest identified use in English, circa 1400; and to describe the center of the retina. Francesco Buzzi described a yellow color in the posterior pole (“retina tinta di un color giallo”) in 1782, but did not use the word “macula”. “Macula lutea” was published by Samuel Thomas von Sömmering by 1799, and subsequently used in 1818 by James Wardrop, which appears to be the first known use in English. The Google n-gram database shows a marked increase in the frequencies of both “macula” and “macula lutea” following the introduction of the ophthalmoscope in 1850. “Macula” has been used in multiple contexts in written English. Modern databases provide powerful tools to explore historical uses of this word, which may be underappreciated by contemporary ophthalmologists. PMID:24913329

  16. Java Power Tools

    CERN Document Server

    Smart, John

    2008-01-01

    All true craftsmen need the best tools to do their finest work, and programmers are no different. Java Power Tools delivers 30 open source tools designed to improve the development practices of Java developers in any size team or organization. Each chapter includes a series of short articles about one particular tool -- whether it's for build systems, version control, or other aspects of the development process -- giving you the equivalent of 30 short reference books in one package. No matter which development method your team chooses, whether it's Agile, RUP, XP, SCRUM, or one of many other

  17. Identification of abiotic and biotic reductive dechlorination in a chlorinated ethene plume after thermal source remediation by means of isotopic and molecular biology tools

    DEFF Research Database (Denmark)

    Badin, Alice; Broholm, Mette Martina; Jacobsen, Carsten S.

    2016-01-01

    Thermal tetrachloroethene (PCE) remediation by steam injection in a sandy aquifer led to the release of dissolved organic carbon (DOC) from aquifer sediments resulting in more reduced redox conditions, accelerated PCE biodegradation, and changes in microbial populations. These changes were...... documented by comparing data collected prior to the remediation event and eight years later. Based on the premise that dual C-Cl isotope slopes reflect ongoing degradation pathways, the slopes associated with PCE and TCE suggest the predominance of biotic reductive dechlorination near the source area. PCE...... is supported by the relative lack of Dhc in the downgradient part of the plume. The results of this study show that thermal remediation can enhance the biodegradation of chlorinated ethenes, and that this effect can be traced to the mobilisation of DOC due to steam injection. This, in turn, results in more...

  18. Comparative implementation of Handwritten and Machine written Gurmukhi text utilizing appropriate parameters

    Science.gov (United States)

    Kaur, Jaswinder; Jagdev, Gagandeep, Dr.

    2018-01-01

    Optical character recognition is concerned with the recognition of optically processed characters. The recognition is done offline after the writing or printing has been completed, unlike online recognition where the computer has to recognize the characters instantly as they are drawn. The performance of character recognition depends upon the quality of scanned documents. The preprocessing steps are used for removing low-frequency background noise and normalizing the intensity of individual scanned documents. Several filters are used for reducing certain image details and enabling an easier or faster evaluation. The primary aim of the research work is to recognize handwritten and machine written characters and differentiate them. The language opted for the research work is Punjabi Gurmukhi and tool utilized is Matlab.

  19. [A workshop to improve written communication skills of medical students].

    Science.gov (United States)

    Bitran, Marcela; Zúñiga, Denisse; Flotts, Paulina; Padilla, Oslando; Moreno, Rodrigo

    2009-05-01

    Despite being among the best academically prepared of the country, many medical students have difficulties to communicate in writing. In 2005, the School of Medicine at the Pontificia Universidad Católica de Chile introduced a writing workshop in the undergraduate curriculum, to enhance the students' writing skills. To describe the workshop and its impact on the writing skills of 3 cohorts of students. This 30-h workshop used a participative methodology with emphasis on deliberate practice and feedback. Students worked in small groups with a faculty member specially trained in writing. The qualities of the essays written before and after the workshop were compared. Essays were rated by a professional team that used an analytic rubric to measure formal aspects of text writing as well as more complex thinking processes. There was a significant improvement in the quality of the texts written after the workshop; the main changes occurred in argumentation, and in paragraph and text structure. This improvement was inversely proportional to the initial level of performance, and independent of gender. A writing workshop based on deliberate practice and personalized feedback is effective to enhance the writing proficiency of medical students. Due to its design, this workshop could be useful for students of other careers and universities.

  20. Clarity Versus Accuracy and Objectivity in Written Legal English

    Directory of Open Access Journals (Sweden)

    Violeta Janulevičienė

    2011-12-01

    Full Text Available This paper is an attempt to analyse the most important grammatical and, specifically, syntactic features and to point out some prominent lexical ones, which aim at accuracy and objectivity of a written legal document, and to discuss how these features influence clarity and transparency of the legal documents. The study covers the analysis of some EU, UK, US legislative acts alongside with some extracts from contract samples. The analysis reveals that written legal English is distinguished by long compound sentences, often with inverted word order and numerous embeddings, passive constructions and nominalisations, specific use of personal pronouns and collocations of synonyms (doublets and triplets, etc. These means allow to achieve the most possible accuracy and objectivity in legal texts but make them complicated and difficult to comprehend at once. Formality, achieved by the mentioned means, makes legal English distant from everyday language and often becomes a reason for criticism. Plain English supporters encourage simplifying legal language; however, long traditions of legal English make changes slow and difficult. Therefore, comprehension and usage of legal English still requires special knowledge of its lexical and grammatical features.

  1. Fundaments for the study of orality in written language

    Directory of Open Access Journals (Sweden)

    José Gaston Hilgert

    2015-02-01

    Full Text Available In this paper, we put forth some reflections upon the production of effects of orality in written texts in light of the fundaments of enunciation. In this theoretical context, we show that the study of orality in written language should not depart from the random identification of lexical and syntactic, figurative and thematic, stylistic or rhetoric resources. What matters is the identification of the interactive scenario in which these linguistic resources are manifested. The interactive scenario is configured by the relationship between narrator/narratee revealed in the text. If this relation takes place by means of the interaction between an I (narrator and a you (narratee, either explicit or implicit, then it is instituted, in this scenario, the basic principle of dialog, of conversation, which defines the proximity condition of the interlocutors and, therefore, the interactive scenario favorable to the use of orality resources. When this relation, however, takes place in the form of a third person narrator who addresses him/herself to an implicit reader, the scenario of distancing is installed, in which orality resources may be unfit or, if they occur, they may have specific functions. This text addresses special attention to the interactive scenario set by the interaction between I/you, showing, in different examples, traits of orality determined by such interaction, and also the various degrees of proximity that this interaction may reveal in its various manifestations.

  2. ELLIPT2D: A Flexible Finite Element Code Written Python

    International Nuclear Information System (INIS)

    Pletzer, A.; Mollis, J.C.

    2001-01-01

    The use of the Python scripting language for scientific applications and in particular to solve partial differential equations is explored. It is shown that Python's rich data structure and object-oriented features can be exploited to write programs that are not only significantly more concise than their counter parts written in Fortran, C or C++, but are also numerically efficient. To illustrate this, a two-dimensional finite element code (ELLIPT2D) has been written. ELLIPT2D provides a flexible and easy-to-use framework for solving a large class of second-order elliptic problems. The program allows for structured or unstructured meshes. All functions defining the elliptic operator are user supplied and so are the boundary conditions, which can be of Dirichlet, Neumann or Robbins type. ELLIPT2D makes extensive use of dictionaries (hash tables) as a way to represent sparse matrices.Other key features of the Python language that have been widely used include: operator over loading, error handling, array slicing, and the Tkinter module for building graphical use interfaces. As an example of the utility of ELLIPT2D, a nonlinear solution of the Grad-Shafranov equation is computed using a Newton iterative scheme. A second application focuses on a solution of the toroidal Laplace equation coupled to a magnetohydrodynamic stability code, a problem arising in the context of magnetic fusion research

  3. Oral and Written Expression in Children With Reading Comprehension Difficulties.

    Science.gov (United States)

    Carretti, Barbara; Motta, Eleonora; Re, Anna Maria

    2016-01-01

    Several studies have highlighted that children with reading comprehension difficulties also have problems in tasks that involve telling a story, in writing or verbally. The main differences identified regard poor comprehenders' lower level of coherence in their productions by comparison with good comprehenders. Only one study has compared poor and good comprehenders' performance in both modalities (oral and written), however, to see whether these modalities differently influence poor comprehenders' performance. We qualitatively and quantitatively compared the performance of good and poor comprehenders in oral and written narrative tasks with the aim of shedding light on this issue. Regression analyses were also used to explore the role of working memory and vocabulary in explaining individual differences. Our results showed that the two groups produced narratives of comparable length, with similar percentages of spelling mistakes, whereas they differed in terms of the quality of their narratives, regardless of the modality. These differences were qualified by analyzing the children's use of connective devices, and poor comprehenders were found to use a higher proportion of additive devices than good comprehenders. Regression analyses showed that working memory (particularly the intrusion errors measure) explained a modest part of the qualitative differences in narrative production. Implications for our theoretical understanding of poor comprehenders' profiles and education are discussed. © Hammill Institute on Disabilities 2014.

  4. Assessment of Written Expression Skills of University Students in Terms of Text Completion Technique

    Directory of Open Access Journals (Sweden)

    Abdulkadir KIRBAŞ

    2017-12-01

    Full Text Available Writing is to transfer the visualised ideas on the paper. Writing, one of the language skills, is a significant tool of communication which provides the permanency of information conveying emotions and thoughts. Since writing has both cognitive and physical aspects, it makes writing the hardest and the latest language skill to improve. The studies show that writing activity is the most difficult skill students have difficulty. In higher education, in order to improve writing skills of students and give basic information and skills about writing skills written expression, composition and writing education lessons are taught both in the department of Turkish Language and Literature and in the departments of Turkish Language in the Faculties of Education. One of the aims of these lessons is to teach students written expression techniques together with the purposes and practices. One of the written expression techniques is text completion skill that improves student’s creativity and enhances her/his imaginary world. The purpose of this study is to assess students’ skills of using text completion technique with reference to the writing studies of students in higher education. the sample of the study consists of 85 college students studying in the department of Turkish Language and Literature in Gümüşhane University in 2016-2017 academic year. The data of the study were obtained from the written expression studies of the students. The introduction part of the article ‘On Reading’ by F. Bacon was given to the students and they were required to complete the text. ‘Text Completion Rating Scale in Writing Expression’ was developed to assess the data of the study by taking opinions of lecturers and Turkish education experts. The data of the study were presented with percentage and frequency rates. At the end of the study, it was concluded that students had weakness in some skills such as writing an effective body part about the topic given

  5. Visualization and analysis of atomistic simulation data with OVITO–the Open Visualization Tool

    International Nuclear Information System (INIS)

    Stukowski, Alexander

    2010-01-01

    The Open Visualization Tool (OVITO) is a new 3D visualization software designed for post-processing atomistic data obtained from molecular dynamics or Monte Carlo simulations. Unique analysis, editing and animations functions are integrated into its easy-to-use graphical user interface. The software is written in object-oriented C++, controllable via Python scripts and easily extendable through a plug-in interface. It is distributed as open-source software and can be downloaded from the website http://ovito.sourceforge.net/

  6. UV spectral fingerprinting and analysis of variance-principal component analysis: a useful tool for characterizing sources of variance in plant materials.

    Science.gov (United States)

    Luthria, Devanand L; Mukhopadhyay, Sudarsan; Robbins, Rebecca J; Finley, John W; Banuelos, Gary S; Harnly, James M

    2008-07-23

    UV spectral fingerprints, in combination with analysis of variance-principal components analysis (ANOVA-PCA), can differentiate between cultivars and growing conditions (or treatments) and can be used to identify sources of variance. Broccoli samples, composed of two cultivars, were grown under seven different conditions or treatments (four levels of Se-enriched irrigation waters, organic farming, and conventional farming with 100 and 80% irrigation based on crop evaporation and transpiration rate). Freeze-dried powdered samples were extracted with methanol-water (60:40, v/v) and analyzed with no prior separation. Spectral fingerprints were acquired for the UV region (220-380 nm) using a 50-fold dilution of the extract. ANOVA-PCA was used to construct subset matrices that permitted easy verification of the hypothesis that cultivar and treatment contributed to a difference in the chemical expression of the broccoli. The sums of the squares of the same matrices were used to show that cultivar, treatment, and analytical repeatability contributed 30.5, 68.3, and 1.2% of the variance, respectively.

  7. STUDY OF SOURCE FINDING TOOLS IN GASKAP SURVEY%GASKAP大规模巡天天区中目标源查找工具的比较研究

    Institute of Scientific and Technical Information of China (English)

    吴丹; 田文武; 朱辉

    2014-01-01

    The SKA proj ect and SKA peer proj ects such as MeerKAT,APERTIF,ASKAP and the FAST will lead radio astronomy into a new era of vast amounts of data.The vast amounts of data requires researchers to collect,manage,analyze and store,so computer-controlled automatic classification filter pretreatment is imminent.We are making progress on simulation of large amounts of data to select candidate sources.The source finding tools used include Duchamp,Clumpfind,Seleavy.We compare the performances of several tools in order to improve efficiency of source finding.%在射电天文领域,SKA项目以及作为 SKA早期项目如 MeerKAT,APERTIF,ASKAP 和中国的 FAST的提出与建设很快就会将射电天文学带入一个海量数据产生的新时代。这些海量的数据需要研究者搜集、管理、处理分析和存储,计算机控制下自动分类筛选预处理迫在眉睫。寻找有效的方法在巡天收集的大量数据中来确定独特的源的模拟工作正在进行。针对 GASKAP (ASKAP 银道面谱线巡天项目)目前用到的源查找工具主要有 Duchamp、Clumpfind、Seleavy等。我们比较几种工具找源的表现,以提高目标源查找的效率。

  8. A Coupling Tool for Parallel Molecular Dynamics-Continuum Simulations

    KAUST Repository

    Neumann, Philipp; Tchipev, Nikola

    2012-01-01

    We present a tool for coupling Molecular Dynamics and continuum solvers. It is written in C++ and is meant to support the developers of hybrid molecular - continuum simulations in terms of both realisation of the respective coupling algorithm

  9. A Feynman graph selection tool in GRACE system

    International Nuclear Information System (INIS)

    Yuasa, Fukuko; Ishikawa, Tadashi; Kaneko, Toshiaki

    2001-01-01

    We present a Feynman graph selection tool grcsel, which is an interpreter written in C language. In the framework of GRACE, it enables us to get a subset of Feynman graphs according to given conditions

  10. Electronic circuit design with HEP computational tools

    International Nuclear Information System (INIS)

    Vaz, Mario

    1996-01-01

    CPSPICE is an electronic circuit statistical simulation program developed to run in a parallel environment under UNIX operating system and TCP/IP communications protocol, using CPS - Cooperative Processes Software , SPICE program and CERNLIB software package. It is part of a set of tools being develop, intended to help electronic engineers to design, model and simulate complex systems and circuits for High Energy Physics detectors, based on statistical methods, using the same software and methodology used by HEP physicists for data analysis. CPSPICE simulates electronic circuits by Monte Carlo method, through several different processes running simultaneously SPICE in UNIX parallel computers or workstation farms. Data transfer between CPS processes for a modified version of SPICE2G6 is done by RAM memory, but can also be done through hard disk files if no source files are available for the simulator, and for bigger simulation outputs files. Simulation results are written in a HBOOK file as a NTUPLE, to be examined by HBOOK in batch model or graphics, and analyzed by statistical procedures available. The HBOOK file be stored on hard disk for small amount of data, or into Exabyte tape file for large amount of data. HEP tools also helps circuit or component modeling, like MINUT program from CERNLIB, that implements Nelder and Mead Simplex and Gradient with or without derivatives algorithms, and can be used for design optimization.This paper presents CPSPICE program implementation. The scheme adopted is suitable to make parallel other electronic circuit simulators. (author)

  11. BioSimplify: an open source sentence simplification engine to improve recall in automatic biomedical information extraction.

    Science.gov (United States)

    Jonnalagadda, Siddhartha; Gonzalez, Graciela

    2010-11-13

    BioSimplify is an open source tool written in Java that introduces and facilitates the use of a novel model for sentence simplification tuned for automatic discourse analysis and information extraction (as opposed to sentence simplification for improving human readability). The model is based on a "shot-gun" approach that produces many different (simpler) versions of the original sentence by combining variants of its constituent elements. This tool is optimized for processing biomedical scientific literature such as the abstracts indexed in PubMed. We tested our tool on its impact to the task of PPI extraction and it improved the f-score of the PPI tool by around 7%, with an improvement in recall of around 20%. The BioSimplify tool and test corpus can be downloaded from https://biosimplify.sourceforge.net.

  12. Testing tool for software concerning nuclear power plant safety

    International Nuclear Information System (INIS)

    Boulc'h, J.; Le Meur, M.; Collart, J.M.; Segalard, J.; Uberschlag, J.

    1984-11-01

    In the present case, softwares to be analyzed are all written in assembler language. This paper presents the study and the realization of a tool to analyze softwares which have an important role for nuclear reactor protection and sauvegarde: principles of the tool design, working principle, realization and evolution of dynamic analyze tool [fr

  13. Documents written by the heads of the Catechetical School in Alexandria: From Mark to Clement

    Directory of Open Access Journals (Sweden)

    Willem H. Oliver

    2017-01-01

    Full Text Available The Catechetical School in Alexandria has delivered a number of prolific scholars and writers during the first centuries of the Common Era, up to its demise by the end of the 4th century. These scholars have produced an extensive collection of documents of which not many are extant. Fortunately, there are many references to these documents supplying us with an idea of the content thereof. As the author could not find one single source containing all the documents written by the heads of the School, he deemed it necessary to list these documents, together with a short discussion of it where possible. This article only discusses the writings of the following heads: Mark the Evangelist, Athenagoras, Pantaenus and Clement, covering the period between approximately 40 CE and the end of the 2nd century. The follow-up article discusses the documents of the heads who succeeded them.Intradisciplinary and/or interdisciplinary implications: The potential results of the proposed research are a full detailed list of all the documents being written by the heads of the School in Alexandria. The disciplines involved are (Church History, Theology and Antiquity. These results will make it easier for future researchers to work on these writers.

  14. GEAS Spectroscopy Tools for Authentic Research Investigations in the Classroom

    Science.gov (United States)

    Rector, Travis A.; Vogt, Nicole P.

    2018-06-01

    Spectroscopy is one of the most powerful tools that astronomers use to study the universe. However relatively few resources are available that enable undergraduates to explore astronomical spectra interactively. We present web-based applications which guide students through the analysis of real spectra of stars, galaxies, and quasars. The tools are written in HTML5 and function in all modern web browsers on computers and tablets. No software needs to be installed nor do any datasets need to be downloaded, enabling students to use the tools in or outside of class (e.g., for online classes).Approachable GUIs allow students to analyze spectra in the same manner as professional astronomers. The stellar spectroscopy tool can fit a continuum with a blackbody and identify spectral features, as well as fit line profiles and determine equivalent widths. The galaxy and AGN tools can also measure redshifts and calcium break strengths. The tools provide access to an archive of hundreds of spectra obtained with the optical telescopes at Kitt Peak National Observatory. It is also possible to load your own spectra or to query the Sloan Digital Sky Survey (SDSS) database.We have also developed curricula to investigate these topics: spectral classification, variable stars, redshift, and AGN classification. We will present the functionality of the tools and describe the associated curriculum. The tools are part of the General Education Astronomy Source (GEAS) project based at New Mexico State University, with support from the National Science Foundation (NSF, AST-0349155) and the National Aeronautics and Space Administration (NASA, NNX09AV36G). Curriculum development was supported by the NSF (DUE-0618849 and DUE-0920293).

  15. In situ quantification of Br and Cl in minerals and fluid inclusions by LA-ICP-MS: a powerful tool to identify fluid sources

    Science.gov (United States)

    Hammerli, Johannes; Rusk, Brian; Spandler, Carl; Emsbo, Poul; Oliver, Nicholas H.S.

    2013-01-01

    Bromine and chlorine are important halogens for fluid source identification in the Earth's crust, but until recently we lacked routine analytical techniques to determine the concentration of these elements in situ on a micrometer scale in minerals and fluid inclusions. In this study, we evaluate the potential of in situ Cl and Br measurements by LA-ICP-MS through analysis of a range of scapolite grains with known Cl and Br concentrations. We assess the effects of varying spot sizes, variable plasma energy and resolve the contribution of polyatomic interferences on Br measurements. Using well-characterised natural scapolite standards, we show that LA-ICP-MS analysis allows measurement of Br and Cl concentrations in scapolite, and fluid inclusions as small as 16 μm in diameter and potentially in sodalite and a variety of other minerals, such as apatite, biotite, and amphibole. As a demonstration of the accuracy and potential of Cl and Br analyses by LA-ICP-MS, we analysed natural fluid inclusions hosted in sphalerite and compared them to crush and leach ion chromatography Cl/Br analyses. Limit of detection for Br is ~8 μg g−1, whereas relatively high Cl concentrations (> 500 μg g−1) are required for quantification by LA-ICP-MS. In general, our LA-ICP-MS fluid inclusion results agree well with ion chromatography (IC) data. Additionally, combined cathodoluminescence and LA-ICP-MS analyses on natural scapolites within a well-studied regional metamorphic suite in South Australia demonstrate that Cl and Br can be quantified with a ~25 μm resolution in natural minerals. This technique can be applied to resolve a range of hydrothermal geology problems, including determining the origins of ore forming brines and ore deposition processes, mapping metamorphic and hydrothermal fluid provinces and pathways, and constraining the effects of fluid–rock reactions and fluid mixing.

  16. User Friendly Open GIS Tool for Large Scale Data Assimilation - a Case Study of Hydrological Modelling

    Science.gov (United States)

    Gupta, P. K.

    2012-08-01

    Open source software (OSS) coding has tremendous advantages over proprietary software. These are primarily fuelled by high level programming languages (JAVA, C++, Python etc...) and open source geospatial libraries (GDAL/OGR, GEOS, GeoTools etc.). Quantum GIS (QGIS) is a popular open source GIS package, which is licensed under GNU GPL and is written in C++. It allows users to perform specialised tasks by creating plugins in C++ and Python. This research article emphasises on exploiting this capability of QGIS to build and implement plugins across multiple platforms using the easy to learn - Python programming language. In the present study, a tool has been developed to assimilate large spatio-temporal datasets such as national level gridded rainfall, temperature, topographic (digital elevation model, slope, aspect), landuse/landcover and multi-layer soil data for input into hydrological models. At present this tool has been developed for Indian sub-continent. An attempt is also made to use popular scientific and numerical libraries to create custom applications for digital inclusion. In the hydrological modelling calibration and validation are important steps which are repetitively carried out for the same study region. As such the developed tool will be user friendly and used efficiently for these repetitive processes by reducing the time required for data management and handling. Moreover, it was found that the developed tool can easily assimilate large dataset in an organised manner.

  17. Introduction of CLIL approach in Sociological Doctoral Programmes: the Ethnolinguistic Focus on Theses Written in Russian or in English

    Directory of Open Access Journals (Sweden)

    Maria Pavenkova

    2016-06-01

    Full Text Available The paper examines some limits of the introduction of SFL-based CLIL approach in non-western sociological doctoral programmes. The author is focusing specifically on pragmatic markers as tools for the structuring of science written discourse and using this approach to identify the differences between Russian and English academic genres.  Data was collected from doctoral theses in Russian and in English from the field of sociology of management. It is shown that the average number of pragmatic markers at 1000 words-3.81 in Russian theses and 2.27 in doctoral theses written in English. The author suggests that these variations are associated with the structure and goals of a scholarly paper. English academic genres are more empirical, whereas Russian focused on the development of theory.

  18. Content validation applied to job simulation and written examinations

    International Nuclear Information System (INIS)

    Saari, L.M.; McCutchen, M.A.; White, A.S.; Huenefeld, J.C.

    1984-08-01

    The application of content validation strategies in work settings have become increasingly popular over the last few years, perhaps spurred by an acknowledgment in the courts of content validation as a method for validating employee selection procedures (e.g., Bridgeport Guardians v. Bridgeport Police Dept., 1977). Since criterion-related validation is often difficult to conduct, content validation methods should be investigated as an alternative for determining job related selection procedures. However, there is not yet consensus among scientists and professionals concerning how content validation should be conducted. This may be because there is a lack of clear cut operations for conducting content validation for different types of selection procedures. The purpose of this paper is to discuss two content validation approaches being used for the development of a licensing examination that involves a job simulation exam and a written exam. These represent variations in methods for applying content validation. 12 references

  19. Optical Music Recognition for Scores Written in White Mensural Notation

    Directory of Open Access Journals (Sweden)

    Tardón LorenzoJ

    2009-01-01

    Full Text Available An Optical Music Recognition (OMR system especially adapted for handwritten musical scores of the XVII-th and the early XVIII-th centuries written in white mensural notation is presented. The system performs a complete sequence of analysis stages: the input is the RGB image of the score to be analyzed and, after a preprocessing that returns a black and white image with corrected rotation, the staves are processed to return a score without staff lines; then, a music symbol processing stage isolates the music symbols contained in the score and, finally, the classification process starts to obtain the transcription in a suitable electronic format so that it can be stored or played. This work will help to preserve our cultural heritage keeping the musical information of the scores in a digital format that also gives the possibility to perform and distribute the original music contained in those scores.

  20. Synergistic relationships between Analytical Chemistry and written standards.

    Science.gov (United States)

    Valcárcel, Miguel; Lucena, Rafael

    2013-07-25

    This paper describes the mutual impact of Analytical Chemistry and several international written standards (norms and guides) related to knowledge management (CEN-CWA 14924:2004), social responsibility (ISO 26000:2010), management of occupational health and safety (OHSAS 18001/2), environmental management (ISO 14001:2004), quality management systems (ISO 9001:2008) and requirements of the competence of testing and calibration laboratories (ISO 17025:2004). The intensity of this impact, based on a two-way influence, is quite different depending on the standard considered. In any case, a new and fruitful approach to Analytical Chemistry based on these relationships can be derived. Copyright © 2013 Elsevier B.V. All rights reserved.

  1. How diaries written for critically ill influence the relatives

    DEFF Research Database (Denmark)

    Nielsen, Anne Højager; Angel, Sanne

    2016-01-01

    BACKGROUND: Diaries written by nurses for the critically ill patient helps relatives cope and support the patient. When relatives participate in writing a diary for the critically ill, patients appreciate it. Furthermore, the diary may reduce post-traumatic stress disorder, anxiety and depression......-selected articles. Finally, 10 articles were included in this review structured by the Matrix method. INCLUSION CRITERIA: (a) Original scientific work, (b) relatives participation and experience of the diary as subject and (c) diaries studied in an intensive care unit setting. FINDINGS: Relatives were given...... instructions on how to write in the diary. They expressed strong feelings in the diary in a very different way than health care staff. The relatives used the diary themselves to gain understanding and to cope. The diary has been shown to prevent post-traumatic stress symptoms. CONCLUSION: The relatives express...

  2. Effect of written presentation on performance in introductory physics

    Directory of Open Access Journals (Sweden)

    Shawn Ballard

    2010-10-01

    Full Text Available This study examined the written work of students in the introductory calculus-based electricity and magnetism course at the University of Arkansas. The students’ solutions to hourly exams were divided into a small set of countable features organized into three major categories, mathematics, language, and graphics. Each category was further divided into subfeatures. The total number of features alone explained more than 30% of the variance in exam scores and from 9% to 15% of the variance in conceptual posttest scores. If all features and subfeatures are used, between 44% and 49% of the variance in exam scores is explained and between 22% and 28% of the variance in conceptual posttest scores. The use of language is consistently positively correlated with both exam performance and conceptual understanding.

  3. How Primary Education students organize the planning of the written

    Directory of Open Access Journals (Sweden)

    José Luis Gallego Ortega

    2013-05-01

    Full Text Available The present paper deals with a study aimed at providing an insight into how students with Primary Education plan their written expression. The exploratory and qualitative investigation resorts to the “collective case study” and it is used the “cognitive interview” to obtain the items, applying the “content analysis” in the interpretation of them. This research has allowed us to identify the process in content organization, how students make these operations and the main difficulties founded in it. The results of the research show that, although these students normally make the operations of this textual organization process, they have important difficulties regarding the general planning of the ideas in a text. Finally, this article gives some guidelines for the teaching of writing structure in Primary Education students.

  4. Thermomagnetically written domains in TbFeCo thin films

    International Nuclear Information System (INIS)

    Reim, W.; Weller, D.

    1988-01-01

    Characteristic features of thermomagnetically written domains in amorphous Tb x (Fe 90 Co 10 ) 100-x alloy thin films having different magnetic properties are reported. In particular, the writing process in materials with low Tb content chi ≤ 21 dominated by the demagnetizing field is compared to the bias field dominated process in Tb rich samples 22 ≤ chi ≤ 25. Domain wall movement over lateral dimensions of the bit size is found for Tb poor materials while for chi ≥ 22 domain boundaries are primarily determined by the area heated up to the Curie-temperature. The importance of mechanical stress on domain formation and irreversible changes of the storage medium due to overheating in the writing process are reported

  5. Source reliability in auditory health persuasion : Its antecedents and consequences

    NARCIS (Netherlands)

    Elbert, Sarah P.; Dijkstra, Arie

    2015-01-01

    Persuasive health messages can be presented through an auditory channel, thereby enhancing the salience of the source, making it fundamentally different from written or pictorial information. We focused on the determinants of perceived source reliability in auditory health persuasion by

  6. DEVELOPMENT OF COMPLEXITY, ACCURACY, AND FLUENCY IN HIGH SCHOOL STUDENTS’ WRITTEN FOREIGN LANGUAGE PRODUCTION

    Directory of Open Access Journals (Sweden)

    Bouchaib Benzehaf

    2016-11-01

    Full Text Available The present study aims to longitudinally depict the dynamic and interactive development of Complexity, Accuracy, and Fluency (CAF in multilingual learners’ L2 and L3 writing. The data sources include free writing tasks written in L2 French and L3 English by 45 high school participants over a period of four semesters. CAF dimensions are measured using a variation of Hunt’s T-units (1964. Analysis of the quantitative data obtained suggests that CAF measures develop differently for learners’ L2 French and L3 English. They increase more persistently in L3 English, and they display the characteristics of a dynamic, non-linear system characterized by ups and downs particularly in L2 French. In light of the results, we suggest more and denser longitudinal data to explore the nature of interactions between these dimensions in foreign language development, particularly at the individual level.

  7. Written Mathematical Traditions in Ancient Mesopotamia: Knowledge, ignorance, and reasonable guesses

    DEFF Research Database (Denmark)

    Høyrup, Jens

    of the latter tradition to type of writing after the Old Babylonian period is not well elucidated by the sources. Much worse, however, is the situation if we consider the sophisticated mathematics created during the Old Babylonian period. Its connection to the school institution and the new literate style......Writing, as well as various mathematical techniques, were created in proto-literate Uruk in order to serve accounting, and Mesopotamian mathematics as we know it was always expressed in writing. In so far, mathematics generically regarded was always part of the generic written tradition. However......, once we move away from the generic perspective, things become much less easy. If we look at basic numeracy from Uruk IV until Ur III, it is possible to point to continuity and thus to a “tradition”, and also if we look at place-value practical computation from Ur III onward – but already the relation...

  8. Mobile devices tools and technologies

    CERN Document Server

    Collins, Lauren

    2015-01-01

    Mobile Devices: Tools and Technologies provides readers with an understanding of the mobile landscape available to app developers, system and network engineers, and the avid techie. As the trend of mobile technology has enabled the continuous development of ubiquitous applications, this book offers insights into tools and technologies critical to evaluating and implementing mobile strategies.The book is organized into four parts of 18 contributed chapters written by engineers in the areas of application and database development, mobile enterprise strategy, and networking and security. Througho

  9. Developmental perspectives in written language and literacy: In honor of Ludo Verhoeven

    NARCIS (Netherlands)

    Segers, P.C.J.; Broek, P.W. van den

    2017-01-01

    Research on the development on written language and literacy is inherently multidisciplinary. In this book, leading researchers studying brain, cognition and behavior, come together in revealing how children develop written language and literacy, why they may experience difficulties, and which

  10. APT: Aperture Photometry Tool

    Science.gov (United States)

    Laher, Russ

    2012-08-01

    Aperture Photometry Tool (APT) is software for astronomers and students interested in manually exploring the photometric qualities of astronomical images. It has a graphical user interface (GUI) which allows the image data associated with aperture photometry calculations for point and extended sources to be visualized and, therefore, more effectively analyzed. Mouse-clicking on a source in the displayed image draws a circular or elliptical aperture and sky annulus around the source and computes the source intensity and its uncertainty, along with several commonly used measures of the local sky background and its variability. The results are displayed and can be optionally saved to an aperture-photometry-table file and plotted on graphs in various ways using functions available in the software. APT is geared toward processing sources in a small number of images and is not suitable for bulk processing a large number of images, unlike other aperture photometry packages (e.g., SExtractor). However, APT does have a convenient source-list tool that enables calculations for a large number of detections in a given image. The source-list tool can be run either in automatic mode to generate an aperture photometry table quickly or in manual mode to permit inspection and adjustment of the calculation for each individual detection. APT displays a variety of useful graphs, including image histogram, and aperture slices, source scatter plot, sky scatter plot, sky histogram, radial profile, curve of growth, and aperture-photometry-table scatter plots and histograms. APT has functions for customizing calculations, including outlier rejection, pixel “picking” and “zapping,” and a selection of source and sky models. The radial-profile-interpolation source model, accessed via the radial-profile-plot panel, allows recovery of source intensity from pixels with missing data and can be especially beneficial in crowded fields.

  11. Written Expression Performance in Adolescents with Attention-Deficit/Hyperactivity Disorder (ADHD)

    Science.gov (United States)

    DeBono, Tony; Hosseini, Armita; Cairo, Cassandra; Ghelani, Karen; Tannock, Rosemary; Toplak, Maggie E.

    2012-01-01

    We examined written expression performance in a sample of adolescents with ADHD and subthreshold ADHD using two different strategies: examining performance on standardized measures of written expression and using other indicators of written expression developed in this study. We examined associations between standardized measures of written…

  12. How Does Dissociation between Written and Oral Forms Affect Reading: Evidence from Auxiliary Verbs in Arabic

    Science.gov (United States)

    Ibrahim, Raphiq

    2011-01-01

    In Arabic, auxiliary verbs are necessary in the written language, but absent from the oral language. This is contrary to languages such as English and French in which auxiliary verbs are mandatory in both written and oral languages. This fact was exploited to examine if dissociation between written and oral forms affects reading measures like…

  13. A prospective international cooperative information technology platform built using open-source tools for improving the access to and safety of bone marrow transplantation in low- and middle-income countries.

    Science.gov (United States)

    Agarwal, Rajat Kumar; Sedai, Amit; Dhimal, Sunil; Ankita, Kumari; Clemente, Luigi; Siddique, Sulman; Yaqub, Naila; Khalid, Sadaf; Itrat, Fatima; Khan, Anwar; Gilani, Sarah Khan; Marwah, Priya; Soni, Rajpreet; Missiry, Mohamed El; Hussain, Mohamed Hamed; Uderzo, Cornelio; Faulkner, Lawrence

    2014-01-01

    Jagriti Innovations developed a collaboration tool in partnership with the Cure2Children Foundation that has been used by health professionals in Italy, Pakistan, and India for the collaborative management of patients undergoing bone marrow transplantation (BMT) for thalassemia major since August 2008. This online open-access database covers data recording, analyzing, and reporting besides enabling knowledge exchange, telemedicine, capacity building, and quality assurance. As of February 2014, over 2400 patients have been registered and 112 BMTs have been performed with outcomes comparable to international standards, but at a fraction of the cost. This approach avoids medical emigration and contributes to local healthcare strengthening and competitiveness. This paper presents the experience and clinical outcomes associated with the use of this platform built using open-source tools and focusing on a locally pertinent tertiary care procedure-BMT. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  14. The GNEMRE Dendro Tool.

    Energy Technology Data Exchange (ETDEWEB)

    Merchant, Bion John

    2007-10-01

    The GNEMRE Dendro Tool provides a previously unrealized analysis capability in the field of nuclear explosion monitoring. Dendro Tool allows analysts to quickly and easily determine the similarity between seismic events using the waveform time-series for each of the events to compute cross-correlation values. Events can then be categorized into clusters of similar events. This analysis technique can be used to characterize historical archives of seismic events in order to determine many of the unique sources that are present. In addition, the source of any new events can be quickly identified simply by comparing the new event to the historical set.

  15. Assessing student written problem solutions: A problem-solving rubric with application to introductory physics

    Science.gov (United States)

    Docktor, Jennifer L.; Dornfeld, Jay; Frodermann, Evan; Heller, Kenneth; Hsu, Leonardo; Jackson, Koblar Alan; Mason, Andrew; Ryan, Qing X.; Yang, Jie

    2016-06-01

    Problem solving is a complex process valuable in everyday life and crucial for learning in the STEM fields. To support the development of problem-solving skills it is important for researchers and curriculum developers to have practical tools that can measure the difference between novice and expert problem-solving performance in authentic classroom work. It is also useful if such tools can be employed by instructors to guide their pedagogy. We describe the design, development, and testing of a simple rubric to assess written solutions to problems given in undergraduate introductory physics courses. In particular, we present evidence for the validity, reliability, and utility of the instrument. The rubric identifies five general problem-solving processes and defines the criteria to attain a score in each: organizing problem information into a Useful Description, selecting appropriate principles (Physics Approach), applying those principles to the specific conditions in the problem (Specific Application of Physics), using Mathematical Procedures appropriately, and displaying evidence of an organized reasoning pattern (Logical Progression).

  16. Assessing student written problem solutions: A problem-solving rubric with application to introductory physics

    Directory of Open Access Journals (Sweden)

    Jennifer L. Docktor

    2016-05-01

    Full Text Available Problem solving is a complex process valuable in everyday life and crucial for learning in the STEM fields. To support the development of problem-solving skills it is important for researchers and curriculum developers to have practical tools that can measure the difference between novice and expert problem-solving performance in authentic classroom work. It is also useful if such tools can be employed by instructors to guide their pedagogy. We describe the design, development, and testing of a simple rubric to assess written solutions to problems given in undergraduate introductory physics courses. In particular, we present evidence for the validity, reliability, and utility of the instrument. The rubric identifies five general problem-solving processes and defines the criteria to attain a score in each: organizing problem information into a Useful Description, selecting appropriate principles (Physics Approach, applying those principles to the specific conditions in the problem (Specific Application of Physics, using Mathematical Procedures appropriately, and displaying evidence of an organized reasoning pattern (Logical Progression.

  17. Simulation tools

    CERN Document Server

    Jenni, F

    2006-01-01

    In the last two decades, simulation tools made a significant contribution to the great progress in development of power electronics. Time to market was shortened and development costs were reduced drastically. Falling costs, as well as improved speed and precision, opened new fields of application. Today, continuous and switched circuits can be mixed. A comfortable number of powerful simulation tools is available. The users have to choose the best suitable for their application. Here a simple rule applies: The best available simulation tool is the tool the user is already used to (provided, it can solve the task). Abilities, speed, user friendliness and other features are continuously being improved—even though they are already powerful and comfortable. This paper aims at giving the reader an insight into the simulation of power electronics. Starting with a short description of the fundamentals of a simulation tool as well as properties of tools, several tools are presented. Starting with simplified models ...

  18. Electricity of machine tool

    International Nuclear Information System (INIS)

    Gijeon media editorial department

    1977-10-01

    This book is divided into three parts. The first part deals with electricity machine, which can taints from generator to motor, motor a power source of machine tool, electricity machine for machine tool such as switch in main circuit, automatic machine, a knife switch and pushing button, snap switch, protection device, timer, solenoid, and rectifier. The second part handles wiring diagram. This concludes basic electricity circuit of machine tool, electricity wiring diagram in your machine like milling machine, planer and grinding machine. The third part introduces fault diagnosis of machine, which gives the practical solution according to fault diagnosis and the diagnostic method with voltage and resistance measurement by tester.

  19. Effects of Written and Auditory Language-Processing Skills on Written Passage Comprehension in Middle and High School Students

    Science.gov (United States)

    Caplan, David; Waters, Gloria; Bertram, Julia; Ostrowski, Adam; Michaud, Jennifer

    2016-01-01

    The authors assessed 4,865 middle and high school students for the ability to recognize and understand written and spoken morphologically simple words, morphologically complex words, and the syntactic structure of sentences and for the ability to answer questions about facts presented in a written passage and to make inferences based on those…

  20. A plug-in to Eclipse for VHDL source codes: functionalities

    Science.gov (United States)

    Niton, B.; Poźniak, K. T.; Romaniuk, R. S.

    The paper presents an original application, written by authors, which supports writing and edition of source codes in VHDL language. It is a step towards fully automatic, augmented code writing for photonic and electronic systems, also systems based on FPGA and/or DSP processors. An implementation is described, based on VEditor. VEditor is a free license program. Thus, the work presented in this paper supplements and extends this free license. The introduction characterizes shortly available tools on the market which serve for aiding the design processes of electronic systems in VHDL. Particular attention was put on plug-ins to the Eclipse environment and Emacs program. There are presented detailed properties of the written plug-in such as: programming extension conception, and the results of the activities of formatter, re-factorizer, code hider, and other new additions to the VEditor program.