WorldWideScience

Sample records for research tools allowing

  1. Narratives and Activity Theory as Reflective Tools in Action Research

    Science.gov (United States)

    Stuart, Kaz

    2012-01-01

    Narratives and activity theory are useful as socially constructed data collection tools that allow a researcher access to the social, cultural and historical meanings that research participants place on events in their lives. This case study shows how these tools were used to promote reflection within a cultural-historical activity theoretically…

  2. Spec Tool; an online education and research resource

    Science.gov (United States)

    Maman, S.; Shenfeld, A.; Isaacson, S.; Blumberg, D. G.

    2016-06-01

    Education and public outreach (EPO) activities related to remote sensing, space, planetary and geo-physics sciences have been developed widely in the Earth and Planetary Image Facility (EPIF) at Ben-Gurion University of the Negev, Israel. These programs aim to motivate the learning of geo-scientific and technologic disciplines. For over the past decade, the facility hosts research and outreach activities for researchers, local community, school pupils, students and educators. As software and data are neither available nor affordable, the EPIF Spec tool was created as a web-based resource to assist in initial spectral analysis as a need for researchers and students. The tool is used both in the academic courses and in the outreach education programs and enables a better understanding of the theoretical data of spectroscopy and Imaging Spectroscopy in a 'hands-on' activity. This tool is available online and provides spectra visualization tools and basic analysis algorithms including Spectral plotting, Spectral angle mapping and Linear Unmixing. The tool enables to visualize spectral signatures from the USGS spectral library and additional spectra collected in the EPIF such as of dunes in southern Israel and from Turkmenistan. For researchers and educators, the tool allows loading collected samples locally for further analysis.

  3. Applied regression analysis a research tool

    CERN Document Server

    Pantula, Sastry; Dickey, David

    1998-01-01

    Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. Applied Regression Analysis is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. Applied Regression Analysis serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to...

  4. "Research Tools": Tools for supporting research and publications

    OpenAIRE

    Ebrahim, Nader Ale

    2014-01-01

    Research Tools” can be defined as vehicles that broadly facilitate research and related activities. “Research Tools” enable researchers to collect, organize, analyze, visualize and publicized research outputs. Dr. Nader has collected over 700 tools that enable students to follow the correct path in research and to ultimately produce high-quality research outputs with more accuracy and efficiency. It is assembled as an interactive Web-based mind map, titled “Research Tools”, which is updated ...

  5. Tool post modification allows easy turret lathe cutting-tool alignment

    Science.gov (United States)

    Fouts, L.

    1966-01-01

    Modified tool holder and tool post permit alignment of turret lathe cutting tools on the center of the spindle. The tool is aligned with the spindle by the holder which is kept in position by a hydraulic lock in feature of the tool post. The tool post is used on horizontal and vertical turret lathes and other engine lathes.

  6. Scientific Visualization Tools for Enhancement of Undergraduate Research

    Science.gov (United States)

    Rodriguez, W. J.; Chaudhury, S. R.

    2001-05-01

    Undergraduate research projects that utilize remote sensing satellite instrument data to investigate atmospheric phenomena pose many challenges. A significant challenge is processing large amounts of multi-dimensional data. Remote sensing data initially requires mining; filtering of undesirable spectral, instrumental, or environmental features; and subsequently sorting and reformatting to files for easy and quick access. The data must then be transformed according to the needs of the investigation(s) and displayed for interpretation. These multidimensional datasets require views that can range from two-dimensional plots to multivariable-multidimensional scientific visualizations with animations. Science undergraduate students generally find these data processing tasks daunting. Generally, researchers are required to fully understand the intricacies of the dataset and write computer programs or rely on commercially available software, which may not be trivial to use. In the time that undergraduate researchers have available for their research projects, learning the data formats, programming languages, and/or visualization packages is impractical. When dealing with large multi-dimensional data sets appropriate Scientific Visualization tools are imperative in allowing students to have a meaningful and pleasant research experience, while producing valuable scientific research results. The BEST Lab at Norfolk State University has been creating tools for multivariable-multidimensional analysis of Earth Science data. EzSAGE and SAGE4D have been developed to sort, analyze and visualize SAGE II (Stratospheric Aerosol and Gas Experiment) data with ease. Three- and four-dimensional visualizations in interactive environments can be produced. EzSAGE provides atmospheric slices in three-dimensions where the researcher can change the scales in the three-dimensions, color tables and degree of smoothing interactively to focus on particular phenomena. SAGE4D provides a navigable

  7. INTELLECTUAL PROPERTY RIGHTS ISSUES FOR RESEARCH TOOLS IN BIOTECHNOLOGY RESEARCH

    Directory of Open Access Journals (Sweden)

    Rekha Chaturvedi

    2015-09-01

    Full Text Available The research tools refer to the resources researchers need to use in experimental work. In Biotechnology, these can include cell lines, monoclonal antibodies, reagents, animal models, growth factors, combinatorial chemistry libraries, drug and drug targets, clones and cloning tools (such as PCR, method, laboratory equipment and machines, database and computer software. Research tools therefore serve as basis for upstream research to improve the present product or process. There are several challenges in the way of using patented research tools. IP issues with regard to research tools are important and may sometime pose hindrance for researchers. Hence in the case of patented research tools, IPR issues can compose a major hurdle for technology development. In majority instances research tools are permitted through MTAs for academic research and for imparting education. TRIPS provides a provision for exception to patent rights for experimental use of patented technology in scientific research and several countries including India have included this provision in their patent legislation. For commercially important work, licensing of research tools can be based on royalty or one time lump sum payment. Some patent owners of important high-end research tools for development of platform technology create problems in licensing which can impede research. Usually cost of a commercially available research tool is built up in its price.

  8. Data Integration Tool: From Permafrost Data Translation Research Tool to A Robust Research Application

    Science.gov (United States)

    Wilcox, H.; Schaefer, K. M.; Jafarov, E. E.; Strawhacker, C.; Pulsifer, P. L.; Thurmes, N.

    2016-12-01

    The United States National Science Foundation funded PermaData project led by the National Snow and Ice Data Center (NSIDC) with a team from the Global Terrestrial Network for Permafrost (GTN-P) aimed to improve permafrost data access and discovery. We developed a Data Integration Tool (DIT) to significantly speed up the time of manual processing needed to translate inconsistent, scattered historical permafrost data into files ready to ingest directly into the GTN-P. We leverage this data to support science research and policy decisions. DIT is a workflow manager that divides data preparation and analysis into a series of steps or operations called widgets. Each widget does a specific operation, such as read, multiply by a constant, sort, plot, and write data. DIT allows the user to select and order the widgets as desired to meet their specific needs. Originally it was written to capture a scientist's personal, iterative, data manipulation and quality control process of visually and programmatically iterating through inconsistent input data, examining it to find problems, adding operations to address the problems, and rerunning until the data could be translated into the GTN-P standard format. Iterative development of this tool led to a Fortran/Python hybrid then, with consideration of users, licensing, version control, packaging, and workflow, to a publically available, robust, usable application. Transitioning to Python allowed the use of open source frameworks for the workflow core and integration with a javascript graphical workflow interface. DIT is targeted to automatically handle 90% of the data processing for field scientists, modelers, and non-discipline scientists. It is available as an open source tool in GitHub packaged for a subset of Mac, Windows, and UNIX systems as a desktop application with a graphical workflow manager. DIT was used to completely translate one dataset (133 sites) that was successfully added to GTN-P, nearly translate three datasets

  9. Research tools | IDRC - International Development Research Centre

    International Development Research Centre (IDRC) Digital Library (Canada)

    Through training materials and guides, we aim to build skills and knowledge to enhance the quality of development research. We also offer free access to our database of funded research projects, known as IDRIS+, and our digital library. Our research tools include. Guide to research databases at IDRC: How to access and ...

  10. Interactive Data Visualization for HIV Cohorts: Leveraging Data Exchange Standards to Share and Reuse Research Tools.

    Directory of Open Access Journals (Sweden)

    Meridith Blevins

    Full Text Available To develop and disseminate tools for interactive visualization of HIV cohort data.If a picture is worth a thousand words, then an interactive video, composed of a long string of pictures, can produce an even richer presentation of HIV population dynamics. We developed an HIV cohort data visualization tool using open-source software (R statistical language. The tool requires that the data structure conform to the HIV Cohort Data Exchange Protocol (HICDEP, and our implementation utilized Caribbean, Central and South America network (CCASAnet data.This tool currently presents patient-level data in three classes of plots: (1 Longitudinal plots showing changes in measurements viewed alongside event probability curves allowing for simultaneous inspection of outcomes by relevant patient classes. (2 Bubble plots showing changes in indicators over time allowing for observation of group level dynamics. (3 Heat maps of levels of indicators changing over time allowing for observation of spatial-temporal dynamics. Examples of each class of plot are given using CCASAnet data investigating trends in CD4 count and AIDS at antiretroviral therapy (ART initiation, CD4 trajectories after ART initiation, and mortality.We invite researchers interested in this data visualization effort to use these tools and to suggest new classes of data visualization. We aim to contribute additional shareable tools in the spirit of open scientific collaboration and hope that these tools further the participation in open data standards like HICDEP by the HIV research community.

  11. Medical Community of Inquiry: A Diagnostic Tool for Learning, Assessment, and Research

    Directory of Open Access Journals (Sweden)

    Rakefet Ackerman

    2017-01-01

    Full Text Available Aim/Purpose: These days educators are expected to integrate technological tools into classes. Although they acquire relevant skills, they are often reluctant to use these tools. Background:\tWe incorporated online forums for generating a Community of Inquiry (CoI in a faculty development program. Extending the Technology, Pedagogy, and Content Knowledge (TPACK model with Assessment Knowledge and content analysis of forum discourse and reflection after each CoI, we offer the Diagnostic Tool for Learning, Assessment, and Research (DTLAR. Methodology: This study spanned over two cycles of a development program for medical faculty. Contribution: This study demonstrates how the DTLAR supports in-depth examination of the benefits and challenges of using CoIs for learning and teaching. Findings: Before the program, participants had little experience with, and were reluctant to use, CoIs in classes. At the program completion, many were willing to adopt CoIs and appreciated this method’s contribution. Both CoIs discourse and reflections included positive attitudes regarding cognitive and teacher awareness categories. However, negative attitudes regarding affective aspects and time-consuming aspects of CoIs were exposed. Participants who experienced facilitating a CoI gained additional insights into its usefulness. Recommendations for Practitioners\t: The DTLAR allows analyzing adaption of online forums for learning and teaching. Recommendation for Researchers: The DTLAR allows analyzing factors that affect the acceptance of online fo-rums for learning and teaching. Impact on Society\t: While the tool was implemented in the context of medical education, it can be readily applied in other adult learning programs. Future Research: The study includes several design aspects that probably affected the improve-ment and challenges we found. Future research is called for providing guidelines for identifying boundary conditions and potential for further

  12. Accounting Research as a didactic tool for a accounting teaching

    Directory of Open Access Journals (Sweden)

    Valeria Gisela Perez

    2016-06-01

    Full Text Available This paper develops a reflection about the importance of the research of accounting subjects in the professional accountants training, this importance is an attribute of research to increase the wealth of discipline under investigation, this can be converted into a skill and/or competence wich accountants are required to demonstrate in their professional practice.Furthermore, accounting is recognized by the authors as a science in constant development, being able to be investigated. This change in knowledge is an element that motivates professionals to be constantly updated, becoming this aspect (constant updating the skill and competence that research can bring to professional training in university classrooms.The reflection is based on the study of documents developed by prestigious authors in accounting theory, teaching and research.Therefore, this paper concludes that research is a useful tool for the professional accounting training, and rewards the important skills and competencies for professional practice; it can be conceived as well as a strategy for technical and educational activities that allows students to recreate knowledge, allowing future updates that will require their professional practice.Key words: Accounting research, university teaching, accounting education. 

  13. miRQuest: integration of tools on a Web server for microRNA research.

    Science.gov (United States)

    Aguiar, R R; Ambrosio, L A; Sepúlveda-Hermosilla, G; Maracaja-Coutinho, V; Paschoal, A R

    2016-03-28

    This report describes the miRQuest - a novel middleware available in a Web server that allows the end user to do the miRNA research in a user-friendly way. It is known that there are many prediction tools for microRNA (miRNA) identification that use different programming languages and methods to realize this task. It is difficult to understand each tool and apply it to diverse datasets and organisms available for miRNA analysis. miRQuest can easily be used by biologists and researchers with limited experience with bioinformatics. We built it using the middleware architecture on a Web platform for miRNA research that performs two main functions: i) integration of different miRNA prediction tools for miRNA identification in a user-friendly environment; and ii) comparison of these prediction tools. In both cases, the user provides sequences (in FASTA format) as an input set for the analysis and comparisons. All the tools were selected on the basis of a survey of the literature on the available tools for miRNA prediction. As results, three different cases of use of the tools are also described, where one is the miRNA identification analysis in 30 different species. Finally, miRQuest seems to be a novel and useful tool; and it is freely available for both benchmarking and miRNA identification at http://mirquest.integrativebioinformatics.me/.

  14. VoiceThread as a Peer Review and Dissemination Tool for Undergraduate Research

    Science.gov (United States)

    Guertin, L. A.

    2012-12-01

    VoiceThread has been utilized in an undergraduate research methods course for peer review and final research project dissemination. VoiceThread (http://www.voicethread.com) can be considered a social media tool, as it is a web-based technology with the capacity to enable interactive dialogue. VoiceThread is an application that allows a user to place a media collection online containing images, audio, videos, documents, and/or presentations in an interface that facilitates asynchronous communication. Participants in a VoiceThread can be passive viewers of the online content or engaged commenters via text, audio, video, with slide annotations via a doodle tool. The VoiceThread, which runs across browsers and operating systems, can be public or private for viewing and commenting and can be embedded into any website. Although few university students are aware of the VoiceThread platform (only 10% of the students surveyed by Ng (2012)), the 2009 K-12 edition of The Horizon Report (Johnson et al., 2009) lists VoiceThread as a tool to watch because of the opportunities it provides as a collaborative learning environment. In Fall 2011, eleven students enrolled in an undergraduate research methods course at Penn State Brandywine each conducted their own small-scale research project. Upon conclusion of the projects, students were required to create a poster summarizing their work for peer review. To facilitate the peer review process outside of class, each student-created PowerPoint file was placed in a VoiceThread with private access to only the class members and instructor. Each student was assigned to peer review five different student posters (i.e., VoiceThread images) with the audio and doodle tools to comment on formatting, clarity of content, etc. After the peer reviews were complete, the students were allowed to edit their PowerPoint poster files for a new VoiceThread. In the new VoiceThread, students were required to video record themselves describing their research

  15. Augmented reality as a tool for linguistic research: Intercepting and manipulating multimodal interaction

    OpenAIRE

    Pitsch, Karola; Neumann, Alexander; Schnier, Christian; Hermann, Thomas

    2013-01-01

    We suggest that an Augmented Reality (AR) system for coupled interaction partners provides a new tool for linguistic research that allows to manipulate the coparticipants’ real-time perception and action. It encompasses novel facilities for recording heterogeneous sensor-rich data sets to be accessed in parallel with qualitative/manual and quantitative/computational methods.

  16. In silico regenerative medicine: how computational tools allow regulatory and financial challenges to be addressed in a volatile market.

    Science.gov (United States)

    Geris, L; Guyot, Y; Schrooten, J; Papantoniou, I

    2016-04-06

    The cell therapy market is a highly volatile one, due to the use of disruptive technologies, the current economic situation and the small size of the market. In such a market, companies as well as academic research institutes are in need of tools to advance their understanding and, at the same time, reduce their R&D costs, increase product quality and productivity, and reduce the time to market. An additional difficulty is the regulatory path that needs to be followed, which is challenging in the case of cell-based therapeutic products and should rely on the implementation of quality by design (QbD) principles. In silico modelling is a tool that allows the above-mentioned challenges to be addressed in the field of regenerative medicine. This review discusses such in silico models and focuses more specifically on the bioprocess. Three (clusters of) examples related to this subject are discussed. The first example comes from the pharmaceutical engineering field where QbD principles and their implementation through the use of in silico models are both a regulatory and economic necessity. The second example is related to the production of red blood cells. The described in silico model is mainly used to investigate the manufacturing process of the cell-therapeutic product, and pays special attention to the economic viability of the process. Finally, we describe the set-up of a model capturing essential events in the development of a tissue-engineered combination product in the context of bone tissue engineering. For each of the examples, a short introduction to some economic aspects is given, followed by a description of the in silico tool or tools that have been developed to allow the implementation of QbD principles and optimal design.

  17. Forum Theater’s potential as a Research Tool

    Directory of Open Access Journals (Sweden)

    Andrea Calsamiglia Madurga

    2016-03-01

    Full Text Available We present a theoretical and epistemological reflection on Forum Theater’s potential as a Research Tool. Our presence on social action and research has led us to a double reflection on qualitative research’s limitations on the affect studies and the Forum Theater’s potential as a research tool to tackle research about affects. After some specific experiences in action research (qualitative research on romantic love and gender violence, and the creation process of the Forum Theater “Is it a joke?”, we explore Forum Theatre’s possibilities as a research tool in the feminist epistemology framework.

  18. Moving research tools into practice: the successes and challenges in promoting uptake of classification tools.

    Science.gov (United States)

    Cunningham, Barbara Jane; Hidecker, Mary Jo Cooley; Thomas-Stonell, Nancy; Rosenbaum, Peter

    2018-05-01

    In this paper, we present our experiences - both successes and challenges - in implementing evidence-based classification tools into clinical practice. We also make recommendations for others wanting to promote the uptake and application of new research-based assessment tools. We first describe classification systems and the benefits of using them in both research and practice. We then present a theoretical framework from Implementation Science to report strategies we have used to implement two research-based classification tools into practice. We also illustrate some of the challenges we have encountered by reporting results from an online survey investigating 58 Speech-language Pathologists' knowledge and use of the Communication Function Classification System (CFCS), a new tool to classify children's functional communication skills. We offer recommendations for researchers wanting to promote the uptake of new tools in clinical practice. Specifically, we identify structural, organizational, innovation, practitioner, and patient-related factors that we recommend researchers address in the design of implementation interventions. Roles and responsibilities of both researchers and clinicians in making implementations science a success are presented. Implications for rehabilitation Promoting uptake of new and evidence-based tools into clinical practice is challenging. Implementation science can help researchers to close the knowledge-to-practice gap. Using concrete examples, we discuss our experiences in implementing evidence-based classification tools into practice within a theoretical framework. Recommendations are provided for researchers wanting to implement new tools in clinical practice. Implications for researchers and clinicians are presented.

  19. Market research companies and new product development tools

    NARCIS (Netherlands)

    Nijssen, E.J.; Frambach, R.T.

    1998-01-01

    This research investigates (1) the share of new product development (NPD) research services in market research (MR) companies’ turnover, (2) MR companies’ awareness and use of NPD tools and the modifications made to these NPD tools, and (3) MR company managers’ perceptions of the influence of client

  20. Market research companies and new product development tools

    NARCIS (Netherlands)

    Nijssen, Edwin J.; Frambach, Ruud T.

    1998-01-01

    This research investigates (1) the share of new product development (NPD) research services in market research (MR) companies' turnover, (2) MR companies' awareness and use of NPD tools and the modifications made to these NPD tools, and (3) MR company managers' perceptions of the influence of client

  1. Software tool for portal dosimetry research.

    Science.gov (United States)

    Vial, P; Hunt, P; Greer, P B; Oliver, L; Baldock, C

    2008-09-01

    This paper describes a software tool developed for research into the use of an electronic portal imaging device (EPID) to verify dose for intensity modulated radiation therapy (IMRT) beams. A portal dose image prediction (PDIP) model that predicts the EPID response to IMRT beams has been implemented into a commercially available treatment planning system (TPS). The software tool described in this work was developed to modify the TPS PDIP model by incorporating correction factors into the predicted EPID image to account for the difference in EPID response to open beam radiation and multileaf collimator (MLC) transmitted radiation. The processes performed by the software tool include; i) read the MLC file and the PDIP from the TPS, ii) calculate the fraction of beam-on time that each point in the IMRT beam is shielded by MLC leaves, iii) interpolate correction factors from look-up tables, iv) create a corrected PDIP image from the product of the original PDIP and the correction factors and write the corrected image to file, v) display, analyse, and export various image datasets. The software tool was developed using the Microsoft Visual Studio.NET framework with the C# compiler. The operation of the software tool was validated. This software provided useful tools for EPID dosimetry research, and it is being utilised and further developed in ongoing EPID dosimetry and IMRT dosimetry projects.

  2. Rapidly Re-Configurable Flight Simulator Tools for Crew Vehicle Integration Research and Design

    Science.gov (United States)

    Pritchett, Amy R.

    2002-01-01

    While simulation is a valuable research and design tool, the time and difficulty required to create new simulations (or re-use existing simulations) often limits their application. This report describes the design of the software architecture for the Reconfigurable Flight Simulator (RFS), which provides a robust simulation framework that allows the simulator to fulfill multiple research and development goals. The core of the architecture provides the interface standards for simulation components, registers and initializes components, and handles the communication between simulation components. The simulation components are each a pre-compiled library 'plugin' module. This modularity allows independent development and sharing of individual simulation components. Additional interfaces can be provided through the use of Object Data/Method Extensions (OD/ME). RFS provides a programmable run-time environment for real-time access and manipulation, and has networking capabilities using the High Level Architecture (HLA).

  3. Research and Development of the Solidification of Slab Ingots from Special Tool Steels

    Directory of Open Access Journals (Sweden)

    Tkadlečková M.

    2017-09-01

    Full Text Available The paper describes the research and development of casting and solidification of slab ingots from special tool steels by means of numerical modelling using the finite element method. The pre-processing, processing and post-processing phases of numerical modelling are outlined. Also, problems with determining the thermophysical properties of materials and heat transfer between the individual parts of the casting system are discussed. Based on the type of grade of tool steel, the risk of final porosity is predicted. The results allowed to improve the production technology of slab ingots, and also to verify the ratio, the chamfer and the external/ internal shape of the wall of the new designed slab ingots.

  4. Development of processes allowing near real-time refinement and validation of triage tools during the early stage of an outbreak in readiness for surge: the FLU-CATs Study.

    Science.gov (United States)

    Venkatesan, Sudhir; Myles, Puja R; McCann, Gerard; Kousoulis, Antonis A; Hashmi, Maimoona; Belatri, Rabah; Boyle, Emma; Barcroft, Alan; van Staa, Tjeerd Pieter; Kirkham, Jamie J; Nguyen Van Tam, Jonathan S; Williams, Timothy J; Semple, Malcolm G

    2015-10-01

    During pandemics of novel influenza and outbreaks of emerging infections, surge in health-care demand can exceed capacity to provide normal standards of care. In such exceptional circumstances, triage tools may aid decisions in identifying people who are most likely to benefit from higher levels of care. Rapid research during the early phase of an outbreak should allow refinement and validation of triage tools so that in the event of surge a valid tool is available. The overarching study aim is to conduct a prospective near real-time analysis of structured clinical assessments of influenza-like illness (ILI) using primary care electronic health records (EHRs) during a pandemic. This abstract summarises the preparatory work, infrastructure development, user testing and proof-of-concept study. (1) In preparation for conducting rapid research in the early phase of a future outbreak, to develop processes that allow near real-time analysis of general practitioner (GP) assessments of people presenting with ILI, management decisions and patient outcomes. (2) As proof of concept: conduct a pilot study evaluating the performance of the triage tools 'Community Assessment Tools' and 'Pandemic Medical Early Warning Score' to predict hospital admission and death in patients presenting with ILI to GPs during inter-pandemic winter seasons. Prospective near real-time analysis of structured clinical assessments and anonymised linkage to data from EHRs. User experience was evaluated by semistructured interviews with participating GPs. Thirty GPs in England, Wales and Scotland, participating in the Clinical Practice Research Datalink. All people presenting with ILI. None. Study outcome is proof of concept through demonstration of data capture and near real-time analysis. Primary patient outcomes were hospital admission within 24 hours and death (all causes) within 30 days of GP assessment. Secondary patient outcomes included GP decision to prescribe antibiotics and/or influenza

  5. Evaluating research impact: the development of a ‘RESEARCH for IMPACT’ TOOL

    Directory of Open Access Journals (Sweden)

    Komla Tsey

    2016-08-01

    Full Text Available Introduction: This paper describes the development of a ‘Research for Impact’ Tool against a background of concerns about the over-researching of Aboriginal and Torres Strait Islander people’s issues without demonstrable benefits.Material and Methods: A combination of literature reviews, workshops with researchers and reflections by project team members and partners using participatory snowball techniques.Results: Assessing research impact is difficult, akin to so-called ‘wicked problem’, but not impossible. Heuristic and collaborative approach to research that takes in the expectations of research users, those being researched and the funders of research offers a pragmatic solution to evaluating research impact. The proposed ‘Research for Impact’ Tool is based on the understanding that the value of research is to create evidence and/or products to support smarter decisions so as to improve the human condition.Research is of limited value unless the evidence produced is used to inform smarter decisions. A practical way of approaching research impact is therefore to start with the decisions confronting decision makers whether they are government policymakers, professional practitioners or households and the extent to which the research supports smarter decisions and the knock-on consequences of such smart decisions. Embedded at each step in the impact planning, monitoring and evaluation process is the need for Indigenous leadership and participation, capacity enhancement and collaborative partnerships and participatory learning by doing approaches across partners.Discussion: The tool is designed in the context of Indigenous research but the basic idea that the way to assess research impact is to start upfront by defining the users’ of research and their information needs, the decisions confronting them and the extent to which research informs smarter decisions is equally applicable to research in other settings, both applied and

  6. Some tooling for manufacturing research reactor fuel plates

    International Nuclear Information System (INIS)

    Knight, R.W.

    1999-01-01

    This paper will discuss some of the tooling necessary to manufacture aluminum-based research reactor fuel plates. Most of this tooling is intended for use in a high-production facility. Some of the tools shown have manufactured more than 150,000 pieces. The only maintenance has been sharpening. With careful design, tools can be made to accommodate the manufacture of several different fuel elements, thus, reducing tooling costs and maintaining tools that the operators are trained to use. An important feature is to design the tools using materials with good lasting quality. Good tools can increase return on investment. (author)

  7. Some Tooling for Manufacturing Research Reactor Fuel Plates

    International Nuclear Information System (INIS)

    Knight, R.W.

    1999-01-01

    This paper will discuss some of the tooling necessary to manufacture aluminum-based research reactor fuel plates. Most of this tooling is intended for use in a high-production facility. Some of the tools shown have manufactured more than 150,000 pieces. The only maintenance has been sharpening. With careful design, tools can be made to accommodate the manufacture of several different fuel elements, thus, reducing tooling costs and maintaining tools that the operators are trained to use. An important feature is to design the tools using materials with good lasting quality. Good tools can increase return on investment

  8. Aligning Web-Based Tools to the Research Process Cycle: A Resource for Collaborative Research Projects

    Science.gov (United States)

    Price, Geoffrey P.; Wright, Vivian H.

    2012-01-01

    Using John Creswell's Research Process Cycle as a framework, this article describes various web-based collaborative technologies useful for enhancing the organization and efficiency of educational research. Visualization tools (Cacoo) assist researchers in identifying a research problem. Resource storage tools (Delicious, Mendeley, EasyBib)…

  9. Extending the XNAT archive tool for image and analysis management in ophthalmology research

    Science.gov (United States)

    Wahle, Andreas; Lee, Kyungmoo; Harding, Adam T.; Garvin, Mona K.; Niemeijer, Meindert; Sonka, Milan; Abràmoff, Michael D.

    2013-03-01

    In ophthalmology, various modalities and tests are utilized to obtain vital information on the eye's structure and function. For example, optical coherence tomography (OCT) is utilized to diagnose, screen, and aid treatment of eye diseases like macular degeneration or glaucoma. Such data are complemented by photographic retinal fundus images and functional tests on the visual field. DICOM isn't widely used yet, though, and frequently images are encoded in proprietary formats. The eXtensible Neuroimaging Archive Tool (XNAT) is an open-source NIH-funded framework for research PACS and is in use at the University of Iowa for neurological research applications. Its use for ophthalmology was hence desirable but posed new challenges due to data types thus far not considered and the lack of standardized formats. We developed custom tools for data types not natively recognized by XNAT itself using XNAT's low-level REST API. Vendor-provided tools can be included as necessary to convert proprietary data sets into valid DICOM. Clients can access the data in a standardized format while still retaining the original format if needed by specific analysis tools. With respective project-specific permissions, results like segmentations or quantitative evaluations can be stored as additional resources to previously uploaded datasets. Applications can use our abstract-level Python or C/C++ API to communicate with the XNAT instance. This paper describes concepts and details of the designed upload script templates, which can be customized to the needs of specific projects, and the novel client-side communication API which allows integration into new or existing research applications.

  10. Script Towards Research 2.0: The Influence of Digital and Online Tools in Academic Research

    Directory of Open Access Journals (Sweden)

    Gabriela Grosseck

    2016-07-01

    Full Text Available The new Internet technologies have infiltrated in a stunning way the academic environment, both at individual and at institutional level. Therefore, more and more teachers have started educational blogs, librarians are active on Twitter, other educational actors curate web content, students post on Instagram or Flickr, and university departments have Facebook pages and/or YouTube accounts etc. Today, the use of web technology has become “a legitimate activity in many areas of higher education” (Waycott, 2010 and a considerable shift to digital academic research has gradually occurred. Teachers are encouraging students to take up digital tools for research and writing, thus revealing new ways of using information and communication technologies for academic purposes and not just for socializing. The main objective of this paper is to investigate the effects of integrating diverse digital, Web 2.0 tools and resources and OERs/MOOCs in research and in the construction of students’ academic texts. We aim to stress the increasing influence of digital and online tools in academic research and writing. Teachers, specialists, and students alike are affected by this process. In order to show how, we explore the following issues: What is Research 2.0? Which digital/online tools have we used to assist our students? What are the challenges for academic research using digital / web 2.0 tools? And how do digital tools shape academic research?

  11. A PART OF RESEARCH METHODOLOGY COURSE: Introduction to the Research Tools

    OpenAIRE

    Ebrahim, Nader Ale

    2016-01-01

    Research Tools” can be defined as vehicles that broadly facilitate research and related activities. “Research Tools” enable researchers to collect, organize, analyze, visualize and publicized research  outputs. Dr. Nader has collected over 800 tools that enable students to follow the correct path in research and to ultimately produce high-quality research outputs with more accuracy and efficiency. It is assembled as an interactive Web-based mind map, titled “Research Tools”, which is updated...

  12. The Ark: a customizable web-based data management tool for health and medical research.

    Science.gov (United States)

    Bickerstaffe, Adrian; Ranaweera, Thilina; Endersby, Travis; Ellis, Christopher; Maddumarachchi, Sanjaya; Gooden, George E; White, Paul; Moses, Eric K; Hewitt, Alex W; Hopper, John L

    2017-02-15

    The Ark is an open-source web-based tool that allows researchers to manage health and medical research data for humans and animals without specialized database skills or programming expertise. The system provides data management for core research information including demographic, phenotype, biospecimen and pedigree data, in addition to supporting typical investigator requirements such as tracking participant consent and correspondence, whilst also being able to generate custom data exports and reports. The Ark is 'study generic' by design and highly configurable via its web interface, allowing researchers to tailor the system to the specific data management requirements of their study. Source code for The Ark can be obtained freely from the website https://github.com/The-Ark-Informatics/ark/ . The source code can be modified and redistributed under the terms of the GNU GPL v3 license. Documentation and a pre-configured virtual appliance can be found at the website http://sphinx.org.au/the-ark/ . adrianb@unimelb.edu.au. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  13. Using smartphones in survey research: a multifunctional tool

    OpenAIRE

    Nathalie Sonck; Henk Fernee

    2013-01-01

    Smartphones and apps offer an innovative means of collecting data from the public. The Netherlands Institute for Social Research | SCP has been engaged in one of the first experiments involving the use of a smartphone app to collect time use data recorded by means of an electronic diary. Is it feasible to use smartphones as a data collection tool for social research? What are the effects on data quality? Can we also incorporate reality mining tools in the smartphone app to replace traditional...

  14. The GATO gene annotation tool for research laboratories

    Directory of Open Access Journals (Sweden)

    A. Fujita

    2005-11-01

    Full Text Available Large-scale genome projects have generated a rapidly increasing number of DNA sequences. Therefore, development of computational methods to rapidly analyze these sequences is essential for progress in genomic research. Here we present an automatic annotation system for preliminary analysis of DNA sequences. The gene annotation tool (GATO is a Bioinformatics pipeline designed to facilitate routine functional annotation and easy access to annotated genes. It was designed in view of the frequent need of genomic researchers to access data pertaining to a common set of genes. In the GATO system, annotation is generated by querying some of the Web-accessible resources and the information is stored in a local database, which keeps a record of all previous annotation results. GATO may be accessed from everywhere through the internet or may be run locally if a large number of sequences are going to be annotated. It is implemented in PHP and Perl and may be run on any suitable Web server. Usually, installation and application of annotation systems require experience and are time consuming, but GATO is simple and practical, allowing anyone with basic skills in informatics to access it without any special training. GATO can be downloaded at [http://mariwork.iq.usp.br/gato/]. Minimum computer free space required is 2 MB.

  15. New tools for Content Innovation and data sharing: Enhancing reproducibility and rigor in biomechanics research.

    Science.gov (United States)

    Guilak, Farshid

    2017-03-21

    We are currently in one of the most exciting times for science and engineering as we witness unprecedented growth in our computational and experimental capabilities to generate new data and models. To facilitate data and model sharing, and to enhance reproducibility and rigor in biomechanics research, the Journal of Biomechanics has introduced a number of tools for Content Innovation to allow presentation, sharing, and archiving of methods, models, and data in our articles. The tools include an Interactive Plot Viewer, 3D Geometric Shape and Model Viewer, Virtual Microscope, Interactive MATLAB Figure Viewer, and Audioslides. Authors are highly encouraged to make use of these in upcoming journal submissions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Development of a data capture tool for researching tech entrepreneurship

    DEFF Research Database (Denmark)

    Andersen, Jakob Axel Bejbro; Howard, Thomas J.; McAloone, Tim C.

    2014-01-01

    . This paper elucidates the requirements for such tools by drawing on knowledge of the entrepreneurial phenomenon and by building on the existing research tools used in design research. On this basis, the development of a capture method for tech startup processes is described and its potential discussed....

  17. A Visualization Tool for Integrating Research Results at an Underground Mine

    Science.gov (United States)

    Boltz, S.; Macdonald, B. D.; Orr, T.; Johnson, W.; Benton, D. J.

    2016-12-01

    Researchers with the National Institute for Occupational Safety and Health are conducting research at a deep, underground metal mine in Idaho to develop improvements in ground control technologies that reduce the effects of dynamic loading on mine workings, thereby decreasing the risk to miners. This research is multifaceted and includes: photogrammetry, microseismic monitoring, geotechnical instrumentation, and numerical modeling. When managing research involving such a wide range of data, understanding how the data relate to each other and to the mining activity quickly becomes a daunting task. In an effort to combine this diverse research data into a single, easy-to-use system, a three-dimensional visualization tool was developed. The tool was created using the Unity3d video gaming engine and includes the mine development entries, production stopes, important geologic structures, and user-input research data. The tool provides the user with a first-person, interactive experience where they are able to walk through the mine as well as navigate the rock mass surrounding the mine to view and interpret the imported data in the context of the mine and as a function of time. The tool was developed using data from a single mine; however, it is intended to be a generic tool that can be easily extended to other mines. For example, a similar visualization tool is being developed for an underground coal mine in Colorado. The ultimate goal is for NIOSH researchers and mine personnel to be able to use the visualization tool to identify trends that may not otherwise be apparent when viewing the data separately. This presentation highlights the features and capabilities of the mine visualization tool and explains how it may be used to more effectively interpret data and reduce the risk of ground fall hazards to underground miners.

  18. Integrating the hospital library with patient care, teaching and research: model and Web 2.0 tools to create a social and collaborative community of clinical research in a hospital setting.

    Science.gov (United States)

    Montano, Blanca San José; Garcia Carretero, Rafael; Varela Entrecanales, Manuel; Pozuelo, Paz Martin

    2010-09-01

    Research in hospital settings faces several difficulties. Information technologies and certain Web 2.0 tools may provide new models to tackle these problems, allowing for a collaborative approach and bridging the gap between clinical practice, teaching and research. We aim to gather a community of researchers involved in the development of a network of learning and investigation resources in a hospital setting. A multi-disciplinary work group analysed the needs of the research community. We studied the opportunities provided by Web 2.0 tools and finally we defined the spaces that would be developed, describing their elements, members and different access levels. WIKINVESTIGACION is a collaborative web space with the aim of integrating the management of all the hospital's teaching and research resources. It is composed of five spaces, with different access privileges. The spaces are: Research Group Space 'wiki for each individual research group', Learning Resources Centre devoted to the Library, News Space, Forum and Repositories. The Internet, and most notably the Web 2.0 movement, is introducing some overwhelming changes in our society. Research and teaching in the hospital setting will join this current and take advantage of these tools to socialise and improve knowledge management.

  19. The SPARK Tool to prioritise questions for systematic reviews in health policy and systems research: development and initial validation.

    Science.gov (United States)

    Akl, Elie A; Fadlallah, Racha; Ghandour, Lilian; Kdouh, Ola; Langlois, Etienne; Lavis, John N; Schünemann, Holger; El-Jardali, Fadi

    2017-09-04

    Groups or institutions funding or conducting systematic reviews in health policy and systems research (HPSR) should prioritise topics according to the needs of policymakers and stakeholders. The aim of this study was to develop and validate a tool to prioritise questions for systematic reviews in HPSR. We developed the tool following a four-step approach consisting of (1) the definition of the purpose and scope of tool, (2) item generation and reduction, (3) testing for content and face validity, (4) and pilot testing of the tool. The research team involved international experts in HPSR, systematic review methodology and tool development, led by the Center for Systematic Reviews on Health Policy and Systems Research (SPARK). We followed an inclusive approach in determining the final selection of items to allow customisation to the user's needs. The purpose of the SPARK tool was to prioritise questions in HPSR in order to address them in systematic reviews. In the item generation and reduction phase, an extensive literature search yielded 40 relevant articles, which were reviewed by the research team to create a preliminary list of 19 candidate items for inclusion in the tool. As part of testing for content and face validity, input from international experts led to the refining, changing, merging and addition of new items, and to organisation of the tool into two modules. Following pilot testing, we finalised the tool, with 22 items organised in two modules - the first module including 13 items to be rated by policymakers and stakeholders, and the second including 9 items to be rated by systematic review teams. Users can customise the tool to their needs, by omitting items that may not be applicable to their settings. We also developed a user manual that provides guidance on how to use the SPARK tool, along with signaling questions. We have developed and conducted initial validation of the SPARK tool to prioritise questions for systematic reviews in HPSR, along with

  20. Validation of a new assessment tool for qualitative research articles

    DEFF Research Database (Denmark)

    Schou, Lone; Høstrup, Helle; Lyngsø, Elin

    2012-01-01

    schou l., høstrup h., lyngsø e.e., larsen s. & poulsen i. (2011) Validation of a new assessment tool for qualitative research articles. Journal of Advanced Nursing00(0), 000-000. doi: 10.1111/j.1365-2648.2011.05898.x ABSTRACT: Aim.  This paper presents the development and validation of a new...... assessment tool for qualitative research articles, which could assess trustworthiness of qualitative research articles as defined by Guba and at the same time aid clinicians in their assessment. Background.  There are more than 100 sets of proposals for quality criteria for qualitative research. However, we...... is the Danish acronym for Appraisal of Qualitative Studies. Phase 1 was to develop the tool based on a literature review and on consultation with qualitative researchers. Phase 2 was an inter-rater reliability test in which 40 health professionals participated. Phase 3 was an inter-rater reliability test among...

  1. Clean Air Markets - Allowances Query Wizard

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Allowances Query Wizard is part of a suite of Clean Air Markets-related tools that are accessible at http://camddataandmaps.epa.gov/gdm/index.cfm. The Allowances...

  2. Serious Games are a Serious Tool for Team Research

    Directory of Open Access Journals (Sweden)

    Michael D. Coovert

    2017-03-01

    Full Text Available Serious games are an attractive tool for education and training, but their utility is even broader. We argue serious games provide a unique opportunity for research as well, particularly in areas where multiple players (groups or teams are involved. In our paper we provide background in several substantive areas. First, we outline major constructs and challenges found in team research. Secondly, we discuss serious games, providing an overview and description of their role in education, training, and research. Thirdly, we describe necessary characteristics for game engines utilized in team research, followed by a discussion of the value added by utilizing serious games. Our goal in this paper is to argue serious games are an effective tool with demonstrated reliability and validity and should be part of a research program for those engaged in team research. Both team researchers and those involved in serious game development can benefit from a mutual partnership which is research focused.

  3. FOSS Tools for Research Data Management

    Science.gov (United States)

    Stender, Vivien; Jankowski, Cedric; Hammitzsch, Martin; Wächter, Joachim

    2017-04-01

    Established initiatives and organizations, e.g. the Initiative for Scientific Cyberinfrastructures (NSF, 2007) or the European Strategy Forum on Research Infrastructures (ESFRI, 2008), promote and foster the development of sustainable research infrastructures. These infrastructures aim the provision of services supporting scientists to search, visualize and access data, to collaborate and exchange information, as well as to publish data and other results. In this regard, Research Data Management (RDM) gains importance and thus requires the support by appropriate tools integrated in these infrastructures. Different projects provide arbitrary solutions to manage research data. However, within two projects - SUMARIO for land and water management and TERENO for environmental monitoring - solutions to manage research data have been developed based on Free and Open Source Software (FOSS) components. The resulting framework provides essential components for harvesting, storing and documenting research data, as well as for discovering, visualizing and downloading these data on the basis of standardized services stimulated considerably by enhanced data management approaches of Spatial Data Infrastructures (SDI). In order to fully exploit the potentials of these developments for enhancing data management in Geosciences the publication of software components, e.g. via GitHub, is not sufficient. We will use our experience to move these solutions into the cloud e.g. as PaaS or SaaS offerings. Our contribution will present data management solutions for the Geosciences developed in two projects. A sort of construction kit with FOSS components build the backbone for the assembly and implementation of projects specific platforms. Furthermore, an approach is presented to stimulate the reuse of FOSS RDM solutions with cloud concepts. In further projects specific RDM platforms can be set-up much faster, customized to the individual needs and tools can be added during the run-time.

  4. Seismicity map tools for earthquake studies

    Science.gov (United States)

    Boucouvalas, Anthony; Kaskebes, Athanasios; Tselikas, Nikos

    2014-05-01

    We report on the development of new and online set of tools for use within Google Maps, for earthquake research. We demonstrate this server based and online platform (developped with PHP, Javascript, MySQL) with the new tools using a database system with earthquake data. The platform allows us to carry out statistical and deterministic analysis on earthquake data use of Google Maps and plot various seismicity graphs. The tool box has been extended to draw on the map line segments, multiple straight lines horizontally and vertically as well as multiple circles, including geodesic lines. The application is demonstrated using localized seismic data from the geographic region of Greece as well as other global earthquake data. The application also offers regional segmentation (NxN) which allows the studying earthquake clustering, and earthquake cluster shift within the segments in space. The platform offers many filters such for plotting selected magnitude ranges or time periods. The plotting facility allows statistically based plots such as cumulative earthquake magnitude plots and earthquake magnitude histograms, calculation of 'b' etc. What is novel for the platform is the additional deterministic tools. Using the newly developed horizontal and vertical line and circle tools we have studied the spatial distribution trends of many earthquakes and we here show for the first time the link between Fibonacci Numbers and spatiotemporal location of some earthquakes. The new tools are valuable for examining visualizing trends in earthquake research as it allows calculation of statistics as well as deterministic precursors. We plan to show many new results based on our newly developed platform.

  5. Impact of design research on industrial practice tools, technology, and training

    CERN Document Server

    Lindemann, Udo

    2016-01-01

    Showcasing exemplars of how various aspects of design research were successfully transitioned into and influenced, design practice, this book features chapters written by eminent international researchers and practitioners from industry on the Impact of Design Research on Industrial Practice. Chapters written by internationally acclaimed researchers of design analyse the findings (guidelines, methods and tools), technologies/products and educational approaches that have been transferred as tools, technologies and people to transform industrial practice of engineering design, whilst the chapters that are written by industrial practitioners describe their experience of how various tools, technologies and training impacted design practice. The main benefit of this book, for educators, researchers and practitioners in (engineering) design, will be access to a comprehensive coverage of case studies of successful transfer of outcomes of design research into practice; as well as guidelines and platforms for successf...

  6. A Clinical Reasoning Tool for Virtual Patients: Design-Based Research Study.

    Science.gov (United States)

    Hege, Inga; Kononowicz, Andrzej A; Adler, Martin

    2017-11-02

    Clinical reasoning is a fundamental process medical students have to learn during and after medical school. Virtual patients (VP) are a technology-enhanced learning method to teach clinical reasoning. However, VP systems do not exploit their full potential concerning the clinical reasoning process; for example, most systems focus on the outcome and less on the process of clinical reasoning. Keeping our concept grounded in a former qualitative study, we aimed to design and implement a tool to enhance VPs with activities and feedback, which specifically foster the acquisition of clinical reasoning skills. We designed the tool by translating elements of a conceptual clinical reasoning learning framework into software requirements. The resulting clinical reasoning tool enables learners to build their patient's illness script as a concept map when they are working on a VP scenario. The student's map is compared with the experts' reasoning at each stage of the VP, which is technically enabled by using Medical Subject Headings, which is a comprehensive controlled vocabulary published by the US National Library of Medicine. The tool is implemented using Web technologies, has an open architecture that enables its integration into various systems through an open application program interface, and is available under a Massachusetts Institute of Technology license. We conducted usability tests following a think-aloud protocol and a pilot field study with maps created by 64 medical students. The results show that learners interact with the tool but create less nodes and connections in the concept map than an expert. Further research and usability tests are required to analyze the reasons. The presented tool is a versatile, systematically developed software component that specifically supports the clinical reasoning skills acquisition. It can be plugged into VP systems or used as stand-alone software in other teaching scenarios. The modular design allows an extension with new

  7. 7 CFR 3402.6 - Overview of the special international study and/or thesis/dissertation research travel allowance.

    Science.gov (United States)

    2010-01-01

    ... thesis/dissertation research travel allowance. 3402.6 Section 3402.6 Agriculture Regulations of the... GRANTS PROGRAM Program Description § 3402.6 Overview of the special international study and/or thesis... special international study or thesis/dissertation research travel allowance, the Project Director must...

  8. Drawing as a user experience research tool

    DEFF Research Database (Denmark)

    Fleury, Alexandre

    2011-01-01

    such previous work, two case studies are presented, in which drawings helped investigate the relationship between media technology users and two specific devices, namely television and mobile phones. The experiment generated useful data and opened for further consideration of the method as an appropriate HCI...... research tool....

  9. Primer on consumer marketing research : procedures, methods, and tools

    Science.gov (United States)

    1994-03-01

    The Volpe Center developed a marketing research primer which provides a guide to the approach, procedures, and research tools used by private industry in predicting consumer response. The final two chapters of the primer focus on the challenges of do...

  10. Three novel software tools for ASDEX Upgrade

    International Nuclear Information System (INIS)

    Martinov, S.; Löbhard, T.; Lunt, T.; Behler, K.; Drube, R.; Eixenberger, H.; Herrmann, A.; Lohs, A.; Lüddecke, K.; Merkel, R.; Neu, G.; ASDEX Upgrade Team; MPCDF Garching

    2016-01-01

    Highlights: • Key features of innovative software tools for data visualization and inspection are presented to the nuclear fusion research community. • 3D animation of experiment geometry together with diagnostic data and images allow better understanding of measurements and influence of machine construction details behind them. • Multi-video viewer with fusion relevant image manipulation abilities and event database features allows faster and better decision making from video streams coming from various plasma and machine diagnostics. • Platform independant Web technologies enable the inspection of diagnostic raw signals with virtually any kind of display device. - Abstract: Visualization of measurements together with experimental settings is a general subject in experiments analysis. The complex engineering design, 3D geometry, and manifold of diagnostics in larger fusion research experiments justify the development of special analysis and visualization programs. Novel ASDEX Upgrade (AUG) software tools bring together virtual navigation through 3D device models and advanced play-back and interpretation of video streams from plasma discharges. A third little tool allows the web-based platform independent observation of real-time diagnostic signals. While all three tools stem from spontaneous development ideas and are not considered mission critical for the operation of a fusion device, they with time and growing completeness shaped up as valuable helpers to visualize acquired data in fusion research. A short overview on the goals, the features, and the design as well as the operation of these tools is given in this paper.

  11. Three novel software tools for ASDEX Upgrade

    Energy Technology Data Exchange (ETDEWEB)

    Martinov, S. [Max-Planck-Institut für Plasmaphysik, Boltzmannstr. 2, D-85748 Garching bei München (Germany); Löbhard, T. [Conovum GmbH & Co. KG, Nymphenburger Straße 13, D-80335 München (Germany); Lunt, T. [Max-Planck-Institut für Plasmaphysik, Boltzmannstr. 2, D-85748 Garching bei München (Germany); Behler, K., E-mail: karl.behler@ipp.mpg.de [Max-Planck-Institut für Plasmaphysik, Boltzmannstr. 2, D-85748 Garching bei München (Germany); Drube, R.; Eixenberger, H.; Herrmann, A.; Lohs, A. [Max-Planck-Institut für Plasmaphysik, Boltzmannstr. 2, D-85748 Garching bei München (Germany); Lüddecke, K. [Unlimited Computer Systems GmbH, Seeshaupterstr. 15, D-82393 Iffeldorf (Germany); Merkel, R.; Neu, G.; ASDEX Upgrade Team [Max-Planck-Institut für Plasmaphysik, Boltzmannstr. 2, D-85748 Garching bei München (Germany); MPCDF Garching [Max Planck Compu ting and Data Facility, Boltzmannstr. 2, D-85748 Garching (Germany)

    2016-11-15

    Highlights: • Key features of innovative software tools for data visualization and inspection are presented to the nuclear fusion research community. • 3D animation of experiment geometry together with diagnostic data and images allow better understanding of measurements and influence of machine construction details behind them. • Multi-video viewer with fusion relevant image manipulation abilities and event database features allows faster and better decision making from video streams coming from various plasma and machine diagnostics. • Platform independant Web technologies enable the inspection of diagnostic raw signals with virtually any kind of display device. - Abstract: Visualization of measurements together with experimental settings is a general subject in experiments analysis. The complex engineering design, 3D geometry, and manifold of diagnostics in larger fusion research experiments justify the development of special analysis and visualization programs. Novel ASDEX Upgrade (AUG) software tools bring together virtual navigation through 3D device models and advanced play-back and interpretation of video streams from plasma discharges. A third little tool allows the web-based platform independent observation of real-time diagnostic signals. While all three tools stem from spontaneous development ideas and are not considered mission critical for the operation of a fusion device, they with time and growing completeness shaped up as valuable helpers to visualize acquired data in fusion research. A short overview on the goals, the features, and the design as well as the operation of these tools is given in this paper.

  12. [The use of interviews in participative intervention and research: the GAM tool as a collective interview].

    Science.gov (United States)

    Sade, Christian; de Barros, Leticia Maria Renault; Melo, Jorge José Maciel; Passos, Eduardo

    2013-10-01

    This paper seeks to assess a way of conducting interviews in line with the ideology of Brazilian Psychiatric Reform. In the methodology of participative intervention and research in mental health, the interview is less a data collection than a data harvesting procedure. It is designed to apply the principles of psychosocial care, autonomy as the basis for treatment, the predominance of the users and of their social networks and civic participation. Inspired by the Explicitation Interview technique, the contention is that the handling of the interview presupposes an open attitude able to promote and embrace different viewpoints. This attitude makes the interview a collective experience of sharing and belonging, allowing participants to reposition themselves subjectively in treatment with the emergence of groupality. As an example of using the interview as a methodological tool in mental health research, we examine research into adaptation of the tool of Autonomous Medication Management (GAM). It is an interventionist approach guided by principles that foster autonomy and the protagonist status of users of psychotropic medication, their quality of life, their rights and recognition of the multiple significances of medication, understood here as a collective interview technique.

  13. Simple Tools to Facilitate Project Management of a Nursing Research Project.

    Science.gov (United States)

    Aycock, Dawn M; Clark, Patricia C; Thomas-Seaton, LaTeshia; Lee, Shih-Yu; Moloney, Margaret

    2016-07-01

    Highly organized project management facilitates rigorous study implementation. Research involves gathering large amounts of information that can be overwhelming when organizational strategies are not used. We describe a variety of project management and organizational tools used in different studies that may be particularly useful for novice researchers. The studies were a multisite study of caregivers of stroke survivors, an Internet-based diary study of women with migraines, and a pilot study testing a sleep intervention in mothers of low-birth-weight infants. Project management tools were used to facilitate enrollment, data collection, and access to results. The tools included protocol and eligibility checklists, event calendars, screening and enrollment logs, instrument scoring tables, and data summary sheets. These tools created efficiency, promoted a positive image, minimized errors, and provided researchers with a sense of control. For the studies described, there were no protocol violations, there were minimal missing data, and the integrity of data collection was maintained. © The Author(s) 2016.

  14. Are EM's communication tools effective? Evaluation research of two EM publications

    International Nuclear Information System (INIS)

    Wight, Evelyn; Gardner, Gene; Harvey, Tony

    1992-01-01

    As a reflection of its growing culture of openness, and in response to the public's need for accurate information about its activities, the U.S. Department of Energy (DOE) Office of the Assistant Secretary for Environmental Restoration and Waste Management (EM) has increased the amount of information available to the public through communication tools such as brochures, fact sheets, and a travelling exhibit with an interactive computer display. Our involvement with this effort has been to design, develop, and critique booklets, brochures, fact sheets and other communication tools for EM. This paper presents an evaluation of the effectiveness of two communication tools we developed: the EM Booklet and the EM Fact Sheets. We measured effectiveness using non-parametric testing. This paper describes DOE's culture change, EM's communication tools and their context within DOE'S new open culture, our research, test methods and results, the significance of our research, and our plans for future research. (author)

  15. Analyzing HT-SELEX data with the Galaxy Project tools--A web based bioinformatics platform for biomedical research.

    Science.gov (United States)

    Thiel, William H; Giangrande, Paloma H

    2016-03-15

    The development of DNA and RNA aptamers for research as well as diagnostic and therapeutic applications is a rapidly growing field. In the past decade, the process of identifying aptamers has been revolutionized with the advent of high-throughput sequencing (HTS). However, bioinformatics tools that enable the average molecular biologist to analyze these large datasets and expedite the identification of candidate aptamer sequences have been lagging behind the HTS revolution. The Galaxy Project was developed in order to efficiently analyze genome, exome, and transcriptome HTS data, and we have now applied these tools to aptamer HTS data. The Galaxy Project's public webserver is an open source collection of bioinformatics tools that are powerful, flexible, dynamic, and user friendly. The online nature of the Galaxy webserver and its graphical interface allow users to analyze HTS data without compiling code or installing multiple programs. Herein we describe how tools within the Galaxy webserver can be adapted to pre-process, compile, filter and analyze aptamer HTS data from multiple rounds of selection. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. Equity Audit: A Teacher Leadership Tool for Nurturing Teacher Research

    Science.gov (United States)

    View, Jenice L.; DeMulder, Elizabeth; Stribling, Stacia; Dodman, Stephanie; Ra, Sophia; Hall, Beth; Swalwell, Katy

    2016-01-01

    This is a three-part essay featuring six teacher educators and one classroom teacher researcher. Part one describes faculty efforts to build curriculum for teacher research, scaffold the research process, and analyze outcomes. Part two shares one teacher researcher's experience using an equity audit tool in several contexts: her teaching practice,…

  17. OpenPrescribing: normalised data and software tool to research trends in English NHS primary care prescribing 1998-2016.

    Science.gov (United States)

    Curtis, Helen J; Goldacre, Ben

    2018-02-23

    We aimed to compile and normalise England's national prescribing data for 1998-2016 to facilitate research on long-term time trends and create an open-data exploration tool for wider use. We compiled data from each individual year's national statistical publications and normalised them by mapping each drug to its current classification within the national formulary where possible. We created a freely accessible, interactive web tool to allow anyone to interact with the processed data. We downloaded all available annual prescription cost analysis datasets, which include cost and quantity for all prescription items dispensed in the community in England. Medical devices and appliances were excluded. We measured the extent of normalisation of data and aimed to produce a functioning accessible analysis tool. All data were imported successfully. 87.5% of drugs were matched exactly on name to the current formulary and a further 6.5% to similar drug names. All drugs in core clinical chapters were reconciled to their current location in the data schema, with only 1.26% of drugs not assigned a current chemical code. We created an openly accessible interactive tool to facilitate wider use of these data. Publicly available data can be made accessible through interactive online tools to help researchers and policy-makers explore time trends in prescribing. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2018. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  18. Managing complex research datasets using electronic tools: A meta-analysis exemplar

    Science.gov (United States)

    Brown, Sharon A.; Martin, Ellen E.; Garcia, Theresa J.; Winter, Mary A.; García, Alexandra A.; Brown, Adama; Cuevas, Heather E.; Sumlin, Lisa L.

    2013-01-01

    Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, e.g., EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process, as well as enhancing communication among research team members. The purpose of this paper is to describe the electronic processes we designed, using commercially available software, for an extensive quantitative model-testing meta-analysis we are conducting. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to: decide on which electronic tools to use, determine how these tools would be employed, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members. PMID:23681256

  19. Enabling laboratory EUV research with a compact exposure tool

    Science.gov (United States)

    Brose, Sascha; Danylyuk, Serhiy; Tempeler, Jenny; Kim, Hyun-su; Loosen, Peter; Juschkin, Larissa

    2016-03-01

    In this work we present the capabilities of the designed and realized extreme ultraviolet laboratory exposure tool (EUVLET) which has been developed at the RWTH-Aachen, Chair for the Technology of Optical Systems (TOS), in cooperation with the Fraunhofer Institute for Laser Technology (ILT) and Bruker ASC GmbH. Main purpose of this laboratory setup is the direct application in research facilities and companies with small batch production, where the fabrication of high resolution periodic arrays over large areas is required. The setup can also be utilized for resist characterization and evaluation of its pre- and post-exposure processing. The tool utilizes a partially coherent discharge produced plasma (DPP) source and minimizes the number of other critical components to a transmission grating, the photoresist coated wafer and the positioning system for wafer and grating and utilizes the Talbot lithography approach. To identify the limits of this approach first each component is analyzed and optimized separately and relations between these components are identified. The EUV source has been optimized to achieve the best values for spatial and temporal coherence. Phase-shifting and amplitude transmission gratings have been fabricated and exposed. Several commercially available electron beam resists and one EUV resist have been characterized by open frame exposures to determine their contrast under EUV radiation. Cold development procedure has been performed to further increase the resist contrast. By analyzing the exposure results it can be demonstrated that only a 1:1 copy of the mask structure can be fully resolved by the utilization of amplitude masks. The utilized phase-shift masks offer higher 1st order diffraction efficiency and allow a demagnification of the mask structure in the achromatic Talbot plane.

  20. Research on the tool holder mode in high speed machining

    Science.gov (United States)

    Zhenyu, Zhao; Yongquan, Zhou; Houming, Zhou; Xiaomei, Xu; Haibin, Xiao

    2018-03-01

    High speed machining technology can improve the processing efficiency and precision, but also reduce the processing cost. Therefore, the technology is widely regarded in the industry. With the extensive application of high-speed machining technology, high-speed tool system has higher and higher requirements on the tool chuck. At present, in high speed precision machining, several new kinds of clip heads are as long as there are heat shrinkage tool-holder, high-precision spring chuck, hydraulic tool-holder, and the three-rib deformation chuck. Among them, the heat shrinkage tool-holder has the advantages of high precision, high clamping force, high bending rigidity and dynamic balance, etc., which are widely used. Therefore, it is of great significance to research the new requirements of the machining tool system. In order to adapt to the requirement of high speed machining precision machining technology, this paper expounds the common tool holder technology of high precision machining, and proposes how to select correctly tool clamping system in practice. The characteristics and existing problems are analyzed in the tool clamping system.

  1. New evaluation tool now available to assess research quality | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2016-04-25

    Apr 25, 2016 ... New evaluation tool now available to assess research quality ... Ratings on a scale defined by rubrics, to indicate the level at which a project ... Report: The value-for-money discourse: risks and opportunities for research for development ... Copyright · Open access policy · Privacy policy · Research ethics ...

  2. LITERATURE REVIEWING WITH RESEARCH TOOLS, Part 3: Writing Literature Review

    OpenAIRE

    Ebrahim, Nader Ale

    2017-01-01

    Research Tools” enable researchers to collect, organize, analyze, visualize and publicized research outputs. Dr. Nader has collected over 700 tools that enable students to follow the correct path in research and to ultimately produce high-quality research outputs with more accuracy and efficiency. It is assembled as an interactive Web-based mind map, titled “Research Tools”, which is updated periodically. “Research Tools” consists of a hierarchical set of nodes. It has four main nodes: (1)...

  3. LITERATURE REVIEWING WITH RESEARCH TOOLS, Part 2: Finding proper articles

    OpenAIRE

    Ebrahim, Nader Ale

    2017-01-01

    Research Tools” enable researchers to collect, organize, analyze, visualize and publicized research outputs. Dr. Nader has collected over 700 tools that enable students to follow the correct path in research and to ultimately produce high-quality research outputs with more accuracy and efficiency. It is assembled as an interactive Web-based mind map, titled “Research Tools”, which is updated periodically. “Research Tools” consists of a hierarchical set of nodes. It has four main nodes: (1)...

  4. Overview of Simulation Tools for Smart Grids

    DEFF Research Database (Denmark)

    aim of this report “D2.1 – Overview of Simulation Tools for Smart Grids” is to provide an overview of the different simulation tools available, i.e. developed and in use, at the different research centres. Required new tool capabilities are identified and extensions to the existing packages...... are indicated. An analysis of the emerging power systems challenges together with a review of the main topics regarding smart grids is provided in Chapter 1. The requirements for the simulation tools and the list of available tools in the different research centres and their main characteristic are reported...... in Chapter 2. The main aspects of the different tools and their purpose of analysis are listed in Chapter 3 along with the main topics concerning the new requirements for tools in order to allow a proper study in the smart grid context. Gaps capabilities and model consolidation of the analysed tools...

  5. SIMULATION TOOLS FOR ELECTRICAL MACHINES MODELLING ...

    African Journals Online (AJOL)

    Dr Obe

    ABSTRACT. Simulation tools are used both for research and teaching to allow a good ... The solution provide an easy way of determining the dynamic .... incorporate an in-built numerical algorithm, ... to learn, versatile in application, enhanced.

  6. FOSS Tools for Research Infrastructures - A Success Story?

    Science.gov (United States)

    Stender, V.; Schroeder, M.; Wächter, J.

    2015-12-01

    Established initiatives and mandated organizations, e.g. the Initiative for Scientific Cyberinfrastructures (NSF, 2007) or the European Strategy Forum on Research Infrastructures (ESFRI, 2008), promote and foster the development of sustainable research infrastructures. The basic idea behind these infrastructures is the provision of services supporting scientists to search, visualize and access data, to collaborate and exchange information, as well as to publish data and other results. Especially the management of research data is gaining more and more importance. In geosciences these developments have to be merged with the enhanced data management approaches of Spatial Data Infrastructures (SDI). The Centre for GeoInformationTechnology (CeGIT) at the GFZ German Research Centre for Geosciences has the objective to establish concepts and standards of SDIs as an integral part of research infrastructure architectures. In different projects, solutions to manage research data for land- and water management or environmental monitoring have been developed based on a framework consisting of Free and Open Source Software (FOSS) components. The framework provides basic components supporting the import and storage of data, discovery and visualization as well as data documentation (metadata). In our contribution, we present our data management solutions developed in three projects, Central Asian Water (CAWa), Sustainable Management of River Oases (SuMaRiO) and Terrestrial Environmental Observatories (TERENO) where FOSS components build the backbone of the data management platform. The multiple use and validation of tools helped to establish a standardized architectural blueprint serving as a contribution to Research Infrastructures. We examine the question of whether FOSS tools are really a sustainable choice and whether the increased efforts of maintenance are justified. Finally it should help to answering the question if the use of FOSS for Research Infrastructures is a

  7. The Value of Open Source Software Tools in Qualitative Research

    Science.gov (United States)

    Greenberg, Gary

    2011-01-01

    In an era of global networks, researchers using qualitative methods must consider the impact of any software they use on the sharing of data and findings. In this essay, I identify researchers' main areas of concern regarding the use of qualitative software packages for research. I then examine how open source software tools, wherein the publisher…

  8. Research and Development of Powder Brazing Filler Metals for Diamond Tools: A Review

    Directory of Open Access Journals (Sweden)

    Fei Long

    2018-05-01

    Full Text Available Powder brazing filler metals (PBFMs feature a number of comparative advantages. Among others, these include a low energy consumption, an accurate dosage, a good brazeability, a short production time, and a high production efficiency. These filler metals have been used in the aerospace, automobile, and electric appliances industries. The PBFMs are especially suitable for diamond tools bonding, which involves complex workpiece shapes and requires accurate dosage. The recent research of PBFMs for diamond tools is reviewed in this paper. The current applications are discussed. The CuSnTi and Ni-Cr-based PBFMs have been the two commonly used monolayer PBFMs. Thus, the bonding mechanism at the interface between both the monolayer PBFMs and a diamond tool are summarized first. The ways to improve the performance of the monolayer PBFMs for diamond tools are analyzed. Next, a research of PBFMs for impregnated diamond tools is reviewed. The technical problems that urgently need solutions are discussed. Finally, the challenges and opportunities involved with the PBFMs for diamond tools research and development are summarized, and corresponding prospects are suggested.

  9. GEAS Spectroscopy Tools for Authentic Research Investigations in the Classroom

    Science.gov (United States)

    Rector, Travis A.; Vogt, Nicole P.

    2018-06-01

    Spectroscopy is one of the most powerful tools that astronomers use to study the universe. However relatively few resources are available that enable undergraduates to explore astronomical spectra interactively. We present web-based applications which guide students through the analysis of real spectra of stars, galaxies, and quasars. The tools are written in HTML5 and function in all modern web browsers on computers and tablets. No software needs to be installed nor do any datasets need to be downloaded, enabling students to use the tools in or outside of class (e.g., for online classes).Approachable GUIs allow students to analyze spectra in the same manner as professional astronomers. The stellar spectroscopy tool can fit a continuum with a blackbody and identify spectral features, as well as fit line profiles and determine equivalent widths. The galaxy and AGN tools can also measure redshifts and calcium break strengths. The tools provide access to an archive of hundreds of spectra obtained with the optical telescopes at Kitt Peak National Observatory. It is also possible to load your own spectra or to query the Sloan Digital Sky Survey (SDSS) database.We have also developed curricula to investigate these topics: spectral classification, variable stars, redshift, and AGN classification. We will present the functionality of the tools and describe the associated curriculum. The tools are part of the General Education Astronomy Source (GEAS) project based at New Mexico State University, with support from the National Science Foundation (NSF, AST-0349155) and the National Aeronautics and Space Administration (NASA, NNX09AV36G). Curriculum development was supported by the NSF (DUE-0618849 and DUE-0920293).

  10. The Systems Biology Research Tool: evolvable open-source software

    Directory of Open Access Journals (Sweden)

    Wright Jeremiah

    2008-06-01

    Full Text Available Abstract Background Research in the field of systems biology requires software for a variety of purposes. Software must be used to store, retrieve, analyze, and sometimes even to collect the data obtained from system-level (often high-throughput experiments. Software must also be used to implement mathematical models and algorithms required for simulation and theoretical predictions on the system-level. Results We introduce a free, easy-to-use, open-source, integrated software platform called the Systems Biology Research Tool (SBRT to facilitate the computational aspects of systems biology. The SBRT currently performs 35 methods for analyzing stoichiometric networks and 16 methods from fields such as graph theory, geometry, algebra, and combinatorics. New computational techniques can be added to the SBRT via process plug-ins, providing a high degree of evolvability and a unifying framework for software development in systems biology. Conclusion The Systems Biology Research Tool represents a technological advance for systems biology. This software can be used to make sophisticated computational techniques accessible to everyone (including those with no programming ability, to facilitate cooperation among researchers, and to expedite progress in the field of systems biology.

  11. Molecular tools for bathing water assessment in Europe: Balancing social science research with a rapidly developing environmental science evidence-base.

    Science.gov (United States)

    Oliver, David M; Hanley, Nick D; van Niekerk, Melanie; Kay, David; Heathwaite, A Louise; Rabinovici, Sharyl J M; Kinzelman, Julie L; Fleming, Lora E; Porter, Jonathan; Shaikh, Sabina; Fish, Rob; Chilton, Sue; Hewitt, Julie; Connolly, Elaine; Cummins, Andy; Glenk, Klaus; McPhail, Calum; McRory, Eric; McVittie, Alistair; Giles, Amanna; Roberts, Suzanne; Simpson, Katherine; Tinch, Dugald; Thairs, Ted; Avery, Lisa M; Vinten, Andy J A; Watts, Bill D; Quilliam, Richard S

    2016-02-01

    The use of molecular tools, principally qPCR, versus traditional culture-based methods for quantifying microbial parameters (e.g., Fecal Indicator Organisms) in bathing waters generates considerable ongoing debate at the science-policy interface. Advances in science have allowed the development and application of molecular biological methods for rapid (~2 h) quantification of microbial pollution in bathing and recreational waters. In contrast, culture-based methods can take between 18 and 96 h for sample processing. Thus, molecular tools offer an opportunity to provide a more meaningful statement of microbial risk to water-users by providing near-real-time information enabling potentially more informed decision-making with regard to water-based activities. However, complementary studies concerning the potential costs and benefits of adopting rapid methods as a regulatory tool are in short supply. We report on findings from an international Working Group that examined the breadth of social impacts, challenges, and research opportunities associated with the application of molecular tools to bathing water regulations.

  12. Reflective Drawing as a Tool for Reflection in Design Research

    Science.gov (United States)

    Calvo, Mirian

    2017-01-01

    This article explores the role of drawing as a tool for reflection. It reports on a PhD research project that aims to identify and analyse the value that co-design processes can bring to participants and their communities. The research is associated with Leapfrog, a three-year project funded by the UK Arts and Humanities Research Council (AHRC).…

  13. A QUALITATIVE RESEARCH REGARDING THE MARKETING COMMUNICATION TOOLS USED IN THE ONLINE ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    POP Nicolae Al.

    2011-07-01

    Full Text Available Starting from the meaning of the communication process in marketing, the authors try to identify its role in assuring the continuity of the management process in what concerns the relationships between all the partners of the company, on the long term. An emphasis is made on the role of online communication and its tools in relationship marketing. In order to validate some of the mentioned ideas the authors have chosen to undertake a qualitative marketing research among the managers of some Romanian tourism companies. The qualitative part of the study had as purpose the identification of the main tools which form the basis of the communication with the beneficiaries of the touristic services, of the way in which the companies use the online communication tools for attracting, keeping and developing the long term relationships with their customers in the virtual environment. The following tools have been analyzed: websites, email marketing campaigns, e-newsletters, online advertising, search engines, sponsored links, blogs, RSS feed, social networks, forums, online discussion groups, portals, infomediaries and instant messaging. The chosen investigation method was the selective survey, the research technique - explorative interrogation and the research instrument - semi structured detailed interview, based on a conversation guide. A very important fact is the classification resulted after the respondents were requested to mention the most efficient tools for attracting customers and for maintaining the relationships with them. Although the notoriety of the online marketing tools is high, there are some tools that are known by definition, but are not used at all or are not used correctly; or are not known by definition, but are used in practice. The authors contributed by validating a performing methodology of qualitative research, a study which will open new ways and means for making the online communication tools used for touristic services in

  14. Action Research on a WebQuest as an Instructional Tool for Writing Abstracts of Research Articles

    Directory of Open Access Journals (Sweden)

    Krismiyati Latuperissa

    2012-08-01

    Full Text Available The massive growth of and access to information technology (IT has enabled the integration of technology into classrooms. One such integration is the use of WebQuests as an instructional tool in teaching targeted learning activities such as writing abstracts of research articles in English for English as a Foreign Language (EFL learners. In the academic world, writing an abstract of a research paper or final project in English can be challenging for EFL students. This article presents an action research project on the process and outcomes of using a WebQuest designed to help 20 Indonesian university IT students write a research article’s abstract in English. Findings reveal that despite positive feedback, changes need to be made to make the WebQuest a more effective instructional tool for the purpose it was designed.

  15. Genephony: a knowledge management tool for genome-wide research

    Directory of Open Access Journals (Sweden)

    Riva Alberto

    2009-09-01

    Full Text Available Abstract Background One of the consequences of the rapid and widespread adoption of high-throughput experimental technologies is an exponential increase of the amount of data produced by genome-wide experiments. Researchers increasingly need to handle very large volumes of heterogeneous data, including both the data generated by their own experiments and the data retrieved from publicly available repositories of genomic knowledge. Integration, exploration, manipulation and interpretation of data and information therefore need to become as automated as possible, since their scale and breadth are, in general, beyond the limits of what individual researchers and the basic data management tools in normal use can handle. This paper describes Genephony, a tool we are developing to address these challenges. Results We describe how Genephony can be used to manage large datesets of genomic information, integrating them with existing knowledge repositories. We illustrate its functionalities with an example of a complex annotation task, in which a set of SNPs coming from a genotyping experiment is annotated with genes known to be associated to a phenotype of interest. We show how, thanks to the modular architecture of Genephony and its user-friendly interface, this task can be performed in a few simple steps. Conclusion Genephony is an online tool for the manipulation of large datasets of genomic information. It can be used as a browser for genomic data, as a high-throughput annotation tool, and as a knowledge discovery tool. It is designed to be easy to use, flexible and extensible. Its knowledge management engine provides fine-grained control over individual data elements, as well as efficient operations on large datasets.

  16. Software Tools | Office of Cancer Clinical Proteomics Research

    Science.gov (United States)

    The CPTAC program develops new approaches to elucidate aspects of the molecular complexity of cancer made from large-scale proteogenomic datasets, and advance them toward precision medicine.  Part of the CPTAC mission is to make data and tools available and accessible to the greater research community to accelerate the discovery process.

  17. An interactive visualization tool for multi-channel confocal microscopy data in neurobiology research

    KAUST Repository

    Yong Wan,

    2009-11-01

    Confocal microscopy is widely used in neurobiology for studying the three-dimensional structure of the nervous system. Confocal image data are often multi-channel, with each channel resulting from a different fluorescent dye or fluorescent protein; one channel may have dense data, while another has sparse; and there are often structures at several spatial scales: subneuronal domains, neurons, and large groups of neurons (brain regions). Even qualitative analysis can therefore require visualization using techniques and parameters fine-tuned to a particular dataset. Despite the plethora of volume rendering techniques that have been available for many years, the techniques standardly used in neurobiological research are somewhat rudimentary, such as looking at image slices or maximal intensity projections. Thus there is a real demand from neurobiologists, and biologists in general, for a flexible visualization tool that allows interactive visualization of multi-channel confocal data, with rapid fine-tuning of parameters to reveal the three-dimensional relationships of structures of interest. Together with neurobiologists, we have designed such a tool, choosing visualization methods to suit the characteristics of confocal data and a typical biologist\\'s workflow. We use interactive volume rendering with intuitive settings for multidimensional transfer functions, multiple render modes and multi-views for multi-channel volume data, and embedding of polygon data into volume data for rendering and editing. As an example, we apply this tool to visualize confocal microscopy datasets of the developing zebrafish visual system.

  18. The DEDUCE Guided Query tool: providing simplified access to clinical data for research and quality improvement.

    Science.gov (United States)

    Horvath, Monica M; Winfield, Stephanie; Evans, Steve; Slopek, Steve; Shang, Howard; Ferranti, Jeffrey

    2011-04-01

    In many healthcare organizations, comparative effectiveness research and quality improvement (QI) investigations are hampered by a lack of access to data created as a byproduct of patient care. Data collection often hinges upon either manual chart review or ad hoc requests to technical experts who support legacy clinical systems. In order to facilitate this needed capacity for data exploration at our institution (Duke University Health System), we have designed and deployed a robust Web application for cohort identification and data extraction--the Duke Enterprise Data Unified Content Explorer (DEDUCE). DEDUCE is envisioned as a simple, web-based environment that allows investigators access to administrative, financial, and clinical information generated during patient care. By using business intelligence tools to create a view into Duke Medicine's enterprise data warehouse, DEDUCE provides a Guided Query functionality using a wizard-like interface that lets users filter through millions of clinical records, explore aggregate reports, and, export extracts. Researchers and QI specialists can obtain detailed patient- and observation-level extracts without needing to understand structured query language or the underlying database model. Developers designing such tools must devote sufficient training and develop application safeguards to ensure that patient-centered clinical researchers understand when observation-level extracts should be used. This may mitigate the risk of data being misunderstood and consequently used in an improper fashion. Copyright © 2010 Elsevier Inc. All rights reserved.

  19. Research Tools and Materials | NCI Technology Transfer Center | TTC

    Science.gov (United States)

    Research Tools can be found in TTC's Available Technologies and in scientific publications. They are freely available to non-profits and universities through a Material Transfer Agreement (or other appropriate mechanism), and available via licensing to companies.

  20. CORE SIM: A multi-purpose neutronic tool for research and education

    International Nuclear Information System (INIS)

    Demaziere, Christophe

    2011-01-01

    Highlights: → A highly flexible neutronic core simulator was developed. → The tool estimates the static neutron flux, the eigenmodes, and the neutron noise. → The tool was successfully validated via many benchmark cases. → The tool can be used for research and education. → The tool is freely available. - Abstract: This paper deals with the development, validation, and demonstration of an innovative neutronic tool. The novelty of the tool resides in its versatility, since many different systems can be investigated and different kinds of calculations can be performed. More precisely, both critical systems and subcritical systems with an external neutron source can be studied, and static and dynamic cases in the frequency domain (i.e. for stationary fluctuations) can be considered. In addition, the tool has the ability to determine the different eigenfunctions of any nuclear core. For each situation, the static neutron flux, the different eigenmodes and eigenvalues, the first-order neutron noise, and their adjoint functions are estimated, as well as the effective multiplication factor of the system. The main advantages of the tool, which is entirely MatLab based, lie with the robustness of the implemented numerical algorithms, its high portability between different computer platforms and operative systems, and finally its ease of use since no input deck writing is required. The present version of the tool, which is based on two-group diffusion theory, is mostly suited to investigate thermal systems. The definition of both the static and dynamic core configurations directly from the static macroscopic cross-sections and their fluctuations, respectively, makes the tool particularly well suited for research and education. Some of the many benchmark cases used to validate the tool are briefly reported. The static and dynamic capabilities of the tool are also demonstrated for the following configurations: a vibrating control rod, a perturbation traveling upwards

  1. The CATS Service: An Astrophysical Research Tool

    Directory of Open Access Journals (Sweden)

    O V Verkhodanov

    2009-03-01

    Full Text Available We describe the current status of CATS (astrophysical CATalogs Support system, a publicly accessible tool maintained at Special Astrophysical Observatory of the Russian Academy of Sciences (SAO RAS (http://cats.sao.ru allowing one to search hundreds of catalogs of astronomical objects discovered all along the electromagnetic spectrum. Our emphasis is mainly on catalogs of radio continuum sources observed from 10 MHz to 245 GHz, and secondly on catalogs of objects such as radio and active stars, X-ray binaries, planetary nebulae, HII regions, supernova remnants, pulsars, nearby and radio galaxies, AGN and quasars. CATS also includes the catalogs from the largest extragalactic surveys with non-radio waves. In 2008 CATS comprised a total of about 109 records from over 400 catalogs in the radio, IR, optical and X-ray windows, including most source catalogs deriving from observations with the Russian radio telescope RATAN-600. CATS offers several search tools through different ways of access, e.g. via Web-interface and e-mail. Since its creation in 1997 CATS has managed about 105requests. Currently CATS is used by external users about 1500 times per day and since its opening to the public in 1997 has received about 4000 requests for its selection and matching tasks.

  2. Basic Research Tools for Earthworm Ecology

    Directory of Open Access Journals (Sweden)

    Kevin R. Butt

    2010-01-01

    Full Text Available Earthworms are responsible for soil development, recycling organic matter and form a vital component within many food webs. For these and other reasons earthworms are worthy of investigation. Many technologically-enhanced approaches have been used within earthworm-focused research. These have their place, may be a development of existing practices or bring techniques from other fields. Nevertheless, let us not overlook the fact that much can still be learned through utilisation of more basic approaches which have been used for some time. New does not always equate to better. Information on community composition within an area and specific population densities can be learned using simple collection techniques, and burrowing behaviour can be determined from pits, resin-insertion or simple mesocosms. Life history studies can be achieved through maintenance of relatively simple cultures. Behavioural observations can be undertaken by direct observation or with low cost webcam usage. Applied aspects of earthworm research can also be achieved through use of simple techniques to enhance population development and even population dynamics can be directly addressed with use of relatively inexpensive, effective marking techniques. This paper seeks to demonstrate that good quality research in this sphere can result from appropriate application of relatively simple research tools.

  3. Basic Research Tools for Earthworm Ecology

    International Nuclear Information System (INIS)

    Butt, K.R.; Grigoropoulou, N.

    2010-01-01

    Earthworms are responsible for soil development, recycling organic matter and form a vital component within many food webs. For these and other reasons earthworms are worthy of investigation. Many technologically-enhanced approaches have been used within earthworm-focused research. These have their place, may be a development of existing practices or bring techniques from other fields. Nevertheless, let us not overlook the fact that much can still be learned through utilisation of more basic approaches which have been used for some time. New does not always equate to better. Information on community composition within an area and specific population densities can be learned using simple collection techniques, and burrowing behaviour can be determined from pits, resin-insertion or simple mesocosms. Life history studies can be achieved through maintenance of relatively simple cultures. Behavioural observations can be undertaken by direct observation or with low cost we became usage. Applied aspects of earthworm research can also be achieved through use of simple techniques to enhance population development and even population dynamics can be directly addressed with use of relatively inexpensive, effective marking techniques. This paper seeks to demonstrate that good quality research in this sphere can result from appropriate application of relatively simple research tools.

  4. Integrating information technologies as tools for surgical research.

    Science.gov (United States)

    Schell, Scott R

    2005-10-01

    Surgical research is dependent upon information technologies. Selection of the computer, operating system, and software tool that best support the surgical investigator's needs requires careful planning before research commences. This manuscript presents a brief tutorial on how surgical investigators can best select these information technologies, with comparisons and recommendations between existing systems, software, and solutions. Privacy concerns, based upon HIPAA and other regulations, now require careful proactive attention to avoid legal penalties, civil litigation, and financial loss. Security issues are included as part of the discussions related to selection and application of information technology. This material was derived from a segment of the Association for Academic Surgery's Fundamentals of Surgical Research course.

  5. Metal Vapor Arcing Risk Assessment Tool

    Science.gov (United States)

    Hill, Monika C.; Leidecker, Henning W.

    2010-01-01

    The Tin Whisker Metal Vapor Arcing Risk Assessment Tool has been designed to evaluate the risk of metal vapor arcing and to help facilitate a decision toward a researched risk disposition. Users can evaluate a system without having to open up the hardware. This process allows for investigating components at risk rather than spending time and money analyzing every component. The tool points to a risk level and provides direction for appropriate action and documentation.

  6. Build your own social network laboratory with Social Lab: a tool for research in social media.

    Science.gov (United States)

    Garaizar, Pablo; Reips, Ulf-Dietrich

    2014-06-01

    Social networking has surpassed e-mail and instant messaging as the dominant form of online communication (Meeker, Devitt, & Wu, 2010). Currently, all large social networks are proprietary, making it difficult to impossible for researchers to make changes to such networks for the purpose of study design and access to user-generated data from the networks. To address this issue, the authors have developed and present Social Lab, an Internet-based free and open-source social network software system available from http://www.sociallab.es . Having full availability of navigation and communication data in Social Lab allows researchers to investigate behavior in social media on an individual and group level. Automated artificial users ("bots") are available to the researcher to simulate and stimulate social networking situations. These bots respond dynamically to situations as they unfold. The bots can easily be configured with scripts and can be used to experimentally manipulate social networking situations in Social Lab. Examples for setting up, configuring, and using Social Lab as a tool for research in social media are provided.

  7. CUAHSI Data Services: Tools and Cyberinfrastructure for Water Data Discovery, Research and Collaboration

    Science.gov (United States)

    Seul, M.; Brazil, L.; Castronova, A. M.

    2017-12-01

    CUAHSI Data Services: Tools and Cyberinfrastructure for Water Data Discovery, Research and CollaborationEnabling research surrounding interdisciplinary topics often requires a combination of finding, managing, and analyzing large data sets and models from multiple sources. This challenge has led the National Science Foundation to make strategic investments in developing community data tools and cyberinfrastructure that focus on water data, as it is central need for many of these research topics. CUAHSI (The Consortium of Universities for the Advancement of Hydrologic Science, Inc.) is a non-profit organization funded by the National Science Foundation to aid students, researchers, and educators in using and managing data and models to support research and education in the water sciences. This presentation will focus on open-source CUAHSI-supported tools that enable enhanced data discovery online using advanced searching capabilities and computational analysis run in virtual environments pre-designed for educators and scientists so they can focus their efforts on data analysis rather than IT set-up.

  8. Design of Scalable and Effective Earth Science Collaboration Tool

    Science.gov (United States)

    Maskey, M.; Ramachandran, R.; Kuo, K. S.; Lynnes, C.; Niamsuwan, N.; Chidambaram, C.

    2014-12-01

    Collaborative research is growing rapidly. Many tools including IDEs are now beginning to incorporate new collaborative features. Software engineering research has shown the effectiveness of collaborative programming and analysis. In particular, drastic reduction in software development time resulting in reduced cost has been highlighted. Recently, we have witnessed the rise of applications that allow users to share their content. Most of these applications scale such collaboration using cloud technologies. Earth science research needs to adopt collaboration technologies to reduce redundancy, cut cost, expand knowledgebase, and scale research experiments. To address these needs, we developed the Earth science collaboration workbench (CWB). CWB provides researchers with various collaboration features by augmenting their existing analysis tools to minimize learning curve. During the development of the CWB, we understood that Earth science collaboration tasks are varied and we concluded that it is not possible to design a tool that serves all collaboration purposes. We adopted a mix of synchronous and asynchronous sharing methods that can be used to perform collaboration across time and location dimensions. We have used cloud technology for scaling the collaboration. Cloud has been highly utilized and valuable tool for Earth science researchers. Among other usages, cloud is used for sharing research results, Earth science data, and virtual machine images; allowing CWB to create and maintain research environments and networks to enhance collaboration between researchers. Furthermore, collaborative versioning tool, Git, is integrated into CWB for versioning of science artifacts. In this paper, we present our experience in designing and implementing the CWB. We will also discuss the integration of collaborative code development use cases for data search and discovery using NASA DAAC and simulation of satellite observations using NASA Earth Observing System Simulation

  9. CoC GIS Tools (GIS Tool)

    Data.gov (United States)

    Department of Housing and Urban Development — This tool provides a no-cost downloadable software tool that allows users to interact with professional quality GIS maps. Users access pre-compiled projects through...

  10. Research on Key Technologies of Unit-Based CNC Machine Tool Assembly Design

    OpenAIRE

    Zhongqi Sheng; Lei Zhang; Hualong Xie; Changchun Liu

    2014-01-01

    Assembly is the part that produces the maximum workload and consumed time during product design and manufacturing process. CNC machine tool is the key basic equipment in manufacturing industry and research on assembly design technologies of CNC machine tool has theoretical significance and practical value. This study established a simplified ASRG for CNC machine tool. The connection between parts, semantic information of transmission, and geometric constraint information were quantified to as...

  11. Claire, a simulation and testing tool for critical softwares

    International Nuclear Information System (INIS)

    Gassino, J.; Henry, J.Y.

    1996-01-01

    The CEA and IPSN (Institute of Nuclear Protection and Safety) needs concerning the testing of critical softwares, have led to the development of the CLAIRE tool which is able to test the softwares without modification. This tool allows to graphically model the system and its environment and to include components into the model which observe and do not modify the behaviour of the system to be tested. The executable codes are integrated in the model. The tool uses target machine simulators (microprocessors). The technique used (the event simulation) allows to associate actions with events such as the execution of an instruction, the access to a variable etc.. The simulation results are exploited using graphic, states research and test cover measurement tools. In particular, this tool can give help to the evaluation of critical softwares with pre-existing components. (J.S.)

  12. iSRAP - a one-touch research tool for rapid profiling of small RNA-seq data.

    Science.gov (United States)

    Quek, Camelia; Jung, Chol-Hee; Bellingham, Shayne A; Lonie, Andrew; Hill, Andrew F

    2015-01-01

    Small non-coding RNAs have been significantly recognized as the key modulators in many biological processes, and are emerging as promising biomarkers for several diseases. These RNA species are transcribed in cells and can be packaged in extracellular vesicles, which are small vesicles released from many biotypes, and are involved in intercellular communication. Currently, the advent of next-generation sequencing (NGS) technology for high-throughput profiling has further advanced the biological insights of non-coding RNA on a genome-wide scale and has become the preferred approach for the discovery and quantification of non-coding RNA species. Despite the routine practice of NGS, the processing of large data sets poses difficulty for analysis before conducting downstream experiments. Often, the current analysis tools are designed for specific RNA species, such as microRNA, and are limited in flexibility for modifying parameters for optimization. An analysis tool that allows for maximum control of different software is essential for drawing concrete conclusions for differentially expressed transcripts. Here, we developed a one-touch integrated small RNA analysis pipeline (iSRAP) research tool that is composed of widely used tools for rapid profiling of small RNAs. The performance test of iSRAP using publicly and in-house available data sets shows its ability of comprehensive profiling of small RNAs of various classes, and analysis of differentially expressed small RNAs. iSRAP offers comprehensive analysis of small RNA sequencing data that leverage informed decisions on the downstream analyses of small RNA studies, including extracellular vesicles such as exosomes.

  13. Welfare assessment in porcine biomedical research – Suggestion for an operational tool

    DEFF Research Database (Denmark)

    Søndergaard, Lene Vammen; Dagnæs-Hansen, Frederik; Herskin, Mette S

    2011-01-01

    of the extent of welfare assessment in pigs used in biomedical research and to suggest a welfare assessment standard for research facilities based on an exposition of ethological considerations relevant for the welfare of pigs in biomedical research. The tools for porcine welfare assessment presented suggest...

  14. Developing a research and practice tool to measure walkability: a demonstration project.

    Science.gov (United States)

    Giles-Corti, Billie; Macaulay, Gus; Middleton, Nick; Boruff, Bryan; Bull, Fiona; Butterworth, Iain; Badland, Hannah; Mavoa, Suzanne; Roberts, Rebecca; Christian, Hayley

    2014-12-01

    Growing evidence shows that higher-density, mixed-use, pedestrian-friendly neighbourhoods encourage active transport, including transport-related walking. Despite widespread recognition of the benefits of creating more walkable neighbourhoods, there remains a gap between the rhetoric of the need for walkability and the creation of walkable neighbourhoods. Moreover, there is little objective data to benchmark the walkability of neighbourhoods within and between Australian cities in order to monitor planning and design intervention progress and to assess built environment and urban policy interventions required to achieve increased walkability. This paper describes a demonstration project that aimed to develop, trial and validate a 'Walkability Index Tool' that could be used by policy makers and practitioners to assess the walkability of local areas; or by researchers to access geospatial data assessing walkability. The overall aim of the project was to develop an automated geospatial tool capable of creating walkability indices for neighbourhoods at user-specified scales. The tool is based on open-source software architecture, within the Australian Urban Research Infrastructure Network (AURIN) framework, and incorporates key sub-component spatial measures of walkability (street connectivity, density and land use mix). Using state-based data, we demonstrated it was possible to create an automated walkability index. However, due to the lack of availability of consistent of national data measuring land use mix, at this stage it has not been possible to create a national walkability measure. The next stage of the project is to increase useability of the tool within the AURIN portal and to explore options for alternative spatial data sources that will enable the development of a valid national walkability index. AURIN's open-source Walkability Index Tool is a first step in demonstrating the potential benefit of a tool that could measure walkability across Australia. It

  15. Building genetic tools in Drosophila research: an interview with Gerald Rubin

    Directory of Open Access Journals (Sweden)

    2016-04-01

    Full Text Available Gerald (Gerry Rubin, pioneer in Drosophila genetics, is Founding Director of the HHMI-funded Janelia Research Campus. In this interview, Gerry recounts key events and collaborations that have shaped his unique approach to scientific exploration, decision-making, management and mentorship – an approach that forms the cornerstone of the model adopted at Janelia to tackle problems in interdisciplinary biomedical research. Gerry describes his remarkable journey from newcomer to internationally renowned leader in the fly field, highlighting his contributions to the tools and resources that have helped establish Drosophila as an important model in translational research. Describing himself as a ‘tool builder’, his current focus is on developing approaches for in-depth study of the fly nervous system, in order to understand key principles in neurobiology. Gerry was interviewed by Ross Cagan, Senior Editor of Disease Models & Mechanisms.

  16. Straightforward statistics understanding the tools of research

    CERN Document Server

    Geher, Glenn

    2014-01-01

    Straightforward Statistics: Understanding the Tools of Research is a clear and direct introduction to statistics for the social, behavioral, and life sciences. Based on the author's extensive experience teaching undergraduate statistics, this book provides a narrative presentation of the core principles that provide the foundation for modern-day statistics. With step-by-step guidance on the nuts and bolts of computing these statistics, the book includes detailed tutorials how to use state-of-the-art software, SPSS, to compute the basic statistics employed in modern academic and applied researc

  17. TRANSPORTATION RESEARCH IMPLEMENTATION MANAGEMENT : DEVELOPMENT OF PERFORMANCE BASED PROCESSES, METRICS, AND TOOLS

    Science.gov (United States)

    2018-02-02

    The objective of this study is to develop an evidencebased research implementation database and tool to support research implementation at the Georgia Department of Transportation (GDOT).A review was conducted drawing from the (1) implementati...

  18. Quantitative Risk reduction estimation Tool For Control Systems, Suggested Approach and Research Needs

    Energy Technology Data Exchange (ETDEWEB)

    Miles McQueen; Wayne Boyer; Mark Flynn; Sam Alessi

    2006-03-01

    For the past year we have applied a variety of risk assessment technologies to evaluate the risk to critical infrastructure from cyber attacks on control systems. More recently, we identified the need for a stand alone control system risk reduction estimation tool to provide owners and operators of control systems with a more useable, reliable, and credible method for managing the risks from cyber attack. Risk is defined as the probability of a successful attack times the value of the resulting loss, typically measured in lives and dollars. Qualitative and ad hoc techniques for measuring risk do not provide sufficient support for cost benefit analyses associated with cyber security mitigation actions. To address the need for better quantitative risk reduction models we surveyed previous quantitative risk assessment research; evaluated currently available tools; developed new quantitative techniques [17] [18]; implemented a prototype analysis tool to demonstrate how such a tool might be used; used the prototype to test a variety of underlying risk calculational engines (e.g. attack tree, attack graph); and identified technical and research needs. We concluded that significant gaps still exist and difficult research problems remain for quantitatively assessing the risk to control system components and networks, but that a useable quantitative risk reduction estimation tool is not beyond reach.

  19. 48 CFR 235.015-70 - Special use allowances for research facilities acquired by educational institutions.

    Science.gov (United States)

    2010-10-01

    ... performance of DoD contracts; (2) Existing facilities, either Government or nongovernment, cannot meet program... effort which results in the special use allowance being excessive compared to the Government research... Defense contracts. FAR 31.3 governs how much the Government will reimburse the institution for the...

  20. How should we assess knowledge translation in research organizations; designing a knowledge translation self-assessment tool for research institutes (SATORI).

    Science.gov (United States)

    Gholami, Jaleh; Majdzadeh, Reza; Nedjat, Saharnaz; Nedjat, Sima; Maleki, Katayoun; Ashoorkhani, Mahnaz; Yazdizadeh, Bahareh

    2011-02-22

    The knowledge translation self-assessment tool for research institutes (SATORI) was designed to assess the status of knowledge translation in research institutes. The objective was, to identify the weaknesses and strengths of knowledge translation in research centres and faculties associated with Tehran University of Medical Sciences (TUMS). The tool, consisting of 50 statements in four main domains, was used in 20 TUMS-affiliated research centres and departments after its reliability was established. It was completed in a group discussion by the members of the research council, researchers and research users' representatives from each centre and/or department. The mean score obtained in the four domains of 'The question of research', 'Knowledge production', 'Knowledge transfer' and 'Promoting the use of evidence' were 2.26, 2.92, 2 and 1.89 (out of 5) respectively.Nine out of 12 interventional priorities with the lowest quartile score were related to knowledge transfer resources and strategies, whereas eight of them were in the highest quartile and related to 'The question of research' and 'Knowledge production'. The self-assessment tool identifies the gaps in capacity and infrastructure of knowledge translation support within research organizations. Assessment of research institutes using SATORI pointed out that strengthening knowledge translation through provision of financial support for knowledge translation activities, creating supportive and facilitating infrastructures, and facilitating interactions between researchers and target audiences to exchange questions and research findings are among the priorities of research centres and/or departments.

  1. Raising Reliability of Web Search Tool Research through Replication and Chaos Theory

    OpenAIRE

    Nicholson, Scott

    1999-01-01

    Because the World Wide Web is a dynamic collection of information, the Web search tools (or "search engines") that index the Web are dynamic. Traditional information retrieval evaluation techniques may not provide reliable results when applied to the Web search tools. This study is the result of ten replications of the classic 1996 Ding and Marchionini Web search tool research. It explores the effects that replication can have on transforming unreliable results from one iteration into replica...

  2. The "Metaphorical Collage" as a Research Tool in the Field of Education

    Science.gov (United States)

    Russo-Zimet, Gila

    2016-01-01

    The aim of this paper is to propose a research tool in the field of education--the "metaphorical collage." This tool facilitates the understanding of concepts and processes in education through the analysis of metaphors in collage works that include pictorial images and verbal images. We believe the "metaphorical collage" to be…

  3. Visual Tools for Eliciting Connections and Cohesiveness in Mixed Methods Research

    Science.gov (United States)

    Murawska, Jaclyn M.; Walker, David A.

    2017-01-01

    In this commentary, we offer a set of visual tools that can assist education researchers, especially those in the field of mathematics, in developing cohesiveness from a mixed methods perspective, commencing at a study's research questions and literature review, through its data collection and analysis, and finally to its results. This expounds…

  4. ARM Climate Research Facility: Outreach Tools and Strategies

    Science.gov (United States)

    Roeder, L.; Jundt, R.

    2009-12-01

    Sponsored by the Department of Energy, the ARM Climate Research Facility is a global scientific user facility for the study of climate change. To publicize progress and achievements and to reach new users, the ACRF uses a variety of Web 2.0 tools and strategies that build off of the program’s comprehensive and well established News Center (www.arm.gov/news). These strategies include: an RSS subscription service for specific news categories; an email “newsletter” distribution to the user community that compiles the latest News Center updates into a short summary with links; and a Facebook page that pulls information from the News Center and links to relevant information in other online venues, including those of our collaborators. The ACRF also interacts with users through field campaign blogs, like Discovery Channel’s EarthLive, to share research experiences from the field. Increasingly, field campaign Wikis are established to help ACRF researchers collaborate during the planning and implementation phases of their field studies and include easy to use logs and image libraries to help record the campaigns. This vital reference information is used in developing outreach material that is shared in highlights, news, and Facebook. Other Web 2.0 tools that ACRF uses include Google Maps to help users visualize facility locations and aircraft flight patterns. Easy-to-use comment boxes are also available on many of the data-related web pages on www.arm.gov to encourage feedback. To provide additional opportunities for increased interaction with the public and user community, future Web 2.0 plans under consideration for ACRF include: evaluating field campaigns for Twitter and microblogging opportunities, adding public discussion forums to research highlight web pages, moving existing photos into albums on FlickR or Facebook, and building online video archives through YouTube.

  5. The use of web2 tools in action research

    DEFF Research Database (Denmark)

    Kolbæk, Raymond; Steensgaard, Randi; Angel, Sanne

    2017-01-01

    . Furthermore we try to evidence-based the concept of "Sample handlings" and examines whether this concept can be used as a flexible methodological tool for developing workflow that promotes patient participation in their own rehabilitation. We use a action research design to identify actual problems, develop......, to test, evaluate and implement specific actions to promote patient participation in rehabilitation. Four nurses and four social and health assistants is having a "co-researcher" active role. The interaction with the researchers creates a reflexive and dynamic process with a learning and competence......Abstract Content: Major challenges occurs, when trying to implement research in clinical practice. In the West Danish Center for Spinal Cord Injury, we are doing a practice-based ph.d. project, that involves the practice field's own members as co-researchers. In the management of the project we use...

  6. A survey of tools for the analysis of quantitative PCR (qPCR data

    Directory of Open Access Journals (Sweden)

    Stephan Pabinger

    2014-09-01

    Our comprehensive survey showed that most tools use their own file format and only a fraction of the currently existing tools support the standardized data exchange format RDML. To allow a more streamlined and comparable analysis of qPCR data, more vendors and tools need to adapt the standardized format to encourage the exchange of data between instrument software, analysis tools, and researchers.

  7. A validated set of tool pictures with matched objects and non-objects for laterality research.

    Science.gov (United States)

    Verma, Ark; Brysbaert, Marc

    2015-01-01

    Neuropsychological and neuroimaging research has established that knowledge related to tool use and tool recognition is lateralized to the left cerebral hemisphere. Recently, behavioural studies with the visual half-field technique have confirmed the lateralization. A limitation of this research was that different sets of stimuli had to be used for the comparison of tools to other objects and objects to non-objects. Therefore, we developed a new set of stimuli containing matched triplets of tools, other objects and non-objects. With the new stimulus set, we successfully replicated the findings of no visual field advantage for objects in an object recognition task combined with a significant right visual field advantage for tools in a tool recognition task. The set of stimuli is available as supplemental data to this article.

  8. EasyInterface: A toolkit for rapid development of GUIs for research prototype tools

    OpenAIRE

    Doménech, Jesús; Genaim, Samir; Johnsen, Einar Broch; Schlatte, Rudolf

    2017-01-01

    In this paper we describe EasyInterface, an open-source toolkit for rapid development of web-based graphical user interfaces (GUIs). This toolkit addresses the need of researchers to make their research prototype tools available to the community, and integrating them in a common environment, rapidly and without being familiar with web programming or GUI libraries in general. If a tool can be executed from a command-line and its output goes to the standard output, then in few minutes one can m...

  9. Adding tools to the open source toolbox: The Internet

    Science.gov (United States)

    Porth, Tricia

    1994-01-01

    The Internet offers researchers additional sources of information not easily available from traditional sources such as print volumes or commercial data bases. Internet tools such as e-mail and file transfer protocol (ftp) speed up the way researchers communicate and transmit data. Mosaic, one of the newest additions to the Internet toolbox, allows users to combine tools such as ftp, gopher, wide area information server, and the world wide web with multimedia capabilities. Mosaic has quickly become a popular means of making information available on the Internet because it is versatile and easily customizable.

  10. iSRAP – a one-touch research tool for rapid profiling of small RNA-seq data

    Science.gov (United States)

    Quek, Camelia; Jung, Chol-hee; Bellingham, Shayne A.; Lonie, Andrew; Hill, Andrew F.

    2015-01-01

    Small non-coding RNAs have been significantly recognized as the key modulators in many biological processes, and are emerging as promising biomarkers for several diseases. These RNA species are transcribed in cells and can be packaged in extracellular vesicles, which are small vesicles released from many biotypes, and are involved in intercellular communication. Currently, the advent of next-generation sequencing (NGS) technology for high-throughput profiling has further advanced the biological insights of non-coding RNA on a genome-wide scale and has become the preferred approach for the discovery and quantification of non-coding RNA species. Despite the routine practice of NGS, the processing of large data sets poses difficulty for analysis before conducting downstream experiments. Often, the current analysis tools are designed for specific RNA species, such as microRNA, and are limited in flexibility for modifying parameters for optimization. An analysis tool that allows for maximum control of different software is essential for drawing concrete conclusions for differentially expressed transcripts. Here, we developed a one-touch integrated small RNA analysis pipeline (iSRAP) research tool that is composed of widely used tools for rapid profiling of small RNAs. The performance test of iSRAP using publicly and in-house available data sets shows its ability of comprehensive profiling of small RNAs of various classes, and analysis of differentially expressed small RNAs. iSRAP offers comprehensive analysis of small RNA sequencing data that leverage informed decisions on the downstream analyses of small RNA studies, including extracellular vesicles such as exosomes. PMID:26561006

  11. What’s Ketso? A Tool for Researchers, Educators, and Practitioners

    OpenAIRE

    James S. Bates

    2016-01-01

    Researchers, educators, and practitioners utilize a range of tools and techniques to obtain data, input, feedback, and information from research participants, program learners, and stakeholders. Ketso is both an array of information gathering techniques and a toolkit (see www.ketso.com). It “can be used in any situation when people come together to share information, learn from each other, make decisions and plan actions” (Tippett & How, 2011, p. 4). The word ketso means “action” in the Sesot...

  12. Basic statistical tools in research and data analysis

    Directory of Open Access Journals (Sweden)

    Zulfiqar Ali

    2016-01-01

    Full Text Available Statistical methods involved in carrying out a study include planning, designing, collecting data, analysing, drawing meaningful interpretation and reporting of the research findings. The statistical analysis gives meaning to the meaningless numbers, thereby breathing life into a lifeless data. The results and inferences are precise only if proper statistical tests are used. This article will try to acquaint the reader with the basic research tools that are utilised while conducting various studies. The article covers a brief outline of the variables, an understanding of quantitative and qualitative variables and the measures of central tendency. An idea of the sample size estimation, power analysis and the statistical errors is given. Finally, there is a summary of parametric and non-parametric tests used for data analysis.

  13. Positioning Mentoring as a Coach Development Tool: Recommendations for Future Practice and Research

    Science.gov (United States)

    McQuade, Sarah; Davis, Louise; Nash, Christine

    2015-01-01

    Current thinking in coach education advocates mentoring as a development tool to connect theory and practice. However, little empirical evidence exists to evaluate the effectiveness of mentoring as a coach development tool. Business, education, and nursing precede the coaching industry in their mentoring practice, and research findings offered in…

  14. Social networks, web-based tools and diseases: implications for biomedical research.

    Science.gov (United States)

    Costa, Fabricio F

    2013-03-01

    Advances in information technology have improved our ability to gather, collect and analyze information from individuals online. Social networks can be seen as a nonlinear superposition of a multitude of complex connections between people where the nodes represent individuals and the links between them capture a variety of different social interactions. The emergence of different types of social networks has fostered connections between individuals, thus facilitating data exchange in a variety of fields. Therefore, the question posed now is "can these same tools be applied to life sciences in order to improve scientific and medical research?" In this article, I will review how social networks and other web-based tools are changing the way we approach and track diseases in biomedical research. Copyright © 2012 Elsevier Ltd. All rights reserved.

  15. A Mentoring Toolkit: Tips and Tools for Mentoring Early-Career Researchers

    Science.gov (United States)

    Flint, Kathleen

    2010-01-01

    Effective mentoring is a critical component in the training of early-career researchers, cultivating more independent, productive and satisfied scientists. For example, mentoring has been shown by the 2005 Sigma Xi National Postdoc Survey to be a key indicator for a successful postdoctoral outcome. Mentoring takes many forms and can include support for maximizing research skills and productivity as well as assistance in preparing for a chosen career path. Yet, because there is no "one-size-fits-all” approach, mentoring can be an activity that is hard to define. In this presentation, a series of tips and tools will be offered to aid mentors in developing a plan for their mentoring activities. This will include: suggestions for how to get started; opportunities for mentoring activities within the research group, within the institution, and outside the institution; tools for communicating and assessing professional milestones; and resources for fostering the professional and career development of mentees. Special considerations will also be presented for mentoring international scholars and women. These strategies will be helpful to the PI responding to the new NSF mentoring plan requirement for postdocs as well as to the student, postdoc, researcher or professor overseeing the research and training of others.

  16. Incorporating ethical principles into clinical research protocols: a tool for protocol writers and ethics committees.

    Science.gov (United States)

    Li, Rebecca H; Wacholtz, Mary C; Barnes, Mark; Boggs, Liam; Callery-D'Amico, Susan; Davis, Amy; Digilova, Alla; Forster, David; Heffernan, Kate; Luthin, Maeve; Lynch, Holly Fernandez; McNair, Lindsay; Miller, Jennifer E; Murphy, Jacquelyn; Van Campen, Luann; Wilenzick, Mark; Wolf, Delia; Woolston, Cris; Aldinger, Carmen; Bierer, Barbara E

    2016-04-01

    A novel Protocol Ethics Tool Kit ('Ethics Tool Kit') has been developed by a multi-stakeholder group of the Multi-Regional Clinical Trials Center of Brigham and Women's Hospital and Harvard. The purpose of the Ethics Tool Kit is to facilitate effective recognition, consideration and deliberation of critical ethical issues in clinical trial protocols. The Ethics Tool Kit may be used by investigators and sponsors to develop a dedicated Ethics Section within a protocol to improve the consistency and transparency between clinical trial protocols and research ethics committee reviews. It may also streamline ethics review and may facilitate and expedite the review process by anticipating the concerns of ethics committee reviewers. Specific attention was given to issues arising in multinational settings. With the use of this Tool Kit, researchers have the opportunity to address critical research ethics issues proactively, potentially speeding the time and easing the process to final protocol approval. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  17. New Tools for New Literacies Research: An Exploration of Usability Testing Software

    Science.gov (United States)

    Asselin, Marlene; Moayeri, Maryam

    2010-01-01

    Competency in the new literacies of the Internet is essential for participating in contemporary society. Researchers studying these new literacies are recognizing the limitations of traditional methodological tools and adapting new technologies and new media for use in research. This paper reports our exploration of usability testing software to…

  18. DataUp: A tool to help researchers describe and share tabular data.

    Science.gov (United States)

    Strasser, Carly; Kunze, John; Abrams, Stephen; Cruse, Patricia

    2014-01-01

    Scientific datasets have immeasurable value, but they lose their value over time without proper documentation, long-term storage, and easy discovery and access. Across disciplines as diverse as astronomy, demography, archeology, and ecology, large numbers of small heterogeneous datasets (i.e., the long tail of data) are especially at risk unless they are properly documented, saved, and shared. One unifying factor for many of these at-risk datasets is that they reside in spreadsheets. In response to this need, the California Digital Library (CDL) partnered with Microsoft Research Connections and the Gordon and Betty Moore Foundation to create the DataUp data management tool for Microsoft Excel. Many researchers creating these small, heterogeneous datasets use Excel at some point in their data collection and analysis workflow, so we were interested in developing a data management tool that fits easily into those work flows and minimizes the learning curve for researchers. The DataUp project began in August 2011. We first formally assessed the needs of researchers by conducting surveys and interviews of our target research groups: earth, environmental, and ecological scientists. We found that, on average, researchers had very poor data management practices, were not aware of data centers or metadata standards, and did not understand the benefits of data management or sharing. Based on our survey results, we composed a list of desirable components and requirements and solicited feedback from the community to prioritize potential features of the DataUp tool. These requirements were then relayed to the software developers, and DataUp was successfully launched in October 2012.

  19. Single Molecule Analysis Research Tool (SMART: an integrated approach for analyzing single molecule data.

    Directory of Open Access Journals (Sweden)

    Max Greenfeld

    Full Text Available Single molecule studies have expanded rapidly over the past decade and have the ability to provide an unprecedented level of understanding of biological systems. A common challenge upon introduction of novel, data-rich approaches is the management, processing, and analysis of the complex data sets that are generated. We provide a standardized approach for analyzing these data in the freely available software package SMART: Single Molecule Analysis Research Tool. SMART provides a format for organizing and easily accessing single molecule data, a general hidden Markov modeling algorithm for fitting an array of possible models specified by the user, a standardized data structure and graphical user interfaces to streamline the analysis and visualization of data. This approach guides experimental design, facilitating acquisition of the maximal information from single molecule experiments. SMART also provides a standardized format to allow dissemination of single molecule data and transparency in the analysis of reported data.

  20. MitoBamAnnotator: A web-based tool for detecting and annotating heteroplasmy in human mitochondrial DNA sequences.

    Science.gov (United States)

    Zhidkov, Ilia; Nagar, Tal; Mishmar, Dan; Rubin, Eitan

    2011-11-01

    The use of Next-Generation Sequencing of mitochondrial DNA is becoming widespread in biological and clinical research. This, in turn, creates a need for a convenient tool that detects and analyzes heteroplasmy. Here we present MitoBamAnnotator, a user friendly web-based tool that allows maximum flexibility and control in heteroplasmy research. MitoBamAnnotator provides the user with a comprehensively annotated overview of mitochondrial genetic variation, allowing for an in-depth analysis with no prior knowledge in programming. Copyright © 2011 Elsevier B.V. and Mitochondria Research Society. All rights reserved. All rights reserved.

  1. Big Data as a Revolutionary Tool in Finance

    Directory of Open Access Journals (Sweden)

    Aureliano Angel Bressan

    2015-08-01

    Full Text Available A data driven culture is arising as a research field and analytic tool in Finance and Management since the advent of structured, semi-structured and unstructured socio-economic and demographic information from social media, mobile devices, blogs and product reviews from consumers. Big Data, the expression that encompasses this revolution, involves the usage of new tools for financial professionals and academic researchers due to the size of data involved, which require more powerful manipulation tools. In this sense, Machine Learning techniques can allow more effective ways to model complex relationships that arise from the interaction of different types of data, regarding issues such as Operational and Reputational Risk, Portfolio Management, Business Intelligence and Predictive Analytics. The following books can be a good start for those interested in this new field.

  2. Air Traffic Control Tools Assessment

    Directory of Open Access Journals (Sweden)

    Tomáš Noskievič

    2017-04-01

    Full Text Available Undoubtedly air transport in today’s world wouldn’t be able to exist without any air traffic control service. As the air transport has been coming through major changes and it has been expanding, it is assumed that its volume will be doubled in the next 15 years. Air traffic control uses strictly organised procedures to ensure safe course of air operations. With the skies covered with more airplanes every year, new tools must be introduced to allow the controllers to manage this rising amount of flying aircraft and to keep the air transport safe. This paper provides a comprehensive and organized material, which describes the newest tools and systems used by air traffic control officers. It proposes improvements for further research and development of ATC tools.

  3. Telerehabilitation: Policy Issues and Research Tools

    Directory of Open Access Journals (Sweden)

    Katherine D. Seelman

    2009-09-01

    Full Text Available The importance of public policy as a complementary framework for telehealth, telemedicine, and by association telerehabilitation, has been recognized by a number of experts. The purpose of this paper is to review literature on telerehabilitation (TR policy and research methodology issues in order to report on the current state of the science and make recommendations about future research needs. An extensive literature search was implemented using search terms grouped into main topics of telerehabilitation, policy, population of users, and policy specific issues such as cost and reimbursement. The availability of rigorous and valid evidence-based cost studies emerged as a major challenge to the field. Existing cost studies provided evidence that telehomecare may be a promising application area for TR. Cost studies also indicated that telepsychiatry is a promising telepractice area. The literature did not reference the International Classification on Functioning, Disability and Health (ICF. Rigorous and comprehensive TR assessment and evaluation tools for outcome studies are tantamount to generating confidence among providers, payers, clinicians and end users. In order to evaluate consumer satisfaction and participation, assessment criteria must include medical, functional and quality of life items such as assistive technology and environmental factors. Keywords: Telerehabilitation, Telehomecare, Telepsychiatry, Telepractice

  4. [Managing a health research institute: towards research excellence through continuous improvement].

    Science.gov (United States)

    Olmedo, Carmen; Buño, Ismael; Plá, Rosa; Lomba, Irene; Bardinet, Thierry; Bañares, Rafael

    2015-01-01

    Health research institutes are a strategic commitment considered the ideal environment to develop excellence in translational research. Achieving quality research requires not only a powerful scientific and research structure but also the quality and integrity of management systems that support it. The essential instruments in our institution were solid strategic planning integrated into and consistent with the system of quality management, systematic evaluation through periodic indicators, measurement of key user satisfaction and internal audits, and implementation of an innovative information management tool. The implemented management tools have provided a strategic thrust to our institute while ensuring a level of quality and efficiency in the development and management of research that allows progress towards excellence in biomedical research. Copyright © 2015 SESPAS. Published by Elsevier Espana. All rights reserved.

  5. Noncontact Atomic Force Microscopy: An Emerging Tool for Fundamental Catalysis Research.

    Science.gov (United States)

    Altman, Eric I; Baykara, Mehmet Z; Schwarz, Udo D

    2015-09-15

    Although atomic force microscopy (AFM) was rapidly adopted as a routine surface imaging apparatus after its introduction in 1986, it has not been widely used in catalysis research. The reason is that common AFM operating modes do not provide the atomic resolution required to follow catalytic processes; rather the more complex noncontact (NC) mode is needed. Thus, scanning tunneling microscopy has been the principal tool for atomic scale catalysis research. In this Account, recent developments in NC-AFM will be presented that offer significant advantages for gaining a complete atomic level view of catalysis. The main advantage of NC-AFM is that the image contrast is due to the very short-range chemical forces that are of interest in catalysis. This motivated our development of 3D-AFM, a method that yields quantitative atomic resolution images of the potential energy surfaces that govern how molecules approach, stick, diffuse, and rebound from surfaces. A variation of 3D-AFM allows the determination of forces required to push atoms and molecules on surfaces, from which diffusion barriers and variations in adsorption strength may be obtained. Pushing molecules towards each other provides access to intermolecular interaction between reaction partners. Following reaction, NC-AFM with CO-terminated tips yields textbook images of intramolecular structure that can be used to identify reaction intermediates and products. Because NC-AFM and STM contrast mechanisms are distinct, combining the two methods can produce unique insight. It is demonstrated for surface-oxidized Cu(100) that simultaneous 3D-AFM/STM yields resolution of both the Cu and O atoms. Moreover, atomic defects in the Cu sublattice lead to variations in the reactivity of the neighboring O atoms. It is shown that NC-AFM also allows a straightforward imaging of work function variations which has been used to identify defect charge states on catalytic surfaces and to map charge transfer within an individual

  6. Google Tools in the Classroom

    Science.gov (United States)

    Albee, E. M.; Koons, P. O.; Schauffler, M.; Zhu, Y.; Segee, B. E.

    2009-12-01

    The Maine Learning Technology Initiative provides every seventh and eighth grade student in the state with MacBook laptop computers. Limitless education possibilities exist with the inclusion of Google Tools and laptops as learning tools in our modern classrooms. Google Applications allow students to create documents, spreadsheets, charts, graphs, forms, and presentations and easily allows the sharing of information with their fellow classmates and teachers. These applications invite the use of inquiry and critical thinking skills, collaboration among peers, and subject integration to teach students crucial concepts. The benefits for teachers extend into the realm of using Google sites to easily create a teacher website and blog to upload classroom information and create a communication connection for parents and students as well as collaborations between the teachers and University researchers and educators. Google Applications further enhances the possibilities for learning, sharing a wealth of information, and enhancing communication inside and outside of the classroom.

  7. Nucleic acids-based tools for ballast water surveillance, monitoring, and research

    Science.gov (United States)

    Darling, John A.; Frederick, Raymond M.

    2018-03-01

    Understanding the risks of biological invasion posed by ballast water-whether in the context of compliance testing, routine monitoring, or basic research-is fundamentally an exercise in biodiversity assessment, and as such should take advantage of the best tools available for tackling that problem. The past several decades have seen growing application of genetic methods for the study of biodiversity, driven in large part by dramatic technological advances in nucleic acids analysis. Monitoring approaches based on such methods have the potential to increase dramatically sampling throughput for biodiversity assessments, and to improve on the sensitivity, specificity, and taxonomic accuracy of traditional approaches. The application of targeted detection tools (largely focused on PCR but increasingly incorporating novel probe-based methodologies) has led to a paradigm shift in rare species monitoring, and such tools have already been applied for early detection in the context of ballast water surveillance. Rapid improvements in community profiling approaches based on high throughput sequencing (HTS) could similarly impact broader efforts to catalogue biodiversity present in ballast tanks, and could provide novel opportunities to better understand the risks of biotic exchange posed by ballast water transport-and the effectiveness of attempts to mitigate those risks. These various approaches still face considerable challenges to effective implementation, depending on particular management or research needs. Compliance testing, for instance, remains dependent on accurate quantification of viable target organisms; while tools based on RNA detection show promise in this context, the demands of such testing require considerable additional investment in methods development. In general surveillance and research contexts, both targeted and community-based approaches are still limited by various factors: quantification remains a challenge (especially for taxa in larger size

  8. Experimental research on the durability cutting tools for cutting-off steel profiles

    Directory of Open Access Journals (Sweden)

    Cristea Alexandru

    2017-01-01

    Full Text Available The production lines used for manufacturing U-shaped profiles are very complex and they must have high productivity. One of the most important stages of the fabrication process is the cutting-off. This paper presents the experimental research and analysis of the durability of the cutting tools used for cutting-off U-shaped metal steel profiles. The results of this work can be used to predict the durability of the cutting tools.

  9. Operations other than war: Requirements for analysis tools research report

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, D.S. III

    1996-12-01

    This report documents the research effort to determine the requirements for new or improved analysis tools to support decisions at the strategic and operational levels for military Operations Other than War (OOTW). The work was performed for the Commander in Chief, U.S. Pacific Command (USCINCPAC). The data collection was based on workshops attended by experts in OOTWs: analysis personnel from each of the Combatant Commands, the Services, the Office of the Secretary of Defense (OSD), the Joint Staff, and other knowledgeable personnel. Further data were gathered from other workshops and conferences and from the literature. The results of this research begin with the creation of a taxonomy of OOTWs: categories of operations, attributes of operations, and tasks requiring analytical support. The tasks are connected to the Joint Staff`s Universal Joint Task List (UJTL). Historical OOTWs are analyzed to produce frequency distributions by category and responsible CINC. The analysis products are synthesized into a list of requirements for analytical tools and definitions of the requirements. The report concludes with a timeline or roadmap for satisfying the requirements.

  10. Critical Care Health Informatics Collaborative (CCHIC): Data, tools and methods for reproducible research: A multi-centre UK intensive care database.

    Science.gov (United States)

    Harris, Steve; Shi, Sinan; Brealey, David; MacCallum, Niall S; Denaxas, Spiros; Perez-Suarez, David; Ercole, Ari; Watkinson, Peter; Jones, Andrew; Ashworth, Simon; Beale, Richard; Young, Duncan; Brett, Stephen; Singer, Mervyn

    2018-04-01

    To build and curate a linkable multi-centre database of high resolution longitudinal electronic health records (EHR) from adult Intensive Care Units (ICU). To develop a set of open-source tools to make these data 'research ready' while protecting patient's privacy with a particular focus on anonymisation. We developed a scalable EHR processing pipeline for extracting, linking, normalising and curating and anonymising EHR data. Patient and public involvement was sought from the outset, and approval to hold these data was granted by the NHS Health Research Authority's Confidentiality Advisory Group (CAG). The data are held in a certified Data Safe Haven. We followed sustainable software development principles throughout, and defined and populated a common data model that links to other clinical areas. Longitudinal EHR data were loaded into the CCHIC database from eleven adult ICUs at 5 UK teaching hospitals. From January 2014 to January 2017, this amounted to 21,930 and admissions (18,074 unique patients). Typical admissions have 70 data-items pertaining to admission and discharge, and a median of 1030 (IQR 481-2335) time-varying measures. Training datasets were made available through virtual machine images emulating the data processing environment. An open source R package, cleanEHR, was developed and released that transforms the data into a square table readily analysable by most statistical packages. A simple language agnostic configuration file will allow the user to select and clean variables, and impute missing data. An audit trail makes clear the provenance of the data at all times. Making health care data available for research is problematic. CCHIC is a unique multi-centre longitudinal and linkable resource that prioritises patient privacy through the highest standards of data security, but also provides tools to clean, organise, and anonymise the data. We believe the development of such tools are essential if we are to meet the twin requirements of

  11. A Tool for Measuring NASA's Aeronautics Research Progress Toward Planned Strategic Community Outcomes

    Science.gov (United States)

    Tahmasebi, Farhad; Pearce, Robert

    2016-01-01

    Description of a tool for portfolio analysis of NASA's Aeronautics research progress toward planned community strategic Outcomes is presented. For efficiency and speed, the tool takes advantage of a function developed in Excels Visual Basic for Applications. The strategic planning process for determining the community Outcomes is also briefly discussed. Stakeholder buy-in, partnership performance, progress of supporting Technical Challenges, and enablement forecast are used as the criteria for evaluating progress toward Outcomes. A few illustrative examples of using the tool are also presented.

  12. Estimating Tool-Tissue Forces Using a 3-Degree-of-Freedom Robotic Surgical Tool.

    Science.gov (United States)

    Zhao, Baoliang; Nelson, Carl A

    2016-10-01

    Robot-assisted minimally invasive surgery (MIS) has gained popularity due to its high dexterity and reduced invasiveness to the patient; however, due to the loss of direct touch of the surgical site, surgeons may be prone to exert larger forces and cause tissue damage. To quantify tool-tissue interaction forces, researchers have tried to attach different kinds of sensors on the surgical tools. This sensor attachment generally makes the tools bulky and/or unduly expensive and may hinder the normal function of the tools; it is also unlikely that these sensors can survive harsh sterilization processes. This paper investigates an alternative method by estimating tool-tissue interaction forces using driving motors' current, and validates this sensorless force estimation method on a 3-degree-of-freedom (DOF) robotic surgical grasper prototype. The results show that the performance of this method is acceptable with regard to latency and accuracy. With this tool-tissue interaction force estimation method, it is possible to implement force feedback on existing robotic surgical systems without any sensors. This may allow a haptic surgical robot which is compatible with existing sterilization methods and surgical procedures, so that the surgeon can obtain tool-tissue interaction forces in real time, thereby increasing surgical efficiency and safety.

  13. The virtual supermarket: an innovative research tool to study consumer food purchasing behaviour.

    Science.gov (United States)

    Waterlander, Wilma E; Scarpa, Michael; Lentz, Daisy; Steenhuis, Ingrid H M

    2011-07-25

    Economic interventions in the food environment are expected to effectively promote healthier food choices. However, before introducing them on a large scale, it is important to gain insight into the effectiveness of economic interventions and peoples' genuine reactions to price changes. Nonetheless, because of complex implementation issues, studies on price interventions are virtually non-existent. This is especially true for experiments undertaken in a retail setting. We have developed a research tool to study the effects of retail price interventions in a virtual-reality setting: the Virtual Supermarket. This paper aims to inform researchers about the features and utilization of this new software application. The Virtual Supermarket is a Dutch-developed three-dimensional software application in which study participants can shop in a manner comparable to a real supermarket. The tool can be used to study several food pricing and labelling strategies. The application base can be used to build future extensions and could be translated into, for example, an English-language version. The Virtual Supermarket contains a front-end which is seen by the participants, and a back-end that enables researchers to easily manipulate research conditions. The application keeps track of time spent shopping, number of products purchased, shopping budget, total expenditures and answers on configurable questionnaires. All data is digitally stored and automatically sent to a web server. A pilot study among Dutch consumers (n = 66) revealed that the application accurately collected and stored all data. Results from participant feedback revealed that 83% of the respondents considered the Virtual Supermarket easy to understand and 79% found that their virtual grocery purchases resembled their regular groceries. The Virtual Supermarket is an innovative research tool with a great potential to assist in gaining insight into food purchasing behaviour. The application can be obtained via an URL

  14. The virtual supermarket: An innovative research tool to study consumer food purchasing behaviour

    Directory of Open Access Journals (Sweden)

    Steenhuis Ingrid HM

    2011-07-01

    Full Text Available Abstract Background Economic interventions in the food environment are expected to effectively promote healthier food choices. However, before introducing them on a large scale, it is important to gain insight into the effectiveness of economic interventions and peoples' genuine reactions to price changes. Nonetheless, because of complex implementation issues, studies on price interventions are virtually non-existent. This is especially true for experiments undertaken in a retail setting. We have developed a research tool to study the effects of retail price interventions in a virtual-reality setting: the Virtual Supermarket. This paper aims to inform researchers about the features and utilization of this new software application. Results The Virtual Supermarket is a Dutch-developed three-dimensional software application in which study participants can shop in a manner comparable to a real supermarket. The tool can be used to study several food pricing and labelling strategies. The application base can be used to build future extensions and could be translated into, for example, an English-language version. The Virtual Supermarket contains a front-end which is seen by the participants, and a back-end that enables researchers to easily manipulate research conditions. The application keeps track of time spent shopping, number of products purchased, shopping budget, total expenditures and answers on configurable questionnaires. All data is digitally stored and automatically sent to a web server. A pilot study among Dutch consumers (n = 66 revealed that the application accurately collected and stored all data. Results from participant feedback revealed that 83% of the respondents considered the Virtual Supermarket easy to understand and 79% found that their virtual grocery purchases resembled their regular groceries. Conclusions The Virtual Supermarket is an innovative research tool with a great potential to assist in gaining insight into food

  15. The virtual supermarket: An innovative research tool to study consumer food purchasing behaviour

    Science.gov (United States)

    2011-01-01

    Background Economic interventions in the food environment are expected to effectively promote healthier food choices. However, before introducing them on a large scale, it is important to gain insight into the effectiveness of economic interventions and peoples' genuine reactions to price changes. Nonetheless, because of complex implementation issues, studies on price interventions are virtually non-existent. This is especially true for experiments undertaken in a retail setting. We have developed a research tool to study the effects of retail price interventions in a virtual-reality setting: the Virtual Supermarket. This paper aims to inform researchers about the features and utilization of this new software application. Results The Virtual Supermarket is a Dutch-developed three-dimensional software application in which study participants can shop in a manner comparable to a real supermarket. The tool can be used to study several food pricing and labelling strategies. The application base can be used to build future extensions and could be translated into, for example, an English-language version. The Virtual Supermarket contains a front-end which is seen by the participants, and a back-end that enables researchers to easily manipulate research conditions. The application keeps track of time spent shopping, number of products purchased, shopping budget, total expenditures and answers on configurable questionnaires. All data is digitally stored and automatically sent to a web server. A pilot study among Dutch consumers (n = 66) revealed that the application accurately collected and stored all data. Results from participant feedback revealed that 83% of the respondents considered the Virtual Supermarket easy to understand and 79% found that their virtual grocery purchases resembled their regular groceries. Conclusions The Virtual Supermarket is an innovative research tool with a great potential to assist in gaining insight into food purchasing behaviour. The

  16. Studying mechanism of radical reactions: From radiation to nitroxides as research tools

    Science.gov (United States)

    Maimon, Eric; Samuni, Uri; Goldstein, Sara

    2018-02-01

    Radicals are part of the chemistry of life, and ionizing radiation chemistry serves as an indispensable research tool for elucidation of the mechanism(s) underlying their reactions. The ever-increasing understanding of their involvement in diverse physiological and pathological processes has expanded the search for compounds that can diminish radical-induced damage. This review surveys the areas of research focusing on radical reactions and particularly with stable cyclic nitroxide radicals, which demonstrate unique antioxidative activities. Unlike common antioxidants that are progressively depleted under oxidative stress and yield secondary radicals, nitroxides are efficient radical scavengers yielding in most cases their respective oxoammonium cations, which are readily reduced back in the tissue to the nitroxide thus continuously being recycled. Nitroxides, which not only protect enzymes, cells, and laboratory animals from diverse kinds of biological injury, but also modify the catalytic activity of heme enzymes, could be utilized in chemical and biological systems serving as a research tool for elucidating mechanisms underlying complex chemical and biochemical processes.

  17. Web-based management of research groups - using the right tools and an adequate integration strategy

    Energy Technology Data Exchange (ETDEWEB)

    Barroso, Antonio Carlos de Oliveira; Menezes, Mario Olimpio de, E-mail: barroso@ipen.b, E-mail: mario@ipen.b [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil). Grupo de Pesquisa em Gestao do Conhecimento Aplicada a Area Nuclear

    2011-07-01

    Nowadays broad interest in a couple of inter linked subject areas can make the configuration of a research group to be much diversified both in terms of its components and of the binding relationships that glues the group together. That is the case of the research group for knowledge management and its applications to nuclear technology - KMANT at IPEN, a living entity born 7 years ago and that has sustainably attracted new collaborators. This paper describes the strategic planning of the group, its charter and credo, the present components of the group and the diversified nature of their relations with the group and with IPEN. Then the technical competencies and currently research lines (or programs) are described as well as the research projects, and the management scheme of the group. In the sequence the web-based management and collaboration tools are described as well our experience with their use. KMANT have experiment with over 20 systems and software in this area, but we will focus on those aimed at: (a) web-based project management (RedMine, ClockinIT, Who does, PhProjekt and Dotproject); (b) teaching platform (Moodle); (c) mapping and knowledge representation tools (Cmap, Freemind and VUE); (d) Simulation tools (Matlab, Vensim and NetLogo); (e) social network analysis tools (ORA, MultiNet and UciNet); (f) statistical analysis and modeling tools (R and SmartPLS). Special emphasis is given to the coupling of the group permanent activities like graduate courses and regular seminars and how newcomers are selected and trained to be able to enroll the group. A global assessment of the role the management strategy and available tool set for the group performance is presented. (author)

  18. Web-based management of research groups - using the right tools and an adequate integration strategy

    International Nuclear Information System (INIS)

    Barroso, Antonio Carlos de Oliveira; Menezes, Mario Olimpio de

    2011-01-01

    Nowadays broad interest in a couple of inter linked subject areas can make the configuration of a research group to be much diversified both in terms of its components and of the binding relationships that glues the group together. That is the case of the research group for knowledge management and its applications to nuclear technology - KMANT at IPEN, a living entity born 7 years ago and that has sustainably attracted new collaborators. This paper describes the strategic planning of the group, its charter and credo, the present components of the group and the diversified nature of their relations with the group and with IPEN. Then the technical competencies and currently research lines (or programs) are described as well as the research projects, and the management scheme of the group. In the sequence the web-based management and collaboration tools are described as well our experience with their use. KMANT have experiment with over 20 systems and software in this area, but we will focus on those aimed at: (a) web-based project management (RedMine, ClockinIT, Who does, PhProjekt and Dotproject); (b) teaching platform (Moodle); (c) mapping and knowledge representation tools (Cmap, Freemind and VUE); (d) Simulation tools (Matlab, Vensim and NetLogo); (e) social network analysis tools (ORA, MultiNet and UciNet); (f) statistical analysis and modeling tools (R and SmartPLS). Special emphasis is given to the coupling of the group permanent activities like graduate courses and regular seminars and how newcomers are selected and trained to be able to enroll the group. A global assessment of the role the management strategy and available tool set for the group performance is presented. (author)

  19. Simulation Tools for Electrical Machines Modelling: Teaching and ...

    African Journals Online (AJOL)

    Simulation tools are used both for research and teaching to allow a good comprehension of the systems under study before practical implementations. This paper illustrates the way MATLAB is used to model non-linearites in synchronous machine. The machine is modeled in rotor reference frame with currents as state ...

  20. SIRSALE: integrated video database management tools

    Science.gov (United States)

    Brunie, Lionel; Favory, Loic; Gelas, J. P.; Lefevre, Laurent; Mostefaoui, Ahmed; Nait-Abdesselam, F.

    2002-07-01

    Video databases became an active field of research during the last decade. The main objective in such systems is to provide users with capabilities to friendly search, access and playback distributed stored video data in the same way as they do for traditional distributed databases. Hence, such systems need to deal with hard issues : (a) video documents generate huge volumes of data and are time sensitive (streams must be delivered at a specific bitrate), (b) contents of video data are very hard to be automatically extracted and need to be humanly annotated. To cope with these issues, many approaches have been proposed in the literature including data models, query languages, video indexing etc. In this paper, we present SIRSALE : a set of video databases management tools that allow users to manipulate video documents and streams stored in large distributed repositories. All the proposed tools are based on generic models that can be customized for specific applications using ad-hoc adaptation modules. More precisely, SIRSALE allows users to : (a) browse video documents by structures (sequences, scenes, shots) and (b) query the video database content by using a graphical tool, adapted to the nature of the target video documents. This paper also presents an annotating interface which allows archivists to describe the content of video documents. All these tools are coupled to a video player integrating remote VCR functionalities and are based on active network technology. So, we present how dedicated active services allow an optimized video transport for video streams (with Tamanoir active nodes). We then describe experiments of using SIRSALE on an archive of news video and soccer matches. The system has been demonstrated to professionals with a positive feedback. Finally, we discuss open issues and present some perspectives.

  1. Modelling as an indispensible research tool in the information society.

    Science.gov (United States)

    Bouma, Johan

    2016-04-01

    Science and society would be well advised to develop a different relationship as the information revolution penetrates all aspects of modern life. Rather than produce clear answers to clear questions in a top-down manner, land-use issues related to the UN Sustainable Development Goals (SDGs) present "wicked"problems involving different, strongly opiniated, stakeholders with conflicting ideas and interests and risk-averse politicians. The Dutch government has invited its citizens to develop a "science agenda", defining future research needs, implicitly suggesting that the research community is unable to do so. Time, therefore, for a pro-active approach to more convincingly define our:"societal license to research". For soil science this could imply a focus on the SDGs , considering soils as living, characteristically different, dynamic bodies in a landscape, to be mapped in ways that allow generation of suitable modelling data. Models allow a dynamic characterization of water- and nutrient regimes and plant growth in soils both for actual and future conditions, reflecting e.g. effects of climate or land-use change or alternative management practices. Engaging modern stakeholders in a bottom-up manner implies continuous involvement and "joint learning" from project initiation to completion, where modelling results act as building blocks to explore alternative scenarios. Modern techniques allow very rapid calculations and innovative visualization. Everything is possible but only modelling can articulate the economic, social and environmental consequences of each scenario, demonstrating in a pro-active manner the crucial and indispensible role of research. But choices are to be made by stakeholders and reluctant policy makers and certainly not by scientists who should carefully guard their independance. Only clear results in the end are convincing proof for the impact of science, requiring therefore continued involvement of scientists up to the very end of projects. To

  2. Scalable Combinatorial Tools for Health Disparities Research

    Directory of Open Access Journals (Sweden)

    Michael A. Langston

    2014-10-01

    Full Text Available Despite staggering investments made in unraveling the human genome, current estimates suggest that as much as 90% of the variance in cancer and chronic diseases can be attributed to factors outside an individual’s genetic endowment, particularly to environmental exposures experienced across his or her life course. New analytical approaches are clearly required as investigators turn to complicated systems theory and ecological, place-based and life-history perspectives in order to understand more clearly the relationships between social determinants, environmental exposures and health disparities. While traditional data analysis techniques remain foundational to health disparities research, they are easily overwhelmed by the ever-increasing size and heterogeneity of available data needed to illuminate latent gene x environment interactions. This has prompted the adaptation and application of scalable combinatorial methods, many from genome science research, to the study of population health. Most of these powerful tools are algorithmically sophisticated, highly automated and mathematically abstract. Their utility motivates the main theme of this paper, which is to describe real applications of innovative transdisciplinary models and analyses in an effort to help move the research community closer toward identifying the causal mechanisms and associated environmental contexts underlying health disparities. The public health exposome is used as a contemporary focus for addressing the complex nature of this subject.

  3. Shape: A 3D Modeling Tool for Astrophysics.

    Science.gov (United States)

    Steffen, Wolfgang; Koning, Nicholas; Wenger, Stephan; Morisset, Christophe; Magnor, Marcus

    2011-04-01

    We present a flexible interactive 3D morpho-kinematical modeling application for astrophysics. Compared to other systems, our application reduces the restrictions on the physical assumptions, data type, and amount that is required for a reconstruction of an object's morphology. It is one of the first publicly available tools to apply interactive graphics to astrophysical modeling. The tool allows astrophysicists to provide a priori knowledge about the object by interactively defining 3D structural elements. By direct comparison of model prediction with observational data, model parameters can then be automatically optimized to fit the observation. The tool has already been successfully used in a number of astrophysical research projects.

  4. Rapid development of image analysis research tools: Bridging the gap between researcher and clinician with pyOsiriX.

    Science.gov (United States)

    Blackledge, Matthew D; Collins, David J; Koh, Dow-Mu; Leach, Martin O

    2016-02-01

    We present pyOsiriX, a plugin built for the already popular dicom viewer OsiriX that provides users the ability to extend the functionality of OsiriX through simple Python scripts. This approach allows users to integrate the many cutting-edge scientific/image-processing libraries created for Python into a powerful DICOM visualisation package that is intuitive to use and already familiar to many clinical researchers. Using pyOsiriX we hope to bridge the apparent gap between basic imaging scientists and clinical practice in a research setting and thus accelerate the development of advanced clinical image processing. We provide arguments for the use of Python as a robust scripting language for incorporation into larger software solutions, outline the structure of pyOsiriX and how it may be used to extend the functionality of OsiriX, and we provide three case studies that exemplify its utility. For our first case study we use pyOsiriX to provide a tool for smooth histogram display of voxel values within a user-defined region of interest (ROI) in OsiriX. We used a kernel density estimation (KDE) method available in Python using the scikit-learn library, where the total number of lines of Python code required to generate this tool was 22. Our second example presents a scheme for segmentation of the skeleton from CT datasets. We have demonstrated that good segmentation can be achieved for two example CT studies by using a combination of Python libraries including scikit-learn, scikit-image, SimpleITK and matplotlib. Furthermore, this segmentation method was incorporated into an automatic analysis of quantitative PET-CT in a patient with bone metastases from primary prostate cancer. This enabled repeatable statistical evaluation of PET uptake values for each lesion, before and after treatment, providing estaimes maximum and median standardised uptake values (SUVmax and SUVmed respectively). Following treatment we observed a reduction in lesion volume, SUVmax and SUVmed for

  5. Paediatric Automatic Phonological Analysis Tools (APAT).

    Science.gov (United States)

    Saraiva, Daniela; Lousada, Marisa; Hall, Andreia; Jesus, Luis M T

    2017-12-01

    To develop the pediatric Automatic Phonological Analysis Tools (APAT) and to estimate inter and intrajudge reliability, content validity, and concurrent validity. The APAT were constructed using Excel spreadsheets with formulas. The tools were presented to an expert panel for content validation. The corpus used in the Portuguese standardized test Teste Fonético-Fonológico - ALPE produced by 24 children with phonological delay or phonological disorder was recorded, transcribed, and then inserted into the APAT. Reliability and validity of APAT were analyzed. The APAT present strong inter- and intrajudge reliability (>97%). The content validity was also analyzed (ICC = 0.71), and concurrent validity revealed strong correlations between computerized and manual (traditional) methods. The development of these tools contributes to fill existing gaps in clinical practice and research, since previously there were no valid and reliable tools/instruments for automatic phonological analysis, which allowed the analysis of different corpora.

  6. Integrating Contemplative Tools into Biomedical Science Education and Research Training Programs

    Directory of Open Access Journals (Sweden)

    Rodney R. Dietert

    2014-01-01

    Full Text Available Academic preparation of science researchers and/or human or veterinary medicine clinicians through the science, technology, engineering, and mathematics (STEM curriculum has usually focused on the students (1 acquiring increased disciplinary expertise, (2 learning needed methodologies and protocols, and (3 expanding their capacity for intense, persistent focus. Such educational training is effective until roadblocks or problems arise via this highly-learned approach. Then, the health science trainee may have few tools available for effective problem solving. Training to achieve flexibility, adaptability, and broadened perspectives using contemplative practices has been rare among biomedical education programs. To address this gap, a Cornell University-based program involving formal biomedical science coursework, and health science workshops has been developed to offer science students, researchers and health professionals a broader array of personal, contemplation-based, problem-solving tools. This STEM educational initiative includes first-person exercises designed to broaden perceptional awareness, decrease emotional drama, and mobilize whole-body strategies for creative problem solving. Self-calibration and journaling are used for students to evaluate the personal utility of each exercise. The educational goals are to increase student self-awareness and self-regulation and to provide trainees with value-added tools for career-long problem solving. Basic elements of this educational initiative are discussed using the framework of the Tree of Contemplative Practices.

  7. A Monte Carlo-based treatment-planning tool for ion beam therapy

    CERN Document Server

    Böhlen, T T; Dosanjh, M; Ferrari, A; Haberer, T; Parodi, K; Patera, V; Mairan, A

    2013-01-01

    Ion beam therapy, as an emerging radiation therapy modality, requires continuous efforts to develop and improve tools for patient treatment planning (TP) and research applications. Dose and fluence computation algorithms using the Monte Carlo (MC) technique have served for decades as reference tools for accurate dose computations for radiotherapy. In this work, a novel MC-based treatment-planning (MCTP) tool for ion beam therapy using the pencil beam scanning technique is presented. It allows single-field and simultaneous multiple-fields optimization for realistic patient treatment conditions and for dosimetric quality assurance for irradiation conditions at state-of-the-art ion beam therapy facilities. It employs iterative procedures that allow for the optimization of absorbed dose and relative biological effectiveness (RBE)-weighted dose using radiobiological input tables generated by external RBE models. Using a re-implementation of the local effect model (LEM), theMCTP tool is able to perform TP studies u...

  8. Engaging stakeholders: lessons from the use of participatory tools for improving maternal and child care health services.

    Science.gov (United States)

    Ekirapa-Kiracho, Elizabeth; Ghosh, Upasona; Brahmachari, Rittika; Paina, Ligia

    2017-12-28

    Effective stakeholder engagement in research and implementation is important for improving the development and implementation of policies and programmes. A varied number of tools have been employed for stakeholder engagement. In this paper, we discuss two participatory methods for engaging with stakeholders - participatory social network analysis (PSNA) and participatory impact pathways analysis (PIPA). Based on our experience, we derive lessons about when and how to apply these tools. This paper was informed by a review of project reports and documents in addition to reflection meetings with the researchers who applied the tools. These reports were synthesised and used to make thick descriptions of the applications of the methods while highlighting key lessons. PSNA and PIPA both allowed a deep understanding of how the system actors are interconnected and how they influence maternal health and maternal healthcare services. The findings from the PSNA provided guidance on how stakeholders of a health system are interconnected and how they can stimulate more positive interaction between the stakeholders by exposing existing gaps. The PIPA meeting enabled the participants to envision how they could expand their networks and resources by mentally thinking about the contributions that they could make to the project. The processes that were considered critical for successful application of the tools and achievement of outcomes included training of facilitators, language used during the facilitation, the number of times the tool is applied, length of the tools, pretesting of the tools, and use of quantitative and qualitative methods. Whereas both tools allowed the identification of stakeholders and provided a deeper understanding of the type of networks and dynamics within the network, PIPA had a higher potential for promoting collaboration between stakeholders, likely due to allowing interaction between them. Additionally, it was implemented within a participatory action

  9. Jane: a new tool for the cophylogeny reconstruction problem.

    Science.gov (United States)

    Conow, Chris; Fielder, Daniel; Ovadia, Yaniv; Libeskind-Hadas, Ran

    2010-02-03

    This paper describes the theory and implementation of a new software tool, called Jane, for the study of historical associations. This problem arises in parasitology (associations of hosts and parasites), molecular systematics (associations of orderings and genes), and biogeography (associations of regions and orderings). The underlying problem is that of reconciling pairs of trees subject to biologically plausible events and costs associated with these events. Existing software tools for this problem have strengths and limitations, and the new Jane tool described here provides functionality that complements existing tools. The Jane software tool uses a polynomial time dynamic programming algorithm in conjunction with a genetic algorithm to find very good, and often optimal, solutions even for relatively large pairs of trees. The tool allows the user to provide rich timing information on both the host and parasite trees. In addition the user can limit host switch distance and specify multiple host switch costs by specifying regions in the host tree and costs for host switches between pairs of regions. Jane also provides a graphical user interface that allows the user to interactively experiment with modifications to the solutions found by the program. Jane is shown to be a useful tool for cophylogenetic reconstruction. Its functionality complements existing tools and it is therefore likely to be of use to researchers in the areas of parasitology, molecular systematics, and biogeography.

  10. [Analysis of researchers' implication in a research-intervention in the Stork Network: a tool for institutional analysis].

    Science.gov (United States)

    Fortuna, Cinira Magali; Mesquita, Luana Pinho de; Matumoto, Silvia; Monceau, Gilles

    2016-09-19

    This qualitative study is based on institutional analysis as the methodological theoretical reference with the objective of analyzing researchers' implication during a research-intervention and the interferences caused by this analysis. The study involved researchers from courses in medicine, nursing, and dentistry at two universities and workers from a Regional Health Department in follow-up on the implementation of the Stork Network in São Paulo State, Brazil. The researchers worked together in the intervention and in analysis workshops, supported by an external institutional analysis. Two institutions stood out in the analysis: the research, established mainly with characteristics of neutrality, and management, with Taylorist characteristics. Differences between researchers and difficulties in identifying actions proper to network management and research were some of the interferences that were identified. The study concludes that implication analysis is a powerful tool for such studies.

  11. Knowledge Translation Tools are Emerging to Move Neck Pain Research into Practice.

    Science.gov (United States)

    Macdermid, Joy C; Miller, Jordan; Gross, Anita R

    2013-01-01

    Development or synthesis of the best clinical research is in itself insufficient to change practice. Knowledge translation (KT) is an emerging field focused on moving knowledge into practice, which is a non-linear, dynamic process that involves knowledge synthesis, transfer, adoption, implementation, and sustained use. Successful implementation requires using KT strategies based on theory, evidence, and best practice, including tools and processes that engage knowledge developers and knowledge users. Tools can provide instrumental help in implementing evidence. A variety of theoretical frameworks underlie KT and provide guidance on how tools should be developed or implemented. A taxonomy that outlines different purposes for engaging in KT and target audiences can also be useful in developing or implementing tools. Theoretical frameworks that underlie KT typically take different perspectives on KT with differential focus on the characteristics of the knowledge, knowledge users, context/environment, or the cognitive and social processes that are involved in change. Knowledge users include consumers, clinicians, and policymakers. A variety of KT tools have supporting evidence, including: clinical practice guidelines, patient decision aids, and evidence summaries or toolkits. Exemplars are provided of two KT tools to implement best practice in management of neck pain-a clinician implementation guide (toolkit) and a patient decision aid. KT frameworks, taxonomies, clinical expertise, and evidence must be integrated to develop clinical tools that implement best evidence in the management of neck pain.

  12. Mutated genes as research tool

    International Nuclear Information System (INIS)

    1981-01-01

    Green plants are the ultimate source of all resources required for man's life, his food, his clothes, and almost all his energy requirements. Primitive prehistoric man could live from the abundance of nature surrounding him. Man today, dominating nature in terms of numbers and exploiting its limited resources, cannot exist without employing his intelligence to direct natural evolution. Plant sciences, therefore, are not a matter of curiosity but an essential requirement. From such considerations, the IAEA and FAO jointly organized a symposium to assess the value of mutation research for various kinds of plant science, which directly or indirectly might contribute to sustaining and improving crop production. The benefit through developing better cultivars that plant breeders can derive from using the additional genetic resources resulting from mutation induction has been assessed before at other FAO/IAEA meetings (Rome 1964, Pullman 1969, Ban 1974, Ibadan 1978) and is also monitored in the Mutation Breeding Newsletter, published by IAEA twice a year. Several hundred plant cultivars which carry economically important characters because their genes have been altered by ionizing radiation or other mutagens, are grown by farmers and horticulturists in many parts of the world. But the benefit derived from such mutant varieties is without any doubt surpassed by the contribution which mutation research has made towards the advancement of genetics. For this reason, a major part of the papers and discussions at the symposium dealt with the role induced-mutation research played in providing insight into gene action and gene interaction, the organization of genes in plant chromosomes in view of homology and homoeology, the evolutionary role of gene duplication and polyploidy, the relevance of gene blocks, the possibilities for chromosome engineering, the functioning of cytroplasmic inheritance and the genetic dynamics of populations. In discussing the evolutionary role of

  13. Soil and Water Assessment Tool: Historical Development, Applications, and Future Research Directions, The

    OpenAIRE

    Philip W. Gassman; Manuel R. Reyes; Colleen H. Green; Jeffrey G. Arnold

    2007-01-01

    The Soil and Water Assessment Tool (SWAT) model is a continuation of nearly 30 years of modeling efforts conducted by the U.S. Department of Agriculture (USDA), Agricultural Research Service. SWAT has gained international acceptance as a robust interdisciplinary watershed modeling tool, as evidenced by international SWAT conferences, hundreds of SWAT-related papers presented at numerous scientific meetings, and dozens of articles published in peer-reviewed journals. The model has also been ad...

  14. Easily configured real-time CPOE Pick Off Tool supporting focused clinical research and quality improvement.

    Science.gov (United States)

    Rosenbaum, Benjamin P; Silkin, Nikolay; Miller, Randolph A

    2014-01-01

    Real-time alerting systems typically warn providers about abnormal laboratory results or medication interactions. For more complex tasks, institutions create site-wide 'data warehouses' to support quality audits and longitudinal research. Sophisticated systems like i2b2 or Stanford's STRIDE utilize data warehouses to identify cohorts for research and quality monitoring. However, substantial resources are required to install and maintain such systems. For more modest goals, an organization desiring merely to identify patients with 'isolation' orders, or to determine patients' eligibility for clinical trials, may adopt a simpler, limited approach based on processing the output of one clinical system, and not a data warehouse. We describe a limited, order-entry-based, real-time 'pick off' tool, utilizing public domain software (PHP, MySQL). Through a web interface the tool assists users in constructing complex order-related queries and auto-generates corresponding database queries that can be executed at recurring intervals. We describe successful application of the tool for research and quality monitoring.

  15. Low field pulsed NMR- a mass screening tool in agricultural research

    International Nuclear Information System (INIS)

    Tiwari, P.N.

    1994-01-01

    One of the main requirements in agricultural research is to analyse large number of samples for their one or more chemical constituents and physical properties. In plant breeding programmes and germplasm evaluation, it is necessary that the analysis is fast as many samples are to be analysed. Pulsed nuclear magnetic resonance (NMR) is a potential tool for developing rapid and nondestructive method of analysis. Various applications of low resolution pulsed NMR in agricultural research, which are generally used as screening method are briefly described. 25 refs., 2 figs., 2 tabs

  16. New Tools for New Research in Psychiatry: A Scalable and Customizable Platform to Empower Data Driven Smartphone Research.

    Science.gov (United States)

    Torous, John; Kiang, Mathew V; Lorme, Jeanette; Onnela, Jukka-Pekka

    2016-05-05

    A longstanding barrier to progress in psychiatry, both in clinical settings and research trials, has been the persistent difficulty of accurately and reliably quantifying disease phenotypes. Mobile phone technology combined with data science has the potential to offer medicine a wealth of additional information on disease phenotypes, but the large majority of existing smartphone apps are not intended for use as biomedical research platforms and, as such, do not generate research-quality data. Our aim is not the creation of yet another app per se but rather the establishment of a platform to collect research-quality smartphone raw sensor and usage pattern data. Our ultimate goal is to develop statistical, mathematical, and computational methodology to enable us and others to extract biomedical and clinical insights from smartphone data. We report on the development and early testing of Beiwe, a research platform featuring a study portal, smartphone app, database, and data modeling and analysis tools designed and developed specifically for transparent, customizable, and reproducible biomedical research use, in particular for the study of psychiatric and neurological disorders. We also outline a proposed study using the platform for patients with schizophrenia. We demonstrate the passive data capabilities of the Beiwe platform and early results of its analytical capabilities. Smartphone sensors and phone usage patterns, when coupled with appropriate statistical learning tools, are able to capture various social and behavioral manifestations of illnesses, in naturalistic settings, as lived and experienced by patients. The ubiquity of smartphones makes this type of moment-by-moment quantification of disease phenotypes highly scalable and, when integrated within a transparent research platform, presents tremendous opportunities for research, discovery, and patient health.

  17. MEDICAL INFORMATICS: AN ESSENTIAL TOOL FOR HEALTH SCIENCES RESEARCH IN ACUTE CARE

    OpenAIRE

    Li, Man; Pickering, Brian W.; Smith, Vernon D.; Hadzikadic, Mirsad; Gajic, Ognjen; Herasevich, Vitaly

    2009-01-01

    Medical Informatics has become an important tool in modern health care practice and research. In the present article we outline the challenges and opportunities associated with the implementation of electronic medical records (EMR) in complex environments such as intensive care units (ICU). We share our initial experience in the design, maintenance and application of a customized critical care, Microsoft SQL based, research warehouse, ICU DataMart. ICU DataMart integrates clinical and adminis...

  18. Medical Informatics: An Essential Tool for Health Sciences Research in Acute Care

    OpenAIRE

    Man Li; Brian W. Pickering; Vernon D. Smith; Mirsad Hadzikadic; Ognjen Gajic; Vitaly Herasevich

    2009-01-01

    Medical Informatics has become an important tool in modern health care practice and research. In the present article we outline the challenges and opportunities associated with the implementation of electronic medical records (EMR) in complex environments such as intensive care units (ICU). We share our initial experience in the design, maintenance and application of a customized critical care, Microsoft SQL based, research warehouse, ICU DataMart. ICU DataMart integrates clinical and adminis...

  19. 10th International Workshop on Parallel Tools for High Performance Computing

    CERN Document Server

    Gracia, José; Hilbrich, Tobias; Knüpfer, Andreas; Resch, Michael; Nagel, Wolfgang

    2017-01-01

    This book presents the proceedings of the 10th International Parallel Tools Workshop, held October 4-5, 2016 in Stuttgart, Germany – a forum to discuss the latest advances in parallel tools. High-performance computing plays an increasingly important role for numerical simulation and modelling in academic and industrial research. At the same time, using large-scale parallel systems efficiently is becoming more difficult. A number of tools addressing parallel program development and analysis have emerged from the high-performance computing community over the last decade, and what may have started as collection of small helper script has now matured to production-grade frameworks. Powerful user interfaces and an extensive body of documentation allow easy usage by non-specialists. tools have been commercialized, but others are operated as open source by a growing research community.

  20. VAO Tools Enhance CANDELS Research Productivity

    Science.gov (United States)

    Greene, Gretchen; Donley, J.; Rodney, S.; LAZIO, J.; Koekemoer, A. M.; Busko, I.; Hanisch, R. J.; VAO Team; CANDELS Team

    2013-01-01

    The formation of galaxies and their co-evolution with black holes through cosmic time are prominent areas in current extragalactic astronomy. New methods in science research are building upon collaborations between scientists and archive data centers which span large volumes of multi-wavelength and heterogeneous data. A successful example of this form of teamwork is demonstrated by the CANDELS (Cosmic Assembly Near-infrared Deep Extragalactic Legacy Survey) and the Virtual Astronomical Observatory (VAO) collaboration. The CANDELS project archive data provider services are registered and discoverable in the VAO through an innovative web based Data Discovery Tool, providing a drill down capability and cross-referencing with other co-spatially located astronomical catalogs, images and spectra. The CANDELS team is working together with the VAO to define new methods for analyzing Spectral Energy Distributions of galaxies containing active galactic nuclei, and helping to evolve advanced catalog matching methods for exploring images of variable depths, wavelengths and resolution. Through the publication of VOEvents, the CANDELS project is publishing data streams for newly discovered supernovae that are bright enough to be followed from the ground.

  1. Decision support frameworks and tools for conservation

    Science.gov (United States)

    Schwartz, Mark W.; Cook, Carly N.; Pressey, Robert L.; Pullin, Andrew S.; Runge, Michael C.; Salafsky, Nick; Sutherland, William J.; Williamson, Matthew A.

    2018-01-01

    The practice of conservation occurs within complex socioecological systems fraught with challenges that require transparent, defensible, and often socially engaged project planning and management. Planning and decision support frameworks are designed to help conservation practitioners increase planning rigor, project accountability, stakeholder participation, transparency in decisions, and learning. We describe and contrast five common frameworks within the context of six fundamental questions (why, who, what, where, when, how) at each of three planning stages of adaptive management (project scoping, operational planning, learning). We demonstrate that decision support frameworks provide varied and extensive tools for conservation planning and management. However, using any framework in isolation risks diminishing potential benefits since no one framework covers the full spectrum of potential conservation planning and decision challenges. We describe two case studies that have effectively deployed tools from across conservation frameworks to improve conservation actions and outcomes. Attention to the critical questions for conservation project planning should allow practitioners to operate within any framework and adapt tools to suit their specific management context. We call on conservation researchers and practitioners to regularly use decision support tools as standard practice for framing both practice and research.

  2. Medical informatics: an essential tool for health sciences research in acute care.

    Science.gov (United States)

    Li, Man; Pickering, Brian W; Smith, Vernon D; Hadzikadic, Mirsad; Gajic, Ognjen; Herasevich, Vitaly

    2009-10-01

    Medical Informatics has become an important tool in modern health care practice and research. In the present article we outline the challenges and opportunities associated with the implementation of electronic medical records (EMR) in complex environments such as intensive care units (ICU). We share our initial experience in the design, maintenance and application of a customized critical care, Microsoft SQL based, research warehouse, ICU DataMart. ICU DataMart integrates clinical and administrative data from heterogeneous sources within the EMR to support research and practice improvement in the ICUs. Examples of intelligent alarms -- "sniffers", administrative reports, decision support and clinical research applications are presented.

  3. Medical Informatics: An Essential Tool for Health Sciences Research in Acute Care

    Directory of Open Access Journals (Sweden)

    Man Li

    2009-10-01

    Full Text Available Medical Informatics has become an important tool in modern health care practice and research. In the present article we outline the challenges and opportunities associated with the implementation of electronic medical records (EMR in complex environments such as intensive care units (ICU. We share our initial experience in the design, maintenance and application of a customized critical care, Microsoft SQL based, research warehouse, ICU DataMart. ICU DataMart integrates clinical and administrative data from heterogeneous sources within the EMR to support research and practice improvement in the ICUs. Examples of intelligent alarms – “sniffers”, administrative reports, decision support and clinical research applications are presented.

  4. A web-based tool to engage stakeholders in informing research planning for future decisions on emerging materials

    International Nuclear Information System (INIS)

    Powers, Christina M.; Grieger, Khara D.; Hendren, Christine Ogilvie; Meacham, Connie A.; Gurevich, Gerald; Lassiter, Meredith Gooding; Money, Eric S.; Lloyd, Jennifer M.; Beaulieu, Stephen M.

    2014-01-01

    Prioritizing and assessing risks associated with chemicals, industrial materials, or emerging technologies is a complex problem that benefits from the involvement of multiple stakeholder groups. For example, in the case of engineered nanomaterials (ENMs), scientific uncertainties exist that hamper environmental, health, and safety (EHS) assessments. Therefore, alternative approaches to standard EHS assessment methods have gained increased attention. The objective of this paper is to describe the application of a web-based, interactive decision support tool developed by the U.S. Environmental Protection Agency (U.S. EPA) in a pilot study on ENMs. The piloted tool implements U.S. EPA's comprehensive environmental assessment (CEA) approach to prioritize research gaps. When pursued, such research priorities can result in data that subsequently improve the scientific robustness of risk assessments and inform future risk management decisions. Pilot results suggest that the tool was useful in facilitating multi-stakeholder prioritization of research gaps. Results also provide potential improvements for subsequent applications. The outcomes of future CEAWeb applications with larger stakeholder groups may inform the development of funding opportunities for emerging materials across the scientific community (e.g., National Science Foundation Science to Achieve Results [STAR] grants, National Institutes of Health Requests for Proposals). - Highlights: • A web-based, interactive decision support tool was piloted for emerging materials. • The tool (CEAWeb) was based on an established approach to prioritize research gaps. • CEAWeb facilitates multi-stakeholder prioritization of research gaps. • We provide recommendations for future versions and applications of CEAWeb

  5. Grooved stone tools from Calabria region (Italy: Archaeological evidence and research perspectives

    Directory of Open Access Journals (Sweden)

    Felice Larocca

    2016-10-01

    Full Text Available Since the end of the 19th century the Calabria region in southern Italy has been known for an abundance of grooved stone axes and hammers used during late prehistory. These artefacts are characterized by a wide and often pronounced groove in the middle of the implement, thought to have aided securing the head to a wooden haft. Their widespread presence is known both in prehistoric archaeological literature and in the archaeological collections of various regional and extra-regional museums. At first, scholars did not relate these tools to the rich Calabrian ore deposits and to possible ancient mining activities; they were regarded simply as a variant of ground lithic industry of Neolithic tradition. However, between 1997 and 2012, about 50 tools were discovered in the prehistoric mine of Grotta della Monaca in northern Calabria where there are outcrops of copper and iron ore. This allowed us to recognize their specific mining value and to consider them as a sort of “guide fossil” for the identification of ancient mining districts. This paper presents the results of a study involving over 150 tools from the entire region, effectively demonstrating an almost perfect co-occurrence of grooved axes and hammers with areas rich in mineral resources, especially metalliferous ores.

  6. Extension of an Object-Oriented Optimization Tool: User's Reference Manual

    Science.gov (United States)

    Pak, Chan-Gi; Truong, Samson S.

    2015-01-01

    The National Aeronautics and Space Administration Armstrong Flight Research Center has developed a cost-effective and flexible object-oriented optimization (O (sup 3)) tool that leverages existing tools and practices and allows easy integration and adoption of new state-of-the-art software. This object-oriented framework can integrate the analysis codes for multiple disciplines, as opposed to relying on one code to perform analysis for all disciplines. Optimization can thus take place within each discipline module, or in a loop between the O (sup 3) tool and the discipline modules, or both. Six different sample mathematical problems are presented to demonstrate the performance of the O (sup 3) tool. Instructions for preparing input data for the O (sup 3) tool are detailed in this user's manual.

  7. Analyst Tools and Quality Control Software for the ARM Data System

    Energy Technology Data Exchange (ETDEWEB)

    Moore, S.T.

    2004-12-14

    ATK Mission Research develops analyst tools and automated quality control software in order to assist the Atmospheric Radiation Measurement (ARM) Data Quality Office with their data inspection tasks. We have developed a web-based data analysis and visualization tool, called NCVweb, that allows for easy viewing of ARM NetCDF files. NCVweb, along with our library of sharable Interactive Data Language procedures and functions, allows even novice ARM researchers to be productive with ARM data with only minimal effort. We also contribute to the ARM Data Quality Office by analyzing ARM data streams, developing new quality control metrics, new diagnostic plots, and integrating this information into DQ HandS - the Data Quality Health and Status web-based explorer. We have developed several ways to detect outliers in ARM data streams and have written software to run in an automated fashion to flag these outliers.

  8. Software Tools for Battery Design | Transportation Research | NREL

    Science.gov (United States)

    Software Tools for Battery Design Software Tools for Battery Design Under the Computer-Aided Engineering for Electric Drive Vehicle Batteries (CAEBAT) project, NREL has developed software tools to help using CAEBAT software tools. Knowledge of the interplay of multi-physics at varied scales is imperative

  9. Metaphors and Drawings as Research Tools of Head Teachers' Perceptions on Their Management and Leadership Roles and Responsibilities

    Science.gov (United States)

    Argyropoulou, Eleftheria; Hatira, Kalliopi

    2014-01-01

    This article introduces an alternative qualitative research tool: metaphor and drawing, as projections of personality features, to explore underlying concepts and values, thoughts and beliefs, fears and hesitations, aspirations and ambitions of the research subjects. These two projective tools are used to explore Greek state kindergarten head…

  10. Research on Key Technologies of Unit-Based CNC Machine Tool Assembly Design

    Directory of Open Access Journals (Sweden)

    Zhongqi Sheng

    2014-01-01

    Full Text Available Assembly is the part that produces the maximum workload and consumed time during product design and manufacturing process. CNC machine tool is the key basic equipment in manufacturing industry and research on assembly design technologies of CNC machine tool has theoretical significance and practical value. This study established a simplified ASRG for CNC machine tool. The connection between parts, semantic information of transmission, and geometric constraint information were quantified to assembly connection strength to depict the assembling difficulty level. The transmissibility based on trust relationship was applied on the assembly connection strength. Assembly unit partition based on assembly connection strength was conducted, and interferential assembly units were identified and revised. The assembly sequence planning and optimization of parts in each assembly unit and between assembly units was conducted using genetic algorithm. With certain type of high speed CNC turning center, as an example, this paper explored into the assembly modeling, assembly unit partition, and assembly sequence planning and optimization and realized the optimized assembly sequence of headstock of CNC machine tool.

  11. Jane: a new tool for the cophylogeny reconstruction problem

    Directory of Open Access Journals (Sweden)

    Ovadia Yaniv

    2010-02-01

    Full Text Available Abstract Background This paper describes the theory and implementation of a new software tool, called Jane, for the study of historical associations. This problem arises in parasitology (associations of hosts and parasites, molecular systematics (associations of orderings and genes, and biogeography (associations of regions and orderings. The underlying problem is that of reconciling pairs of trees subject to biologically plausible events and costs associated with these events. Existing software tools for this problem have strengths and limitations, and the new Jane tool described here provides functionality that complements existing tools. Results The Jane software tool uses a polynomial time dynamic programming algorithm in conjunction with a genetic algorithm to find very good, and often optimal, solutions even for relatively large pairs of trees. The tool allows the user to provide rich timing information on both the host and parasite trees. In addition the user can limit host switch distance and specify multiple host switch costs by specifying regions in the host tree and costs for host switches between pairs of regions. Jane also provides a graphical user interface that allows the user to interactively experiment with modifications to the solutions found by the program. Conclusions Jane is shown to be a useful tool for cophylogenetic reconstruction. Its functionality complements existing tools and it is therefore likely to be of use to researchers in the areas of parasitology, molecular systematics, and biogeography.

  12. The Shared Health Research Information Network (SHRINE): a prototype federated query tool for clinical data repositories.

    Science.gov (United States)

    Weber, Griffin M; Murphy, Shawn N; McMurry, Andrew J; Macfadden, Douglas; Nigrin, Daniel J; Churchill, Susanne; Kohane, Isaac S

    2009-01-01

    The authors developed a prototype Shared Health Research Information Network (SHRINE) to identify the technical, regulatory, and political challenges of creating a federated query tool for clinical data repositories. Separate Institutional Review Boards (IRBs) at Harvard's three largest affiliated health centers approved use of their data, and the Harvard Medical School IRB approved building a Query Aggregator Interface that can simultaneously send queries to each hospital and display aggregate counts of the number of matching patients. Our experience creating three local repositories using the open source Informatics for Integrating Biology and the Bedside (i2b2) platform can be used as a road map for other institutions. The authors are actively working with the IRBs and regulatory groups to develop procedures that will ultimately allow investigators to obtain identified patient data and biomaterials through SHRINE. This will guide us in creating a future technical architecture that is scalable to a national level, compliant with ethical guidelines, and protective of the interests of the participating hospitals.

  13. Simultaneous Scheduling of Jobs, AGVs and Tools Considering Tool Transfer Times in Multi Machine FMS By SOS Algorithm

    Science.gov (United States)

    Sivarami Reddy, N.; Ramamurthy, D. V., Dr.; Prahlada Rao, K., Dr.

    2017-08-01

    This article addresses simultaneous scheduling of machines, AGVs and tools where machines are allowed to share the tools considering transfer times of jobs and tools between machines, to generate best optimal sequences that minimize makespan in a multi-machine Flexible Manufacturing System (FMS). Performance of FMS is expected to improve by effective utilization of its resources, by proper integration and synchronization of their scheduling. Symbiotic Organisms Search (SOS) algorithm is a potent tool which is a better alternative for solving optimization problems like scheduling and proven itself. The proposed SOS algorithm is tested on 22 job sets with makespan as objective for scheduling of machines and tools where machines are allowed to share tools without considering transfer times of jobs and tools and the results are compared with the results of existing methods. The results show that the SOS has outperformed. The same SOS algorithm is used for simultaneous scheduling of machines, AGVs and tools where machines are allowed to share tools considering transfer times of jobs and tools to determine the best optimal sequences that minimize makespan.

  14. Information Technology Research Services: Powerful Tools to Keep Up with a Rapidly Moving Field

    Science.gov (United States)

    Hunter, Paul

    2010-01-01

    Marty firms offer Information Technology Research reports, analyst calls, conferences, seminars, tools, leadership development, etc. These entities include Gartner, Forrester Research, IDC, The Burton Group, Society for Information Management, 1nfoTech Research, The Corporate Executive Board, and so on. This talk will cover how a number of such services are being used at the Goddard Space Flight Center to improve our IT management practices, workforce skills, approach to innovation, and service delivery. These tools and services are used across the workforce, from the executive leadership to the IT worker. The presentation will cover the types of services each vendor provides and their primary engagement model. The use of these services at other NASA Centers and Headquarters will be included. In addition, I will explain how two of these services are available now to the entire NASA IT workforce through enterprise-wide subscriptions.

  15. Cost-effective evolution of research prototypes into end-user tools: The MACH case study

    DEFF Research Database (Denmark)

    Störrle, Harald

    2017-01-01

    's claim by fellow scientists, and (3) demonstrate the utility and value of the research contribution to any interested parties. However, turning an exploratory prototype into a “proper” tool for end-users often entails great effort. Heavyweight mainstream frameworks such as Eclipse do not address...... this issue; their steep learning curves constitute substantial entry barriers to such ecosystems. In this paper, we present the Model Analyzer/Checker (MACH), a stand-alone tool with a command-line interpreter. MACH integrates a set of research prototypes for analyzing UML models. By choosing a simple...... command line interpreter rather than (costly) graphical user interface, we achieved the core goal of quickly deploying research results to a broader audience while keeping the required effort to an absolute minimum. We analyze MACH as a case study of how requirements and constraints in an academic...

  16. Communications tools in research projects to support Semi and Non Structured Information

    Directory of Open Access Journals (Sweden)

    Astrid Jaime

    2005-06-01

    Full Text Available Innovation and thus the production of knowledge becomes a factor of competitiveness. In this context quality management could be complemented by knowledge management to aim the improvement of knowledge production by research activities process. To this end, after describing knowledge and informa-tion typologies in engineering activities, a knowledge man-agement system is proposed. The goal is to support: (1 Semi-Structured Information (e.g. reports, etc. thanks to the BASIC-Lab tool functions, which are based on attributing points of view and annotations to documents and document zones, and (2 Non-Structured Information (such as mail, dialogues, etc., thanks to MICA-Graph approach which intends to support ex-change of technical messages that concerns common resolution of research problems within project teams and to capitalise relevant knowledge. For the both approaches, prototype tools have been developed and evaluated, primarily to feed back with manufacturing knowledge in the EADS industrial envi-ronment.

  17. Co-authorship Network Analysis: A Powerful Tool for Strategic Planning of Research, Development and Capacity Building Programs on Neglected Diseases

    Science.gov (United States)

    Morel, Carlos Medicis; Serruya, Suzanne Jacob; Penna, Gerson Oliveira; Guimarães, Reinaldo

    2009-01-01

    Background New approaches and tools were needed to support the strategic planning, implementation and management of a Program launched by the Brazilian Government to fund research, development and capacity building on neglected tropical diseases with strong focus on the North, Northeast and Center-West regions of the country where these diseases are prevalent. Methodology/Principal Findings Based on demographic, epidemiological and burden of disease data, seven diseases were selected by the Ministry of Health as targets of the initiative. Publications on these diseases by Brazilian researchers were retrieved from international databases, analyzed and processed with text-mining tools in order to standardize author- and institution's names and addresses. Co-authorship networks based on these publications were assembled, visualized and analyzed with social network analysis software packages. Network visualization and analysis generated new information, allowing better design and strategic planning of the Program, enabling decision makers to characterize network components by area of work, identify institutions as well as authors playing major roles as central hubs or located at critical network cut-points and readily detect authors or institutions participating in large international scientific collaborating networks. Conclusions/Significance Traditional criteria used to monitor and evaluate research proposals or R&D Programs, such as researchers' productivity and impact factor of scientific publications, are of limited value when addressing research areas of low productivity or involving institutions from endemic regions where human resources are limited. Network analysis was found to generate new and valuable information relevant to the strategic planning, implementation and monitoring of the Program. It afforded a more proactive role of the funding agencies in relation to public health and equity goals, to scientific capacity building objectives and a more

  18. Co-authorship network analysis: a powerful tool for strategic planning of research, development and capacity building programs on neglected diseases.

    Directory of Open Access Journals (Sweden)

    Carlos Medicis Morel

    Full Text Available BACKGROUND: New approaches and tools were needed to support the strategic planning, implementation and management of a Program launched by the Brazilian Government to fund research, development and capacity building on neglected tropical diseases with strong focus on the North, Northeast and Center-West regions of the country where these diseases are prevalent. METHODOLOGY/PRINCIPAL FINDINGS: Based on demographic, epidemiological and burden of disease data, seven diseases were selected by the Ministry of Health as targets of the initiative. Publications on these diseases by Brazilian researchers were retrieved from international databases, analyzed and processed with text-mining tools in order to standardize author- and institution's names and addresses. Co-authorship networks based on these publications were assembled, visualized and analyzed with social network analysis software packages. Network visualization and analysis generated new information, allowing better design and strategic planning of the Program, enabling decision makers to characterize network components by area of work, identify institutions as well as authors playing major roles as central hubs or located at critical network cut-points and readily detect authors or institutions participating in large international scientific collaborating networks. CONCLUSIONS/SIGNIFICANCE: Traditional criteria used to monitor and evaluate research proposals or R&D Programs, such as researchers' productivity and impact factor of scientific publications, are of limited value when addressing research areas of low productivity or involving institutions from endemic regions where human resources are limited. Network analysis was found to generate new and valuable information relevant to the strategic planning, implementation and monitoring of the Program. It afforded a more proactive role of the funding agencies in relation to public health and equity goals, to scientific capacity building

  19. Managing research and surveillance projects in real-time with a novel open-source eManagement tool designed for under-resourced countries.

    Science.gov (United States)

    Steiner, Andreas; Hella, Jerry; Grüninger, Servan; Mhalu, Grace; Mhimbira, Francis; Cercamondi, Colin I; Doulla, Basra; Maire, Nicolas; Fenner, Lukas

    2016-09-01

    A software tool is developed to facilitate data entry and to monitor research projects in under-resourced countries in real-time. The eManagement tool "odk_planner" is written in the scripting languages PHP and Python. The odk_planner is lightweight and uses minimal internet resources. It was designed to be used with the open source software Open Data Kit (ODK). The users can easily configure odk_planner to meet their needs, and the online interface displays data collected from ODK forms in a graphically informative way. The odk_planner also allows users to upload pictures and laboratory results and sends text messages automatically. User-defined access rights protect data and privacy. We present examples from four field applications in Tanzania successfully using the eManagement tool: 1) clinical trial; 2) longitudinal Tuberculosis (TB) Cohort Study with a complex visit schedule, where it was used to graphically display missing case report forms, upload digitalized X-rays, and send text message reminders to patients; 3) intervention study to improve TB case detection, carried out at pharmacies: a tablet-based electronic referral system monitored referred patients, and sent automated messages to remind pharmacy clients to visit a TB Clinic; and 4) TB retreatment case monitoring designed to improve drug resistance surveillance: clinicians at four public TB clinics and lab technicians at the TB reference laboratory used a smartphone-based application that tracked sputum samples, and collected clinical and laboratory data. The user friendly, open source odk_planner is a simple, but multi-functional, Web-based eManagement tool with add-ons that helps researchers conduct studies in under-resourced countries. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  20. Tools for Reproducibility and Extensibility in Scientific Research

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    Open inquiry through reproducing results is fundamental to the scientific process. Contemporary research relies on software engineering pipelines to collect, process, and analyze data. The open source projects within Project Jupyter facilitate these objectives by bringing software engineering within the context of scientific communication. We will highlight specific projects that are computational building blocks for scientific communication, starting with the Jupyter Notebook. We will also explore applications of projects that build off of the Notebook such as Binder, JupyterHub, and repo2docker. We will discuss how these projects can individually and jointly improve reproducibility in scientific communication. Finally, we will demonstrate applications of Jupyter software that allow researchers to build upon the code of other scientists, both to extend their work and the work of others.    There will be a follow-up demo session in the afternoon, hosted by iML. Details can be foun...

  1. Using Open Source Tools to Create a Mobile Optimized, Crowdsourced Translation Tool

    Directory of Open Access Journals (Sweden)

    Evviva Weinraub Lajoie

    2014-04-01

    Full Text Available In late 2012, OSU Libraries and Press partnered with Maria's Libraries, an NGO in Rural Kenya, to provide users the ability to crowdsource translations of folk tales and existing children's books into a variety of African languages, sub-languages, and dialects. Together, these two organizations have been creating a mobile optimized platform using open source libraries such as Wink Toolkit (a library which provides mobile-friendly interaction from a website and Globalize3 to allow for multiple translations of database entries in a Ruby on Rails application. Research regarding successes of similar tools has been utilized in providing a consistent user interface. The OSU Libraries & Press team delivered a proof-of-concept tool that has the opportunity to promote technology exploration, improve early childhood literacy, change the way we approach foreign language learning, and to provide opportunities for cost-effective, multi-language publishing.

  2. A web-based tool to engage stakeholders in informing research planning for future decisions on emerging materials

    Energy Technology Data Exchange (ETDEWEB)

    Powers, Christina M., E-mail: powers.christina@epa.gov [National Center for Environmental Assessment, Office of Research and Development, U.S. Environmental Protection Agency, Research Triangle Park, NC 27711 (United States); Grieger, Khara D., E-mail: kgrieger@rti.org [RTI International, 3040 Cornwallis Rd., Research Triangle Park, NC 27709 (United States); Hendren, Christine Ogilvie, E-mail: chendren@duke.edu [Center for the Environmental Implications of NanoTechnology, Duke University, Durham, NC 27708 (United States); Meacham, Connie A., E-mail: meacham.connie@epa.gov [National Center for Environmental Assessment, Office of Research and Development, U.S. Environmental Protection Agency, Research Triangle Park, NC 27711 (United States); Gurevich, Gerald, E-mail: gurevich.gerald@epa.gov [National Center for Environmental Assessment, Office of Research and Development, U.S. Environmental Protection Agency, Research Triangle Park, NC 27711 (United States); Lassiter, Meredith Gooding, E-mail: lassiter.meredith@epa.gov [National Center for Environmental Assessment, Office of Research and Development, U.S. Environmental Protection Agency, Research Triangle Park, NC 27711 (United States); Money, Eric S., E-mail: emoney@rti.org [RTI International, 3040 Cornwallis Rd., Research Triangle Park, NC 27709 (United States); Lloyd, Jennifer M., E-mail: jml@rti.org [RTI International, 3040 Cornwallis Rd., Research Triangle Park, NC 27709 (United States); Beaulieu, Stephen M., E-mail: steveb@rti.org [RTI International, 3040 Cornwallis Rd., Research Triangle Park, NC 27709 (United States)

    2014-02-01

    Prioritizing and assessing risks associated with chemicals, industrial materials, or emerging technologies is a complex problem that benefits from the involvement of multiple stakeholder groups. For example, in the case of engineered nanomaterials (ENMs), scientific uncertainties exist that hamper environmental, health, and safety (EHS) assessments. Therefore, alternative approaches to standard EHS assessment methods have gained increased attention. The objective of this paper is to describe the application of a web-based, interactive decision support tool developed by the U.S. Environmental Protection Agency (U.S. EPA) in a pilot study on ENMs. The piloted tool implements U.S. EPA's comprehensive environmental assessment (CEA) approach to prioritize research gaps. When pursued, such research priorities can result in data that subsequently improve the scientific robustness of risk assessments and inform future risk management decisions. Pilot results suggest that the tool was useful in facilitating multi-stakeholder prioritization of research gaps. Results also provide potential improvements for subsequent applications. The outcomes of future CEAWeb applications with larger stakeholder groups may inform the development of funding opportunities for emerging materials across the scientific community (e.g., National Science Foundation Science to Achieve Results [STAR] grants, National Institutes of Health Requests for Proposals). - Highlights: • A web-based, interactive decision support tool was piloted for emerging materials. • The tool (CEAWeb) was based on an established approach to prioritize research gaps. • CEAWeb facilitates multi-stakeholder prioritization of research gaps. • We provide recommendations for future versions and applications of CEAWeb.

  3. Theatre elicitation integrating a participatory research tool in a mixed-method study

    NARCIS (Netherlands)

    Roerig, S.; Evers, S.J.T.M.; Krabbendam, L.

    2015-01-01

    The relation between theatre, or drama, and research is not novel which is illustrated by concepts such as role theory, theatre for development, or distancing in drama therapy. In various scientific fields theatre is used as a communicative and/or educative tool, however in the realm of childhood

  4. Electric gun: a new tool for ultrahigh-pressure research

    International Nuclear Information System (INIS)

    Weingart, R.C.; Chau, H.H.; Goosman, D.R.; Hofer, W.W.; Honodel, C.A.; Lee, R.S.; Steinberg, D.J.; Stroud, J.R.

    1979-01-01

    We have developed a new tool for ultrahigh-pressure research at LLL. This system, which we call the electric gun, has already achieved thin flyer plate velocities in excess of 20 km/s and pressures of the order of 2 TPa in tantalum. We believe that the electric gun is competitive with laser- and nuclear-driven methods of producing shocks in the 1-to-5 TPa range because of its precision and ease and economy of operation. Its development is recommended for shock initiation studies, dry runs for Site 300 hydroshots, and as a shock wave generator for surface studies

  5. Conceptualising the Use of Facebook in Ethnographic Research: As Tool, as Data and as Context

    Science.gov (United States)

    Baker, Sally

    2013-01-01

    This article proposes a three-part conceptualisation of the use of Facebook in ethnographic research: as a tool, as data and as context. Longitudinal research with young adults at a time of significant change provides many challenges for the ethnographic researcher, such as maintaining channels of communication and high rates of participant…

  6. Does health intervention research have real world policy and practice impacts: testing a new impact assessment tool.

    Science.gov (United States)

    Cohen, Gillian; Schroeder, Jacqueline; Newson, Robyn; King, Lesley; Rychetnik, Lucie; Milat, Andrew J; Bauman, Adrian E; Redman, Sally; Chapman, Simon

    2015-01-01

    There is a growing emphasis on the importance of research having demonstrable public benefit. Measurements of the impacts of research are therefore needed. We applied a modified impact assessment process that builds on best practice to 5 years (2003-2007) of intervention research funded by Australia's National Health and Medical Research Council to determine if these studies had post-research real-world policy and practice impacts. We used a mixed method sequential methodology whereby chief investigators of eligible intervention studies who completed two surveys and an interview were included in our final sample (n = 50), on which we conducted post-research impact assessments. Data from the surveys and interviews were triangulated with additional information obtained from documentary analysis to develop comprehensive case studies. These case studies were then summarized and the reported impacts were scored by an expert panel using criteria for four impact dimensions: corroboration; attribution, reach, and importance. Nineteen (38%) of the cases in our final sample were found to have had policy and practice impacts, with an even distribution of high, medium, and low impact scores. While the tool facilitated a rigorous and explicit criterion-based assessment of post-research impacts, it was not always possible to obtain evidence using documentary analysis to corroborate the impacts reported in chief investigator interviews. While policy and practice is ideally informed by reviews of evidence, some intervention research can and does have real world impacts that can be attributed to single studies. We recommend impact assessments apply explicit criteria to consider the corroboration, attribution, reach, and importance of reported impacts on policy and practice. Impact assessments should also allow sufficient time between impact data collection and completion of the original research and include mechanisms to obtain end-user input to corroborate claims and reduce biases

  7. Intelligent tools for building a scientific information platform from research to implementation

    CERN Document Server

    Skonieczny, Łukasz; Rybiński, Henryk; Kryszkiewicz, Marzena; Niezgódka, Marek

    2014-01-01

    This book is a selection of results obtained within three years of research performed under SYNAT—a nation-wide scientific project aiming at creating an infrastructure for scientific content storage and sharing for academia, education and open knowledge society in Poland. The book is intended to be the last of the series related to the SYNAT project. The previous books, titled “Intelligent Tools for Building a Scientific Information Platform” and “Intelligent Tools for Building a Scientific Information Platform: Advanced Architectures and Solutions”, were published as volumes 390 and 467 in Springer's Studies in Computational Intelligence. Its contents is based on the SYNAT 2013 Workshop held in Warsaw. The papers included in this volume present an overview and insight into information retrieval, repository systems, text processing, ontology-based systems, text mining, multimedia data processing and advanced software engineering, addressing the problems of implementing intelligent tools for building...

  8. New method development in prehistoric stone tool research: evaluating use duration and data analysis protocols.

    Science.gov (United States)

    Evans, Adrian A; Macdonald, Danielle A; Giusca, Claudiu L; Leach, Richard K

    2014-10-01

    Lithic microwear is a research field of prehistoric stone tool (lithic) analysis that has been developed with the aim to identify how stone tools were used. It has been shown that laser scanning confocal microscopy has the potential to be a useful quantitative tool in the study of prehistoric stone tool function. In this paper, two important lines of inquiry are investigated: (1) whether the texture of worn surfaces is constant under varying durations of tool use, and (2) the development of rapid objective data analysis protocols. This study reports on the attempt to further develop these areas of study and results in a better understanding of the complexities underlying the development of flexible analytical algorithms for surface analysis. The results show that when sampling is optimised, surface texture may be linked to contact material type, independent of use duration. Further research is needed to validate this finding and test an expanded range of contact materials. The use of automated analytical protocols has shown promise but is only reliable if sampling location and scale are defined. Results suggest that the sampling protocol reports on the degree of worn surface invasiveness, complicating the ability to investigate duration related textural characterisation. Copyright © 2014. Published by Elsevier Ltd.

  9. The new designs of diamond drill bits for composite polymers tooling

    Directory of Open Access Journals (Sweden)

    Ruslan Yu. Melentiev

    2015-12-01

    Full Text Available The author explores the drilling operation of some new engineering materials such as carbon fiber reinforced plastics (CFRP and other polymers that have an anisotropic structure, high-strength and elastic properties combined with low heat endurance. Such combination of properties makes impossible the simple transfer of the existing technologies for classic materials working to considered new class. At the same time, the existing tools cannot assure the specified quality of tooled products at the current productivity and tool life. Aim: The aim of this research is to increase the process efficiency of diamond drilling in composite polymers by developing the new designs of diamond drill bits. Materials and Methods: One of the most promising directions to solve this problem is the diamond coated abrasive type tool. This paper addresses and classifies the existing types of diamond drill bits according to their application and operation. The literature data analysis of known disadvantages during drilling operation, the quality of surface and joining face was performed. Results: The experimental researches of the author prove the negative meaning of the already known but kept out fact – the drill core blocking. The most important factors and structural features affecting the CFRP drilling process are revealed. The accounting of these factors allowed creating the set of unique designs of diamond drill bits for different purposes. The presented patented models has different allowance distribution schemes and cutting forces, thus satisfy the mechanical requirements of quality, productivity, tool life and hole geometry in the tooling of the specified material class.

  10. Creation of a personality garden--a tool for reflection and teacher development; an autoethnographical research paper.

    Science.gov (United States)

    O'Keeffe, Tracey

    2015-01-01

    This paper focuses on the Creation of a Personality Garden as a development tool. The original concept of the Garden was born from an autoethnographical study on the effects of self-concept on the teaching and learning experience. To explore the effects of self-concept on the teaching and learning experience. An autoethnographical study. The study was undertaken in London, UK. The researcher was also the sole participant in line with the autoethnographical approach. Data was collected through the means of a reflective diary, personal memory data, interview and other creative genres. A thematic analysis approach was then used to code and group core concepts. Three key areas were identified: emotional connection, growth, and resilience, with a fourth as an over-arching driver for the study; the audience and act of teaching. These elements appeared to underpin a teaching philosophy which recognises the benefits of self-awareness in teachers and an ability and willingness to connect with learners and respond to individual needs. The Garden was one element of self-reflective data which was later re-designed to embrace the personal transformation of the researcher throughout the study. Educationalists must have a willingness to explore self-perception as it can facilitate a sense of transparency and connection between the teacher and the learner. The Garden works as a dynamic tool and a sustainable model for confronting the on-going challenges of embracing risk-taking and emotionally connecting with learners within the educational context. It allows exploration of the nuances of personality and how the uniqueness of self interacts with the role of the teacher; a sometimes uncomfortable, yet safe, place to sit and experience a virtual reality check questioning assumptions and the theories that the individual espouses to use. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Japan. Human cloning ban allows some research.

    Science.gov (United States)

    Normile, D

    2000-12-08

    TOKYO--Japanese legislators last week approved a ban on human cloning that leaves room for the use of certain techniques in basic research. The action comes at the same time officials in two other countries--China and France--aired similar proposals that would prohibit so-called reproductive cloning while recognizing the possible importance of the technology in combating disease and improving human health.

  12. Justifying the design and selection of literacy and thinking tools

    Directory of Open Access Journals (Sweden)

    David Whitehead

    2008-10-01

    Full Text Available Criteria for the design and selection of literacy and thinking tools that allow educators to justifywhat they do are described within a wider framework of learning theory and research into bestpractice. Based on a meta-analysis of best practice, results from a three year project designedto evaluate the effectiveness of a secondary school literacy initiative in New Zealand, togetherwith recent research from cognitive and neuro-psychologists, it is argued that the design andselection of literacy and thinking tools used in elementary schools should be consistent with (iteaching focused (ii learner focused, (iii thought linked (iv neurologically consistent, (vsubject specific, (vi text linked, (vii developmentally appropriate, and (viii assessment linkedcriteria.

  13. New Tools for Sea Ice Data Analysis and Visualization: NSIDC's Arctic Sea Ice News and Analysis

    Science.gov (United States)

    Vizcarra, N.; Stroeve, J.; Beam, K.; Beitler, J.; Brandt, M.; Kovarik, J.; Savoie, M. H.; Skaug, M.; Stafford, T.

    2017-12-01

    Arctic sea ice has long been recognized as a sensitive climate indicator and has undergone a dramatic decline over the past thirty years. Antarctic sea ice continues to be an intriguing and active field of research. The National Snow and Ice Data Center's Arctic Sea Ice News & Analysis (ASINA) offers researchers and the public a transparent view of sea ice data and analysis. We have released a new set of tools for sea ice analysis and visualization. In addition to Charctic, our interactive sea ice extent graph, the new Sea Ice Data and Analysis Tools page provides access to Arctic and Antarctic sea ice data organized in seven different data workbooks, updated daily or monthly. An interactive tool lets scientists, or the public, quickly compare changes in ice extent and location. Another tool allows users to map trends, anomalies, and means for user-defined time periods. Animations of September Arctic and Antarctic monthly average sea ice extent and concentration may also be accessed from this page. Our tools help the NSIDC scientists monitor and understand sea ice conditions in near real time. They also allow the public to easily interact with and explore sea ice data. Technical innovations in our data center helped NSIDC quickly build these tools and more easily maintain them. The tools were made publicly accessible to meet the desire from the public and members of the media to access the numbers and calculations that power our visualizations and analysis. This poster explores these tools and how other researchers, the media, and the general public are using them.

  14. An informatics supported web-based data annotation and query tool to expedite translational research for head and neck malignancies

    International Nuclear Information System (INIS)

    Amin, Waqas; Kang, Hyunseok P; Egloff, Ann Marie; Singh, Harpreet; Trent, Kerry; Ridge-Hetrick, Jennifer; Seethala, Raja R; Grandis, Jennifer; Parwani, Anil V

    2009-01-01

    The Specialized Program of Research Excellence (SPORE) in Head and Neck Cancer neoplasm virtual biorepository is a bioinformatics-supported system to incorporate data from various clinical, pathological, and molecular systems into a single architecture based on a set of common data elements (CDEs) that provides semantic and syntactic interoperability of data sets. The various components of this annotation tool include the Development of Common Data Elements (CDEs) that are derived from College of American Pathologists (CAP) Checklist and North American Association of Central Cancer Registries (NAACR) standards. The Data Entry Tool is a portable and flexible Oracle-based data entry device, which is an easily mastered web-based tool. The Data Query Tool helps investigators and researchers to search de-identified information within the warehouse/resource through a 'point and click' interface, thus enabling only the selected data elements to be essentially copied into a data mart using a multi dimensional model from the warehouse's relational structure. The SPORE Head and Neck Neoplasm Database contains multimodal datasets that are accessible to investigators via an easy to use query tool. The database currently holds 6553 cases and 10607 tumor accessions. Among these, there are 965 metastatic, 4227 primary, 1369 recurrent, and 483 new primary cases. The data disclosure is strictly regulated by user's authorization. The SPORE Head and Neck Neoplasm Virtual Biorepository is a robust translational biomedical informatics tool that can facilitate basic science, clinical, and translational research. The Data Query Tool acts as a central source providing a mechanism for researchers to efficiently find clinically annotated datasets and biospecimens that are relevant to their research areas. The tool protects patient privacy by revealing only de-identified data in accordance with regulations and approvals of the IRB and scientific review committee

  15. Social networking site usage among childhood cancer survivors--a potential tool for research recruitment?

    Science.gov (United States)

    Seltzer, Erica D; Stolley, Melinda R; Mensah, Edward K; Sharp, Lisa K

    2014-09-01

    The recent and rapid growth of social networking site (SNS) use presents a unique public health opportunity to develop effective strategies for the recruitment of hard-to-reach participants for cancer research studies. This survey investigated childhood cancer survivors' reported use of SNS such as Facebook or MySpace and their perceptions of using SNS, for recruitment into survivorship research. Sixty White, Black, and Hispanic adult childhood cancer survivors (range 18-48 years of age) that were randomly selected from a larger childhood cancer study, the Chicago Healthy Living Study, participated in this pilot survey. Telephone surveys were conducted to understand current SNS activity and attitudes towards using SNS as a cancer research recruitment tool. Seventy percent of participants reported SNS usage of which 80 % were at least weekly users and 79 % reported positive attitudes towards the use of SNS as a recruitment tool for survivorship research. The results of this pilot study revealed that SNS use was high and regular among the childhood cancer survivors sampled. Most had positive attitudes towards using SNS for recruitment of research. The results of this pilot survey suggest that SNS may offer an alternative approach for recruitment of childhood cancer survivors into research.

  16. Social Networking Site Usage Among Childhood Cancer Survivors - A Potential Tool for Research Recruitment?

    Science.gov (United States)

    Seltzer, Erica D.; Stolley, Melinda R.; Mensah, Edward K.; Sharp, Lisa K.

    2014-01-01

    Purpose The recent and rapid growth of social networking site (SNS) use presents a unique public health opportunity to develop effective strategies for the recruitment of hard-to-reach participants for cancer research studies. This survey investigated childhood cancer survivors’ reported use of SNS such as facebook or MySpace and their perceptions of using SNS, for recruitment into survivorship research. Methods Sixty White, Black and Hispanic, adult childhood cancer survivors (range 18 – 48 years of age) that were randomly selected from a larger childhood cancer study, the Chicago Healthy Living Study (CHLS), participated in this pilot survey. Telephone surveys were conducted to understand current SNS activity and attitudes towards using SNS as a cancer research recruitment tool. Results Seventy percent of participants reported SNS usage of which 80% were at least weekly users and 79 % reported positive attitudes towards the use of SNS as a recruitment tool for survivorship research. Conclusions and implications for cancer survivors The results of this pilot study revealed that SNS use was high and regular among the childhood cancer survivors sampled. Most had positive attitudes towards using SNS for recruitment of research. The results of this pilot survey suggest that SNS may offer an alternative approach for recruitment of childhood cancer survivors into research. PMID:24532046

  17. NASA System-Level Design, Analysis and Simulation Tools Research on NextGen

    Science.gov (United States)

    Bardina, Jorge

    2011-01-01

    A review of the research accomplished in 2009 in the System-Level Design, Analysis and Simulation Tools (SLDAST) of the NASA's Airspace Systems Program is presented. This research thrust focuses on the integrated system-level assessment of component level innovations, concepts and technologies of the Next Generation Air Traffic System (NextGen) under research in the ASP program to enable the development of revolutionary improvements and modernization of the National Airspace System. The review includes the accomplishments on baseline research and the advancements on design studies and system-level assessment, including the cluster analysis as an annualization standard of the air traffic in the U.S. National Airspace, and the ACES-Air MIDAS integration for human-in-the-loop analyzes within the NAS air traffic simulation.

  18. A META-COMPOSITE SOFTWARE DEVELOPMENT APPROACH FOR TRANSLATIONAL RESEARCH

    Science.gov (United States)

    Sadasivam, Rajani S.; Tanik, Murat M.

    2013-01-01

    Translational researchers conduct research in a highly data-intensive and continuously changing environment and need to use multiple, disparate tools to achieve their goals. These researchers would greatly benefit from meta-composite software development or the ability to continuously compose and recompose tools together in response to their ever-changing needs. However, the available tools are largely disconnected, and current software approaches are inefficient and ineffective in their support for meta-composite software development. Building on the composite services development approach, the de facto standard for developing integrated software systems, we propose a concept-map and agent-based meta-composite software development approach. A crucial step in composite services development is the modeling of users’ needs as processes, which can then be specified in an executable format for system composition. We have two key innovations. First, our approach allows researchers (who understand their needs best) instead of technicians to take a leadership role in the development of process models, reducing inefficiencies and errors. A second innovation is that our approach also allows for modeling of complex user interactions as part of the process, overcoming the technical limitations of current tools. We demonstrate the feasibility of our approach using a real-world translational research use case. We also present results of usability studies evaluating our approach for future refinements. PMID:23504436

  19. A meta-composite software development approach for translational research.

    Science.gov (United States)

    Sadasivam, Rajani S; Tanik, Murat M

    2013-06-01

    Translational researchers conduct research in a highly data-intensive and continuously changing environment and need to use multiple, disparate tools to achieve their goals. These researchers would greatly benefit from meta-composite software development or the ability to continuously compose and recompose tools together in response to their ever-changing needs. However, the available tools are largely disconnected, and current software approaches are inefficient and ineffective in their support for meta-composite software development. Building on the composite services development approach, the de facto standard for developing integrated software systems, we propose a concept-map and agent-based meta-composite software development approach. A crucial step in composite services development is the modeling of users' needs as processes, which can then be specified in an executable format for system composition. We have two key innovations. First, our approach allows researchers (who understand their needs best) instead of technicians to take a leadership role in the development of process models, reducing inefficiencies and errors. A second innovation is that our approach also allows for modeling of complex user interactions as part of the process, overcoming the technical limitations of current tools. We demonstrate the feasibility of our approach using a real-world translational research use case. We also present results of usability studies evaluating our approach for future refinements.

  20. What is WorldWide Telescope, and Why Should Researchers Care?

    Science.gov (United States)

    Goodman, Alyssa A.

    2016-01-01

    As of 2015, about 20 million people have downloaded the computer program called "WorldWide Telescope," and even more have accessed it via the web, at http://worldwidetelescope.org. But, the vast majority of these millions are not professional astronomers. This talk will explain why WorldWide Telescope (WWT) is also a powerful tool for research astronomers. I will focus on how WWT can be, and is, being built-in to Journals, and into day-to-day research environments. By way of example, I will show how WWT already: allows users to display images, including those in Journals, in the context of multi-wavelength full-sky imagery; allows for the display of which parts of the Sky have been studied, when, how, and for what reason (see http://adsass.org); allows, via right-click, immediate access to ADS, SIMBAD, and other professional research tools. I will also highlight new work, currently in development, that is using WWT as a tool for observation planning, and as a display mode for advanced high-dimensional data visualization tools, like glue (see http://glueviz.org). WWT is now well-known in the education community (see http://wwtambassadors.org), so the explicit goal of this talk will be to make researchers more aware of its full power. I will explain how WWT transitioned, over 8 years, from a Microsoft Research project to its current open-source state (see https://github.com/WorldWideTelescope), and I will conclude with comments on the future of WWT, and its relationship to how research should be carried out in the future (see http://tinyurl.com/aas-potf).

  1. Strategic Risk Assessment: A Decision Tool for Complex Decisions

    Energy Technology Data Exchange (ETDEWEB)

    Pollard, Simon; Duarte-Davidson, Raquel; Yearsley, Roger [Environment Agency, London (United Kingdom). National Centre for Risk Analysis and Options Appraisal; Kemp, Ray; Crawford, Mark [Galson Sciences Limited, Oakham (United Kingdom)

    2001-07-01

    Reporting on the state of the environment often requires policy makers and regulators to prioritise a range of diverse environmental issues for the purpose of directing future action on environmental protection and improvement. Information on environmental issues to inform this type of analysis can be disparate, it may be too voluminous or even absent. Data on a range of issues are rarely presented in a common format that allows easy comparison. Nevertheless, strategic judgements are required on the significance of impacts from various environmental pressures and on the inherent uncertainties. Prioritising issues forces a discussion among stakeholders of the relative significance of 'environmental harm' from pressures acting on various receptors in the environment. Discussions of this sort rapidly evolve into a discourse on risks and values. In an attempt to help systematise these discussions and provide practical tools for the analysis of environmental risks at a strategic level, the Environment Agency of England and Wales has initiated developmental research on strategic risk assessment. The tools developed under this research use the concept of 'environmental harm' as a common currency, viewed from technical, social and economic perspectives, to analyse impacts from a range of environmental pressures. Critical to an informed debate is an understanding and analysis both of the various characteristics of harm (spatial and temporal extent, reversibility, latency, etc.) and of the social response to the actual or potential environmental harm. Recent developments in this approach allow a presentation of the analysis in a structured fashion so as to better inform risk management decisions. Here, we present recent developments in the strategic risk assessment research tool, as tested by case studies from state of the environment reporting and the analysis of a regional environmental plan. We discuss its relative advantages and limitations and its

  2. Strategic Risk Assessment: A Decision Tool for Complex Decisions

    International Nuclear Information System (INIS)

    Pollard, Simon; Duarte-Davidson, Raquel; Yearsley, Roger

    2001-01-01

    Reporting on the state of the environment often requires policy makers and regulators to prioritise a range of diverse environmental issues for the purpose of directing future action on environmental protection and improvement. Information on environmental issues to inform this type of analysis can be disparate, it may be too voluminous or even absent. Data on a range of issues are rarely presented in a common format that allows easy comparison. Nevertheless, strategic judgements are required on the significance of impacts from various environmental pressures and on the inherent uncertainties. Prioritising issues forces a discussion among stakeholders of the relative significance of 'environmental harm' from pressures acting on various receptors in the environment. Discussions of this sort rapidly evolve into a discourse on risks and values. In an attempt to help systematise these discussions and provide practical tools for the analysis of environmental risks at a strategic level, the Environment Agency of England and Wales has initiated developmental research on strategic risk assessment. The tools developed under this research use the concept of 'environmental harm' as a common currency, viewed from technical, social and economic perspectives, to analyse impacts from a range of environmental pressures. Critical to an informed debate is an understanding and analysis both of the various characteristics of harm (spatial and temporal extent, reversibility, latency, etc.) and of the social response to the actual or potential environmental harm. Recent developments in this approach allow a presentation of the analysis in a structured fashion so as to better inform risk management decisions. Here, we present recent developments in the strategic risk assessment research tool, as tested by case studies from state of the environment reporting and the analysis of a regional environmental plan. We discuss its relative advantages and limitations and its wider potential role

  3. A Semantic Approach to Cross-Disciplinary Research Collaboration

    Directory of Open Access Journals (Sweden)

    Laurens De Vocht

    2012-11-01

    Full Text Available The latest developments in ICT, more specifically Social Media and Web 2.0 tools, facilitate the use of online services in research and education. This is also known as Research 2.0 and Technology Enhanced Learning. Web 2.0 tools are especially useful in cases where experts from different disciplines want to collaborate. We suggest an integrated method that embeds these services in research and learning processes, because it is a laborious task for researchers and learners to check and use all varying types of tools and services. We explain a flexible model that uses state-of-the-art semantic technologies to model both structured and unstructured research data. The research data is extracted from many online resources and Social Media. We implement learning objects as an abstraction of the semantically modeled research data. We propose an environment that improves the scientific research and learning process by allowing researchers to efficiently browse the information and concepts represented as learning objects.

  4. RSP Tooling Technology

    Energy Technology Data Exchange (ETDEWEB)

    None

    2001-11-20

    RSP Tooling{trademark} is a spray forming technology tailored for producing molds and dies. The approach combines rapid solidification processing and net-shape materials processing in a single step. The general concept involves converting a mold design described by a CAD file to a tooling master using a suitable rapid prototyping (RP) technology such as stereolithography. A pattern transfer is made to a castable ceramic, typically alumina or fused silica (Figure 1). This is followed by spray forming a thick deposit of a tooling alloy on the pattern to capture the desired shape, surface texture, and detail. The resultant metal block is cooled to room temperature and separated from the pattern. The deposit's exterior walls are machined square, allowing it to be used as an insert in a standard mold base. The overall turnaround time for tooling is about 3 to 5 days, starting with a master. Molds and dies produced in this way have been used in high volume production runs in plastic injection molding and die casting. A Cooperative Research and Development Agreement (CRADA) between the Idaho National Engineering and Environmental Laboratory (INEEL) and Grupo Vitro has been established to evaluate the feasibility of using RSP Tooling technology for producing molds and dies of interest to Vitro. This report summarizes results from Phase I of this agreement, and describes work scope and budget for Phase I1 activities. The main objective in Phase I was to demonstrate the feasibility of applying the Rapid Solidification Process (RSP) Tooling method to produce molds for the manufacture of glass and other components of interest to Vitro. This objective was successfully achieved.

  5. Providing access to risk prediction tools via the HL7 XML-formatted risk web service.

    Science.gov (United States)

    Chipman, Jonathan; Drohan, Brian; Blackford, Amanda; Parmigiani, Giovanni; Hughes, Kevin; Bosinoff, Phil

    2013-07-01

    Cancer risk prediction tools provide valuable information to clinicians but remain computationally challenging. Many clinics find that CaGene or HughesRiskApps fit their needs for easy- and ready-to-use software to obtain cancer risks; however, these resources may not fit all clinics' needs. The HughesRiskApps Group and BayesMendel Lab therefore developed a web service, called "Risk Service", which may be integrated into any client software to quickly obtain standardized and up-to-date risk predictions for BayesMendel tools (BRCAPRO, MMRpro, PancPRO, and MelaPRO), the Tyrer-Cuzick IBIS Breast Cancer Risk Evaluation Tool, and the Colorectal Cancer Risk Assessment Tool. Software clients that can convert their local structured data into the HL7 XML-formatted family and clinical patient history (Pedigree model) may integrate with the Risk Service. The Risk Service uses Apache Tomcat and Apache Axis2 technologies to provide an all Java web service. The software client sends HL7 XML information containing anonymized family and clinical history to a Dana-Farber Cancer Institute (DFCI) server, where it is parsed, interpreted, and processed by multiple risk tools. The Risk Service then formats the results into an HL7 style message and returns the risk predictions to the originating software client. Upon consent, users may allow DFCI to maintain the data for future research. The Risk Service implementation is exemplified through HughesRiskApps. The Risk Service broadens the availability of valuable, up-to-date cancer risk tools and allows clinics and researchers to integrate risk prediction tools into their own software interface designed for their needs. Each software package can collect risk data using its own interface, and display the results using its own interface, while using a central, up-to-date risk calculator. This allows users to choose from multiple interfaces while always getting the latest risk calculations. Consenting users contribute their data for future

  6. e-research: Changes and challenges in the use of digital tools in primary care research

    DEFF Research Database (Denmark)

    Bruun Larsen, Lars; Skonnord, Trygve; Gjelstad, Svein

    in primary care research. Examples of this are online randomisation, electronic questionnaires, automatic email scheduling, mobile phone applications and data extraction tools. The amount of data can be increased to a low cost, and this can help to reach adequate sample sizes. However, there are still...... challenges within the field. To secure a high response rate, you need to follow up manually or use another application. There are also practical and ethical problems, and the data security for sensitive data have to be followed carefully. Session content Oral presentations about some technological...

  7. Lowering the Barrier to Reproducible Research by Publishing Provenance from Common Analytical Tools

    Science.gov (United States)

    Jones, M. B.; Slaughter, P.; Walker, L.; Jones, C. S.; Missier, P.; Ludäscher, B.; Cao, Y.; McPhillips, T.; Schildhauer, M.

    2015-12-01

    Scientific provenance describes the authenticity, origin, and processing history of research products and promotes scientific transparency by detailing the steps in computational workflows that produce derived products. These products include papers, findings, input data, software products to perform computations, and derived data and visualizations. The geosciences community values this type of information, and, at least theoretically, strives to base conclusions on computationally replicable findings. In practice, capturing detailed provenance is laborious and thus has been a low priority; beyond a lab notebook describing methods and results, few researchers capture and preserve detailed records of scientific provenance. We have built tools for capturing and publishing provenance that integrate into analytical environments that are in widespread use by geoscientists (R and Matlab). These tools lower the barrier to provenance generation by automating capture of critical information as researchers prepare data for analysis, develop, test, and execute models, and create visualizations. The 'recordr' library in R and the `matlab-dataone` library in Matlab provide shared functions to capture provenance with minimal changes to normal working procedures. Researchers can capture both scripted and interactive sessions, tag and manage these executions as they iterate over analyses, and then prune and publish provenance metadata and derived products to the DataONE federation of archival repositories. Provenance traces conform to the ProvONE model extension of W3C PROV, enabling interoperability across tools and languages. The capture system supports fine-grained versioning of science products and provenance traces. By assigning global identifiers such as DOIs, reseachers can cite the computational processes used to reach findings. And, finally, DataONE has built a web portal to search, browse, and clearly display provenance relationships between input data, the software

  8. geneCBR: a translational tool for multiple-microarray analysis and integrative information retrieval for aiding diagnosis in cancer research

    Directory of Open Access Journals (Sweden)

    Fdez-Riverola Florentino

    2009-06-01

    Full Text Available Abstract Background Bioinformatics and medical informatics are two research fields that serve the needs of different but related communities. Both domains share the common goal of providing new algorithms, methods and technological solutions to biomedical research, and contributing to the treatment and cure of diseases. Although different microarray techniques have been successfully used to investigate useful information for cancer diagnosis at the gene expression level, the true integration of existing methods into day-to-day clinical practice is still a long way off. Within this context, case-based reasoning emerges as a suitable paradigm specially intended for the development of biomedical informatics applications and decision support systems, given the support and collaboration involved in such a translational development. With the goals of removing barriers against multi-disciplinary collaboration and facilitating the dissemination and transfer of knowledge to real practice, case-based reasoning systems have the potential to be applied to translational research mainly because their computational reasoning paradigm is similar to the way clinicians gather, analyze and process information in their own practice of clinical medicine. Results In addressing the issue of bridging the existing gap between biomedical researchers and clinicians who work in the domain of cancer diagnosis, prognosis and treatment, we have developed and made accessible a common interactive framework. Our geneCBR system implements a freely available software tool that allows the use of combined techniques that can be applied to gene selection, clustering, knowledge extraction and prediction for aiding diagnosis in cancer research. For biomedical researches, geneCBR expert mode offers a core workbench for designing and testing new techniques and experiments. For pathologists or oncologists, geneCBR diagnostic mode implements an effective and reliable system that can

  9. Latest Community Coordinated Modeling Center (CCMC) services and innovative tools supporting the space weather research and operational communities.

    Science.gov (United States)

    Mendoza, A. M. M.; Rastaetter, L.; Kuznetsova, M. M.; Mays, M. L.; Chulaki, A.; Shim, J. S.; MacNeice, P. J.; Taktakishvili, A.; Collado-Vega, Y. M.; Weigand, C.; Zheng, Y.; Mullinix, R.; Patel, K.; Pembroke, A. D.; Pulkkinen, A. A.; Boblitt, J. M.; Bakshi, S. S.; Tsui, T.

    2017-12-01

    The Community Coordinated Modeling Center (CCMC), with the fundamental goal of aiding the transition of modern space science models into space weather forecasting while supporting space science research, has been serving as an integral hub for over 15 years, providing invaluable resources to both space weather scientific and operational communities. CCMC has developed and provided innovative web-based point of access tools varying from: Runs-On-Request System - providing unprecedented global access to the largest collection of state-of-the-art solar and space physics models, Integrated Space Weather Analysis (iSWA) - a powerful dissemination system for space weather information, Advanced Online Visualization and Analysis tools for more accurate interpretation of model results, Standard Data formats for Simulation Data downloads, and Mobile apps to view space weather data anywhere to the scientific community. In addition to supporting research and performing model evaluations, CCMC also supports space science education by hosting summer students through local universities. In this poster, we will showcase CCMC's latest innovative tools and services, and CCMC's tools that revolutionized the way we do research and improve our operational space weather capabilities. CCMC's free tools and resources are all publicly available online (http://ccmc.gsfc.nasa.gov).

  10. Planning Tool for Strategic Evaluation of Facility Plans - 13570

    Energy Technology Data Exchange (ETDEWEB)

    Magoulas, Virginia; Cercy, Michael [Savannah River National Laboratory, Savannah River Site, Aiken, SC 29808 (United States); Hall, Irin [Newport News Shipbuilding, 4101 Washington Ave., Newport News, VA 23607 (United States)

    2013-07-01

    Savannah River National Laboratory (SRNL) has developed a strategic planning tool for the evaluation of the utilization of its unique resources for processing and research and development of nuclear materials. The Planning Tool is a strategic level tool for assessing multiple missions that could be conducted utilizing the SRNL facilities and showcasing the plan. Traditional approaches using standard scheduling tools and laying out a strategy on paper tended to be labor intensive and offered either a limited or cluttered view for visualizing and communicating results. A tool that can assess the process throughput, duration, and utilization of the facility was needed. SRNL teamed with Newport News Shipbuilding (NNS), a division of Huntington Ingalls Industries, to create the next generation Planning Tool. The goal of this collaboration was to create a simulation based tool that allows for quick evaluation of strategies with respect to new or changing missions, and clearly communicates results to the decision makers. This tool has been built upon a mature modeling and simulation software previously developed by NNS. The Planning Tool provides a forum for capturing dependencies, constraints, activity flows, and variable factors. It is also a platform for quickly evaluating multiple mission scenarios, dynamically adding/updating scenarios, generating multiple views for evaluating/communicating results, and understanding where there are areas of risks and opportunities with respect to capacity. The Planning Tool that has been developed is useful in that it presents a clear visual plan for the missions at the Savannah River Site (SRS). It not only assists in communicating the plans to SRS corporate management, but also allows the area stakeholders a visual look at the future plans for SRS. The design of this tool makes it easily deployable to other facility and mission planning endeavors. (authors)

  11. Planning Tool for Strategic Evaluation of Facility Plans - 13570

    International Nuclear Information System (INIS)

    Magoulas, Virginia; Cercy, Michael; Hall, Irin

    2013-01-01

    Savannah River National Laboratory (SRNL) has developed a strategic planning tool for the evaluation of the utilization of its unique resources for processing and research and development of nuclear materials. The Planning Tool is a strategic level tool for assessing multiple missions that could be conducted utilizing the SRNL facilities and showcasing the plan. Traditional approaches using standard scheduling tools and laying out a strategy on paper tended to be labor intensive and offered either a limited or cluttered view for visualizing and communicating results. A tool that can assess the process throughput, duration, and utilization of the facility was needed. SRNL teamed with Newport News Shipbuilding (NNS), a division of Huntington Ingalls Industries, to create the next generation Planning Tool. The goal of this collaboration was to create a simulation based tool that allows for quick evaluation of strategies with respect to new or changing missions, and clearly communicates results to the decision makers. This tool has been built upon a mature modeling and simulation software previously developed by NNS. The Planning Tool provides a forum for capturing dependencies, constraints, activity flows, and variable factors. It is also a platform for quickly evaluating multiple mission scenarios, dynamically adding/updating scenarios, generating multiple views for evaluating/communicating results, and understanding where there are areas of risks and opportunities with respect to capacity. The Planning Tool that has been developed is useful in that it presents a clear visual plan for the missions at the Savannah River Site (SRS). It not only assists in communicating the plans to SRS corporate management, but also allows the area stakeholders a visual look at the future plans for SRS. The design of this tool makes it easily deployable to other facility and mission planning endeavors. (authors)

  12. Multimedia Informed Consent Tool for a Low Literacy African Research Population: Development and Pilot-Testing

    Science.gov (United States)

    Afolabi, Muhammed Olanrewaju; Bojang, Kalifa; D’Alessandro, Umberto; Imoukhuede, Egeruan Babatunde; Ravinetto, Raffaella M; Larson, Heidi Jane; McGrath, Nuala; Chandramohan, Daniel

    2014-01-01

    Background International guidelines recommend the use of appropriate informed consent procedures in low literacy research settings because written information is not known to guarantee comprehension of study information. Objectives This study developed and evaluated a multimedia informed consent tool for people with low literacy in an area where a malaria treatment trial was being planned in The Gambia. Methods We developed the informed consent document of the malaria treatment trial into a multimedia tool integrating video, animations and audio narrations in three major Gambian languages. Acceptability and ease of use of the multimedia tool were assessed using quantitative and qualitative methods. In two separate visits, the participants’ comprehension of the study information was measured by using a validated digitised audio questionnaire. Results The majority of participants (70%) reported that the multimedia tool was clear and easy to understand. Participants had high scores on the domains of adverse events/risk, voluntary participation, study procedures while lowest scores were recorded on the question items on randomisation. The differences in mean scores for participants’ ‘recall’ and ‘understanding’ between first and second visits were statistically significant (F (1,41)=25.38, pmultimedia tool was acceptable and easy to administer among low literacy participants in The Gambia. It also proved to be effective in delivering and sustaining comprehension of study information across a diverse group of participants. Additional research is needed to compare the tool to the traditional consent interview, both in The Gambia and in other sub-Saharan settings. PMID:25133065

  13. Motor origins of tool use.

    Science.gov (United States)

    Kahrs, Björn A; Jung, Wendy P; Lockman, Jeffrey J

    2013-01-01

    The current study examines the developmental trajectory of banging movements and its implications for tool use development. Twenty (6- to 15-month-old) infants wore reflective markers while banging a handled cube; movements were recorded at 240 Hz. Results indicated that through the second half-year, banging movements undergo developmental changes making them ideally suited for instrumental hammering and pounding. Younger infants were inefficient and variable when banging the object: Their hands followed circuitous paths of great lengths at high velocities. By 1 year, infants showed consistent and efficient straight up-down hand trajectories of smaller magnitude and velocity, allowing for precise aiming and delivering dependable levels of force. The findings suggest that tool use develops gradually from infants' existing manual behaviors. © 2012 The Authors. Child Development © 2012 Society for Research in Child Development, Inc.

  14. What’s Ketso? A Tool for Researchers, Educators, and Practitioners

    Directory of Open Access Journals (Sweden)

    James S. Bates

    2016-06-01

    Full Text Available Researchers, educators, and practitioners utilize a range of tools and techniques to obtain data, input, feedback, and information from research participants, program learners, and stakeholders. Ketso is both an array of information gathering techniques and a toolkit (see www.ketso.com. It “can be used in any situation when people come together to share information, learn from each other, make decisions and plan actions” (Tippett & How, 2011, p. 4. The word ketso means “action” in the Sesotho language, spoken in the African nation of Lesotho where the concept for this instrument was conceived. Ketso techniques fall into the participatory action research family of social science research methods (Tippett, Handley, & Ravetz, 2007. Ohio State University Extension professionals have used the Ketso toolkit and its techniques in numerous settings, including for professional development, conducting community needs/interests assessments, brainstorming, and data collection. As a toolkit, Ketso uses tactile and colorful leaves, branches, and icons to organize and display participants’ contributions on felt mats. As an array of techniques, Ketso is effective in engaging audiences because it is inclusive and provides each participant a platform for their perspective to be shared.

  15. DNA recovery from wild chimpanzee tools.

    Directory of Open Access Journals (Sweden)

    Fiona A Stewart

    Full Text Available Most of our knowledge of wild chimpanzee behaviour stems from fewer than 10 long-term field sites. This bias limits studies to a potentially unrepresentative set of communities known to show great behavioural diversity on small geographic scales. Here, we introduce a new genetic approach to bridge the gap between behavioural material evidence in unhabituated chimpanzees and genetic advances in the field of primatology. The use of DNA analyses has revolutionised archaeological and primatological fields, whereby extraction of DNA from non-invasively collected samples allows researchers to reconstruct behaviour without ever directly observing individuals. We used commercially available forensic DNA kits to show that termite-fishing by wild chimpanzees (Pan troglodytes schweinfurthii leaves behind detectable chimpanzee DNA evidence on tools. We then quantified the recovered DNA, compared the yield to that from faecal samples, and performed an initial assessment of mitochondrial and microsatellite markers to identify individuals. From 49 termite-fishing tools from the Issa Valley research site in western Tanzania, we recovered an average of 52 pg/μl chimpanzee DNA, compared to 376.2 pg/μl in faecal DNA extracts. Mitochondrial DNA haplotypes could be assigned to 41 of 49 tools (84%. Twenty-six tool DNA extracts yielded >25 pg/μl DNA and were selected for microsatellite analyses; genotypes were determined with confidence for 18 tools. These tools were used by a minimum of 11 individuals across the study period and termite mounds. These results demonstrate the utility of bio-molecular techniques and a primate archaeology approach in non-invasive monitoring and behavioural reconstruction of unhabituated primate populations.

  16. Direct numerical control of machine tools in a nuclear research center by the CAMAC system

    International Nuclear Information System (INIS)

    Zwoll, K.; Mueller, K.D.; Becks, B.; Erven, W.; Sauer, M.

    1977-01-01

    The production of mechanical parts in research centers can be improved by connecting several numerically controlled machine tools to a central process computer via a data link. The CAMAC Serial Highway with its expandable structure yields an economic and flexible system for this purpose. The CAMAC System also facilitates the development of modular components controlling the machine tools itself. A CAMAC installation controlling three different machine tools connected to a central computer (PDP11) via the CAMAC Serial Highway is described. Besides this application, part of the CAMAC hardware and software can also be used for a great variety of scientific experiments

  17. TU-CD-304-11: Veritas 2.0: A Cloud-Based Tool to Facilitate Research and Innovation

    Energy Technology Data Exchange (ETDEWEB)

    Mishra, P; Patankar, A; Etmektzoglou, A; Svatos, M [Varian Medical Systems, Palo Alto, CA (United States); Lewis, J [Brigham and Women’s Hospital, Boston, MA (United States)

    2015-06-15

    Purpose: We introduce Veritas 2.0, a cloud-based, non-clinical research portal, to facilitate translation of radiotherapy research ideas to new delivery techniques. The ecosystem of research tools includes web apps for a research beam builder for TrueBeam Developer Mode, an image reader for compressed and uncompressed XIM files, and a trajectory log file based QA/beam delivery analyzer. Methods: The research beam builder can generate TrueBeam readable XML file either from scratch or from pre-existing DICOM-RT plans. DICOM-RT plan is first converted to XML format and then researcher can interactively modify or add control points to them. Delivered beam can be verified via reading generated images and analyzing trajectory log files. Image reader can read both uncompressed and HND-compressed XIM images. The trajectory log analyzer lets researchers plot expected vs. actual values and deviations among 30 mechanical axes. The analyzer gives an animated view of MLC patterns for the beam delivery. Veritas 2.0 is freely available and its advantages versus standalone software are i) No software installation or maintenance needed, ii) easy accessibility across all devices iii) seamless upgrades and iv) OS independence. Veritas is written using open-source tools like twitter bootstrap, jQuery, flask, and Python-based modules. Results: In the first experiment, an anonymized 7-beam DICOM-RT IMRT plan was converted to XML beam containing 1400 control points. kV and MV imaging points were inserted into this XML beam. In another experiment, a binary log file was analyzed to compare actual vs expected values and deviations among axes. Conclusions: Veritas 2.0 is a public cloud-based web app that hosts a pool of research tools for facilitating research from conceptualization to verification. It is aimed at providing a platform for facilitating research and collaboration. I am full time employee at Varian Medical systems, Palo Alto.

  18. TU-CD-304-11: Veritas 2.0: A Cloud-Based Tool to Facilitate Research and Innovation

    International Nuclear Information System (INIS)

    Mishra, P; Patankar, A; Etmektzoglou, A; Svatos, M; Lewis, J

    2015-01-01

    Purpose: We introduce Veritas 2.0, a cloud-based, non-clinical research portal, to facilitate translation of radiotherapy research ideas to new delivery techniques. The ecosystem of research tools includes web apps for a research beam builder for TrueBeam Developer Mode, an image reader for compressed and uncompressed XIM files, and a trajectory log file based QA/beam delivery analyzer. Methods: The research beam builder can generate TrueBeam readable XML file either from scratch or from pre-existing DICOM-RT plans. DICOM-RT plan is first converted to XML format and then researcher can interactively modify or add control points to them. Delivered beam can be verified via reading generated images and analyzing trajectory log files. Image reader can read both uncompressed and HND-compressed XIM images. The trajectory log analyzer lets researchers plot expected vs. actual values and deviations among 30 mechanical axes. The analyzer gives an animated view of MLC patterns for the beam delivery. Veritas 2.0 is freely available and its advantages versus standalone software are i) No software installation or maintenance needed, ii) easy accessibility across all devices iii) seamless upgrades and iv) OS independence. Veritas is written using open-source tools like twitter bootstrap, jQuery, flask, and Python-based modules. Results: In the first experiment, an anonymized 7-beam DICOM-RT IMRT plan was converted to XML beam containing 1400 control points. kV and MV imaging points were inserted into this XML beam. In another experiment, a binary log file was analyzed to compare actual vs expected values and deviations among axes. Conclusions: Veritas 2.0 is a public cloud-based web app that hosts a pool of research tools for facilitating research from conceptualization to verification. It is aimed at providing a platform for facilitating research and collaboration. I am full time employee at Varian Medical systems, Palo Alto

  19. Qualitative and Quantitative Management Tools Used by Financial Officers in Public Research Universities

    Science.gov (United States)

    Trexler, Grant Lewis

    2012-01-01

    This dissertation set out to identify effective qualitative and quantitative management tools used by financial officers (CFOs) in carrying out their management functions of planning, decision making, organizing, staffing, communicating, motivating, leading and controlling at a public research university. In addition, impediments to the use of…

  20. An effective tool to manage the distribution of medicines and monitor the treatment in hospital pharmacies.

    Science.gov (United States)

    Franzoso, Gianpaolo

    2014-01-01

    Introduction The purpose of the article is to share a modus operandi and a tool that allows the recruitment and management of thousands of patients and their treatment by using a simple software created by the author and made freely available to all colleague-pharmacists. The author, a pharmacist, created this database because there were no tools on the market with all the features needed to manage the treatment of patients and the orders of drugs to ensure continuity of care without waste of public money. Methods The data collection is facilitated by the software and allows the monitoring of treatment of the patients and their re-evaluation. This tool can create a table containing all the information needed to predict the demand for drugs, the timing of therapies and of the treatment plans. It is an effective instrument to calculate the optimal purchase of drugs and the delivery of therapies to patients. Conclusions A simple tool that allows the management of many patients, reduces research time and facilitates the control of therapies. It allows us to optimize inventory and minimize the stock of drugs. It allows the pharmacist to focus attention on the clinical management of the patient by helping him to follow therapy and respond to his needs.

  1. The Development of a Communication Tool to Facilitate the Cancer Trial Recruitment Process and Increase Research Literacy among Underrepresented Populations.

    Science.gov (United States)

    Torres, Samantha; de la Riva, Erika E; Tom, Laura S; Clayman, Marla L; Taylor, Chirisse; Dong, Xinqi; Simon, Melissa A

    2015-12-01

    Despite increasing need to boost the recruitment of underrepresented populations into cancer trials and biobanking research, few tools exist for facilitating dialogue between researchers and potential research participants during the recruitment process. In this paper, we describe the initial processes of a user-centered design cycle to develop a standardized research communication tool prototype for enhancing research literacy among individuals from underrepresented populations considering enrollment in cancer research and biobanking studies. We present qualitative feedback and recommendations on the prototype's design and content from potential end users: five clinical trial recruiters and ten potential research participants recruited from an academic medical center. Participants were given the prototype (a set of laminated cards) and were asked to provide feedback about the tool's content, design elements, and word choices during semi-structured, in-person interviews. Results suggest that the prototype was well received by recruiters and patients alike. They favored the simplicity, lay language, and layout of the cards. They also noted areas for improvement, leading to card refinements that included the following: addressing additional topic areas, clarifying research processes, increasing the number of diverse images, and using alternative word choices. Our process for refining user interfaces and iterating content in early phases of design may inform future efforts to develop tools for use in clinical research or biobanking studies to increase research literacy.

  2. Justifying the design and selection of literacy and thinking tools

    Directory of Open Access Journals (Sweden)

    David Whitehead

    2008-10-01

    Full Text Available Criteria for the design and selection of literacy and thinking tools that allow educators to justify what they do are described within a wider framework of learning theory and research into best practice. Based on a meta-analysis of best practice, results from a three year project designed to evaluate the effectiveness of a secondary school literacy initiative in New Zealand, together with recent research from cognitive and neuro-psychologists, it is argued that the design and selection of literacy and thinking tools used in elementary schools should be consistent with (i teaching focused (ii learner focused, (iii thought linked (iv neurologically consistent, (v subject specific, (vi text linked, (vii developmentally appropriate, and (viii assessment linked criteria.

  3. MoDOT pavement preservation research program volume IV, pavement evaluation tools-data collection methods.

    Science.gov (United States)

    2015-10-01

    The overarching goal of the MoDOT Pavement Preservation Research Program, Task 3: Pavement Evaluation Tools Data : Collection Methods was to identify and evaluate methods to rapidly obtain network-level and project-level information relevant to :...

  4. Computational tool for simulation of power and refrigeration cycles

    Science.gov (United States)

    Córdoba Tuta, E.; Reyes Orozco, M.

    2016-07-01

    Small improvement in thermal efficiency of power cycles brings huge cost savings in the production of electricity, for that reason have a tool for simulation of power cycles allows modeling the optimal changes for a best performance. There is also a big boom in research Organic Rankine Cycle (ORC), which aims to get electricity at low power through cogeneration, in which the working fluid is usually a refrigerant. A tool to design the elements of an ORC cycle and the selection of the working fluid would be helpful, because sources of heat from cogeneration are very different and in each case would be a custom design. In this work the development of a multiplatform software for the simulation of power cycles and refrigeration, which was implemented in the C ++ language and includes a graphical interface which was developed using multiplatform environment Qt and runs on operating systems Windows and Linux. The tool allows the design of custom power cycles, selection the type of fluid (thermodynamic properties are calculated through CoolProp library), calculate the plant efficiency, identify the fractions of flow in each branch and finally generates a report very educational in pdf format via the LaTeX tool.

  5. Integrated Design Tools for Embedded Control Systems

    OpenAIRE

    Jovanovic, D.S.; Hilderink, G.H.; Broenink, Johannes F.; Karelse, F.

    2001-01-01

    Currently, computer-based control systems are still being implemented using the same techniques as 10 years ago. The purpose of this project is the development of a design framework, consisting of tools and libraries, which allows the designer to build high reliable heterogeneous real-time embedded systems in a very short time at a fraction of the present day costs. The ultimate focus of current research is on transformation control laws to efficient concurrent algorithms, with concerns about...

  6. The Efficient Virtual Learning Environment: A Case study of Web 2.0 Tools and Windows Live Spaces

    Science.gov (United States)

    Uzunboylu, Huseyin; Bicen, Huseyin; Cavus, Nadire

    2011-01-01

    Technological developments have affected teachers' instructional techniques: technology has allowed the concept of education to be viewed from different perspectives. The aim of this research is to integrate Web 2.0 tools, which are sparsely found on the internet (each tool is on a different site), into education and see if it positively affects…

  7. Facebook Ethnography: The Poststructural Ontology of Transnational (Im Migration Research

    Directory of Open Access Journals (Sweden)

    David Joseph Piacenti PhD

    2014-02-01

    Full Text Available This theoretical article discusses the creative utility of Facebook as a new ethnographic tool in which to study transnational (im migration. Facebook ethnography allows the (im migration researcher to transcend the four structural dualities that constrain transnational ethnographic research: (a geographic constraints, (b travel funding constraints, (c travel time constraints, and (d the logistical constraints of entrée into new ethnographic contexts. Facebook ethnography also allows the qualitative researcher to temporarily transcend the ontological structuralist dualities of traditional research methods, producing a new poststructural epistemological and ontological methodology.

  8. Using Web 2.0 tools to connect shore-based users to live science from the wide blue ocean

    Science.gov (United States)

    Cooper, S. K.; Peart, L.; Collins, J.

    2009-12-01

    The fast-expanding use of social networking tools, combined with improved connectivity available through satellite-provided internet on board the scientific ocean drilling vessel JOIDES Resolution (the JR), has allowed for a whole new kind of interaction. Unlike in the not-so-distant past, when non-participants were forced to wait for months to read about the results of ongoing research, web tools allow almost instantaneous participation in ship-based ocean science. Utilizing a brand new portal, joidesresolution.org, scientists and educators at sea can post daily blogs about their work and respond to questions and comments on those blogs, update the JR’s Facebook and Twitter pages, and post videos and photos to YouTube and Flickr regularly. Live video conferencing tools also allow for direct interaction with scientists and a view into the work being done on board in real time. These tools have allowed students, teachers and families, groups and individuals on shore to follow along with the expeditions of the ship and its exciting scientific explorations -- and become a part of them. Building this community provides a whole range of rich interactions and brings seafloor research and the real process of science to those who would never before have had access to it. This presentation will include an overview of the web portal and its associated social networking sites, as well as a discussion of the challenges and lessons learned over nearly a year of utilizing these new tools. The web portal joidesresolution.org home page.

  9. 40 CFR 35.2025 - Allowance and advance of allowance.

    Science.gov (United States)

    2010-07-01

    ... advance of allowance. (a) Allowance. Step 2+3 and Step 3 grant agreements will include an allowance for facilities planning and design of the project and Step 7 agreements will include an allowance for facility... 40 Protection of Environment 1 2010-07-01 2010-07-01 false Allowance and advance of allowance. 35...

  10. Final Report: Simulation Tools for Parallel Microwave Particle in Cell Modeling

    International Nuclear Information System (INIS)

    Stoltz, Peter H.

    2008-01-01

    Transport of high-power rf fields and the subsequent deposition of rf power into plasma is an important component of developing tokamak fusion energy. Two limitations on rf heating are: (i) breakdown of the metallic structures used to deliver rf power to the plasma, and (ii) a detailed understanding of how rf power couples into a plasma. Computer simulation is a main tool for helping solve both of these problems, but one of the premier tools, VORPAL, is traditionally too difficult to use for non-experts. During this Phase II project, we developed the VorpalView user interface tool. This tool allows Department of Energy researchers a fully graphical interface for analyzing VORPAL output to more easily model rf power delivery and deposition in plasmas.

  11. DDT: A Research Tool for Automatic Data Distribution in High Performance Fortran

    Directory of Open Access Journals (Sweden)

    Eduard AyguadÉ

    1997-01-01

    Full Text Available This article describes the main features and implementation of our automatic data distribution research tool. The tool (DDT accepts programs written in Fortran 77 and generates High Performance Fortran (HPF directives to map arrays onto the memories of the processors and parallelize loops, and executable statements to remap these arrays. DDT works by identifying a set of computational phases (procedures and loops. The algorithm builds a search space of candidate solutions for these phases which is explored looking for the combination that minimizes the overall cost; this cost includes data movement cost and computation cost. The movement cost reflects the cost of accessing remote data during the execution of a phase and the remapping costs that have to be paid in order to execute the phase with the selected mapping. The computation cost includes the cost of executing a phase in parallel according to the selected mapping and the owner computes rule. The tool supports interprocedural analysis and uses control flow information to identify how phases are sequenced during the execution of the application.

  12. Virtual Globes and Glacier Research: Integrating research, collaboration, logistics, data archival, and outreach into a single tool

    Science.gov (United States)

    Nolan, M.

    2006-12-01

    Virtual Globes are a paradigm shift in the way earth sciences are conducted. With these tools, nearly all aspects of earth science can be integrated from field science, to remote sensing, to remote collaborations, to logistical planning, to data archival/retrieval, to PDF paper retriebal, to education and outreach. Here we present an example of how VGs can be fully exploited for field sciences, using research at McCall Glacier, in Arctic Alaska.

  13. VRML and Collaborative Environments: New Tools for Networked Visualization

    Science.gov (United States)

    Crutcher, R. M.; Plante, R. L.; Rajlich, P.

    We present two new applications that engage the network as a tool for astronomical research and/or education. The first is a VRML server which allows users over the Web to interactively create three-dimensional visualizations of FITS images contained in the NCSA Astronomy Digital Image Library (ADIL). The server's Web interface allows users to select images from the ADIL, fill in processing parameters, and create renderings featuring isosurfaces, slices, contours, and annotations; the often extensive computations are carried out on an NCSA SGI supercomputer server without the user having an individual account on the system. The user can then download the 3D visualizations as VRML files, which may be rotated and manipulated locally on virtually any class of computer. The second application is the ADILBrowser, a part of the NCSA Horizon Image Data Browser Java package. ADILBrowser allows a group of participants to browse images from the ADIL within a collaborative session. The collaborative environment is provided by the NCSA Habanero package which includes text and audio chat tools and a white board. The ADILBrowser is just an example of a collaborative tool that can be built with the Horizon and Habanero packages. The classes provided by these packages can be assembled to create custom collaborative applications that visualize data either from local disk or from anywhere on the network.

  14. Proposal of a trigger tool to assess adverse events in dental care.

    Science.gov (United States)

    Corrêa, Claudia Dolores Trierweiler Sampaio de Oliveira; Mendes, Walter

    2017-11-21

    The aim of this study was to propose a trigger tool for research of adverse events in outpatient dentistry in Brazil. The tool was elaborated in two stages: (i) to build a preliminary set of triggers, a literature review was conducted to identify the composition of trigger tools used in other areas of health and the principal adverse events found in dentistry; (ii) to validate the preliminarily constructed triggers a panel of experts was organized using the modified Delphi method. Fourteen triggers were elaborated in a tool with explicit criteria to identify potential adverse events in dental care, essential for retrospective patient chart reviews. Studies on patient safety in dental care are still incipient when compared to other areas of health care. This study intended to contribute to the research in this field. The contribution by the literature and guidance from the expert panel allowed elaborating a set of triggers to detect adverse events in dental care, but additional studies are needed to test the instrument's validity.

  15. Northwestern University Schizophrenia Data and Software Tool (NUSDAST

    Directory of Open Access Journals (Sweden)

    Lei eWang

    2013-11-01

    Full Text Available The schizophrenia research community has invested substantial resources on collecting, managing and sharing large neuroimaging datasets. As part of this effort, our group has collected high resolution magnetic resonance (MR datasets from individuals with schizophrenia, their non-psychotic siblings, healthy controls and their siblings. This effort has resulted in a growing resource, the Northwestern University Schizophrenia Data and Software Tool (NUSDAST, an NIH-funded data sharing project to stimulate new research. This resource resides on XNAT Central, and it contains neuroimaging (MR scans, landmarks and surface maps for deep subcortical structures, and FreeSurfer cortical parcellation and measurement data, cognitive (cognitive domain scores for crystallized intelligence, working memory, episodic memory, and executive function, clinical (demographic, sibling relationship, SAPS and SANS psychopathology, and genetic (20 polymorphisms data, collected from more than 450 subjects, most with 2-year longitudinal follow-up. A neuroimaging mapping, analysis and visualization software tool, CAWorks, is also part of this resource. Moreover, in making our existing neuroimaging data along with the associated meta-data and computational tools publically accessible, we have established a web-based information retrieval portal that allows the user to efficiently search the collection. This research-ready dataset meaningfully combines neuroimaging data with other relevant information, and it can be used to help facilitate advancing neuroimaging research. It is our hope that this effort will help to overcome some of the commonly recognized technical barriers in advancing neuroimaging research such as lack of local organization and standard descriptions.

  16. A methodology and decision support tool for informing state-level bioenergy policymaking: New Jersey biofuels as a case study

    Science.gov (United States)

    Brennan-Tonetta, Margaret

    This dissertation seeks to provide key information and a decision support tool that states can use to support long-term goals of fossil fuel displacement and greenhouse gas reductions. The research yields three outcomes: (1) A methodology that allows for a comprehensive and consistent inventory and assessment of bioenergy feedstocks in terms of type, quantity, and energy potential. Development of a standardized methodology for consistent inventorying of biomass resources fosters research and business development of promising technologies that are compatible with the state's biomass resource base. (2) A unique interactive decision support tool that allows for systematic bioenergy analysis and evaluation of policy alternatives through the generation of biomass inventory and energy potential data for a wide variety of feedstocks and applicable technologies, using New Jersey as a case study. Development of a database that can assess the major components of a bioenergy system in one tool allows for easy evaluation of technology, feedstock and policy options. The methodology and decision support tool is applicable to other states and regions (with location specific modifications), thus contributing to the achievement of state and federal goals of renewable energy utilization. (3) Development of policy recommendations based on the results of the decision support tool that will help to guide New Jersey into a sustainable renewable energy future. The database developed in this research represents the first ever assessment of bioenergy potential for New Jersey. It can serve as a foundation for future research and modifications that could increase its power as a more robust policy analysis tool. As such, the current database is not able to perform analysis of tradeoffs across broad policy objectives such as economic development vs. CO2 emissions, or energy independence vs. source reduction of solid waste. Instead, it operates one level below that with comparisons of kWh or

  17. 30 CFR 206.157 - Determination of transportation allowances.

    Science.gov (United States)

    2010-07-01

    ... a series of outgoing pipelines; (5) Gas Research Institute (GRI) fees. The GRI conducts research... industry and gas customers. GRI fees are allowable provided such fees are mandatory in FERC-approved...

  18. Assessing Clinical Trial-Associated Workload in Community-Based Research Programs Using the ASCO Clinical Trial Workload Assessment Tool.

    Science.gov (United States)

    Good, Marjorie J; Hurley, Patricia; Woo, Kaitlin M; Szczepanek, Connie; Stewart, Teresa; Robert, Nicholas; Lyss, Alan; Gönen, Mithat; Lilenbaum, Rogerio

    2016-05-01

    Clinical research program managers are regularly faced with the quandary of determining how much of a workload research staff members can manage while they balance clinical practice and still achieve clinical trial accrual goals, maintain data quality and protocol compliance, and stay within budget. A tool was developed to measure clinical trial-associated workload, to apply objective metrics toward documentation of work, and to provide clearer insight to better meet clinical research program challenges and aid in balancing staff workloads. A project was conducted to assess the feasibility and utility of using this tool in diverse research settings. Community-based research programs were recruited to collect and enter clinical trial-associated monthly workload data into a web-based tool for 6 consecutive months. Descriptive statistics were computed for self-reported program characteristics and workload data, including staff acuity scores and number of patient encounters. Fifty-one research programs that represented 30 states participated. Median staff acuity scores were highest for staff with patients enrolled in studies and receiving treatment, relative to staff with patients in follow-up status. Treatment trials typically resulted in higher median staff acuity, relative to cancer control, observational/registry, and prevention trials. Industry trials exhibited higher median staff acuity scores than trials sponsored by the National Institutes of Health/National Cancer Institute, academic institutions, or others. The results from this project demonstrate that trial-specific acuity measurement is a better measure of workload than simply counting the number of patients. The tool was shown to be feasible and useable in diverse community-based research settings. Copyright © 2016 by American Society of Clinical Oncology.

  19. Research and technology management in the electricity industry methods, tools and case studies

    CERN Document Server

    Daim, Tugrul; Kim, Jisun

    2013-01-01

    Technologies such as renewable energy alternatives including wind, solar and biomass, storage technologies and electric engines are creating a different landscape for the  electricity industry. Using sources and ideas from technologies such as renewable energy alternatives, Research and Technology Management in the Electricity Industry explores a different landscape for this industry and applies it to the electric industry supported by real industry cases. Divided into three sections, Research and Technology Management in the Electricity Industry introduces a range of  methods and tools includ

  20. State Health Mapper: An Interactive, Web-Based Tool for Physician Workforce Planning, Recruitment, and Health Services Research.

    Science.gov (United States)

    Krause, Denise D

    2015-11-01

    Health rankings in Mississippi are abysmal. Mississippi also has fewer physicians to serve its population compared with all other states. Many residents of this predominately rural state do not have access to healthcare providers. To better understand the demographics and distribution of the current health workforce in Mississippi, the main objective of the study was to design a Web-based, spatial, interactive application to visualize and explore the physician workforce. A Web application was designed to assist in health workforce planning. Secondary datasets of licensure and population information were obtained, and live feeds from licensure systems are being established. Several technologies were used to develop an intuitive, user-friendly application. Custom programming was completed in JavaScript so the application could run on most platforms, including mobile devices. The application allows users to identify and query geographic locations of individual or aggregated physicians based on attributes included in the licensure data, to perform drive time or buffer analyses, and to explore sociodemographic population data by geographic area of choice. This Web-based application with analytical tools visually represents the physician workforce licensed in Mississippi and its attributes, and provides access to much-needed information for statewide health workforce planning and research. The success of the application is not only based on the practicality of the tool but also on its ease of use. Feedback has been positive and has come from a wide variety of organizations across the state.

  1. Structure and software tools of AIDA.

    Science.gov (United States)

    Duisterhout, J S; Franken, B; Witte, F

    1987-01-01

    AIDA consists of a set of software tools to allow for fast development and easy-to-maintain Medical Information Systems. AIDA supports all aspects of such a system both during development and operation. It contains tools to build and maintain forms for interactive data entry and on-line input validation, a database management system including a data dictionary and a set of run-time routines for database access, and routines for querying the database and output formatting. Unlike an application generator, the user of AIDA may select parts of the tools to fulfill his needs and program other subsystems not developed with AIDA. The AIDA software uses as host language the ANSI-standard programming language MUMPS, an interpreted language embedded in an integrated database and programming environment. This greatly facilitates the portability of AIDA applications. The database facilities supported by AIDA are based on a relational data model. This data model is built on top of the MUMPS database, the so-called global structure. This relational model overcomes the restrictions of the global structure regarding string length. The global structure is especially powerful for sorting purposes. Using MUMPS as a host language allows the user an easy interface between user-defined data validation checks or other user-defined code and the AIDA tools. AIDA has been designed primarily for prototyping and for the construction of Medical Information Systems in a research environment which requires a flexible approach. The prototyping facility of AIDA operates terminal independent and is even to a great extent multi-lingual. Most of these features are table-driven; this allows on-line changes in the use of terminal type and language, but also causes overhead. AIDA has a set of optimizing tools by which it is possible to build a faster, but (of course) less flexible code from these table definitions. By separating the AIDA software in a source and a run-time version, one is able to write

  2. A Tool and Application Programming Interface for Browsing Historical Geostationary Satellite Data

    Science.gov (United States)

    Chee, T.; Nguyen, L.; Minnis, P.; Spangenberg, D.; Ayers, J.

    2013-12-01

    Providing access to information is a key concern for NASA Langley Research Center. We describe a tool and method that allows end users to easily browse and access information that is otherwise difficult to acquire and manipulate. The tool described has as its core the application-programming interface that is made available to the public. One goal of the tool is to provide a demonstration to end users so that they can use the enhanced imagery as an input into their own work flows. This project builds upon NASA Langley Cloud and Radiation Group's experience with making real-time and historical satellite imagery accessible and easily searchable. As we see the increasing use of virtual supply chains that provide additional value at each link there is value in making satellite imagery available through a simple access method as well as allowing users to browse and view that imagery as they need rather than in a manner most convenient for the data provider.

  3. Spent fuel treatment to allow storage in air

    International Nuclear Information System (INIS)

    Williams, K.L.

    1988-01-01

    During Fiscal Year 1987 (FY-87), research began at the Idaho National Engineering Laboratory (INEL) to develop a treatment material and process to coat fuel rods in commercial spent fuel assemblies to allow the assemblies to be stored in hot (up to 380 0 C) air without oxidation of the fuel. This research was conducted under a research and development fund provided by the U.S. Department of Energy (DOE) and independently administered by EG and G Idaho, Inc., DOE's prime contractor at the INEL. The objectives of the research were to identify and evaluate possible treatment processes and materials, identify areas of uncertainty, and to recommend the most likely candidate to allow spent fuel dry storage in hot air. The results of the research are described: results were promising and several good candidates were identified, but further research is needed to examine the candidates to the point where comparison is possible

  4. Science in the Eyes of Preschool Children: Findings from an Innovative Research Tool

    Science.gov (United States)

    Dubosarsky, Mia D.

    How do young children view science? Do these views reflect cultural stereotypes? When do these views develop? These fundamental questions in the field of science education have rarely been studied with the population of preschool children. One main reason is the lack of an appropriate research instrument that addresses preschool children's developmental competencies. Extensive body of research has pointed at the significance of early childhood experiences in developing positive attitudes and interests toward learning in general and the learning of science in particular. Theoretical and empirical research suggests that stereotypical views of science may be replaced by authentic views following inquiry science experience. However, no preschool science intervention program could be designed without a reliable instrument that provides baseline information about preschool children's current views of science. The current study presents preschool children's views of science as gathered from a pioneering research tool. This tool, in the form of a computer "game," does not require reading, writing, or expressive language skills and is operated by the children. The program engages children in several simple tasks involving picture recognition and yes/no answers in order to reveal their views about science. The study was conducted with 120 preschool children in two phases and found that by the age of 4 years, participants possess an emergent concept of science. Gender and school differences were detected. Findings from this interdisciplinary study will contribute to the fields of early childhood, science education, learning technologies, program evaluation, and early childhood curriculum development.

  5. I-RREACH: an engagement and assessment tool for improving implementation readiness of researchers, organizations and communities in complex interventions.

    Science.gov (United States)

    Maar, Marion; Yeates, Karen; Barron, Marcia; Hua, Diane; Liu, Peter; Moy Lum-Kwong, Margaret; Perkins, Nancy; Sleeth, Jessica; Tobe, Joshua; Wabano, Mary Jo; Williamson, Pamela; Tobe, Sheldon W

    2015-05-04

    Non-communicable chronic diseases are the leading causes of mortality globally, and nearly 80% of these deaths occur in low- and middle-income countries (LMICs). In high-income countries (HICs), inequitable distribution of resources affects poorer and otherwise disadvantaged groups including Aboriginal peoples. Cardiovascular mortality in high-income countries has recently begun to fall; however, these improvements are not realized among citizens in LMICs or those subgroups in high-income countries who are disadvantaged in the social determinants of health including Aboriginal people. It is critical to develop multi-faceted, affordable and realistic health interventions in collaboration with groups who experience health inequalities. Based on community-based participatory research (CBPR), we aimed to develop implementation tools to guide complex interventions to ensure that health gains can be realized in low-resource environments. We developed the I-RREACH (Intervention and Research Readiness Engagement and Assessment of Community Health Care) tool to guide implementation of interventions in low-resource environments. We employed CBPR and a consensus methodology to (1) develop the theoretical basis of the tool and (2) to identify key implementation factor domains; then, we (3) collected participant evaluation data to validate the tool during implementation. The I-RREACH tool was successfully developed using a community-based consensus method and is rooted in participatory principles, equalizing the importance of the knowledge and perspectives of researchers and community stakeholders while encouraging respectful dialogue. The I-RREACH tool consists of three phases: fact finding, stakeholder dialogue and community member/patient dialogue. The evaluation for our first implementation of I-RREACH by participants was overwhelmingly positive, with 95% or more of participants indicating comfort with and support for the process and the dialogue it creates. The I

  6. Handbook of Research on Technology Tools for Real-World Skill Development (2 Volumes)

    Science.gov (United States)

    Rosen, Yigel, Ed.; Ferrara, Steve, Ed.; Mosharraf, Maryam, Ed.

    2016-01-01

    Education is expanding to include a stronger focus on the practical application of classroom lessons in an effort to prepare the next generation of scholars for a changing world economy centered on collaborative and problem-solving skills for the digital age. "The Handbook of Research on Technology Tools for Real-World Skill Development"…

  7. The Virtual Museum for Meteorites: an Online Tool for Researchers Educators and Students

    Science.gov (United States)

    Madiedo, J. M.

    2013-09-01

    The Virtual Museum for Meteorites (Figure 1) was created as a tool for students, educators and researchers [1, 2]. One of the aims of this online resource is to promote the interest in meteorites. Thus, the role of meteorites in education and outreach is fundamental, as these are very valuable tools to promote the public's interest in Astronomy and Planetary Sciences. Meteorite exhibitions reveal the fascination of students, educators and even researchers for these extraterrestrial rocks and how these can explain many key questions origin and evolution of our Solar System. However, despite the efforts related to the origin and evolution of our Solar System. However, despite the efforts of private collectors, museums and other institutions to organize meteorite exhibitions, the reach of these is usually limited. The Virtual Museum for Meteorites takes advantage of HTML and related technologies to overcome local boundaries and offer its contents for a global audience. A description of the recent developments performed in the framework of this virtual museum is given in this work.

  8. Facebook as a recruitment tool for adolescent health research: a systematic review.

    Science.gov (United States)

    Amon, Krestina L; Campbell, Andrew J; Hawke, Catherine; Steinbeck, Katharine

    2014-01-01

    Researchers are increasingly using social media to recruit participants to surveys and clinical studies. However, the evidence of the efficacy and validity of adolescent recruitment through Facebook is yet to be established. To conduct a systematic review of the literature on the use of Facebook to recruit adolescents for health research. Nine electronic databases and reference lists were searched for articles published between 2004 and 2013. Studies were included in the review if: 1) participants were aged ≥ 10 to ≤ 18 years, 2) studies addressed a physical or mental health issue, 3) Facebook was identified as a recruitment tool, 4) recruitment details using Facebook were outlined in the methods section and considered in the discussion, or information was obtained by contacting the authors, 5) results revealed how many participants were recruited using Facebook, and 6) studies addressed how adolescent consent and/or parental consent was obtained. Titles, abstracts, and keywords were scanned and duplicates removed by 2 reviewers. Full text was evaluated for inclusion criteria, and 2 reviewers independently extracted data. The search resulted in 587 publications, of which 25 full-text papers were analyzed. Six studies met all the criteria for inclusion in the review. Three recruitment methods using Facebook was identified: 1) paid Facebook advertising, 2) use of the Facebook search tool, and 3) creation and use of a Facebook Page. Eligible studies described the use of paid Facebook advertising and Facebook as a search tool as methods to successfully recruit adolescent participants. Online and verbal consent was obtained from participants recruited from Facebook. Copyright © 2014 Academic Pediatric Association. Published by Elsevier Inc. All rights reserved.

  9. Building Bridges: The Use of Reflective Oral Diaries as a Qualitative Research Tool

    Science.gov (United States)

    Hewitt, Elizabeth

    2017-01-01

    The article is a reflection on the use of an oral diary as a qualitative research tool, the role that it played during fieldwork and the methodological issues that emerged. It draws on a small-scale empirical study into primary school teachers' use of group discussion, during which oral diaries were used to explore and document teacher reflective…

  10. Using a web-based survey tool to undertake a Delphi study: application for nurse education research.

    Science.gov (United States)

    Gill, Fenella J; Leslie, Gavin D; Grech, Carol; Latour, Jos M

    2013-11-01

    The Internet is increasingly being used as a data collection medium to access research participants. This paper reports on the experience and value of using web-survey software to conduct an eDelphi study to develop Australian critical care course graduate practice standards. The eDelphi technique used involved the iterative process of administering three rounds of surveys to a national expert panel. The survey was developed online using SurveyMonkey. Panel members responded to statements using one rating scale for round one and two scales for rounds two and three. Text boxes for panel comments were provided. For each round, the SurveyMonkey's email tool was used to distribute an individualized email invitation containing the survey web link. The distribution of panel responses, individual responses and a summary of comments were emailed to panel members. Stacked bar charts representing the distribution of responses were generated using the SurveyMonkey software. Panel response rates remained greater than 85% over all rounds. An online survey provided numerous advantages over traditional survey approaches including high quality data collection, ease and speed of survey administration, direct communication with the panel and rapid collation of feedback allowing data collection to be undertaken in 12 weeks. Only minor challenges were experienced using the technology. Ethical issues, specific to using the Internet to conduct research and external hosting of web-based software, lacked formal guidance. High response rates and an increased level of data quality were achieved in this study using web-survey software and the process was efficient and user-friendly. However, when considering online survey software, it is important to match the research design with the computer capabilities of participants and recognize that ethical review guidelines and processes have not yet kept pace with online research practices. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. Tool Efficiency Analysis model research in SEMI industry

    Directory of Open Access Journals (Sweden)

    Lei Ma

    2018-01-01

    Full Text Available One of the key goals in SEMI industry is to improve equipment through put and ensure equipment production efficiency maximization. This paper is based on SEMI standards in semiconductor equipment control, defines the transaction rules between different tool states,and presents a TEA system model which is to analysis tool performance automatically based on finite state machine. The system was applied to fab tools and verified its effectiveness successfully, and obtained the parameter values used to measure the equipment performance, also including the advices of improvement.

  12. Research report on design allowable values of structural materials for LMFBR

    International Nuclear Information System (INIS)

    1978-11-01

    The present report is composed of following two main parts. i) review and re-evaluation on test results by FCI Sub-committee studies, performed from 1973 to 1976, ii) review on procedures for determining design allowable values of structural materials for LMFBR components. Re-evaluation works have been made on monotonic tensile properties at elevated temperatures, creep and creep rupture properties, creep-fatigue properties (strain rate and tensile strain hold time effects on strain fatigue properties at elevated temperatures) of Types 316 and 304 stainless steel and 2 1/4Cr-1Mo steel (base and weld metals) produced in Japan. In the first half of the present report, creep-fatigue test results obtained by FCI Sub-committee studies are subjected to re-evaluation by the present P-FCI Sub-committee. Reviews have been made on testing methods on FCI's-creep-fatigue experiments with other test data of the test materials; high temperature monotonic tensile data, creep and creep rupture data, and origin of the test materials. The data of FCI studies are compared with other reference data obtained by several Japanese laboratories. In the latter half of the present report, procedures including ASME's are reviewed for setting design allowable values for LMFBR components on the basis of high temperature strength properties obtained with materials produced in Japan. A creep rupture data of Japanese steels are issued and examined to make proposal for a design allowable stress of S sub(t) through parameter survey. (author)

  13. Making Your Tools Useful to a Broader Audience

    Science.gov (United States)

    Lyness, M. D.; Broten, M. J.

    2006-12-01

    With the increasing growth of Web Services and SOAP the ability to connect and reuse computational and also visualization tools from all over the world via Web Interfaces that can be easily displayed in any current browser has provided the means to construct an ideal online research environment. The age-old question of usability is a major determining factor whether a particular tool would find great success in its community. An interface that can be understood purely by a user's intuition is desirable and more closely obtainable than ever before. Through the use of increasingly sophisticated web-oriented technologies including JavaScript, AJAX, and the DOM, web interfaces are able to harness the advantages of the Internet along with the functional capabilities of native applications such as menus, partial page changes, background processing, and visual effects to name a few. Also, with computers becoming a normal part of the educational process companies, such as Google and Microsoft, give us a synthetic intuition as a foundation for new designs. Understanding the way earth science researchers know how to use computers will allow the VLab portal (http://vlab.msi.umn.edu) and other projects to create interfaces that will get used. To provide detailed communication with the users of VLab's computational tools, projects like the Porky Portlet (http://www.gorerle.com/vlab-wiki/index.php?title=Porky_Portlet) spawned to empower users with a fully- detailed, interactive visual representation of progressing workflows. With the well-thought design of such tools and interfaces, researchers around the world will become accustomed to new highly engaging, visual web- based research environments.

  14. How Design-based Research, Action Research and Interaction Design Contributes to the Development of Designs for Learning

    DEFF Research Database (Denmark)

    Majgaard, Gunver; Misfeldt, Morten; Nielsen, Jacob

    2011-01-01

    This article explores how action research, design based research and interaction design can be combined and used in the development of educational robotic tools. Our case study is the development of Number Blocks and it combines physical interaction, learning, and immediate feedback. Number Blocks...... supports the children's understanding of place value in the sense that it allows them to experiment with creating large numbers. The development was done in collaboration with a class of 7-8 year old children and their mathematics teacher. The article argues that elements from different research methods...... allowed a structured approach to projects that combines educational research and innovation of new learning technologies. Key elements of this approach is acknowledging the users input, developing a theoretical pre-analysis and using an iterative approach....

  15. Development of dosimetry tools for proton therapy research

    International Nuclear Information System (INIS)

    Kim, Jong-Won; Kim, Dogyun

    2010-01-01

    Dosimetry tools for proton therapy research have been developed to measure the properties of a therapeutic proton beam. A CCD camera-scintillation screen system, which can verify the 2D dose distribution of a scanning beam and can be used for proton radiography, was developed. Also developed were a large area parallel-plate ionization chamber and a multi-layer Faraday cup to monitor the beam current and to measure the beam energy, respectively. To investigate the feasibility of locating the distal dose falloff in real time during patient treatment, a prompt gamma measuring system composed of multi-layer shielding structures was then devised. The system worked well for a pristine proton beam. However, correlation between the distal dose falloff and the prompt gamma distribution was blurred by neutron background for a therapy beam formed by scattering method. We have also worked on the design of a Compton camera to image the 2D distribution of prompt gamma rays.

  16. DEVELOPMENT OF AN ENVIRONMENTAL RATING TOOL FOR BUILDINGS THROUGH A NEW KIND OF DIALOGUE BETWEEN STAKEHOLDERS AND RESEARCHERS

    Directory of Open Access Journals (Sweden)

    Mauritz Glaumann

    2009-03-01

    Full Text Available Buildings need to be more environmentally benign since the building sector is responsible for about 40% of all of energy and material use in Sweden. For this reason a unique cooperation between companies, municipalities and the Government called “Building- Living and Property Management for the future”, in short “The Building Living Dialogue” has going on since 2003. The project focuses on: a healthy indoor environment, b efficient use of energy, and c efficient resource management. In accordance with the dialogue targets, two research projects were initiated aiming at developing an Environmental rating tool taking into accounts both building sector requirements and expectations and national and international research findings. This paper describes the first phase in the development work where stakeholders and researchers cooperate. It includes results from inventories and based on this experience discusses procedures for developing assessment tools and what the desirable features of a broadly accepted building rating tool could be.

  17. Vehicle Technology Simulation and Analysis Tools | Transportation Research

    Science.gov (United States)

    Analysis Tools NREL developed the following modeling, simulation, and analysis tools to investigate novel design goals (e.g., fuel economy versus performance) to find cost-competitive solutions. ADOPT Vehicle Simulator to analyze the performance and fuel economy of conventional and advanced light- and

  18. The Article Idea Chart: A participatory action research tool to aid involvement in dissemination

    Directory of Open Access Journals (Sweden)

    Cheryl Forchuk

    2014-06-01

    Full Text Available Participatory-action research encourages the involvement of all key stakeholders in the research process and is especially well suited to mental health research. Previous literature outlines the importance of engaging stakeholders in the development of research questions and methodologies, but little has been written about ensuring the involvement of all stakeholders (especially non-academic members in dissemination opportunities such as publication development. The Article Idea Chart was developed as a specific methodology for engaging all stakeholders in data analysis and publication development. It has been successfully utilised in a number of studies and is an effective tool for ensuring the dissemination process of participatory-action research results is both inclusive and transparent to all team members, regardless of stakeholder group. Keywords: participatory-action research, mental health, dissemination, community capacity building, publications, authorship

  19. A tool to assess sex-gender when selecting health research projects.

    Science.gov (United States)

    Tomás, Concepción; Yago, Teresa; Eguiluz, Mercedes; Samitier, M A Luisa; Oliveros, Teresa; Palacios, Gemma

    2015-04-01

    To validate the questionnaire "Gender Perspective in Health Research" (GPIHR) to assess the inclusion of gender perspective in research projects. Validation study in two stages. Feasibility was analysed in the first, and reliability, internal consistence and validity in the second. Aragón Institute of Health Science, Aragón, Spain. GPIHR was applied to 118 research projects funded in national and international competitive tenders from 2003 to 2012. Analysis of inter- and intra-observer reliability with Kappa index and internal consistency with Cronbach's alpha. Content validity analysed through literature review and construct validity with an exploratory factor analysis. Validated GPIHR has 10 questions: 3 in the introduction, 1 for objectives, 3 for methodology and 3 for research purpose. Average time of application was 13min Inter-observer reliability (Kappa) varied between 0.35 and 0.94 and intra-observer between 0.40 and 0.94. Theoretical construct is supported in the literature. Factor analysis identifies three levels of GP inclusion: "difference by sex", "gender sensitive" and "feminist research" with an internal consistency of 0.64, 0.87 and 0.81, respectively, which explain 74.78% of variance. GPIHR questionnaire is a valid tool to assess GP and useful for those researchers who would like to include GP in their projects. Copyright © 2014 Elsevier España, S.L.U. All rights reserved.

  20. The jabber chat tool EFDA Messenger and screen sharing tool EFDATV

    Energy Technology Data Exchange (ETDEWEB)

    Thomsen, K. [EFDA Close Support Unit Garching, Boltzmannstr. 2, D-85748 Garching (Germany)], E-mail: Knud.Thomsen@efda.org; Beck, S. [EFDA Close Support Unit Garching, Boltzmannstr. 2, D-85748 Garching (Germany); Wilhelm, B. [EFDA CSU Barcelona, c/Josep Pla n.2, Torres Diag. Litoral Edificio B3, 7a planta, 08019 Barcelona (Spain)

    2008-04-15

    Two Remote Participation (RP) tools are described. The first tool, named EFDA Messenger, is a secure Instant Messaging (IM) tool based on a Jabber server that only accepts SSL encrypted communication and does not allow file transfers as well as audio and video transmissions. This tool is useful to have as another mean of communication during video or teleconferences. The second tool, named EFDATV, is a multipurposeVirtual Network Computing (VNC) based desktop screen sharing system used to share presentations via the Internet. A Java enabled web browser or a VNC client is sufficient for the presenter and the audience to use EFDATV. It is also possible from an EFDATV channel to connect to another VNC server and broadcast the view from that VNC server.

  1. The jabber chat tool EFDA Messenger and screen sharing tool EFDATV

    International Nuclear Information System (INIS)

    Thomsen, K.; Beck, S.; Wilhelm, B.

    2008-01-01

    Two Remote Participation (RP) tools are described. The first tool, named EFDA Messenger, is a secure Instant Messaging (IM) tool based on a Jabber server that only accepts SSL encrypted communication and does not allow file transfers as well as audio and video transmissions. This tool is useful to have as another mean of communication during video or teleconferences. The second tool, named EFDATV, is a multipurposeVirtual Network Computing (VNC) based desktop screen sharing system used to share presentations via the Internet. A Java enabled web browser or a VNC client is sufficient for the presenter and the audience to use EFDATV. It is also possible from an EFDATV channel to connect to another VNC server and broadcast the view from that VNC server

  2. The Diesel Combustion Collaboratory: Combustion Researchers Collaborating over the Internet

    Energy Technology Data Exchange (ETDEWEB)

    C. M. Pancerella; L. A. Rahn; C. Yang

    2000-02-01

    The Diesel Combustion Collaborator (DCC) is a pilot project to develop and deploy collaborative technologies to combustion researchers distributed throughout the DOE national laboratories, academia, and industry. The result is a problem-solving environment for combustion research. Researchers collaborate over the Internet using DCC tools, which include: a distributed execution management system for running combustion models on widely distributed computers, including supercomputers; web-accessible data archiving capabilities for sharing graphical experimental or modeling data; electronic notebooks and shared workspaces for facilitating collaboration; visualization of combustion data; and video-conferencing and data-conferencing among researchers at remote sites. Security is a key aspect of the collaborative tools. In many cases, the authors have integrated these tools to allow data, including large combustion data sets, to flow seamlessly, for example, from modeling tools to data archives. In this paper the authors describe the work of a larger collaborative effort to design, implement and deploy the DCC.

  3. 42 CFR 61.9 - Payments: Stipends; dependency allowances; travel allowances.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Payments: Stipends; dependency allowances; travel... FELLOWSHIPS, INTERNSHIPS, TRAINING FELLOWSHIPS Regular Fellowships § 61.9 Payments: Stipends; dependency allowances; travel allowances. Payments for stipends, dependency allowances, and the travel allowances...

  4. A knowledge transfer scheme to bridge the gap between science and practice: an integration of existing research frameworks into a tool for practice.

    Science.gov (United States)

    Verhagen, Evert; Voogt, Nelly; Bruinsma, Anja; Finch, Caroline F

    2014-04-01

    Evidence of effectiveness does not equal successful implementation. To progress the field, practical tools are needed to bridge the gap between research and practice and to truly unite effectiveness and implementation evidence. This paper describes the Knowledge Transfer Scheme integrating existing implementation research frameworks into a tool which has been developed specifically to bridge the gap between knowledge derived from research on the one side and evidence-based usable information and tools for practice on the other.

  5. The Insight ToolKit Image Registration Framework

    Directory of Open Access Journals (Sweden)

    Brian eAvants

    2014-04-01

    Full Text Available Publicly available scientific resources help establish evaluation standards, provide a platform for teaching and improve reproducibility. Version 4 of the Insight ToolKit ( ITK4 seeks to es- tablish new standards in publicly available image registration methodology. ITK4 makes severaladvances in comparison to previous versions of ITK. ITK4 supports both multivariate images and objective functions; it also unifies high-dimensional (deformation field and low-dimensional (affine transformations with metrics that are reusable across transform types and with com- posite transforms that allow arbitrary series of geometric mappings to be chained together seamlessly. Metrics and optimizers take advantage of multi-core resources, when available.Furthermore, ITK4 reduces the parameter optimization burden via principled heuristics that automatically set scaling across disparate parameter types (rotations versus translations. A related approach also constrains steps sizes for gradient-based optimizers. The result is that tuning for different metrics and/or image pairs is rarely necessary allowing the researcher tomore easily focus on design/comparison of registration strategies. In total, the ITK4 contribu- tion is intended as a structure to support reproducible research practices, will provide a more extensive foundation against which to evaluate new work in image registration and also enable application level programmers a broad suite of tools on which to build. Finally, we contextu- alize this work with a reference registration evaluation study with application to pediatric brainlabeling.

  6. Developing an Interactive Data Visualization Tool to Assess the Impact of Decision Support on Clinical Operations.

    Science.gov (United States)

    Huber, Timothy C; Krishnaraj, Arun; Monaghan, Dayna; Gaskin, Cree M

    2018-05-18

    Due to mandates from recent legislation, clinical decision support (CDS) software is being adopted by radiology practices across the country. This software provides imaging study decision support for referring providers at the point of order entry. CDS systems produce a large volume of data, providing opportunities for research and quality improvement. In order to better visualize and analyze trends in this data, an interactive data visualization dashboard was created using a commercially available data visualization platform. Following the integration of a commercially available clinical decision support product into the electronic health record, a dashboard was created using a commercially available data visualization platform (Tableau, Seattle, WA). Data generated by the CDS were exported from the data warehouse, where they were stored, into the platform. This allowed for real-time visualization of the data generated by the decision support software. The creation of the dashboard allowed the output from the CDS platform to be more easily analyzed and facilitated hypothesis generation. Integrating data visualization tools into clinical decision support tools allows for easier data analysis and can streamline research and quality improvement efforts.

  7. Design of the GLARE tool. A grease lubrication apparatus for research and education

    International Nuclear Information System (INIS)

    Rawlings, B.

    2012-01-01

    The GLARE: Grease Lubrication Apparatus for Research and Education was designed as a fourth year thesis project with the University of Ontario Institute of Technology (UOIT). The purpose of the apparatus is to train Ontario Power Generation Nuclear (OPGN) staff to properly lubricate bearings with grease and to help detect early equipment failures. Proper re-lubrication is critical to the nuclear industry as equipment may be inaccessible for long periods of time. A secondary purpose for the tool is for UOIT research and undergraduate laboratories.This abstract provides an overview of the project and its application to the nuclear industry. (author)

  8. Astonishing advances in mouse genetic tools for biomedical research.

    Science.gov (United States)

    Kaczmarczyk, Lech; Jackson, Walker S

    2015-01-01

    The humble house mouse has long been a workhorse model system in biomedical research. The technology for introducing site-specific genome modifications led to Nobel Prizes for its pioneers and opened a new era of mouse genetics. However, this technology was very time-consuming and technically demanding. As a result, many investigators continued to employ easier genome manipulation methods, though resulting models can suffer from overlooked or underestimated consequences. Another breakthrough, invaluable for the molecular dissection of disease mechanisms, was the invention of high-throughput methods to measure the expression of a plethora of genes in parallel. However, the use of samples containing material from multiple cell types could obfuscate data, and thus interpretations. In this review we highlight some important issues in experimental approaches using mouse models for biomedical research. We then discuss recent technological advances in mouse genetics that are revolutionising human disease research. Mouse genomes are now easily manipulated at precise locations thanks to guided endonucleases, such as transcription activator-like effector nucleases (TALENs) or the CRISPR/Cas9 system, both also having the potential to turn the dream of human gene therapy into reality. Newly developed methods of cell type-specific isolation of transcriptomes from crude tissue homogenates, followed by detection with next generation sequencing (NGS), are vastly improving gene regulation studies. Taken together, these amazing tools simplify the creation of much more accurate mouse models of human disease, and enable the extraction of hitherto unobtainable data.

  9. Web-based tools for microRNAs involved in human cancer.

    Science.gov (United States)

    Mar-Aguilar, Fermín; Rodríguez-Padilla, Cristina; Reséndez-Pérez, Diana

    2016-06-01

    MicroRNAs (miRNAs/miRs) are a family of small, endogenous and evolutionarily-conserved non-coding RNAs that are involved in the regulation of several cellular and functional processes. miRNAs can act as oncogenes or tumor suppressors in all types of cancer, and could be used as prognostic and diagnostic biomarkers. Databases and computational algorithms are behind the majority of the research performed on miRNAs. These tools assemble and curate the relevant information on miRNAs and present it in a user-friendly manner. The current review presents 14 online databases that address every aspect of miRNA cancer research. Certain databases focus on miRNAs and a particular type of cancer, while others analyze the behavior of miRNAs in different malignancies at the same time. Additional databases allow researchers to search for mutations in miRNAs or their targets, and to review the naming history of a particular miRNA. All these databases are open-access, and are a valuable tool for those researchers working with these molecules, particularly those who lack access to an advanced computational infrastructure.

  10. DisEpi: Compact Visualization as a Tool for Applied Epidemiological Research.

    Science.gov (United States)

    Benis, Arriel; Hoshen, Moshe

    2017-01-01

    Outcomes research and evidence-based medical practice is being positively impacted by proliferation of healthcare databases. Modern epidemiologic studies require complex data comprehension. A new tool, DisEpi, facilitates visual exploration of epidemiological data supporting Public Health Knowledge Discovery. It provides domain-experts a compact visualization of information at the population level. In this study, DisEpi is applied to Attention-Deficit/Hyperactivity Disorder (ADHD) patients within Clalit Health Services, analyzing the socio-demographic and ADHD filled prescription data between 2006 and 2016 of 1,605,800 children aged 6 to 17 years. DisEpi's goals facilitate the identification of (1) Links between attributes and/or events, (2) Changes in these relationships over time, and (3) Clusters of population attributes for similar trends. DisEpi combines hierarchical clustering graphics and a heatmap where color shades reflect disease time-trends. In the ADHD context, DisEpi allowed the domain-expert to visually analyze a snapshot summary of data mining results. Accordingly, the domain-expert was able to efficiently identify that: (1) Relatively younger children and particularly youngest children in class are treated more often, (2) Medication incidence increased between 2006 and 2011 but then stabilized, and (3) Progression rates of medication incidence is different for each of the 3 main discovered clusters (aka: profiles) of treated children. DisEpi delivered results similar to those previously published which used classical statistical approaches. DisEpi requires minimal preparation and fewer iterations, generating results in a user-friendly format for the domain-expert. DisEpi will be wrapped as a package containing the end-to-end discovery process. Optionally, it may provide automated annotation using calendar events (such as policy changes or media interests), which can improve discovery efficiency, interpretation, and policy implementation.

  11. Bioluminescent imaging: a critical tool in pre-clinical oncology research.

    LENUS (Irish Health Repository)

    O'Neill, Karen

    2010-02-01

    Bioluminescent imaging (BLI) is a non-invasive imaging modality widely used in the field of pre-clinical oncology research. Imaging of small animal tumour models using BLI involves the generation of light by luciferase-expressing cells in the animal following administration of substrate. This light may be imaged using an external detector. The technique allows a variety of tumour-associated properties to be visualized dynamically in living models. The increasing use of BLI as a small-animal imaging modality has led to advances in the development of xenogeneic, orthotopic, and genetically engineered animal models expressing luciferase genes. This review aims to provide insight into the principles of BLI and its applications in cancer research. Many studies to assess tumour growth and development, as well as efficacy of candidate therapeutics, have been performed using BLI. More recently, advances have also been made using bioluminescent imaging in studies of protein-protein interactions, genetic screening, cell-cycle regulators, and spontaneous cancer development. Such novel studies highlight the versatility and potential of bioluminescent imaging in future oncological research.

  12. Trial Promoter: A Web-Based Tool for Boosting the Promotion of Clinical Research Through Social Media.

    Science.gov (United States)

    Reuter, Katja; Ukpolo, Francis; Ward, Edward; Wilson, Melissa L; Angyan, Praveen

    2016-06-29

    Scarce information about clinical research, in particular clinical trials, is among the top reasons why potential participants do not take part in clinical studies. Without volunteers, on the other hand, clinical research and the development of novel approaches to preventing, diagnosing, and treating disease are impossible. Promising digital options such as social media have the potential to work alongside traditional methods to boost the promotion of clinical research. However, investigators and research institutions are challenged to leverage these innovations while saving time and resources. To develop and test the efficiency of a Web-based tool that automates the generation and distribution of user-friendly social media messages about clinical trials. Trial Promoter is developed in Ruby on Rails, HTML, cascading style sheet (CSS), and JavaScript. In order to test the tool and the correctness of the generated messages, clinical trials (n=46) were randomized into social media messages and distributed via the microblogging social media platform Twitter and the social network Facebook. The percent correct was calculated to determine the probability with which Trial Promoter generates accurate messages. During a 10-week testing phase, Trial Promoter automatically generated and published 525 user-friendly social media messages on Twitter and Facebook. On average, Trial Promoter correctly used the message templates and substituted the message parameters (text, URLs, and disease hashtags) 97.7% of the time (1563/1600). Trial Promoter may serve as a promising tool to render clinical trial promotion more efficient while requiring limited resources. It supports the distribution of any research or other types of content. The Trial Promoter code and installation instructions are freely available online.

  13. Research on Test-bench for Sonic Logging Tool

    Directory of Open Access Journals (Sweden)

    Xianping Liu

    2016-01-01

    Full Text Available In this paper, the test-bench for sonic logging tool is proposed and designed to realize automatic calibration and testing of the sonic logging tool. The test-bench System consists of Host Computer, Embedded Controlling Board, and functional boards. The Host Computer serves as the Human Machine Interface (HMI and processes uploaded data. The software running on Host Computer is designed on VC++, which is developed based on multithreading, Dynamic Linkable Library (DLL and Multiple Document Interface (MDI techniques. The Embedded Controlling Board uses ARM7 as the microcontroller and communicates with Host Computer via Ethernet. The Embedded Controlling Board software is realized based on embedded uclinux operating system with a layered architecture. The functional boards are designed based on Field Programmable Gate Array (FPGA and provide test interfaces for the logging tool. The functional board software is divided into independent sub-modules that can repeatedly be used by various functional boards and then integrated those sub-modules in the top layer. With the layered architecture and modularized design, the software system is highly reliable and extensible. With the help of designed system, a test has been conducted quickly and successfully on the electronic receiving cabin of the sonic logging tool. It demonstrated that the system could greatly improve the production efficiency of the sonic logging tool.

  14. Approaches, tools and methods used for setting priorities in health research in the 21(st) century.

    Science.gov (United States)

    Yoshida, Sachiyo

    2016-06-01

    Health research is difficult to prioritize, because the number of possible competing ideas for research is large, the outcome of research is inherently uncertain, and the impact of research is difficult to predict and measure. A systematic and transparent process to assist policy makers and research funding agencies in making investment decisions is a permanent need. To obtain a better understanding of the landscape of approaches, tools and methods used to prioritize health research, I conducted a methodical review using the PubMed database for the period 2001-2014. A total of 165 relevant studies were identified, in which health research prioritization was conducted. They most frequently used the CHNRI method (26%), followed by the Delphi method (24%), James Lind Alliance method (8%), the Combined Approach Matrix (CAM) method (2%) and the Essential National Health Research method (priorities were set. A further 19% used a combination of expert panel interview and focus group discussion ("consultation process") but provided few details, while a further 2% used approaches that were clearly described, but not established as a replicable method. Online surveys that were not accompanied by face-to-face meetings were used in 8% of studies, while 9% used a combination of literature review and questionnaire to scrutinise the research options for prioritization among the participating experts. The number of priority setting exercises in health research published in PubMed-indexed journals is increasing, especially since 2010. These exercises are being conducted at a variety of levels, ranging from the global level to the level of an individual hospital. With the development of new tools and methods which have a well-defined structure - such as the CHNRI method, James Lind Alliance Method and Combined Approach Matrix - it is likely that the Delphi method and non-replicable consultation processes will gradually be replaced by these emerging tools, which offer more

  15. Approaches, tools and methods used for setting priorities in health research in the 21st century

    Science.gov (United States)

    Yoshida, Sachiyo

    2016-01-01

    Background Health research is difficult to prioritize, because the number of possible competing ideas for research is large, the outcome of research is inherently uncertain, and the impact of research is difficult to predict and measure. A systematic and transparent process to assist policy makers and research funding agencies in making investment decisions is a permanent need. Methods To obtain a better understanding of the landscape of approaches, tools and methods used to prioritize health research, I conducted a methodical review using the PubMed database for the period 2001–2014. Results A total of 165 relevant studies were identified, in which health research prioritization was conducted. They most frequently used the CHNRI method (26%), followed by the Delphi method (24%), James Lind Alliance method (8%), the Combined Approach Matrix (CAM) method (2%) and the Essential National Health Research method (priorities were set. A further 19% used a combination of expert panel interview and focus group discussion (“consultation process”) but provided few details, while a further 2% used approaches that were clearly described, but not established as a replicable method. Online surveys that were not accompanied by face–to–face meetings were used in 8% of studies, while 9% used a combination of literature review and questionnaire to scrutinise the research options for prioritization among the participating experts. Conclusion The number of priority setting exercises in health research published in PubMed–indexed journals is increasing, especially since 2010. These exercises are being conducted at a variety of levels, ranging from the global level to the level of an individual hospital. With the development of new tools and methods which have a well–defined structure – such as the CHNRI method, James Lind Alliance Method and Combined Approach Matrix – it is likely that the Delphi method and non–replicable consultation processes will gradually be

  16. Cash Reconciliation Tool

    Data.gov (United States)

    US Agency for International Development — CART is a cash reconciliation tool that allows users to reconcile Agency cash disbursements with Treasury fund balances; track open unreconciled items; and create an...

  17. Inquiring Sport and Physical Activity students’ perceptions using metaphors as research tools

    OpenAIRE

    Martínez Ruiz, María Ángeles; Ávalos Ramos, María Alejandra; Merma Molina, Gladys

    2017-01-01

    The aim of this study is to analyse the metaphorical expressions designed by Science of Sport and Physical Activity university students, as a tool of inquiring two research questions: their perceptions of their physical education teachers, and the meaning physical activity has in students’ personal life. 51 students from the University of Alicante have participated in the study. Qualitative data analysis software AQUAD 6 was used for data processing. The results obtained from the analysis of ...

  18. TACIT: An open-source text analysis, crawling, and interpretation tool.

    Science.gov (United States)

    Dehghani, Morteza; Johnson, Kate M; Garten, Justin; Boghrati, Reihane; Hoover, Joe; Balasubramanian, Vijayan; Singh, Anurag; Shankar, Yuvarani; Pulickal, Linda; Rajkumar, Aswin; Parmar, Niki Jitendra

    2017-04-01

    As human activity and interaction increasingly take place online, the digital residues of these activities provide a valuable window into a range of psychological and social processes. A great deal of progress has been made toward utilizing these opportunities; however, the complexity of managing and analyzing the quantities of data currently available has limited both the types of analysis used and the number of researchers able to make use of these data. Although fields such as computer science have developed a range of techniques and methods for handling these difficulties, making use of those tools has often required specialized knowledge and programming experience. The Text Analysis, Crawling, and Interpretation Tool (TACIT) is designed to bridge this gap by providing an intuitive tool and interface for making use of state-of-the-art methods in text analysis and large-scale data management. Furthermore, TACIT is implemented as an open, extensible, plugin-driven architecture, which will allow other researchers to extend and expand these capabilities as new methods become available.

  19. Quantitative Imaging In Pathology (QUIP) | Informatics Technology for Cancer Research (ITCR)

    Science.gov (United States)

    This site hosts web accessible applications, tools and data designed to support analysis, management, and exploration of whole slide tissue images for cancer research. The following tools are included: caMicroscope: A digital pathology data management and visualization plaform that enables interactive viewing of whole slide tissue images and segmentation results. caMicroscope can be also used independently of QUIP. FeatureExplorer: An interactive tool to allow patient-level feature exploration across multiple dimensions.

  20. Employment of Questionnaire as Tool for Effective Business Research Outcome: Problems and Challenges

    Directory of Open Access Journals (Sweden)

    ADENIYI AKINGBADE WAIDI

    2016-06-01

    Full Text Available Questionnaire has to do with questions designed to gather information or data for analysis. Questionnaire has to be adequate, simple, focused and related to the subject which the research is set to achieve and to test the hypotheses and questions that are formulated for the study. But many questionnaires are constructed and administered without following proper guideline which hinders there end result. This paper assesses some of the guides for constructing questionnaire as well as it uses and the extent to which it enhanced manager’s access to reliable data and information. Descriptive method is employed for the study. Findings revealed that poor or badly prepared questionnaire produce questionnaire that does not provide effective results. Managers and researchers that use such questionnaire hardly achieve their organisational and research objectives. The need for good, well prepared and adequate questionnaire is exemplified by its being the primary tool for analytical research. The study recommends that questionnaire be properly prepared for effective research outcome.

  1. Data Center IT Equipment Energy Assessment Tools: Current State of Commercial Tools, Proposal for a Future Set of Assessment Tools

    Energy Technology Data Exchange (ETDEWEB)

    Radhakrishnan, Ben D. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); National Univ., San Diego, CA (United States). School of Engineering

    2012-06-30

    This research project, which was conducted during the Summer and Fall of 2011, investigated some commercially available assessment tools with a focus on IT equipment to see if such tools could round out the DC Pro tool suite. In this research, the assessment capabilities of the various tools were compiled to help make “non-biased” information available to the public. This research should not be considered to be exhaustive on all existing vendor tools although a number of vendors were contacted. Large IT equipment OEM’s like IBM and Dell provide their proprietary internal automated software which does not work on any other IT equipment. However, found two companies with products that showed promise in performing automated assessments for IT equipment from different OEM vendors. This report documents the research and provides a list of software products reviewed, contacts and websites, product details, discussions with specific companies, a set of recommendations, and next steps. As a result of this research, a simple 3-level approach to an IT assessment tool is proposed along with an example of an assessment using a simple IT equipment data collection tool (Level 1, spreadsheet). The tool has been reviewed with the Green Grid and LBNL staff. The initial feedback has been positive although further refinement to the tool will be necessary. Proposed next steps include a field trial of at least two vendors’ software in two different data centers with an objective to prove the concept, ascertain the extent of energy and computational assessment, ease of installation and opportunities for continuous improvement. Based on the discussions, field trials (or case studies) are proposed with two vendors – JouleX (expected to be completed in 2012) and Sentilla.

  2. adLIMS: a customized open source software that allows bridging clinical and basic molecular research studies.

    Science.gov (United States)

    Calabria, Andrea; Spinozzi, Giulio; Benedicenti, Fabrizio; Tenderini, Erika; Montini, Eugenio

    2015-01-01

    Many biological laboratories that deal with genomic samples are facing the problem of sample tracking, both for pure laboratory management and for efficiency. Our laboratory exploits PCR techniques and Next Generation Sequencing (NGS) methods to perform high-throughput integration site monitoring in different clinical trials and scientific projects. Because of the huge amount of samples that we process every year, which result in hundreds of millions of sequencing reads, we need to standardize data management and tracking systems, building up a scalable and flexible structure with web-based interfaces, which are usually called Laboratory Information Management System (LIMS). We started collecting end-users' requirements, composed of desired functionalities of the system and Graphical User Interfaces (GUI), and then we evaluated available tools that could address our requirements, spanning from pure LIMS to Content Management Systems (CMS) up to enterprise information systems. Our analysis identified ADempiere ERP, an open source Enterprise Resource Planning written in Java J2EE, as the best software that also natively implements some highly desirable technological advances, such as the high usability and modularity that grants high use-case flexibility and software scalability for custom solutions. We extended and customized ADempiere ERP to fulfil LIMS requirements and we developed adLIMS. It has been validated by our end-users verifying functionalities and GUIs through test cases for PCRs samples and pre-sequencing data and it is currently in use in our laboratories. adLIMS implements authorization and authentication policies, allowing multiple users management and roles definition that enables specific permissions, operations and data views to each user. For example, adLIMS allows creating sample sheets from stored data using available exporting operations. This simplicity and process standardization may avoid manual errors and information backtracking, features

  3. Haystack, a web-based tool for metabolomics research.

    Science.gov (United States)

    Grace, Stephen C; Embry, Stephen; Luo, Heng

    2014-01-01

    Liquid chromatography coupled to mass spectrometry (LCMS) has become a widely used technique in metabolomics research for differential profiling, the broad screening of biomolecular constituents across multiple samples to diagnose phenotypic differences and elucidate relevant features. However, a significant limitation in LCMS-based metabolomics is the high-throughput data processing required for robust statistical analysis and data modeling for large numbers of samples with hundreds of unique chemical species. To address this problem, we developed Haystack, a web-based tool designed to visualize, parse, filter, and extract significant features from LCMS datasets rapidly and efficiently. Haystack runs in a browser environment with an intuitive graphical user interface that provides both display and data processing options. Total ion chromatograms (TICs) and base peak chromatograms (BPCs) are automatically displayed, along with time-resolved mass spectra and extracted ion chromatograms (EICs) over any mass range. Output files in the common .csv format can be saved for further statistical analysis or customized graphing. Haystack's core function is a flexible binning procedure that converts the mass dimension of the chromatogram into a set of interval variables that can uniquely identify a sample. Binned mass data can be analyzed by exploratory methods such as principal component analysis (PCA) to model class assignment and identify discriminatory features. The validity of this approach is demonstrated by comparison of a dataset from plants grown at two light conditions with manual and automated peak detection methods. Haystack successfully predicted class assignment based on PCA and cluster analysis, and identified discriminatory features based on analysis of EICs of significant bins. Haystack, a new online tool for rapid processing and analysis of LCMS-based metabolomics data is described. It offers users a range of data visualization options and supports non

  4. Neighborhood Mapping Tool

    Data.gov (United States)

    Department of Housing and Urban Development — This tool assists the public and Choice Neighborhoods applicants to prepare data to submit with their grant application by allowing applicants to draw the exact...

  5. Advanced pressure tube sampling tools

    International Nuclear Information System (INIS)

    Wittich, K.C.; King, J.M.

    2002-01-01

    Deuterium concentration is an important parameter that must be assessed to evaluate the Fitness for service of CANDU pressure tubes. In-reactor pressure tube sampling allows accurate deuterium concentration assessment to be made without the expenses associated with fuel channel removal. This technology, which AECL has developed over the past fifteen years, has become the standard method for deuterium concentration assessment. AECL is developing a multi-head tool that would reduce in-reactor handling overhead by allowing one tool to sequentially sample at all four axial pressure tube locations before removal from the reactor. Four sets of independent cutting heads, like those on the existing sampling tools, facilitate this incorporating proven technology demonstrated in over 1400 in-reactor samples taken to date. The multi-head tool is delivered by AECL's Advanced Delivery Machine or other similar delivery machines. Further, AECL has developed an automated sample handling system that receives and processes the tool once out of the reactor. This system retrieves samples from the tool, dries, weighs and places them in labelled vials which are then directed into shielded shipping flasks. The multi-head wet sampling tool and the automated sample handling system are based on proven technology and offer continued savings and dose reduction to utilities in a competitive electricity market. (author)

  6. NDT-Tool: A case tool to deal with requirements in web information systems

    OpenAIRE

    Escalona Cuaresma, María José; Torres Valderrama, Jesús; Mejías Risoto, Manuel

    2003-01-01

    Internet progress and the rising interest for developing systems in web environment has given way to several methodological proposals which have been proposed to be a suitable reference in the development process. However, there is a gap in case tool[3][4][6]. This work presents a case tool named NDT-Tool that allows to apply algorithms and techniques proposed in NDT (Navigational Development Techniques) [2], which is a methodological proposition to specify, analyze and desi...

  7. Making Research Fly in Schools: "Drosophila" as a Powerful Modern Tool for Teaching Biology

    Science.gov (United States)

    Harbottle, Jennifer; Strangward, Patrick; Alnuamaani, Catherine; Lawes, Surita; Patel, Sanjai; Prokop, Andreas

    2016-01-01

    The "droso4schools" project aims to introduce the fruit fly "Drosophila" as a powerful modern teaching tool to convey curriculum-relevant specifications in biology lessons. Flies are easy and cheap to breed and have been at the forefront of biology research for a century, providing unique conceptual understanding of biology and…

  8. Ddi Tool: A serious game for the development of competences of graduate and postgraduate students in the Operations Management environment

    Directory of Open Access Journals (Sweden)

    F. Javier Ramirez

    2017-06-01

    Full Text Available A serious game known as Ddi Tool is presented by the authors to improve the competences on Operations Management of graduate and postgraduate students. The game is applied to the resolution of multistage industrial processes allowing to have a global vision of the manufacturing process and combining the students’ skills on operations management research and learning. The tool allows also to perform an economic evaluation of the whole process by means of the process costs analysis and improving this cost as function of the main process variables and parameters: raw material, workforce, energy consumption, etc. The game has been generated using Java language with a user-friendly interface for a quick comprehension by the student during the practical classroom. In this manner, the tool allows developing competencies to students applying and developing scientific, technological, mathematical, economical and sustainable knowledge.

  9. The Neuroscience of Storing and Molding Tool Action Concepts: how plastic is grounded cognition?

    Directory of Open Access Journals (Sweden)

    J.C. Mizelle

    2010-11-01

    Full Text Available Choosing how to use tools to accomplish a task is a natural and seemingly trivial aspect of our lives, yet engages complex neural mechanisms. Recently, work in healthy populations has led to the idea that tool knowledge is grounded to allow for appropriate recall based on some level of personal history. This grounding has presumed neural loci for tool use, centered on parieto-temporo-frontal areas to fuse perception and action representations into one dynamic system. A challenge for this idea is related to one of its great benefits. For such a system to exist, it must be very plastic, to allow for the introduction of novel tools or concepts of tool use and modification of existing ones. Thus, learning new tool usage (familiar tools in new situations and new tools in familiar situations must involve mapping into this grounded network while marinating existing rules for tool usage. This plasticity may present a challenging breadth of encoding that needs to be optimally stored and accessed. The aim of this work is to explore the challenges of plasticity related to changing or incorporating representations of tool action within the theory of grounded cognition and propose a modular model of tool-object goal related accomplishment. While considering the neuroscience evidence for this approach, we will focus on the requisite plasticity for this system. Further, we will highlight challenges for flexibility and organization of already grounded tool actions and provide thoughts on future research to better evaluate mechanisms of encoding in the theory of grounded cognition.

  10. Advances in the Use of Neuroscience Methods in Research on Learning and Instruction

    Science.gov (United States)

    De Smedt, Bert

    2014-01-01

    Cognitive neuroscience offers a series of tools and methodologies that allow researchers in the field of learning and instruction to complement and extend the knowledge they have accumulated through decades of behavioral research. The appropriateness of these methods depends on the research question at hand. Cognitive neuroscience methods allow…

  11. German translation of the Alberta context tool and two measures of research use: methods, challenges and lessons learned

    Science.gov (United States)

    2013-01-01

    Background Understanding the relationship between organizational context and research utilization is key to reducing the research-practice gap in health care. This is particularly true in the residential long term care (LTC) setting where relatively little work has examined the influence of context on research implementation. Reliable, valid measures and tools are a prerequisite for studying organizational context and research utilization. Few such tools exist in German. We thus translated three such tools (the Alberta Context Tool and two measures of research use) into German for use in German residential LTC. We point out challenges and strategies for their solution unique to German residential LTC, and demonstrate how resolving specific challenges in the translation of the health care aide instrument version streamlined the translation process of versions for registered nurses, allied health providers, practice specialists, and managers. Methods Our translation methods were based on best practices and included two independent forward translations, reconciliation of the forward translations, expert panel discussions, two independent back translations, reconciliation of the back translations, back translation review, and cognitive debriefing. Results We categorized the challenges in this translation process into seven categories: (1) differing professional education of Canadian and German care providers, (2) risk that German translations would become grammatically complex, (3) wordings at risk of being misunderstood, (4) phrases/idioms non-existent in German, (5) lack of corresponding German words, (6) limited comprehensibility of corresponding German words, and (7) target persons’ unfamiliarity with activities detailed in survey items. Examples of each challenge are described with strategies that we used to manage the challenge. Conclusion Translating an existing instrument is complex and time-consuming, but a rigorous approach is necessary to obtain instrument

  12. 'Screening audit' as a quality assurance tool in good clinical practice compliant research environments.

    Science.gov (United States)

    Park, Sinyoung; Nam, Chung Mo; Park, Sejung; Noh, Yang Hee; Ahn, Cho Rong; Yu, Wan Sun; Kim, Bo Kyung; Kim, Seung Min; Kim, Jin Seok; Rha, Sun Young

    2018-04-25

    With the growing amount of clinical research, regulations and research ethics are becoming more stringent. This trend introduces a need for quality assurance measures for ensuring adherence to research ethics and human research protection beyond Institutional Review Board approval. Audits, one of the most effective tools for assessing quality assurance, are measures used to evaluate Good Clinical Practice (GCP) and protocol compliance in clinical research. However, they are laborious, time consuming, and require expertise. Therefore, we developed a simple auditing process (a screening audit) and evaluated its feasibility and effectiveness. The screening audit was developed using a routine audit checklist based on the Severance Hospital's Human Research Protection Program policies and procedures. The measure includes 20 questions, and results are summarized in five categories of audit findings. We analyzed 462 studies that were reviewed by the Severance Hospital Human Research Protection Center between 2013 and 2017. We retrospectively analyzed research characteristics, reply rate, audit findings, associated factors and post-screening audit compliance, etc. RESULTS: Investigator reply rates gradually increased, except for the first year (73% → 26% → 53% → 49% → 55%). The studies were graded as "critical," "major," "minor," and "not a finding" (11.9, 39.0, 42.9, and 6.3%, respectively), based on findings and number of deficiencies. The auditors' decisions showed fair agreement with weighted kappa values of 0.316, 0.339, and 0.373. Low-risk level studies, single center studies, and non-phase clinical research showed more prevalent frequencies of being "major" or "critical" (p = 0.002, audit grade (p audit results of post-screening audit compliance checks in "non-responding" and "critical" studies upon applying the screening audit. Our screening audit is a simple and effective way to assess overall GCP compliance by institutions and to

  13. Examining Student Research Choices and Processes in a Disintermediated Searching Environment

    Science.gov (United States)

    Rempel, Hannah Gascho; Buck, Stefanie; Deitering, Anne-Marie

    2013-01-01

    Students today perform research in a disintermediated environment, which often allows them to struggle directly with the process of selecting research tools and choosing scholarly sources. The authors conducted a qualitative study with twenty students, using structured observations to ascertain the processes students use to select databases and…

  14. Automatic generation of bioinformatics tools for predicting protein-ligand binding sites.

    Science.gov (United States)

    Komiyama, Yusuke; Banno, Masaki; Ueki, Kokoro; Saad, Gul; Shimizu, Kentaro

    2016-03-15

    Predictive tools that model protein-ligand binding on demand are needed to promote ligand research in an innovative drug-design environment. However, it takes considerable time and effort to develop predictive tools that can be applied to individual ligands. An automated production pipeline that can rapidly and efficiently develop user-friendly protein-ligand binding predictive tools would be useful. We developed a system for automatically generating protein-ligand binding predictions. Implementation of this system in a pipeline of Semantic Web technique-based web tools will allow users to specify a ligand and receive the tool within 0.5-1 day. We demonstrated high prediction accuracy for three machine learning algorithms and eight ligands. The source code and web application are freely available for download at http://utprot.net They are implemented in Python and supported on Linux. shimizu@bi.a.u-tokyo.ac.jp Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.

  15. RESEARCH CAPACITIES OF UNIVERSITIES: ESTIMATION OF PARAMETERS AND MODELING OF THE DYNAMICS OF THE RESEARCH SYSTEMS

    Directory of Open Access Journals (Sweden)

    CAROLINA DELGADO HURTADO

    2017-12-01

    Full Text Available Research capacities are developed scientific skills that enable universities to accomplish the dissemination of high-quality scientific knowledge. Nowadays, the modeling of their dynamics is one of the most important concerns for the stakeholders related to the scientific activity, including university managers, private sector and government. In this context, the present article aims to approach the issue of modeling the capacities of the Universities’ research systems, presenting Systems Dynamics as an effective methodological tool for the treatment of data contained in intellectual capital indicators, allowing to estimate parameters, conditions and scenarios. The main contribution lays on the modeling and simulations accomplished for several scenarios, which display the critical variables and the more sensitive ones when building or strengthening research capacities. The establishment of parameters through regression techniques allowed to more accurately model the dynamics of the variables. This is an interesting contribution in terms of the accuracy of the simulations that later might be used to propose and carry out changes related to the management of the universities research. Future research with alternative modeling for social systems will allow to broaden the scope of the study.

  16. A GUI-based intuitive tool for analyzing formats and extracting contents of binary data in fusion research

    International Nuclear Information System (INIS)

    Naito, O.

    2015-01-01

    Highlights: • A GUI-based intuitive tool for data format analysis is presented. • Data can be viewed in any data types specified by the user in real time. • Analyzed formats are saved and reused as templates for other data of the same forms. • Users can easily extract contents in any forms by writing a simple script file. • The tool would be useful for exchanging data in collaborative fusion researches. - Abstract: An intuitive tool with graphical user interface (GUI) for analyzing formats and extracting contents of binary data in fusion research is presented. Users can examine structures of binary data at arbitrary addresses by selecting their type from a list of radio buttons in the data inspection window and checking their representations instantly on the computer screen. The result of analysis is saved in a file which contains the information such as name, data type, start address, and array size of the data. If the array size of some data depends on others that appear prior to the former and if the users specify their relation in the inspection window, the resultant file can also be used as a format template for the same series of data. By writing a simple script, the users can extract the contents of data either to a text or binary file in the format of their preference. As a real-life example, the tool is applied to the MHD equilibrium data at JT-60U, where poloidal flux data are extracted and converted to a format suitable for contour plotting in other data visualization program. The tool would be useful in collaborative fusion researches for exchanging relatively small-size data, which don’t fit in well with the standard routine processes

  17. User Manual for the PROTEUS Mesh Tools

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Micheal A. [Argonne National Lab. (ANL), Argonne, IL (United States); Shemon, Emily R. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2015-06-01

    This report describes the various mesh tools that are provided with the PROTEUS code giving both descriptions of the input and output. In many cases the examples are provided with a regression test of the mesh tools. The most important mesh tools for any user to consider using are the MT_MeshToMesh.x and the MT_RadialLattice.x codes. The former allows the conversion between most mesh types handled by PROTEUS while the second allows the merging of multiple (assembly) meshes into a radial structured grid. Note that the mesh generation process is recursive in nature and that each input specific for a given mesh tool (such as .axial or .merge) can be used as “mesh” input for any of the mesh tools discussed in this manual.

  18. Visualizing Cloud Properties and Satellite Imagery: A Tool for Visualization and Information Integration

    Science.gov (United States)

    Chee, T.; Nguyen, L.; Smith, W. L., Jr.; Spangenberg, D.; Palikonda, R.; Bedka, K. M.; Minnis, P.; Thieman, M. M.; Nordeen, M.

    2017-12-01

    Providing public access to research products including cloud macro and microphysical properties and satellite imagery are a key concern for the NASA Langley Research Center Cloud and Radiation Group. This work describes a web based visualization tool and API that allows end users to easily create customized cloud product and satellite imagery, ground site data and satellite ground track information that is generated dynamically. The tool has two uses, one to visualize the dynamically created imagery and the other to provide access to the dynamically generated imagery directly at a later time. Internally, we leverage our practical experience with large, scalable application practices to develop a system that has the largest potential for scalability as well as the ability to be deployed on the cloud to accommodate scalability issues. We build upon NASA Langley Cloud and Radiation Group's experience with making real-time and historical satellite cloud product information, satellite imagery, ground site data and satellite track information accessible and easily searchable. This tool is the culmination of our prior experience with dynamic imagery generation and provides a way to build a "mash-up" of dynamically generated imagery and related kinds of information that are visualized together to add value to disparate but related information. In support of NASA strategic goals, our group aims to make as much scientific knowledge, observations and products available to the citizen science, research and interested communities as well as for automated systems to acquire the same information for data mining or other analytic purposes. This tool and the underlying API's provide a valuable research tool to a wide audience both as a standalone research tool and also as an easily accessed data source that can easily be mined or used with existing tools.

  19. Patient registries: useful tools for clinical research in myasthenia gravis.

    Science.gov (United States)

    Baggi, Fulvio; Mantegazza, Renato; Antozzi, Carlo; Sanders, Donald

    2012-12-01

    Clinical registries may facilitate research on myasthenia gravis (MG) in several ways: as a source of demographic, clinical, biological, and immunological data on large numbers of patients with this rare disease; as a source of referrals for clinical trials; and by allowing rapid identification of MG patients with specific features. Physician-derived registries have the added advantage of incorporating diagnostic and treatment data that may allow comparison of outcomes from different therapeutic approaches, which can be supplemented with patient self-reported data. We report the demographic analysis of MG patients in two large physician-derived registries, the Duke MG Patient Registry, at the Duke University Medical Center, and the INNCB MG Registry, at the Istituto Neurologico Carlo Besta, as a preliminary study to assess the consistency of the two data sets. These registries share a common structure, with an inner core of common data elements (CDE) that facilitate data analysis. The CDEs are concordant with the MG-specific CDEs developed under the National Institute of Neurological Disorders and Stroke Common Data Elements Project. © 2012 New York Academy of Sciences.

  20. iPat: intelligent prediction and association tool for genomic research.

    Science.gov (United States)

    Chen, Chunpeng James; Zhang, Zhiwu

    2018-06-01

    The ultimate goal of genomic research is to effectively predict phenotypes from genotypes so that medical management can improve human health and molecular breeding can increase agricultural production. Genomic prediction or selection (GS) plays a complementary role to genome-wide association studies (GWAS), which is the primary method to identify genes underlying phenotypes. Unfortunately, most computing tools cannot perform data analyses for both GWAS and GS. Furthermore, the majority of these tools are executed through a command-line interface (CLI), which requires programming skills. Non-programmers struggle to use them efficiently because of the steep learning curves and zero tolerance for data formats and mistakes when inputting keywords and parameters. To address these problems, this study developed a software package, named the Intelligent Prediction and Association Tool (iPat), with a user-friendly graphical user interface. With iPat, GWAS or GS can be performed using a pointing device to simply drag and/or click on graphical elements to specify input data files, choose input parameters and select analytical models. Models available to users include those implemented in third party CLI packages such as GAPIT, PLINK, FarmCPU, BLINK, rrBLUP and BGLR. Users can choose any data format and conduct analyses with any of these packages. File conversions are automatically conducted for specified input data and selected packages. A GWAS-assisted genomic prediction method was implemented to perform genomic prediction using any GWAS method such as FarmCPU. iPat was written in Java for adaptation to multiple operating systems including Windows, Mac and Linux. The iPat executable file, user manual, tutorials and example datasets are freely available at http://zzlab.net/iPat. zhiwu.zhang@wsu.edu.

  1. Handbook of bibliometric indicators quantitative tools for studying and evaluating research

    CERN Document Server

    Todeschini, Roberto

    2016-01-01

    At last, the first systematic guide to the growing jungle of citation indices and other bibliometric indicators. Written with the aim of providing a complete and unbiased overview of all available statistical measures for scientific productivity, the core of this reference is an alphabetical dictionary of indices and other algorithms used to evaluate the importance and impact of researchers and their institutions. In 150 major articles, the authors describe all indices in strictly mathematical terms without passing judgement on their relative merit. From widely used measures, such as the journal impact factor or the h-index, to highly specialized indices, all indicators currently in use in the sciences and humanities are described, and their application explained. The introductory section and the appendix contain a wealth of valuable supporting information on data sources, tools and techniques for bibliometric and scientometric analysis - for individual researchers as well as their funders and publishers.

  2. Unique life sciences research facilities at NASA Ames Research Center

    Science.gov (United States)

    Mulenburg, G. M.; Vasques, M.; Caldwell, W. F.; Tucker, J.

    1994-01-01

    The Life Science Division at NASA's Ames Research Center has a suite of specialized facilities that enable scientists to study the effects of gravity on living systems. This paper describes some of these facilities and their use in research. Seven centrifuges, each with its own unique abilities, allow testing of a variety of parameters on test subjects ranging from single cells through hardware to humans. The Vestibular Research Facility allows the study of both centrifugation and linear acceleration on animals and humans. The Biocomputation Center uses computers for 3D reconstruction of physiological systems, and interactive research tools for virtual reality modeling. Psycophysiological, cardiovascular, exercise physiology, and biomechanical studies are conducted in the 12 bed Human Research Facility and samples are analyzed in the certified Central Clinical Laboratory and other laboratories at Ames. Human bedrest, water immersion and lower body negative pressure equipment are also available to study physiological changes associated with weightlessness. These and other weightlessness models are used in specialized laboratories for the study of basic physiological mechanisms, metabolism and cell biology. Visual-motor performance, perception, and adaptation are studied using ground-based models as well as short term weightlessness experiments (parabolic flights). The unique combination of Life Science research facilities, laboratories, and equipment at Ames Research Center are described in detail in relation to their research contributions.

  3. The Role of Hypothesis in Constructive Design Research

    DEFF Research Database (Denmark)

    Bang, Anne Louise; Krogh, Peter; Ludvigsen, Martin

    2012-01-01

    and solid perspective on how to keep constructive design research on track, this paper offers a model for understanding the role of hypothesis in constructive design research. The model allows for understanding the hypothesis’s relation to research motivation, questions, experiments, evaluation...... and knowledge production. The intention of the model is to have it serve as a tool in the research process aiding the researcher to understand at what “level” discussions and claims are brought forward, and what consequences these might have for the research work at hand. Thus, the paper claims the central...

  4. Tools of online Marketing

    OpenAIRE

    Hossain, M. S.; Rahman, M. F.

    2017-01-01

    Abstract Online marketing is the most crucial issue in the modern marketing era but there was no previous research that could identify the tools of internet marketing before this study and it was the first study on the field of online marketing tools. This research was descriptive in nature and it has attempted to identify the major tools of internet marketing from the concepts of traditional marketing tools. Worldwide network is known as Internet that can exchange information between use...

  5. Interactive and Approachable Web-Based Tools for Exploring Global Geophysical Data Records

    Science.gov (United States)

    Croteau, M. J.; Nerem, R. S.; Merrifield, M. A.; Thompson, P. R.; Loomis, B. D.; Wiese, D. N.; Zlotnicki, V.; Larson, J.; Talpe, M.; Hardy, R. A.

    2017-12-01

    Making global and regional data accessible and understandable for non-experts can be both challenging and hazardous. While data products are often developed with end users in mind, the ease of use of these data can vary greatly. Scientists must take care to provide detailed guides for how to use data products to ensure users are not incorrectly applying data to their problem. For example, terrestrial water storage data from the Gravity Recovery and Climate Experiment (GRACE) satellite mission is notoriously difficult for non-experts to access and correctly use. However, allowing these data to be easily accessible to scientists outside the GRACE community is desirable because this would allow that data to see much wider-spread use. We have developed a web-based interactive mapping and plotting tool that provides easy access to geophysical data. This work presents an intuitive method for making such data widely accessible to experts and non-experts alike, making the data approachable and ensuring proper use of the data. This tool has proven helpful to experts by providing fast and detailed access to the data. Simultaneously, the tool allows non-experts to gain familiarity with the information contained in the data and access to that information for both scientific studies and public use. In this presentation, we discuss the development of this tool and application to both GRACE and ocean altimetry satellite missions, and demonstrate the capabilities of the tool. Focusing on the data visualization aspects of the tool, we showcase our integrations of the Mapbox API and the D3.js data-driven web document framework. We then explore the potential of these tools in other web-based visualization projects, and how incorporation of such tools into science can improve the presentation of research results. We demonstrate how the development of an interactive and exploratory resource can enable further layers of exploratory and scientific discovery.

  6. Climate Action Planning Tool | NREL

    Science.gov (United States)

    NREL's Climate Action Planning Tool provides a quick, basic estimate of how various technology options can contribute to an overall climate action plan for your research campus. Use the tool to Tool Calculation Formulas and Assumptions Climate Neutral Research Campuses Website Climate Neutral

  7. In Search of Samoan Research Approaches to Education: Tofa'a'Anolasi and the Foucauldian Tool Box

    Science.gov (United States)

    Galuvao, Akata Sisigafu'aapulematumua

    2018-01-01

    This article introduces Tofa'a'anolasi, a novel Samoan research framework created by drawing on the work of other Samoan and Pacific education researchers, in combination with adapting the 'Foucauldian tool box' to use for research carried out from a Samoan perspective. The article starts with an account and explanation of the process of…

  8. SEPHYDRO: An Integrated Multi-Filter Web-Based Tool for Baseflow Separation

    Science.gov (United States)

    Serban, D.; MacQuarrie, K. T. B.; Popa, A.

    2017-12-01

    Knowledge of baseflow contributions to streamflow is important for understanding watershed scale hydrology, including groundwater-surface water interactions, impact of geology and landforms on baseflow, estimation of groundwater recharge rates, etc. Baseflow (or hydrograph) separation methods can be used as supporting tools in many areas of environmental research, such as the assessment of the impact of agricultural practices, urbanization and climate change on surface water and groundwater. Over the past few decades various digital filtering and graphically-based methods have been developed in an attempt to improve the assessment of the dynamics of the various sources of streamflow (e.g. groundwater, surface runoff, subsurface flow); however, these methods are not available under an integrated platform and, individually, often require significant effort for implementation. Here we introduce SEPHYDRO, an open access, customizable web-based tool, which integrates 11 algorithms allowing for separation of streamflow hydrographs. The streamlined interface incorporates a reference guide as well as additional information that allows users to import their own data, customize the algorithms, and compare, visualise and export results. The tool includes one-, two- and three-parameter digital filters as well as graphical separation methods and has been successfully applied in Atlantic Canada, in studies dealing with nutrient loading to fresh water and coastal water ecosystems. Future developments include integration of additional separation algorithms as well as incorporation of geochemical separation methods. SEPHYDRO has been developed through a collaborative research effort between the Canadian Rivers Institute, University of New Brunswick (Fredericton, New Brunswick, Canada), Agriculture and Agri-Food Canada and Environment and Climate Change Canada and is currently available at http://canadianriversinstitute.com/tool/

  9. CrossQuery: a web tool for easy associative querying of transcriptome data.

    Directory of Open Access Journals (Sweden)

    Toni U Wagner

    Full Text Available Enormous amounts of data are being generated by modern methods such as transcriptome or exome sequencing and microarray profiling. Primary analyses such as quality control, normalization, statistics and mapping are highly complex and need to be performed by specialists. Thereafter, results are handed back to biomedical researchers, who are then confronted with complicated data lists. For rather simple tasks like data filtering, sorting and cross-association there is a need for new tools which can be used by non-specialists. Here, we describe CrossQuery, a web tool that enables straight forward, simple syntax queries to be executed on transcriptome sequencing and microarray datasets. We provide deep-sequencing data sets of stem cell lines derived from the model fish Medaka and microarray data of human endothelial cells. In the example datasets provided, mRNA expression levels, gene, transcript and sample identification numbers, GO-terms and gene descriptions can be freely correlated, filtered and sorted. Queries can be saved for later reuse and results can be exported to standard formats that allow copy-and-paste to all widespread data visualization tools such as Microsoft Excel. CrossQuery enables researchers to quickly and freely work with transcriptome and microarray data sets requiring only minimal computer skills. Furthermore, CrossQuery allows growing association of multiple datasets as long as at least one common point of correlated information, such as transcript identification numbers or GO-terms, is shared between samples. For advanced users, the object-oriented plug-in and event-driven code design of both server-side and client-side scripts allow easy addition of new features, data sources and data types.

  10. CrossQuery: a web tool for easy associative querying of transcriptome data.

    Science.gov (United States)

    Wagner, Toni U; Fischer, Andreas; Thoma, Eva C; Schartl, Manfred

    2011-01-01

    Enormous amounts of data are being generated by modern methods such as transcriptome or exome sequencing and microarray profiling. Primary analyses such as quality control, normalization, statistics and mapping are highly complex and need to be performed by specialists. Thereafter, results are handed back to biomedical researchers, who are then confronted with complicated data lists. For rather simple tasks like data filtering, sorting and cross-association there is a need for new tools which can be used by non-specialists. Here, we describe CrossQuery, a web tool that enables straight forward, simple syntax queries to be executed on transcriptome sequencing and microarray datasets. We provide deep-sequencing data sets of stem cell lines derived from the model fish Medaka and microarray data of human endothelial cells. In the example datasets provided, mRNA expression levels, gene, transcript and sample identification numbers, GO-terms and gene descriptions can be freely correlated, filtered and sorted. Queries can be saved for later reuse and results can be exported to standard formats that allow copy-and-paste to all widespread data visualization tools such as Microsoft Excel. CrossQuery enables researchers to quickly and freely work with transcriptome and microarray data sets requiring only minimal computer skills. Furthermore, CrossQuery allows growing association of multiple datasets as long as at least one common point of correlated information, such as transcript identification numbers or GO-terms, is shared between samples. For advanced users, the object-oriented plug-in and event-driven code design of both server-side and client-side scripts allow easy addition of new features, data sources and data types.

  11. Current research relevant to the improvement of γ-ray spectroscopy as an analytical tool

    International Nuclear Information System (INIS)

    Meyer, R.A.; Tirsell, K.G.; Armantrout, G.A.

    1976-01-01

    Four areas of research that will have significant impact on the further development of γ-ray spectroscopy as an accurate analytical tool are considered. The areas considered are: (1) automation; (2) accurate multigamma ray sources; (3) accuracy of the current and future γ-ray energy scale, and (4) new solid state X and γ-ray detectors

  12. Web-based communication tools in a European research project: the example of the TRACE project

    Directory of Open Access Journals (Sweden)

    Baeten V.

    2009-01-01

    Full Text Available The multi-disciplinary and international nature of large European projects requires powerful managerial and communicative tools to ensure the transmission of information to the end-users. One such project is TRACE entitled “Tracing Food Commodities in Europe”. One of its objectives is to provide a communication system dedicated to be the central source of information on food authenticity and traceability in Europe. This paper explores the web tools used and communication vehicles offered to scientists involved in the TRACE project to communicate internally as well as to the public. Two main tools have been built: an Intranet and a public website. The TRACE website can be accessed at http://www.trace.eu.org. A particular emphasis was placed on the efficiency, the relevance and the accessibility of the information, the publicity of the website as well as the use of the collaborative utilities. The rationale of web space design as well as integration of proprietary software solutions are presented. Perspectives on the using of web tools in the research projects are discussed.

  13. Benefits, Challenges and Tools of Big Data Management

    Directory of Open Access Journals (Sweden)

    Fernando L. F. Almeida

    2017-10-01

    Full Text Available Big Data is one of the most predominant field of knowledge and research that has generated high repercussion in the process of digital transformation of organizations in recent years. The Big Data's main goal is to improve work processes through analysis and interpretation of large amounts of data. Knowing how Big Data works, its benefits, challenges and tools, are essential elements for business success. Our study performs a systematic review on Big Data field adopting a mind map approach, which allows us to easily and visually identify its main elements and dependencies. The findings identified and mapped a total of 12 main branches of benefits, challenges and tools, and also a total of 52 sub branches in each of the main areas of the model.

  14. High-tech hammer : BBJ Tools transforms the traditional fluid hammer into a revolutionary drilling tool

    Energy Technology Data Exchange (ETDEWEB)

    Byfield, M.

    2010-12-15

    This article described BBJ Tools' patent-pending fluid hammer that enhances drilling rate of penetration. The technology was awarded the 2010 winner for best drilling technology for a company with fewer than 100 employees. The fluid hammer features several improvements in terms of maintaining drill-bit integrity, steering ability, and operating flexibility. The hammer incorporates a positive displacement motor and adjustable housing that uniquely allow the driller to steer the drill bit. The fluid hammer works with both polycrystalline diamond compact bits and roller cones. The unique weight-to-bit-transfer design allows the operator to have diversified percussion control. More weight on the bit results in more force, and hammering stops when weight is taken off the bit. The major components of the mud motor are incorporated into the fluid hammer, allowing the tool to compete in every application in which a mud motor is used. The percussion mechanism transmits left-hand reactive torque to the housing. The rate of penetration is substantially better than other similar tools on the market. 2 figs.

  15. Tools for adequacy of research lines of research institutes: the example of IPEN

    International Nuclear Information System (INIS)

    Sacramento, Jose Miguel Noronha

    2011-01-01

    This work aims to assist research institutes, notably the IPEN, in order to improve their assertiveness in the process of defining their research lines. New evolutionary speeds have increased exponentially requiring greater synchronism and multiple and coordinated action from the three fundamental elements in order to assure the development of the contemporary society: Government, Productive Structure and Infrastructure in Science and Technology. This environment increasingly dynamic and mutant imposes greater proximity with the socioeconomic environment when former client-consumer has become the co-creator of knowledge and supplier of energy now contained in a new standard of social relations, called Networked Society. The difference in time for the University, the Productive Structure and Government is function of its main activities: Science, Market and the achievement of Public Opinion, respectively. The equation that will harmonize and find synergies between these three dimensions is the contemporary challenge for those who seek to innovate and advance knowledge in order to improve the standard of living of the society. In this work is shown that research institutes must believe in the words of Robert Plomin and start connecting to the several links in different chains in order to make use of a collective intelligence that continuously expands in speed and quality higher than in any other time in human history. The comparison among the results obtained from the different methodologies of analysis proposed in this work allows finding out strengths and weaknesses, threats and opportunities of the IPEN providing subsidies in order to find better ways to tailor its performance to the new demands. (author)

  16. Handbook of Research on Science Education and University Outreach as a Tool for Regional Development

    Science.gov (United States)

    Narasimharao, B. Pandu, Ed.; Wright, Elizabeth, Ed.; Prasad, Shashidhara, Ed.; Joshi, Meghana, Ed.

    2017-01-01

    Higher education institutions play a vital role in their surrounding communities. Besides providing a space for enhanced learning opportunities, universities can utilize their resources for social and economic interests. The "Handbook of Research on Science Education and University Outreach as a Tool for Regional Development" is a…

  17. Evaluation of Crew-Centric Onboard Mission Operations Planning and Execution Tool: Year 2

    Science.gov (United States)

    Hillenius, S.; Marquez, J.; Korth, D.; Rosenbaum, M.; Deliz, Ivy; Kanefsky, Bob; Zheng, Jimin

    2018-01-01

    could effectively self-schedule. In parallel we have added in new features and functionality in the Playbook tool based off of our insights from crew self-scheduling in the NASA analogs. In particular this year we have added in the ability for the crew to add, edit, and remove their own activities in the Playbook tool, expanding the type of planning and re-planning possible in the tool and opening up the ability for more free form plan creation. The ability to group and manipulate groups of activities from the plan task list was also added, allowing crew members to add predefined sets of activities onto their mission timeline. In addition we also added a way for crew members to roll back changes in their plan, in order to allow an undo like capability. These features expand and complement the initial self-scheduling features added in year one with the goal of making crew autonomous planning more efficient. As part of this work we have also finished developing the first version of our Playbook Data Analysis Tool, a research tool built to interpret and analyze the unobtrusively collected data obtained during the NASA analog missions through Playbook. This data which includes user click interaction as well as plan change information, through the Playbook Data Analysis Tool, allows us to playback this information as if a video camera was mounted over the crewmember's tablet. While the primary purpose of this tool is to allow usability analysis of crew self-scheduling sessions used on the NASA analog, since the data collected is structured, the tool can automatically derive metrics that would be traditionally tedious to achieve without manual analysis of video playback. We will demonstrate and discuss the ability for future derived metrics to be added to the tool. In addition to the current data and results gathered in year two we will also discuss the preparation and goals of our International Space Station (ISS) onboard technology demonstration with Playbook. This

  18. Informatization of radiological protection:new tools of information dissemination and sharing knowledge

    International Nuclear Information System (INIS)

    Levy, Denise S.; Sordi, Gian Maria A.A.

    2013-01-01

    This project aims the informatization of the radiological protection optimization programs in a single system in order to offer unified programs and inter-related information in Portuguese, providing Brazilian radioactive facilities a complete repository for research, consultation and information. In order to meet both national and international recommendations within the scope of this work, we conducted a comprehensive job of perception about each program contents as well as its real dimension, identifying and detailing the vital parts of programs. The content includes concepts, definitions and theory in addition to the optimization programs, help decision making techniques, information related to protection costs, radiation doses and detriment. The content allows to answer to every question when an optimization program is elaborated, according to decision maker's specific situation. For dimensioning the work of informatization and developing the WEB platform according to the needs of the target public profile, we have conducted an extensive research regarding the possibilities of Information and Communication Technology access in companies throughout the country, which allowed to define the best interfaces tools and resources. The servers processing power added to the technology of relational databases allow to integrate information from different sources, enabling complex queries with reduced response time. The project was implemented in a web environment, using the Web 2.0 tools and resources that allow the entire organizational structure, that would enable the inter-relationships and joints needed for proper use of information technology in radiological protection. This project uses the combination of multiple technologies, maximizing the resources available in each one of them in order to achieve our goals. The investigation of the usage profile for five months enabled important data that suggest new possibilities for the development of computerization of

  19. An open-source software tool for the generation of relaxation time maps in magnetic resonance imaging

    International Nuclear Information System (INIS)

    Messroghli, Daniel R; Rudolph, Andre; Abdel-Aty, Hassan; Wassmuth, Ralf; Kühne, Titus; Dietz, Rainer; Schulz-Menger, Jeanette

    2010-01-01

    In magnetic resonance (MR) imaging, T1, T2 and T2* relaxation times represent characteristic tissue properties that can be quantified with the help of specific imaging strategies. While there are basic software tools for specific pulse sequences, until now there is no universal software program available to automate pixel-wise mapping of relaxation times from various types of images or MR systems. Such a software program would allow researchers to test and compare new imaging strategies and thus would significantly facilitate research in the area of quantitative tissue characterization. After defining requirements for a universal MR mapping tool, a software program named MRmap was created using a high-level graphics language. Additional features include a manual registration tool for source images with motion artifacts and a tabular DICOM viewer to examine pulse sequence parameters. MRmap was successfully tested on three different computer platforms with image data from three different MR system manufacturers and five different sorts of pulse sequences: multi-image inversion recovery T1; Look-Locker/TOMROP T1; modified Look-Locker (MOLLI) T1; single-echo T2/T2*; and multi-echo T2/T2*. Computing times varied between 2 and 113 seconds. Estimates of relaxation times compared favorably to those obtained from non-automated curve fitting. Completed maps were exported in DICOM format and could be read in standard software packages used for analysis of clinical and research MR data. MRmap is a flexible cross-platform research tool that enables accurate mapping of relaxation times from various pulse sequences. The software allows researchers to optimize quantitative MR strategies in a manufacturer-independent fashion. The program and its source code were made available as open-source software on the internet

  20. An open-source software tool for the generation of relaxation time maps in magnetic resonance imaging

    Directory of Open Access Journals (Sweden)

    Kühne Titus

    2010-07-01

    Full Text Available Abstract Background In magnetic resonance (MR imaging, T1, T2 and T2* relaxation times represent characteristic tissue properties that can be quantified with the help of specific imaging strategies. While there are basic software tools for specific pulse sequences, until now there is no universal software program available to automate pixel-wise mapping of relaxation times from various types of images or MR systems. Such a software program would allow researchers to test and compare new imaging strategies and thus would significantly facilitate research in the area of quantitative tissue characterization. Results After defining requirements for a universal MR mapping tool, a software program named MRmap was created using a high-level graphics language. Additional features include a manual registration tool for source images with motion artifacts and a tabular DICOM viewer to examine pulse sequence parameters. MRmap was successfully tested on three different computer platforms with image data from three different MR system manufacturers and five different sorts of pulse sequences: multi-image inversion recovery T1; Look-Locker/TOMROP T1; modified Look-Locker (MOLLI T1; single-echo T2/T2*; and multi-echo T2/T2*. Computing times varied between 2 and 113 seconds. Estimates of relaxation times compared favorably to those obtained from non-automated curve fitting. Completed maps were exported in DICOM format and could be read in standard software packages used for analysis of clinical and research MR data. Conclusions MRmap is a flexible cross-platform research tool that enables accurate mapping of relaxation times from various pulse sequences. The software allows researchers to optimize quantitative MR strategies in a manufacturer-independent fashion. The program and its source code were made available as open-source software on the internet.

  1. User Manual for the PROTEUS Mesh Tools

    International Nuclear Information System (INIS)

    Smith, Micheal A.; Shemon, Emily R.

    2015-01-01

    This report describes the various mesh tools that are provided with the PROTEUS code giving both descriptions of the input and output. In many cases the examples are provided with a regression test of the mesh tools. The most important mesh tools for any user to consider using are the MT M eshToMesh.x and the MT R adialLattice.x codes. The former allows the conversion between most mesh types handled by PROTEUS while the second allows the merging of multiple (assembly) meshes into a radial structured grid. Note that the mesh generation process is recursive in nature and that each input specific for a given mesh tool (such as .axial or .merge) can be used as ''mesh'' input for any of the mesh tools discussed in this manual.

  2. Multimedia Informed Consent Tool for a Low Literacy African Research Population: Development and Pilot-Testing.

    Science.gov (United States)

    Afolabi, Muhammed Olanrewaju; Bojang, Kalifa; D'Alessandro, Umberto; Imoukhuede, Egeruan Babatunde; Ravinetto, Raffaella M; Larson, Heidi Jane; McGrath, Nuala; Chandramohan, Daniel

    2014-04-05

    International guidelines recommend the use of appropriate informed consent procedures in low literacy research settings because written information is not known to guarantee comprehension of study information. This study developed and evaluated a multimedia informed consent tool for people with low literacy in an area where a malaria treatment trial was being planned in The Gambia. We developed the informed consent document of the malaria treatment trial into a multimedia tool integrating video, animations and audio narrations in three major Gambian languages. Acceptability and ease of use of the multimedia tool were assessed using quantitative and qualitative methods. In two separate visits, the participants' comprehension of the study information was measured by using a validated digitised audio questionnaire. The majority of participants (70%) reported that the multimedia tool was clear and easy to understand. Participants had high scores on the domains of adverse events/risk, voluntary participation, study procedures while lowest scores were recorded on the question items on randomisation. The differences in mean scores for participants' 'recall' and 'understanding' between first and second visits were statistically significant (F (1,41)=25.38, presearch is needed to compare the tool to the traditional consent interview, both in The Gambia and in other sub-Saharan settings.

  3. Thermal Error Test and Intelligent Modeling Research on the Spindle of High Speed CNC Machine Tools

    Science.gov (United States)

    Luo, Zhonghui; Peng, Bin; Xiao, Qijun; Bai, Lu

    2018-03-01

    Thermal error is the main factor affecting the accuracy of precision machining. Through experiments, this paper studies the thermal error test and intelligent modeling for the spindle of vertical high speed CNC machine tools in respect of current research focuses on thermal error of machine tool. Several testing devices for thermal error are designed, of which 7 temperature sensors are used to measure the temperature of machine tool spindle system and 2 displacement sensors are used to detect the thermal error displacement. A thermal error compensation model, which has a good ability in inversion prediction, is established by applying the principal component analysis technology, optimizing the temperature measuring points, extracting the characteristic values closely associated with the thermal error displacement, and using the artificial neural network technology.

  4. NeuroDebian Virtual Machine Deployment Facilitates Trainee-Driven Bedside Neuroimaging Research.

    Science.gov (United States)

    Cohen, Alexander; Kenney-Jung, Daniel; Botha, Hugo; Tillema, Jan-Mendelt

    2017-01-01

    Freely available software, derived from the past 2 decades of neuroimaging research, is significantly more flexible for research purposes than presently available clinical tools. Here, we describe and demonstrate the utility of rapidly deployable analysis software to facilitate trainee-driven translational neuroimaging research. A recipe and video tutorial were created to guide the creation of a NeuroDebian-based virtual computer that conforms to current neuroimaging research standards and can exist within a HIPAA-compliant system. This allows for retrieval of clinical imaging data, conversion to standard file formats, and rapid visualization and quantification of individual patients' cortical and subcortical anatomy. As an example, we apply this pipeline to a pediatric patient's data to illustrate the advantages of research-derived neuroimaging tools in asking quantitative questions "at the bedside." Our goal is to provide a path of entry for trainees to become familiar with common neuroimaging tools and foster an increased interest in translational research.

  5. Near-infrared spectroscopy (NIRS) as a new tool for neuroeconomic research

    Science.gov (United States)

    Kopton, Isabella M.; Kenning, Peter

    2014-01-01

    Over the last decade, the application of neuroscience to economic research has gained in importance and the number of neuroeconomic studies has grown extensively. The most common method for these investigations is fMRI. However, fMRI has limitations (particularly concerning situational factors) that should be countered with other methods. This review elaborates on the use of functional Near-Infrared Spectroscopy (fNIRS) as a new and promising tool for investigating economic decision making both in field experiments and outside the laboratory. We describe results of studies investigating the reliability of prototype NIRS studies, as well as detailing experiments using conventional and stationary fNIRS devices to analyze this potential. This review article shows that further research using mobile fNIRS for studies on economic decision making outside the laboratory could be a fruitful avenue helping to develop the potential of a new method for field experiments outside the laboratory. PMID:25147517

  6. 42 CFR 61.8 - Benefits: Stipends; dependency allowances; travel allowances; vacation.

    Science.gov (United States)

    2010-10-01

    ... 42 Public Health 1 2010-10-01 2010-10-01 false Benefits: Stipends; dependency allowances; travel...; dependency allowances; travel allowances; vacation. Individuals awarded regular fellowships shall be entitled...) Stipend. (b) Dependency allowances. (c) When authorized in advance, separate allowances for travel. Such...

  7. Development of a journal recommendation tool based upon co-citation analysis of journals cited in Wageningen UR research articles

    NARCIS (Netherlands)

    Veller, van M.G.P.; Gerritsma, W.

    2015-01-01

    Wageningen UR Library has developed a tool based upon co-citation analysis to recommend alternative journals to researchers for a journal they look up in the tool. The journal recommendations can be tuned in such a way to include citation preferences for each of the five science groups that comprise

  8. Tool for Military Logistics Planning of Peace Support Operations: The OTAS Planning Tool

    NARCIS (Netherlands)

    Merrienboer, S.A. van

    1998-01-01

    Within the research group Operations Research Studies Army of the TNO Physics and Electronics Laboratory the OTAS planning tool is developed for the Royal Netherlands Armed Forces. This paper gives a general and brief description of the tool.

  9. A MORET tool to assist code bias estimation

    International Nuclear Information System (INIS)

    Fernex, F.; Richet, Y.; Letang, E.

    2003-01-01

    This new Graphical User Interface (GUI) developed in JAVA is one of the post-processing tools for MORET4 code. It aims to help users to estimate the importance of the k eff bias due to the code in order to better define the upper safety limit. Moreover, it allows visualizing the distance between an actual configuration case and evaluated critical experiments. This tool depends on a validated experiments database, on sets of physical parameters and on various statistical tools allowing interpolating the calculation bias of the database or displaying the projections of experiments on a reduced base of parameters. The development of this tool is still in progress. (author)

  10. PMD2HD--a web tool aligning a PubMed search results page with the local German Cancer Research Centre library collection.

    Science.gov (United States)

    Bohne-Lang, Andreas; Lang, Elke; Taube, Anke

    2005-06-27

    Web-based searching is the accepted contemporary mode of retrieving relevant literature, and retrieving as many full text articles as possible is a typical prerequisite for research success. In most cases only a proportion of references will be directly accessible as digital reprints through displayed links. A large number of references, however, have to be verified in library catalogues and, depending on their availability, are accessible as print holdings or by interlibrary loan request. The problem of verifying local print holdings from an initial retrieval set of citations can be solved using Z39.50, an ANSI protocol for interactively querying library information systems. Numerous systems include Z39.50 interfaces and therefore can process Z39.50 interactive requests. However, the programmed query interaction command structure is non-intuitive and inaccessible to the average biomedical researcher. For the typical user, it is necessary to implement the protocol within a tool that hides and handles Z39.50 syntax, presenting a comfortable user interface. PMD2HD is a web tool implementing Z39.50 to provide an appropriately functional and usable interface to integrate into the typical workflow that follows an initial PubMed literature search, providing users with an immediate asset to assist in the most tedious step in literature retrieval, checking for subscription holdings against a local online catalogue. PMD2HD can facilitate literature access considerably with respect to the time and cost of manual comparisons of search results with local catalogue holdings. The example presented in this article is related to the library system and collections of the German Cancer Research Centre. However, the PMD2HD software architecture and use of common Z39.50 protocol commands allow for transfer to a broad range of scientific libraries using Z39.50-compatible library information systems.

  11. Synthetic biology in mammalian cells: Next generation research tools and therapeutics

    Science.gov (United States)

    Lienert, Florian; Lohmueller, Jason J; Garg, Abhishek; Silver, Pamela A

    2014-01-01

    Recent progress in DNA manipulation and gene circuit engineering has greatly improved our ability to programme and probe mammalian cell behaviour. These advances have led to a new generation of synthetic biology research tools and potential therapeutic applications. Programmable DNA-binding domains and RNA regulators are leading to unprecedented control of gene expression and elucidation of gene function. Rebuilding complex biological circuits such as T cell receptor signalling in isolation from their natural context has deepened our understanding of network motifs and signalling pathways. Synthetic biology is also leading to innovative therapeutic interventions based on cell-based therapies, protein drugs, vaccines and gene therapies. PMID:24434884

  12. CERR: A computational environment for radiotherapy research

    International Nuclear Information System (INIS)

    Deasy, Joseph O.; Blanco, Angel I.; Clark, Vanessa H.

    2003-01-01

    A software environment is described, called the computational environment for radiotherapy research (CERR, pronounced 'sir'). CERR partially addresses four broad needs in treatment planning research: (a) it provides a convenient and powerful software environment to develop and prototype treatment planning concepts, (b) it serves as a software integration environment to combine treatment planning software written in multiple languages (MATLAB, FORTRAN, C/C++, JAVA, etc.), together with treatment plan information (computed tomography scans, outlined structures, dose distributions, digital films, etc.), (c) it provides the ability to extract treatment plans from disparate planning systems using the widely available AAPM/RTOG archiving mechanism, and (d) it provides a convenient and powerful tool for sharing and reproducing treatment planning research results. The functional components currently being distributed, including source code, include: (1) an import program which converts the widely available AAPM/RTOG treatment planning format into a MATLAB cell-array data object, facilitating manipulation; (2) viewers which display axial, coronal, and sagittal computed tomography images, structure contours, digital films, and isodose lines or dose colorwash, (3) a suite of contouring tools to edit and/or create anatomical structures, (4) dose-volume and dose-surface histogram calculation and display tools, and (5) various predefined commands. CERR allows the user to retrieve any AAPM/RTOG key word information about the treatment plan archive. The code is relatively self-describing, because it relies on MATLAB structure field name definitions based on the AAPM/RTOG standard. New structure field names can be added dynamically or permanently. New components of arbitrary data type can be stored and accessed without disturbing system operation. CERR has been applied to aid research in dose-volume-outcome modeling, Monte Carlo dose calculation, and treatment planning optimization

  13. A Portfolio Analysis Tool for Measuring NASAs Aeronautics Research Progress toward Planned Strategic Outcomes

    Science.gov (United States)

    Tahmasebi, Farhad; Pearce, Robert

    2016-01-01

    Description of a tool for portfolio analysis of NASA's Aeronautics research progress toward planned community strategic Outcomes is presented. The strategic planning process for determining the community Outcomes is also briefly described. Stakeholder buy-in, partnership performance, progress of supporting Technical Challenges, and enablement forecast are used as the criteria for evaluating progress toward Outcomes. A few illustrative examples are also presented.

  14. A survey of tools for the analysis of quantitative PCR (qPCR) data.

    Science.gov (United States)

    Pabinger, Stephan; Rödiger, Stefan; Kriegner, Albert; Vierlinger, Klemens; Weinhäusel, Andreas

    2014-09-01

    Real-time quantitative polymerase-chain-reaction (qPCR) is a standard technique in most laboratories used for various applications in basic research. Analysis of qPCR data is a crucial part of the entire experiment, which has led to the development of a plethora of methods. The released tools either cover specific parts of the workflow or provide complete analysis solutions. Here, we surveyed 27 open-access software packages and tools for the analysis of qPCR data. The survey includes 8 Microsoft Windows, 5 web-based, 9 R-based and 5 tools from other platforms. Reviewed packages and tools support the analysis of different qPCR applications, such as RNA quantification, DNA methylation, genotyping, identification of copy number variations, and digital PCR. We report an overview of the functionality, features and specific requirements of the individual software tools, such as data exchange formats, availability of a graphical user interface, included procedures for graphical data presentation, and offered statistical methods. In addition, we provide an overview about quantification strategies, and report various applications of qPCR. Our comprehensive survey showed that most tools use their own file format and only a fraction of the currently existing tools support the standardized data exchange format RDML. To allow a more streamlined and comparable analysis of qPCR data, more vendors and tools need to adapt the standardized format to encourage the exchange of data between instrument software, analysis tools, and researchers.

  15. RESEARCH CENTRIFUGE- ADVANCED TOOL SEPERATION

    OpenAIRE

    Mahajan Ashwini; Prof. B.V. Jain; Dr Surajj Sarode

    2015-01-01

    A centrifuge is a critical piece of equipment for the laboratory. Purpose of this study was to study research centrifuge in detail, its applications, uses in different branches and silent features. Their are two types of research centrifuge study here revolutionary research centrifuge and microprocessor research centrifuge. A centrifuge is a device that separates particles from a solution through use of a rotor. In biology, the particles are usually cells, sub cellular organelles, or large mo...

  16. Longbow: A Lightweight Remote Job Submission Tool

    Directory of Open Access Journals (Sweden)

    James Gebbie-Rayet

    2016-01-01

    Full Text Available We present Longbow, a lightweight console-based remote job submission tool and library. Longbow allows the user to quickly and simply run jobs on high performance computing facilities without leaving their familiar desktop environment. Not only does Longbow greatly simplify the management of compute- intensive jobs for experienced researchers, it also lowers the technical barriers surrounding high perfor-mance computation for the next generation of scientists and engineers. Longbow has already been used to remotely submit jobs in a number of projects and has the potential to redefine the manner in which high performance computers are used.

  17. PDBlocal: A web-based tool for local inspection of biological macromolecular 3D structures

    Directory of Open Access Journals (Sweden)

    Pan Wang

    2018-03-01

    Full Text Available Functional research on biological macromolecules must focus on specific local regions. PDBlocal is a web-based tool developed to overcome the limitations of traditional molecular visualization tools for three-dimensional (3D inspection of local regions. PDBlocal provides an intuitive and easy-to-manipulate web page interface and some new useful functions. It can keep local regions flashing, display sequence text that is dynamically consistent with the 3D structure in local appearance under multiple local manipulations, use two scenes to help users inspect the same local region with different statuses, list all historical manipulation statuses with a tree structure, allow users to annotate regions of interest, and save all historical statuses and other data to a web server for future research. PDBlocal has met expectations and shown satisfactory performance for both expert and novice users. This tool is available at http://labsystem.scuec.edu.cn/pdblocal/.

  18. Application of the enterprise management tools Lean Six Sigma and PMBOK in developing a program of research management.

    Science.gov (United States)

    Hors, Cora; Goldberg, Anna Carla; Almeida, Ederson Haroldo Pereira de; Babio Júnior, Fernando Galan; Rizzo, Luiz Vicente

    2012-01-01

    Introduce a program for the management of scientific research in a General Hospital employing the business management tools Lean Six Sigma and PMBOK for project management in this area. The Lean Six Sigma methodology was used to improve the management of the institution's scientific research through a specific tool (DMAIC) for identification, implementation and posterior analysis based on PMBOK practices of the solutions found. We present our solutions for the management of institutional research projects at the Sociedade Beneficente Israelita Brasileira Albert Einstein. The solutions were classified into four headings: people, processes, systems and organizational culture. A preliminary analysis of these solutions showed them to be completely or partially compliant to the processes described in the PMBOK Guide. In this post facto study, we verified that the solutions drawn from a project using Lean Six Sigma methodology and based on PMBOK enabled the improvement of our processes dealing with the management of scientific research carried out in the institution and constitutes a model to contribute to the search of innovative science management solutions by other institutions dealing with scientific research in Brazil.

  19. Automated riverine landscape characterization: GIS-based tools for watershed-scale research, assessment, and management.

    Science.gov (United States)

    Williams, Bradley S; D'Amico, Ellen; Kastens, Jude H; Thorp, James H; Flotemersch, Joseph E; Thoms, Martin C

    2013-09-01

    River systems consist of hydrogeomorphic patches (HPs) that emerge at multiple spatiotemporal scales. Functional process zones (FPZs) are HPs that exist at the river valley scale and are important strata for framing whole-watershed research questions and management plans. Hierarchical classification procedures aid in HP identification by grouping sections of river based on their hydrogeomorphic character; however, collecting data required for such procedures with field-based methods is often impractical. We developed a set of GIS-based tools that facilitate rapid, low cost riverine landscape characterization and FPZ classification. Our tools, termed RESonate, consist of a custom toolbox designed for ESRI ArcGIS®. RESonate automatically extracts 13 hydrogeomorphic variables from readily available geospatial datasets and datasets derived from modeling procedures. An advanced 2D flood model, FLDPLN, designed for MATLAB® is used to determine valley morphology by systematically flooding river networks. When used in conjunction with other modeling procedures, RESonate and FLDPLN can assess the character of large river networks quickly and at very low costs. Here we describe tool and model functions in addition to their benefits, limitations, and applications.

  20. Enhancement of Local Climate Analysis Tool

    Science.gov (United States)

    Horsfall, F. M.; Timofeyeva, M. M.; Dutton, J.

    2012-12-01

    The National Oceanographic and Atmospheric Administration (NOAA) National Weather Service (NWS) will enhance its Local Climate Analysis Tool (LCAT) to incorporate specific capabilities to meet the needs of various users including energy, health, and other communities. LCAT is an online interactive tool that provides quick and easy access to climate data and allows users to conduct analyses at the local level such as time series analysis, trend analysis, compositing, correlation and regression techniques, with others to be incorporated as needed. LCAT uses principles of Artificial Intelligence in connecting human and computer perceptions on application of data and scientific techniques in multiprocessing simultaneous users' tasks. Future development includes expanding the type of data currently imported by LCAT (historical data at stations and climate divisions) to gridded reanalysis and General Circulation Model (GCM) data, which are available on global grids and thus will allow for climate studies to be conducted at international locations. We will describe ongoing activities to incorporate NOAA Climate Forecast System (CFS) reanalysis data (CFSR), NOAA model output data, including output from the National Multi Model Ensemble Prediction System (NMME) and longer term projection models, and plans to integrate LCAT into the Earth System Grid Federation (ESGF) and its protocols for accessing model output and observational data to ensure there is no redundancy in development of tools that facilitate scientific advancements and use of climate model information in applications. Validation and inter-comparison of forecast models will be included as part of the enhancement to LCAT. To ensure sustained development, we will investigate options for open sourcing LCAT development, in particular, through the University Corporation for Atmospheric Research (UCAR).

  1. Direct writing of metal nanostructures: lithographic tools for nanoplasmonics research.

    Science.gov (United States)

    Leggett, Graham J

    2011-03-22

    Continued progress in the fast-growing field of nanoplasmonics will require the development of new methods for the fabrication of metal nanostructures. Optical lithography provides a continually expanding tool box. Two-photon processes, as demonstrated by Shukla et al. (doi: 10.1021/nn103015g), enable the fabrication of gold nanostructures encapsulated in dielectric material in a simple, direct process and offer the prospect of three-dimensional fabrication. At higher resolution, scanning probe techniques enable nanoparticle particle placement by localized oxidation, and near-field sintering of nanoparticulate films enables direct writing of nanowires. Direct laser "printing" of single gold nanoparticles offers a remarkable capability for the controlled fabrication of model structures for fundamental studies, particle-by-particle. Optical methods continue to provide a powerful support for research into metamaterials.

  2. msBiodat analysis tool, big data analysis for high-throughput experiments.

    Science.gov (United States)

    Muñoz-Torres, Pau M; Rokć, Filip; Belužic, Robert; Grbeša, Ivana; Vugrek, Oliver

    2016-01-01

    Mass spectrometry (MS) are a group of a high-throughput techniques used to increase knowledge about biomolecules. They produce a large amount of data which is presented as a list of hundreds or thousands of proteins. Filtering those data efficiently is the first step for extracting biologically relevant information. The filtering may increase interest by merging previous data with the data obtained from public databases, resulting in an accurate list of proteins which meet the predetermined conditions. In this article we present msBiodat Analysis Tool, a web-based application thought to approach proteomics to the big data analysis. With this tool, researchers can easily select the most relevant information from their MS experiments using an easy-to-use web interface. An interesting feature of msBiodat analysis tool is the possibility of selecting proteins by its annotation on Gene Ontology using its Gene Id, ensembl or UniProt codes. The msBiodat analysis tool is a web-based application that allows researchers with any programming experience to deal with efficient database querying advantages. Its versatility and user-friendly interface makes easy to perform fast and accurate data screening by using complex queries. Once the analysis is finished, the result is delivered by e-mail. msBiodat analysis tool is freely available at http://msbiodata.irb.hr.

  3. Computer system for identification of tool wear model in hot forging

    Directory of Open Access Journals (Sweden)

    Wilkus Marek

    2016-01-01

    Full Text Available The aim of the research was to create a methodology that will enable effective and reliable prediction of the tool wear. The idea of the hybrid model, which accounts for various mechanisms of tool material deterioration, is proposed in the paper. The mechanisms, which were considered, include abrasive wear, adhesive wear, thermal fatigue, mechanical fatigue, oxidation and plastic deformation. Individual models of various complexity were used for separate phenomena and strategy of combination of these models in one hybrid system was developed to account for the synergy of various mechanisms. The complex hybrid model was built on the basis of these individual models for various wear mechanisms. The individual models expanded from phenomenological ones for abrasive wear to multi-scale methods for modelling micro cracks initiation and propagation utilizing virtual representations of granular microstructures. The latter have been intensively developed recently and they form potentially a powerful tool that allows modelling of thermal and mechanical fatigue, accounting explicitly for the tool material microstructure.

  4. Facilitating Research and Learning in Petrology and Geochemistry through Classroom Applications of Remotely Operable Research Instrumentation

    Science.gov (United States)

    Ryan, J. G.

    2012-12-01

    Bringing the use of cutting-edge research tools into student classroom experiences has long been a popular educational strategy in the geosciences and other STEM disciplines. The NSF CCLI and TUES programs have funded a large number of projects that placed research-grade instrumentation at educational institutions for instructional use and use in supporting undergraduate research activities. While student and faculty response to these activities has largely been positive, a range of challenges exist related to their educational effectiveness. Many of the obstacles these approaches have faced relate to "scaling up" of research mentoring experiences (e.g., providing training and time for use for an entire classroom of students, as opposed to one or two), and to time tradeoffs associated with providing technical training for effective instrument use versus course content coverage. The biggest challenge has often been simple logistics: a single instrument, housed in a different space, is difficult to integrate effectively into instructional activities. My CCLI-funded project sought primarily to knock down the logistical obstacles to research instrument use by taking advantage of remote instrument operation technologies, which allow the in-classroom use of networked analytical tools. Remote use of electron microprobe and SEM instruments of the Florida Center for Analytical Electron Microscopy (FCAEM) in Miami, FL was integrated into two geoscience courses at USF in Tampa, FL. Remote operation permitted the development of whole-class laboratory exercises to familiarize students with the tools, their function, and their capabilities; and it allowed students to collect high-quality chemical and image data on their own prepared samples in the classroom during laboratory periods. These activities improve student engagement in the course, appear to improve learning of key concepts in mineralogy and petrology, and have led to students pursuing independent research projects, as

  5. A multimedia consent tool for research participants in the Gambia: a randomized controlled trial.

    Science.gov (United States)

    Afolabi, Muhammed Olanrewaju; McGrath, Nuala; D'Alessandro, Umberto; Kampmann, Beate; Imoukhuede, Egeruan B; Ravinetto, Raffaella M; Alexander, Neal; Larson, Heidi J; Chandramohan, Daniel; Bojang, Kalifa

    2015-05-01

    To assess the effectiveness of a multimedia informed consent tool for adults participating in a clinical trial in the Gambia. Adults eligible for inclusion in a malaria treatment trial (n = 311) were randomized to receive information needed for informed consent using either a multimedia tool (intervention arm) or a standard procedure (control arm). A computerized, audio questionnaire was used to assess participants' comprehension of informed consent. This was done immediately after consent had been obtained (at day 0) and at subsequent follow-up visits (days 7, 14, 21 and 28). The acceptability and ease of use of the multimedia tool were assessed in focus groups. On day 0, the median comprehension score in the intervention arm was 64% compared with 40% in the control arm (P = 0.042). The difference remained significant at all follow-up visits. Poorer comprehension was independently associated with female sex (odds ratio, OR: 0.29; 95% confidence interval, CI: 0.12-0.70) and residing in Jahaly rather than Basse province (OR: 0.33; 95% CI: 0.13-0.82). There was no significant independent association with educational level. The risk that a participant's comprehension score would drop to half of the initial value was lower in the intervention arm (hazard ratio 0.22, 95% CI: 0.16-0.31). Overall, 70% (42/60) of focus group participants from the intervention arm found the multimedia tool clear and easy to understand. A multimedia informed consent tool significantly improved comprehension and retention of consent information by research participants with low levels of literacy.

  6. Environmental Remediation Data Management Tools

    International Nuclear Information System (INIS)

    Wierowski, J. V.; Henry, L. G.; Dooley, D. A.

    2002-01-01

    Computer software tools for data management can improve site characterization, planning and execution of remediation projects. This paper discusses the use of two such products that have primarily been used within the nuclear power industry to enhance the capabilities of radiation protection department operations. Advances in digital imaging, web application development and programming technologies have made development of these tools possible. The Interactive Visual Tour System (IVTS) allows the user to easily create and maintain a comprehensive catalog containing digital pictures of the remediation site. Pictures can be cataloged in groups (termed ''tours'') that can be organized either chronologically or spatially. Spatial organization enables the user to ''walk around'' the site and view desired areas or components instantly. Each photo is linked to a map (floor plan, topographical map, elevation drawing, etc.) with graphics displaying the location on the map and any available tour/component links. Chronological organization enables the user to view the physical results of the remediation efforts over time. Local and remote management teams can view these pictures at any time and from any location. The Visual Survey Data System (VSDS) allows users to record survey and sample data directly on photos and/or maps of areas and/or components. As survey information is collected for each area, survey data trends can be reviewed for any repetitively measured location or component. All data is stored in a Quality Assurance (Q/A) records database with reference to its physical sampling point on the site as well as other information to support the final closeout report for the site. The ease of use of these web-based products has allowed nuclear power plant clients to plan outage work from their desktop and realize significant savings with respect to dose and cost. These same tools are invaluable for remediation and decommissioning planning of any scale and for recording

  7. Tools for the functional interpretation of metabolomic experiments.

    Science.gov (United States)

    Chagoyen, Monica; Pazos, Florencio

    2013-11-01

    The so-called 'omics' approaches used in modern biology aim at massively characterizing the molecular repertories of living systems at different levels. Metabolomics is one of the last additions to the 'omics' family and it deals with the characterization of the set of metabolites in a given biological system. As metabolomic techniques become more massive and allow characterizing larger sets of metabolites, automatic methods for analyzing these sets in order to obtain meaningful biological information are required. Only recently the first tools specifically designed for this task in metabolomics appeared. They are based on approaches previously used in transcriptomics and other 'omics', such as annotation enrichment analysis. These, together with generic tools for metabolic analysis and visualization not specifically designed for metabolomics will for sure be in the toolbox of the researches doing metabolomic experiments in the near future.

  8. Community-based participatory research and user-centered design in a diabetes medication information and decision tool.

    Science.gov (United States)

    Henderson, Vida A; Barr, Kathryn L; An, Lawrence C; Guajardo, Claudia; Newhouse, William; Mase, Rebecca; Heisler, Michele

    2013-01-01

    Together, community-based participatory research (CBPR), user-centered design (UCD), and health information technology (HIT) offer promising approaches to improve health disparities in low-resource settings. This article describes the application of CBPR and UCD principles to the development of iDecide/Decido, an interactive, tailored, web-based diabetes medication education and decision support tool delivered by community health workers (CHWs) to African American and Latino participants with diabetes in Southwest and Eastside Detroit. The decision aid is offered in English or Spanish and is delivered on an iPad in participants' homes. The overlapping principles of CBPR and UCD used to develop iDecide/Decido include a user-focused or community approach, equitable academic and community partnership in all study phases, an iterative development process that relies on input from all stakeholders, and a program experience that is specified, adapted, and implemented with the target community. Collaboration between community members, researchers, and developers is especially evident in the program's design concept, animations, pictographs, issue cards, goal setting, tailoring, and additional CHW tools. The principles of CBPR and UCD can be successfully applied in developing health information tools that are easy to use and understand, interactive, and target health disparities.

  9. IMPROVEMENT OF METHODS FOR HYDROBIOLOGICAL RESEARCH AND MODIFICATION OF STANDARD TOOLS FOR SAMPLE COLLECTION

    Directory of Open Access Journals (Sweden)

    M. M. Aligadjiev

    2015-01-01

    Full Text Available Aim. The paper discusses the improvement of methods of hydrobiological studies by modifying tools for plankton and benthic samples collecting. Methods. In order to improve the standard methods of hydro-biological research, we have developed tools for sampling zooplankton and benthic environment of the Caspian Sea. Results. Long-term practice of selecting hydrobiological samples in the Caspian Sea shows that it is required to complete the modernization of the sampling tools used to collect hydrobiological material. With the introduction of Azov and Black Sea invasive comb jelly named Mnemiopsis leidyi A. Agassiz to the Caspian Sea there is a need to collect plankton samples without disturbing its integrity. Tools for collecting benthic fauna do not always give a complete picture of the state of benthic ecosystems because of the lack of visual site selection for sampling. Moreover, while sampling by dredge there is a probable loss of the samples, especially in areas with difficult terrain. Conclusion. We propose to modify a small model of Upstein net (applied in shallow water to collect zooplankton samples with an upper inverted cone that will significantly improve the catchability of the net in theCaspian Sea. Bottom sampler can be improved by installing a video camera for visual inspection of the bottom topography, and use sensors to determine tilt of the dredge and the position of the valves of the bucket. 

  10. The European source term code ESTER - basic ideas and tools for coupling of ATHLET and ESTER

    International Nuclear Information System (INIS)

    Schmidt, F.; Schuch, A.; Hinkelmann, M.

    1993-04-01

    The French software house CISI and IKE of the University of Stuttgart have developed during 1990 and 1991 in the frame of the Shared Cost Action Reactor Safety the informatic structure of the European Source TERm Evaluation System (ESTER). Due to this work tools became available which allow to unify on an European basis both code development and code application in the area of severe core accident research. The behaviour of reactor cores is determined by thermal hydraulic conditions. Therefore for the development of ESTER it was important to investigate how to integrate thermal hydraulic code systems with ESTER applications. This report describes the basic ideas of ESTER and improvements of ESTER tools in view of a possible coupling of the thermal hydraulic code system ATHLET and ESTER. Due to the work performed during this project the ESTER tools became the most modern informatic tools presently available in the area of severe accident research. A sample application is given which demonstrates the use of the new tools. (orig.) [de

  11. Integrating research tools to support the management of social-ecological systems under climate change

    Science.gov (United States)

    Miller, Brian W.; Morisette, Jeffrey T.

    2014-01-01

    Developing resource management strategies in the face of climate change is complicated by the considerable uncertainty associated with projections of climate and its impacts and by the complex interactions between social and ecological variables. The broad, interconnected nature of this challenge has resulted in calls for analytical frameworks that integrate research tools and can support natural resource management decision making in the face of uncertainty and complex interactions. We respond to this call by first reviewing three methods that have proven useful for climate change research, but whose application and development have been largely isolated: species distribution modeling, scenario planning, and simulation modeling. Species distribution models provide data-driven estimates of the future distributions of species of interest, but they face several limitations and their output alone is not sufficient to guide complex decisions for how best to manage resources given social and economic considerations along with dynamic and uncertain future conditions. Researchers and managers are increasingly exploring potential futures of social-ecological systems through scenario planning, but this process often lacks quantitative response modeling and validation procedures. Simulation models are well placed to provide added rigor to scenario planning because of their ability to reproduce complex system dynamics, but the scenarios and management options explored in simulations are often not developed by stakeholders, and there is not a clear consensus on how to include climate model outputs. We see these strengths and weaknesses as complementarities and offer an analytical framework for integrating these three tools. We then describe the ways in which this framework can help shift climate change research from useful to usable.

  12. deepTools2: a next generation web server for deep-sequencing data analysis.

    Science.gov (United States)

    Ramírez, Fidel; Ryan, Devon P; Grüning, Björn; Bhardwaj, Vivek; Kilpert, Fabian; Richter, Andreas S; Heyne, Steffen; Dündar, Friederike; Manke, Thomas

    2016-07-08

    We present an update to our Galaxy-based web server for processing and visualizing deeply sequenced data. Its core tool set, deepTools, allows users to perform complete bioinformatic workflows ranging from quality controls and normalizations of aligned reads to integrative analyses, including clustering and visualization approaches. Since we first described our deepTools Galaxy server in 2014, we have implemented new solutions for many requests from the community and our users. Here, we introduce significant enhancements and new tools to further improve data visualization and interpretation. deepTools continue to be open to all users and freely available as a web service at deeptools.ie-freiburg.mpg.de The new deepTools2 suite can be easily deployed within any Galaxy framework via the toolshed repository, and we also provide source code for command line usage under Linux and Mac OS X. A public and documented API for access to deepTools functionality is also available. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  13. Development of materials for the rapid manufacture of die cast tooling

    Science.gov (United States)

    Hardro, Peter Jason

    The focus of this research is to develop a material composition that can be processed by rapid prototyping (RP) in order to produce tooling for the die casting process. Where these rapidly produced tools will be superior to traditional tooling production methods by offering one or more of the following advantages: reduced tooling cost, shortened tooling creation time, reduced man-hours for tool creation, increased tool life, and shortened die casting cycle time. By utilizing RP's additive build process and vast material selection, there was a prospect that die cast tooling may be produced quicker and with superior material properties. To this end, the material properties that influence die life and cycle time were determined, and a list of materials that fulfill these "optimal" properties were highlighted. Physical testing was conducted in order to grade the processability of each of the material systems and to optimize the manufacturing process for the downselected material system. Sample specimens were produced and microscopy techniques were utilized to determine a number of physical properties of the material system. Additionally, a benchmark geometry was selected and die casting dies were produced from traditional tool materials (H13 steel) and techniques (machining) and from the newly developed materials and RP techniques (selective laser sintering (SLS) and laser engineered net shaping (LENS)). Once the tools were created, a die cast alloy was selected and a preset number of parts were shot into each tool. During tool creation, the manufacturing time and cost was closely monitored and an economic model was developed to compare traditional tooling to RP tooling. This model allows one to determine, in the early design stages, when it is advantageous to implement RP tooling and when traditional tooling would be best. The results of the physical testing and economic analysis has shown that RP tooling is able to achieve a number of the research objectives, namely

  14. PumpKin: A tool to find principal pathways in plasma chemical models

    Science.gov (United States)

    Markosyan, A. H.; Luque, A.; Gordillo-Vázquez, F. J.; Ebert, U.

    2014-10-01

    PumpKin is a software package to find all principal pathways, i.e. the dominant reaction sequences, in chemical reaction systems. Although many tools are available to integrate numerically arbitrarily complex chemical reaction systems, few tools exist in order to analyze the results and interpret them in relatively simple terms. In particular, due to the large disparity in the lifetimes of the interacting components, it is often useful to group reactions into pathways that recycle the fastest species. This allows a researcher to focus on the slow chemical dynamics, eliminating the shortest timescales. Based on the algorithm described by Lehmann (2004), PumpKin automates the process of finding such pathways, allowing the user to analyze complex kinetics and to understand the consumption and production of a certain species of interest. We designed PumpKin with an emphasis on plasma chemical systems but it can also be applied to atmospheric modeling and to industrial applications such as plasma medicine and plasma-assisted combustion.

  15. Using the Nine Common Themes of Good Practice checklist as a tool for evaluating the research priority setting process of a provincial research and program evaluation program.

    Science.gov (United States)

    Mador, Rebecca L; Kornas, Kathy; Simard, Anne; Haroun, Vinita

    2016-03-23

    Given the context-specific nature of health research prioritization and the obligation to effectively allocate resources to initiatives that will achieve the greatest impact, evaluation of priority setting processes can refine and strengthen such exercises and their outcomes. However, guidance is needed on evaluation tools that can be applied to research priority setting. This paper describes the adaption and application of a conceptual framework to evaluate a research priority setting exercise operating within the public health sector in Ontario, Canada. The Nine Common Themes of Good Practice checklist, described by Viergever et al. (Health Res Policy Syst 8:36, 2010) was used as the conceptual framework to evaluate the research priority setting process developed for the Locally Driven Collaborative Projects (LDCP) program in Ontario, Canada. Multiple data sources were used to inform the evaluation, including a review of selected priority setting approaches, surveys with priority setting participants, document review, and consultation with the program advisory committee. The evaluation assisted in identifying improvements to six elements of the LDCP priority setting process. The modifications were aimed at improving inclusiveness, information gathering practices, planning for project implementation, and evaluation. In addition, the findings identified that the timing of priority setting activities and level of control over the process were key factors that influenced the ability to effectively implement changes. The findings demonstrate the novel adaptation and application of the 'Nine Common Themes of Good Practice checklist' as a tool for evaluating a research priority setting exercise. The tool can guide the development of evaluation questions and enables the assessment of key constructs related to the design and delivery of a research priority setting process.

  16. [Confusing the confused: thoughts on impact factor, h(irsch) index, Q value, and other cofactors that influence the researcher's happiness].

    Science.gov (United States)

    Quindós, Guillermo

    2009-06-30

    The need to evaluate curricula for sponsorship for research projects or professional promotion, has led to the search for tools that allow an objective valuation. However, the total number papers published, or citations of articles of a particular author, or the impact factor of the Journal where they are published are inadequate indicators for the evaluation of the quality and productivity of researchers. The h index, proposed by Hirsch, categorises the papers according to the number of citations per article. This tool appears to lack the limitations of other bibliometric tools but is less useful for non English-speaking authors. To propose and debate the usefulness of the existing bibliometric indicators and tools for the evaluation and categorization of researchers and scientific journals. Search for papers on bibliometric tools. There are some hot spots in the debate on the national and international evaluation of researchers' productivity and quality of scientific journals. Opinions on impact factors and h index have been discussed. The positive discrimination, using the Q value, is proposed as an alternative for the evaluation of Spanish and Iberoamerican researchers. It is very important de-mystify the importance of bibliometric indicators. The impact factor is useful for evaluating journals from the same scientific area but not for the evaluation of researchers' curricula. For the comparison of curricula from two or more researchers, we must use the h index or the proposed Q value. the latter allows positive discrimination of the task for Spanish and Iberoamerican researchers.

  17. Learning motion concepts using real-time microcomputer-based laboratory tools

    Science.gov (United States)

    Thornton, Ronald K.; Sokoloff, David R.

    1990-09-01

    Microcomputer-based laboratory (MBL) tools have been developed which interface to Apple II and Macintosh computers. Students use these tools to collect physical data that are graphed in real time and then can be manipulated and analyzed. The MBL tools have made possible discovery-based laboratory curricula that embody results from educational research. These curricula allow students to take an active role in their learning and encourage them to construct physical knowledge from observation of the physical world. The curricula encourage collaborative learning by taking advantage of the fact that MBL tools present data in an immediately understandable graphical form. This article describes one of the tools—the motion detector (hardware and software)—and the kinematics curriculum. The effectiveness of this curriculum compared to traditional college and university methods for helping students learn basic kinematics concepts has been evaluated by pre- and post-testing and by observation. There is strong evidence for significantly improved learning and retention by students who used the MBL materials, compared to those taught in lecture.

  18. MoManI: a tool to facilitate research, analysis, and teaching of computer models

    Science.gov (United States)

    Howells, Mark; Pelakauskas, Martynas; Almulla, Youssef; Tkaczyk, Alan H.; Zepeda, Eduardo

    2017-04-01

    Allocating limited resource efficiently is a task to which efficient planning and policy design aspires. This may be a non-trivial task. For example, the seventh sustainable development goal (SDG) of Agenda 2030 is to provide access to affordable sustainable energy to all. On the one hand, energy is required to realise almost all other SDGs. (A clinic requires electricity for fridges to store vaccines for maternal health, irrigate agriculture requires energy to pump water to crops in dry periods etc.) On the other hand, the energy system is non-trivial. It requires the mapping of resource, its conversion into useable energy and then into machines that we use to meet our needs. That requires new tools that draw from standard techniques, best-in-class models and allow the analyst to develop new models. Thus we present the Model Management Infrastructure (MoManI). MoManI is used to develop, manage, run, store input and results data for linear programming models. MoManI, is a browser-based open source interface for systems modelling. It is available to various user audiences, from policy makers and planners through to academics. For example, we implement the Open Source energy Modelling System (OSeMOSYS) in MoManI. OSeMOSYS is a specialized energy model generator. A typical OSeMOSYS model would represent the current energy system of a country, region or city; in it, equations and constraints are specified; and calibrated to a base year. From that future technologies and policy options are represented. From those scenarios are designed and run. Efficient allocation of energy resource and expenditure on technology is calculated. Finally, results are visualized. At present this is done in relatively rigid interfaces or via (for some) cumbersome text files. Implementing and operating OSeMOSYS in MoManI shortens the learning curve and reduces phobia associated with the complexity of computer modelling, thereby supporting effective capacity building activities. The novel

  19. Long range manipulator development and experiments with dismantling tools

    International Nuclear Information System (INIS)

    Mueller, K.

    1993-01-01

    An existing handling system (EMIR) was used as a carrier system for various tools for concrete dismantling and radiation protection monitoring. It combined the advantages of long reach and high payload with highly dexterous kinematics. This system was enhanced mechanically to allow the use of different tools. Tool attachment devices for automatic tool exchange were investigated as well as interfaces (electric, hydraulic, compressed air, cooling water and signals). The control system was improved with regard to accuracy and sensor data processing. Programmable logic controller functions for tool control were incorporated. A free field mockup of the EMIR was build that allowed close simulation of dismantling scenarios without radioactive inventory. Aged concrete was provided for the integration tests. The development scheduled included the basic concept investigation; the development of tools and sensors; the EMIR hardware enhancement including a tool exchange; the adaption of tools and mockup and the final evaluation of the system during experiments

  20. Method for effective usage of Google Analytics tools

    Directory of Open Access Journals (Sweden)

    Ирина Николаевна Егорова

    2016-01-01

    Full Text Available Modern Google Analytics tools have been investigated against effective attraction channels for users and bottlenecks detection. Conducted investigation allowed to suggest modern method for effective usage of Google Analytics tools. The method is based on main traffic indicators analysis, as well as deep analysis of goals and their consecutive tweaking. Method allows to increase website conversion and might be useful for SEO and Web analytics specialists

  1. New generation of Sour Service Drill Pipe allows addressing highly sour field challenges

    Directory of Open Access Journals (Sweden)

    Thomazic A.

    2013-11-01

    Full Text Available Drill pipes are commonly produced by assembling pipe and tool joints through friction welding. The weld, as a result of this process, presents some challenges for preserving corrosion resistance due to some metallurgical factors such as heterogeneous microstructure, different chemical compositions between the tool joint and the pipe body and heterogeneous mechanical properties close to the welded line. Hence a new drill pipe configuration have been developed including modified chemical composition and modified manufacturing process. These modifications allow for the improvement of mechanical properties performance and corrosion resistance in the welded zone.

  2. The t-test: An Influential Inferential Tool in Chaplaincy and Other Healthcare Research.

    Science.gov (United States)

    Jankowski, Katherine R B; Flannelly, Kevin J; Flannelly, Laura T

    2018-01-01

    The t-test developed by William S. Gosset (also known as Student's t-test and the two-sample t-test) is commonly used to compare one sample mean on a measure with another sample mean on the same measure. The outcome of the t-test is used to draw inferences about how different the samples are from each other. It is probably one of the most frequently relied upon statistics in inferential research. It is easy to use: a researcher can calculate the statistic with three simple tools: paper, pen, and a calculator. A computer program can quickly calculate the t-test for large samples. The ease of use can result in the misuse of the t-test. This article discusses the development of the original t-test, basic principles of the t-test, two additional types of t-tests (the one-sample t-test and the paired t-test), and recommendations about what to consider when using the t-test to draw inferences in research.

  3. Exploring the Types of SMEs Which Could use Blogs as a Marketing Tool: a Proposed Future Research Agenda

    OpenAIRE

    Adeline Phaik Harn Chua; Kenneth R. Deans; Craig M. Parker

    2009-01-01

    Blogs appear to be gaining momentum as a marketing tool which can be used by organisations for such strategies and processes as branding, managing reputation, developing customer trust and loyalty, niche marketing, gathering marketing intelligence and promoting their online presence. There has been limited academic research in this area, and most significantly concerning the types of small and medium enterprises (SMEs) for which blogs might have potential as a marketing tool. In an attempt to...

  4. Development of a Systems Engineering Competency Model Tool for the Aviation and Missile Research, Development, And Engineering Center (AMRDEC)

    Science.gov (United States)

    2017-06-01

    grade level (GS-7 to GS-15). This foundational model is structured to support the individual needs of any Department of Defense organization and is... organizational level with traceability to the approved OPM competencies. The Redstone SECCM Tool will allow documentation of system engineering competencies and...assessment of individual and organizational development and training needs. This report documents the requirements analysis, system design, and system

  5. Open source tools for fluorescent imaging.

    Science.gov (United States)

    Hamilton, Nicholas A

    2012-01-01

    As microscopy becomes increasingly automated and imaging expands in the spatial and time dimensions, quantitative analysis tools for fluorescent imaging are becoming critical to remove both bottlenecks in throughput as well as fully extract and exploit the information contained in the imaging. In recent years there has been a flurry of activity in the development of bio-image analysis tools and methods with the result that there are now many high-quality, well-documented, and well-supported open source bio-image analysis projects with large user bases that cover essentially every aspect from image capture to publication. These open source solutions are now providing a viable alternative to commercial solutions. More importantly, they are forming an interoperable and interconnected network of tools that allow data and analysis methods to be shared between many of the major projects. Just as researchers build on, transmit, and verify knowledge through publication, open source analysis methods and software are creating a foundation that can be built upon, transmitted, and verified. Here we describe many of the major projects, their capabilities, and features. We also give an overview of the current state of open source software for fluorescent microscopy analysis and the many reasons to use and develop open source methods. Copyright © 2012 Elsevier Inc. All rights reserved.

  6. A tool for exploring space-time patterns : an animation user research

    Directory of Open Access Journals (Sweden)

    Ogao Patrick J

    2006-08-01

    Full Text Available Abstract Background Ever since Dr. John Snow (1813–1854 used a case map to identify water well as the source of a cholera outbreak in London in the 1800s, the use of spatio-temporal maps have become vital tools in a wide range of disease mapping and control initiatives. The increasing use of spatio-temporal maps in these life-threatening sectors warrants that they are accurate, and easy to interpret to enable prompt decision making by health experts. Similar spatio-temporal maps are observed in urban growth and census mapping – all critical aspects a of a country's socio-economic development. In this paper, a user test research was carried out to determine the effectiveness of spatio-temporal maps (animation in exploring geospatial structures encompassing disease, urban and census mapping. Results Three types of animation were used, namely; passive, interactive and inference-based animation, with the key differences between them being on the level of interactivity and complementary domain knowledge that each offers to the user. Passive animation maintains the view only status. The user has no control over its contents and dynamic variables. Interactive animation provides users with the basic media player controls, navigation and orientation tools. Inference-based animation incorporates these interactive capabilities together with a complementary automated intelligent view that alerts users to interesting patterns, trends or anomalies that may be inherent in the data sets. The test focussed on the role of animation passive and interactive capabilities in exploring space-time patterns by engaging test-subjects in thinking aloud evaluation protocol. The test subjects were selected from a geoinformatics (map reading, interpretation and analysis abilities background. Every test-subject used each of the three types of animation and their performances for each session assessed. The results show that interactivity in animation is a preferred

  7. A tool for exploring space-time patterns: an animation user research.

    Science.gov (United States)

    Ogao, Patrick J

    2006-08-29

    Ever since Dr. John Snow (1813-1854) used a case map to identify water well as the source of a cholera outbreak in London in the 1800s, the use of spatio-temporal maps have become vital tools in a wide range of disease mapping and control initiatives. The increasing use of spatio-temporal maps in these life-threatening sectors warrants that they are accurate, and easy to interpret to enable prompt decision making by health experts. Similar spatio-temporal maps are observed in urban growth and census mapping--all critical aspects a of a country's socio-economic development. In this paper, a user test research was carried out to determine the effectiveness of spatio-temporal maps (animation) in exploring geospatial structures encompassing disease, urban and census mapping. Three types of animation were used, namely; passive, interactive and inference-based animation, with the key differences between them being on the level of interactivity and complementary domain knowledge that each offers to the user. Passive animation maintains the view only status. The user has no control over its contents and dynamic variables. Interactive animation provides users with the basic media player controls, navigation and orientation tools. Inference-based animation incorporates these interactive capabilities together with a complementary automated intelligent view that alerts users to interesting patterns, trends or anomalies that may be inherent in the data sets. The test focussed on the role of animation passive and interactive capabilities in exploring space-time patterns by engaging test-subjects in thinking aloud evaluation protocol. The test subjects were selected from a geoinformatics (map reading, interpretation and analysis abilities) background. Every test-subject used each of the three types of animation and their performances for each session assessed. The results show that interactivity in animation is a preferred exploratory tool in identifying, interpreting and

  8. Statistical methods for the forensic analysis of striated tool marks

    Energy Technology Data Exchange (ETDEWEB)

    Hoeksema, Amy Beth [Iowa State Univ., Ames, IA (United States)

    2013-01-01

    In forensics, fingerprints can be used to uniquely identify suspects in a crime. Similarly, a tool mark left at a crime scene can be used to identify the tool that was used. However, the current practice of identifying matching tool marks involves visual inspection of marks by forensic experts which can be a very subjective process. As a result, declared matches are often successfully challenged in court, so law enforcement agencies are particularly interested in encouraging research in more objective approaches. Our analysis is based on comparisons of profilometry data, essentially depth contours of a tool mark surface taken along a linear path. In current practice, for stronger support of a match or non-match, multiple marks are made in the lab under the same conditions by the suspect tool. We propose the use of a likelihood ratio test to analyze the difference between a sample of comparisons of lab tool marks to a field tool mark, against a sample of comparisons of two lab tool marks. Chumbley et al. (2010) point out that the angle of incidence between the tool and the marked surface can have a substantial impact on the tool mark and on the effectiveness of both manual and algorithmic matching procedures. To better address this problem, we describe how the analysis can be enhanced to model the effect of tool angle and allow for angle estimation for a tool mark left at a crime scene. With sufficient development, such methods may lead to more defensible forensic analyses.

  9. Software Tools Streamline Project Management

    Science.gov (United States)

    2009-01-01

    Three innovative software inventions from Ames Research Center (NETMARK, Program Management Tool, and Query-Based Document Management) are finding their way into NASA missions as well as industry applications. The first, NETMARK, is a program that enables integrated searching of data stored in a variety of databases and documents, meaning that users no longer have to look in several places for related information. NETMARK allows users to search and query information across all of these sources in one step. This cross-cutting capability in information analysis has exponentially reduced the amount of time needed to mine data from days or weeks to mere seconds. NETMARK has been used widely throughout NASA, enabling this automatic integration of information across many documents and databases. NASA projects that use NETMARK include the internal reporting system and project performance dashboard, Erasmus, NASA s enterprise management tool, which enhances organizational collaboration and information sharing through document routing and review; the Integrated Financial Management Program; International Space Station Knowledge Management; Mishap and Anomaly Information Reporting System; and management of the Mars Exploration Rovers. Approximately $1 billion worth of NASA s projects are currently managed using Program Management Tool (PMT), which is based on NETMARK. PMT is a comprehensive, Web-enabled application tool used to assist program and project managers within NASA enterprises in monitoring, disseminating, and tracking the progress of program and project milestones and other relevant resources. The PMT consists of an integrated knowledge repository built upon advanced enterprise-wide database integration techniques and the latest Web-enabled technologies. The current system is in a pilot operational mode allowing users to automatically manage, track, define, update, and view customizable milestone objectives and goals. The third software invention, Query

  10. The Arabidopsis co-expression tool (act): a WWW-based tool and database for microarray-based gene expression analysis

    DEFF Research Database (Denmark)

    Jen, C. H.; Manfield, I. W.; Michalopoulos, D. W.

    2006-01-01

    be examined using the novel clique finder tool to determine the sets of genes most likely to be regulated in a similar manner. In combination, these tools offer three levels of analysis: creation of correlation lists of co-expressed genes, refinement of these lists using two-dimensional scatter plots......We present a new WWW-based tool for plant gene analysis, the Arabidopsis Co-Expression Tool (act) , based on a large Arabidopsis thaliana microarray data set obtained from the Nottingham Arabidopsis Stock Centre. The co-expression analysis tool allows users to identify genes whose expression...

  11. Computer-Aided Modelling Methods and Tools

    DEFF Research Database (Denmark)

    Cameron, Ian; Gani, Rafiqul

    2011-01-01

    The development of models for a range of applications requires methods and tools. In many cases a reference model is required that allows the generation of application specific models that are fit for purpose. There are a range of computer aided modelling tools available that help to define the m...

  12. PHEBUS: a new powerful laser facility for I.C.F. research

    International Nuclear Information System (INIS)

    Andre, M.; Coudeville, A.; Dautray, R.

    1986-09-01

    The new laser system called Phebus recently settled by the Centre d'Etudes of Limeil-Valenton is presented, together with the wide range of diagnostic tools allowed to be setted in the vacuum vessel. Research directions of target fabrication are also given

  13. Creatiing a Collaborative Research Network for Scientists

    Science.gov (United States)

    Gunn, W.

    2012-12-01

    This abstract proposes a discussion of how professional science communication and scientific cooperation can become more efficient through the use of modern social network technology, using the example of Mendeley. Mendeley is a research workflow and collaboration tool which crowdsources real-time research trend information and semantic annotations of research papers in a central data store, thereby creating a "social research network" that is emergent from the research data added to the platform. We describe how Mendeley's model can overcome barriers for collaboration by turning research papers into social objects, making academic data publicly available via an open API, and promoting more efficient collaboration. Central to the success of Mendeley has been the creation of a tool that works for the researcher without the requirement of being part of an explicit social network. Mendeley automatically extracts metadata from research papers, and allows a researcher to annotate, tag and organize their research collection. The tool integrates with the paper writing workflow and provides advanced collaboration options, thus significantly improving researchers' productivity. By anonymously aggregating usage data, Mendeley enables the emergence of social metrics and real-time usage stats on top of the articles' abstract metadata. In this way a social network of collaborators, and people genuinely interested in content, emerges. By building this research network around the article as the social object, a social layer of direct relevance to academia emerges. As science, particularly Earth sciences with their large shared resources, become more and more global, the management and coordination of research is more and more dependent on technology to support these distributed collaborations.

  14. Pressure Vessel Steel Research: Belgian Activities

    International Nuclear Information System (INIS)

    Van Walle, E.; Fabry, A.; Ait Abderrahim, H.; Chaouadi, R.; D'hondt, P.; Puzzolante, J.L.; Van de Velde, J.; Van Ransbeeck, T.; Gerard, R.

    1994-03-01

    A review of the Belgian research activities on Nuclear Reactor Pressure Vessel Steels (RPVS) and on related Neutron Dosimetry Aspects is presented. Born out of the surveillance programmes of the Belgian nuclear power plants, this research has lead to the development of material saving techniques, like reconstitution and miniaturization, and to improved neutron dosimetry techniques. A physically- justified RPVS fracture toughness indexation methodology, supported by micro-mechanistic modelling, is based on the elaborate use of the instrumented Charpy impact signal. Computational tools for neutron dosimetry allow to reduce the uncertainties on surveillance capsule fluences significantly

  15. Pressure Vessel Steel Research: Belgian Activities

    Energy Technology Data Exchange (ETDEWEB)

    Van Walle, E; Fabry, A; Ait Abderrahim, H; Chaouadi, R; D` hondt, P; Puzzolante, J L; Van de Velde, J; Van Ransbeeck, T [Centre d` Etude de l` Energie Nucleaire, Mol (Belgium); Gerard, R [TRACTEBEL, Brussels (Belgium)

    1994-03-01

    A review of the Belgian research activities on Nuclear Reactor Pressure Vessel Steels (RPVS) and on related Neutron Dosimetry Aspects is presented. Born out of the surveillance programmes of the Belgian nuclear power plants, this research has lead to the development of material saving techniques, like reconstitution and miniaturization, and to improved neutron dosimetry techniques. A physically- justified RPVS fracture toughness indexation methodology, supported by micro-mechanistic modelling, is based on the elaborate use of the instrumented Charpy impact signal. Computational tools for neutron dosimetry allow to reduce the uncertainties on surveillance capsule fluences significantly.

  16. GREAT: a web portal for Genome Regulatory Architecture Tools.

    Science.gov (United States)

    Bouyioukos, Costas; Bucchini, François; Elati, Mohamed; Képès, François

    2016-07-08

    GREAT (Genome REgulatory Architecture Tools) is a novel web portal for tools designed to generate user-friendly and biologically useful analysis of genome architecture and regulation. The online tools of GREAT are freely accessible and compatible with essentially any operating system which runs a modern browser. GREAT is based on the analysis of genome layout -defined as the respective positioning of co-functional genes- and its relation with chromosome architecture and gene expression. GREAT tools allow users to systematically detect regular patterns along co-functional genomic features in an automatic way consisting of three individual steps and respective interactive visualizations. In addition to the complete analysis of regularities, GREAT tools enable the use of periodicity and position information for improving the prediction of transcription factor binding sites using a multi-view machine learning approach. The outcome of this integrative approach features a multivariate analysis of the interplay between the location of a gene and its regulatory sequence. GREAT results are plotted in web interactive graphs and are available for download either as individual plots, self-contained interactive pages or as machine readable tables for downstream analysis. The GREAT portal can be reached at the following URL https://absynth.issb.genopole.fr/GREAT and each individual GREAT tool is available for downloading. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  17. Assessing children’s competence to consent in research by a standardized tool: a validity study

    Science.gov (United States)

    2012-01-01

    Background Currently over 50% of drugs prescribed to children have not been evaluated properly for use in their age group. One key reason why children have been excluded from clinical trials is that they are not considered able to exercise meaningful autonomy over the decision to participate. Dutch law states that competence to consent can be presumed present at the age of 12 and above; however, in pediatric practice children’s competence is not that clearly presented and the transition from assent to active consent is gradual. A gold standard for competence assessment in children does not exist. In this article we describe a study protocol on the development of a standardized tool for assessing competence to consent in research in children and adolescents. Methods/design In this study we modified the MacCAT-CR, the best evaluated competence assessment tool for adults, for use in children and adolescents. We will administer the tool prospectively to a cohort of pediatric patients from 6 to18 years during the selection stages of ongoing clinical trials. The outcomes of the MacCAT-CR interviews will be compared to a reference standard, established by the judgments of clinical investigators, and an expert panel consisting of child psychiatrists, child psychologists and medical ethicists. The reliability, criterion-related validity and reproducibility of the tool will be determined. As MacCAT-CR is a multi-item scale consisting of 13 items, power was justified at 130–190 subjects, providing a minimum of 10–15 observations per item. MacCAT-CR outcomes will be correlated with age, life experience, IQ, ethnicity, socio-economic status and competence judgment of the parent(s). It is anticipated that 160 participants will be recruited over 2 years to complete enrollment. Discussion A validity study on an assessment tool of competence to consent is strongly needed in research practice, particularly in the child and adolescent population. In this study we will establish

  18. Assessing children’s competence to consent in research by a standardized tool: a validity study

    Directory of Open Access Journals (Sweden)

    Hein Irma M

    2012-09-01

    Full Text Available Abstract Background Currently over 50% of drugs prescribed to children have not been evaluated properly for use in their age group. One key reason why children have been excluded from clinical trials is that they are not considered able to exercise meaningful autonomy over the decision to participate. Dutch law states that competence to consent can be presumed present at the age of 12 and above; however, in pediatric practice children’s competence is not that clearly presented and the transition from assent to active consent is gradual. A gold standard for competence assessment in children does not exist. In this article we describe a study protocol on the development of a standardized tool for assessing competence to consent in research in children and adolescents. Methods/design In this study we modified the MacCAT-CR, the best evaluated competence assessment tool for adults, for use in children and adolescents. We will administer the tool prospectively to a cohort of pediatric patients from 6 to18 years during the selection stages of ongoing clinical trials. The outcomes of the MacCAT-CR interviews will be compared to a reference standard, established by the judgments of clinical investigators, and an expert panel consisting of child psychiatrists, child psychologists and medical ethicists. The reliability, criterion-related validity and reproducibility of the tool will be determined. As MacCAT-CR is a multi-item scale consisting of 13 items, power was justified at 130–190 subjects, providing a minimum of 10–15 observations per item. MacCAT-CR outcomes will be correlated with age, life experience, IQ, ethnicity, socio-economic status and competence judgment of the parent(s. It is anticipated that 160 participants will be recruited over 2 years to complete enrollment. Discussion A validity study on an assessment tool of competence to consent is strongly needed in research practice, particularly in the child and adolescent population. In

  19. Research tools application for female fashion underwear comfort assesment

    Directory of Open Access Journals (Sweden)

    Andreia Salvan Pagnan

    2016-06-01

    Full Text Available Within the women's clothing of the universe's underwear were long an insignificant plan with regard to the development of new textile materials, shapes and colors. The panties that had been known as breeches or long underwear only became a necessity around the twentieth century with the vaporous dresses Christian Dior in the 50 Technological advances in the textile industry brought spandex created by the American laboratory DuPont's better known as the lycra. The elasticity of the fabric gave comfort to women's lingerie, passing this attribute to be considered as a quality factor in lingeries. To understand the desires of the users a qualitative research was conducted with women 18-45 years collecting opinions on the perceived comfort of already existing models compared to a new one be launched. Through the Quality Function Deployment Tool (QFD, or Quality Function Deployment, the data obtained from users of the answers given an interpretation which is to prioritize targets for the development of a based product on analyzes of desired characteristics which are converted into attributes technicians.

  20. Short peptides allowing preferential detection of Candida albicans hyphae.

    Science.gov (United States)

    Kaba, Hani E J; Pölderl, Antonia; Bilitewski, Ursula

    2015-09-01

    Whereas the detection of pathogens via recognition of surface structures by specific antibodies and various types of antibody mimics is frequently described, the applicability of short linear peptides as sensor molecules or diagnostic tools is less well-known. We selected peptides which were previously reported to bind to recombinant S. cerevisiae cells, expressing members of the C. albicans Agglutinin-Like-Sequence (ALS) cell wall protein family. We slightly modified amino acid sequences to evaluate peptide sequence properties influencing binding to C. albicans cells. Among the selected peptides, decamer peptides with an "AP"-N-terminus were superior to shorter peptides. The new decamer peptide FBP4 stained viable C. albicans cells more efficiently in their mature hyphal form than in their yeast form. Moreover, it allowed distinction of C. albicans from other related Candida spp. and could thus be the basis for the development of a useful tool for the diagnosis of invasive candidiasis.

  1. Human Pluripotent Stem Cell-Derived Cardiomyocytes as Research and Therapeutic Tools

    Directory of Open Access Journals (Sweden)

    Ivana Acimovic

    2014-01-01

    Full Text Available Human pluripotent stem cells (hPSCs, namely, embryonic stem cells (ESCs and induced pluripotent stem cells (iPSCs, with their ability of indefinite self-renewal and capability to differentiate into cell types derivatives of all three germ layers, represent a powerful research tool in developmental biology, for drug screening, disease modelling, and potentially cell replacement therapy. Efficient differentiation protocols that would result in the cell type of our interest are needed for maximal exploitation of these cells. In the present work, we aim at focusing on the protocols for differentiation of hPSCs into functional cardiomyocytes in vitro as well as achievements in the heart disease modelling and drug testing on the patient-specific iPSC-derived cardiomyocytes (iPSC-CMs.

  2. Tools for Ephemeral Gully Erosion Process Research

    Science.gov (United States)

    Techniques to quantify ephemeral gully erosion have been identified by USDA Natural Resources Conservation Service (NRCS) as one of gaps in current erosion assessment tools. One reason that may have contributed to this technology gap is the difficulty to quantify changes in channel geometry to asses...

  3. Recovery Action Mapping Tool

    Data.gov (United States)

    National Oceanic and Atmospheric Administration, Department of Commerce — The Recovery Action Mapping Tool is a web map that allows users to visually interact with and query actions that were developed to recover species listed under the...

  4. PRACTICAL APPLICATION OF QUALITY TOOLS

    Directory of Open Access Journals (Sweden)

    Duško Pavletić

    2008-09-01

    Full Text Available The paper is dealing with one segment of broader research of universality an systematicness in application of seven basic quality tools (7QC tools, which is possible to use in different areas: power plant, process industry, government, health and tourism services. The aim of the paper was to show on practical examples that there is real possibility of application of 7QC tools. Furthermore, the research has to show to what extent are selected tools in usage and what reasons of avoiding their broader application are. The simple example of successful application of the quality tools are shown on selected company in process industry.

  5. Collaborative Data Mining Tool for Education

    Science.gov (United States)

    Garcia, Enrique; Romero, Cristobal; Ventura, Sebastian; Gea, Miguel; de Castro, Carlos

    2009-01-01

    This paper describes a collaborative educational data mining tool based on association rule mining for the continuous improvement of e-learning courses allowing teachers with similar course's profile sharing and scoring the discovered information. This mining tool is oriented to be used by instructors non experts in data mining such that, its…

  6. Using Social Media as a Research Recruitment Tool: Ethical Issues and Recommendations.

    Science.gov (United States)

    Gelinas, Luke; Pierce, Robin; Winkler, Sabune; Cohen, I Glenn; Lynch, Holly Fernandez; Bierer, Barbara E

    2017-03-01

    The use of social media as a recruitment tool for research with humans is increasing, and likely to continue to grow. Despite this, to date there has been no specific regulatory guidance and there has been little in the bioethics literature to guide investigators and institutional review boards (IRBs) faced with navigating the ethical issues such use raises. We begin to fill this gap by first defending a nonexceptionalist methodology for assessing social media recruitment; second, examining respect for privacy and investigator transparency as key norms governing social media recruitment; and, finally, analyzing three relatively novel aspects of social media recruitment: (i) the ethical significance of compliance with website "terms of use"; (ii) the ethics of recruiting from the online networks of research participants; and (iii) the ethical implications of online communication from and between participants. Two checklists aimed at guiding investigators and IRBs through the ethical issues are included as appendices.

  7. Personal Health and Finance Quiz: A Tool for Outreach, Research, and Evaluation

    Directory of Open Access Journals (Sweden)

    Barbara O'Neill

    2015-02-01

    Full Text Available Rutgers Cooperative Extension developed an online self-assessment tool called the Personal Health and Finance Quiz available at http://njaes.rutgers.edu/money/health-finance-quiz/. Believed to be among the first public surveys to simultaneously query users about their health and personal finance practices, the quiz is part of Small Steps to Health and Wealth™ (SSHW, a Cooperative Extension program developed to motivate Americans to take action to improve both their health and personal finances (see http://njaes.rutgers.edu/sshw/. Respondents indicate one of four frequencies for performance of 20 daily activities and receive a Health, Finance, and Total score indicating their frequency of performing activities that health and financial experts recommend. In addition to providing users with personalized feedback, the quiz collects data for research about the health and financial practices of Americans to inform future Extension outreach and can be used as a pre-/post-test to evaluate the impact of SSHW programs. Initial research analyses are planned for 2015.

  8. Domain-Specific Thesaurus as a Tool for Information Retrieval and Collection of Knowledge

    Directory of Open Access Journals (Sweden)

    Vladimir N. Boikov

    2013-01-01

    Full Text Available This paper reports basic approaches to constructive creation of an open resource named ”Domain-specified thesaurus of poetics”, which is one of the levels of an information-analytical system of the Russian poetry (IAS RP. The poetics is a group of disciplines focused on a comprehensive theoretical and historical study of poetry. IAS RP will be used as a tool for a wide range of studies allowing to determine the characteristic features of the analyzed works of poetry. Consequently, the thesaurus is the knowledge base from which one can borrow input data for training the system. The aim of our research requires a specific approach to formating the knowledge base. Thesaurus is a web-based resource which includes a domain-specific directory, information retrieval tools and tools for further analyzes. The study of glossary consisting of three thousand terms and a set of semantic fields is reviewed in this paper. Rdf-graph of the domain-specified thesaurus of poetics is presented, containing 9 types of objects and different kinds of relationships among them. Wiki-tecnologies are used for implementing a resource which allows to store data in Semantic Web formats.

  9. Evaluating the Contribution of NASA Remotely-Sensed Data Sets on a Convection-Allowing Forecast Model

    Science.gov (United States)

    Zavodsky, Bradley T.; Case, Jonathan L.; Molthan, Andrew L.

    2012-01-01

    The Short-term Prediction Research and Transition (SPoRT) Center is a collaborative partnership between NASA and operational forecasting partners, including a number of National Weather Service forecast offices. SPoRT provides real-time NASA products and capabilities to help its partners address specific operational forecast challenges. One challenge that forecasters face is using guidance from local and regional deterministic numerical models configured at convection-allowing resolution to help assess a variety of mesoscale/convective-scale phenomena such as sea-breezes, local wind circulations, and mesoscale convective weather potential on a given day. While guidance from convection-allowing models has proven valuable in many circumstances, the potential exists for model improvements by incorporating more representative land-water surface datasets, and by assimilating retrieved temperature and moisture profiles from hyper-spectral sounders. In order to help increase the accuracy of deterministic convection-allowing models, SPoRT produces real-time, 4-km CONUS forecasts using a configuration of the Weather Research and Forecasting (WRF) model (hereafter SPoRT-WRF) that includes unique NASA products and capabilities including 4-km resolution soil initialization data from the Land Information System (LIS), 2-km resolution SPoRT SST composites over oceans and large water bodies, high-resolution real-time Green Vegetation Fraction (GVF) composites derived from the Moderate-resolution Imaging Spectroradiometer (MODIS) instrument, and retrieved temperature and moisture profiles from the Atmospheric Infrared Sounder (AIRS) and Infrared Atmospheric Sounding Interferometer (IASI). NCAR's Model Evaluation Tools (MET) verification package is used to generate statistics of model performance compared to in situ observations and rainfall analyses for three months during the summer of 2012 (June-August). Detailed analyses of specific severe weather outbreaks during the summer

  10. New tools for chloroplast genetic engineering allow the synthesis of human growth hormone in the green alga Chlamydomonas reinhardtii.

    Science.gov (United States)

    Wannathong, Thanyanan; Waterhouse, Janet C; Young, Rosanna E B; Economou, Chloe K; Purton, Saul

    2016-06-01

    In recent years, there has been an increasing interest in the exploitation of microalgae in industrial biotechnology. Potentially, these phototrophic eukaryotes could be used for the low-cost synthesis of valuable recombinant products such as bioactive metabolites and therapeutic proteins. The algal chloroplast in particular represents an attractive target for such genetic engineering, both because it houses major metabolic pathways and because foreign genes can be targeted to specific loci within the chloroplast genome, resulting in high-level, stable expression. However, routine methods for chloroplast genetic engineering are currently available only for one species-Chlamydomonas reinhardtii-and even here, there are limitations to the existing technology, including the need for an expensive biolistic device for DNA delivery, the lack of robust expression vectors, and the undesirable use of antibiotic resistance markers. Here, we describe a new strain and vectors for targeted insertion of transgenes into a neutral chloroplast locus that (i) allow scar-less fusion of a transgenic coding sequence to the promoter/5'UTR element of the highly expressed endogenous genes psaA or atpA, (ii) employ the endogenous gene psbH as an effective but benign selectable marker, and (iii) ensure the successful integration of the transgene construct in all transformant lines. Transformation is achieved by a simple and cheap method of agitation of a DNA/cell suspension with glass beads, with selection based on the phototrophic rescue of a cell wall-deficient ΔpsbH strain. We demonstrate the utility of these tools in the creation of a transgenic line that produces high levels of functional human growth hormone.

  11. Twitter Chats as a Research Tool: A Study of Young Adult Financial Decisions

    Directory of Open Access Journals (Sweden)

    Barbara O’Neill

    2018-02-01

    Full Text Available Many researchers collect online survey data because it is cost-effective and less time-consuming than traditional research methods. This paper describes Twitter chats as a research tool vis-à-vis two other online research methods: providing links to electronic surveys to respondents and use of commercially available survey panels through vendors with readily available respondents. Similar to a face-to-face focus group, Twitter chats provide a synchronous environment for participants to answer a structured series of questions and to respond to both the chat facilitator and each other. This paper also reports representative responses from a Twitter chat that explored financial decisions of young adults. The chat was sponsored by a multi-state group of land-grant university researchers, in cooperation with WiseBread, a personal finance website targeted to millennials, to recruit respondents for a more extensive month-long online survey about the financial decisions of young adults. The Twitter chat responses suggest that student loans were the top concern of participants, and debt and housing rounded out the top three concerns. The internet, both websites and social media, was the most frequently cited source of financial information. The article concludes with a discussion of lessons learned from the Twitter chat experience and suggestions for professional practice.

  12. HIFSuite: Tools for HDL Code Conversion and Manipulation

    Directory of Open Access Journals (Sweden)

    Bombieri Nicola

    2010-01-01

    Full Text Available Abstract HIFSuite ia a set of tools and application programming interfaces (APIs that provide support for modeling and verification of HW/SW systems. The core of HIFSuite is the HDL Intermediate Format (HIF language upon which a set of front-end and back-end tools have been developed to allow the conversion of HDL code into HIF code and vice versa. HIFSuite allows designers to manipulate and integrate heterogeneous components implemented by using different hardware description languages (HDLs. Moreover, HIFSuite includes tools, which rely on HIF APIs, for manipulating HIF descriptions in order to support code abstraction/refinement and postrefinement verification.

  13. Exploring the Types of SMEs Which Could use Blogs as a Marketing Tool: a Proposed Future Research Agenda

    Directory of Open Access Journals (Sweden)

    Adeline Phaik Harn Chua

    2009-08-01

    Full Text Available Blogs appear to be gaining momentum as a marketing tool which can be used by organisations for such strategies and processes as branding, managing reputation, developing customer trust and loyalty, niche marketing, gathering marketing intelligence and promoting their online presence. There has been limited academic research in this area, and most significantly concerning the types of small and medium enterprises (SMEs for which blogs might have potential as a marketing tool. In an attempt to address the knowledge gap, this paper presents a future research agenda (in the form of research questions which can guide the eBusiness research community in conducting much needed studies in this area. This paper is particularly novel in that it aims to demonstrate how the heterogeneity of SMEs and their specific business uses of eBusiness technology such as blogs can form the central plank of a future research agenda. This is important because the existing eBusiness literature tends to treat eBusiness collectively rather than focusing on the specific business uses of different eBusiness technologies, and to treat SMEs as a homogeneous group. The paper concludes with a discussion of how this research agenda can form the basis of studies which use a range of different research methods, and how this "big picture" agenda approach might help the eBusiness research community build theory which better explains SME adoption and use of eBusiness.

  14. Research of a smart cutting tool based on MEMS strain gauge

    Science.gov (United States)

    Zhao, Y.; Zhao, Y. L.; Shao, YW; Hu, T. J.; Zhang, Q.; Ge, X. H.

    2018-03-01

    Cutting force is an important factor that affects machining accuracy, cutting vibration and tool wear. Machining condition monitoring by cutting force measurement is a key technology for intelligent manufacture. Current cutting force sensors exist problems of large volume, complex structure and poor compatibility in practical application, for these problems, a smart cutting tool is proposed in this paper for cutting force measurement. Commercial MEMS (Micro-Electro-Mechanical System) strain gauges with high sensitivity and small size are adopted as transducing element of the smart tool, and a structure optimized cutting tool is fabricated for MEMS strain gauge bonding. Static calibration results show that the developed smart cutting tool is able to measure cutting forces in both X and Y directions, and the cross-interference error is within 3%. Its general accuracy is 3.35% and 3.27% in X and Y directions, and sensitivity is 0.1 mV/N, which is very suitable for measuring small cutting forces in high speed and precision machining. The smart cutting tool is portable and reliable for practical application in CNC machine tool.

  15. Subsetting Tools for Enabling Easy Access to International Airborne Chemistry Data

    Science.gov (United States)

    Northup, E. A.; Chen, G.; Quam, B. M.; Beach, A. L., III; Silverman, M. L.; Early, A. B.

    2017-12-01

    In response to the Research Opportunities in Earth and Space Science (ROSES) 2015 release announcement for Advancing Collaborative Connections for Earth System Science (ACCESS), researchers at NASA Langley Research Center (LaRC) proposed to extend the capabilities of the existing Toolsets for Airborne Data (TAD) to include subsetting functionality to allow for easier access to international airborne field campaign data. Airborne field studies are commonly used to gain a detailed understanding of atmospheric processes for scientific research on international climate change and air quality issues. To accommodate the rigorous process for manipulating airborne field study chemistry data, and to lessen barriers for researchers, TAD was created with the ability to geolocate data from various sources measured on different time scales from a single flight. The analysis of airborne chemistry data typically requires data subsetting, which can be challenging and resource-intensive for end users. In an effort to streamline this process, new data subsetting features and updates to the current database model will be added to the TAD toolset. These will include two subsetters: temporal and spatial, and vertical profile. The temporal and spatial subsetter will allow users to both focus on data from a specific location and/or time period. The vertical profile subsetter will retrieve data collected during an individual aircraft ascent or descent spiral. These new web-based tools will allow for automation of the typically labor-intensive manual data subsetting process, which will provide users with data tailored to their specific research interests. The system has been designed to allow for new in-situ airborne missions to be added as they become available, with only minor pre-processing required. The development of these enhancements will be discussed in this presentation.

  16. User Manual for the PROTEUS Mesh Tools

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Micheal A. [Argonne National Lab. (ANL), Argonne, IL (United States); Shemon, Emily R [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-09-19

    PROTEUS is built around a finite element representation of the geometry for visualization. In addition, the PROTEUS-SN solver was built to solve the even-parity transport equation on a finite element mesh provided as input. Similarly, PROTEUS-MOC and PROTEUS-NEMO were built to apply the method of characteristics on unstructured finite element meshes. Given the complexity of real world problems, experience has shown that using commercial mesh generator to create rather simple input geometries is overly complex and slow. As a consequence, significant effort has been put into place to create multiple codes that help assist in the mesh generation and manipulation. There are three input means to create a mesh in PROTEUS: UFMESH, GRID, and NEMESH. At present, the UFMESH is a simple way to generate two-dimensional Cartesian and hexagonal fuel assembly geometries. The UFmesh input allows for simple assembly mesh generation while the GRID input allows the generation of Cartesian, hexagonal, and regular triangular structured grid geometry options. The NEMESH is a way for the user to create their own mesh or convert another mesh file format into a PROTEUS input format. Given that one has an input mesh format acceptable for PROTEUS, we have constructed several tools which allow further mesh and geometry construction (i.e. mesh extrusion and merging). This report describes the various mesh tools that are provided with the PROTEUS code giving both descriptions of the input and output. In many cases the examples are provided with a regression test of the mesh tools. The most important mesh tools for any user to consider using are the MT_MeshToMesh.x and the MT_RadialLattice.x codes. The former allows the conversion between most mesh types handled by PROTEUS while the second allows the merging of multiple (assembly) meshes into a radial structured grid. Note that the mesh generation process is recursive in nature and that each input specific for a given mesh tool (such as .axial

  17. Interaction Matrices as a Tool for Prioritizing Radioecology Research

    Energy Technology Data Exchange (ETDEWEB)

    Mora, J.C.; Robles, Beatriz [Centro de Investigaciones Energeticas, Medioambientales y Tecnologicas - CIEMAT (Spain); Bradshaw, Clare; Stark, Karolina [Stockholm University (Sweden); Sweeck, Liev; Vives i Batlle, Jordi [Belgian Nuclear Research Centre SCK-CEN (Belgium); Beresford, Nick [Centre for Ecology and Hydrology - CEH (United Kingdom); Thoerring, Havard; Dowdall, Mark [Norwegian Radiation Protection Authority - NRPA (Norway); Outola, Iisa; Turtiainen, Tuukka; Vetikko, Virve [STUK - Radiation and Nuclear Safety Authority (Finland); Steiner, Martin [Federal Office for Radiation Protection - BfS (Germany); Beaugelin-Seiller, Karine; Fevrier, Laureline; Hurtevent, Pierre; Boyer, Patrick [Institut de Radioprotection et de Surete Nucleaire - IRSN (France)

    2014-07-01

    Interaction Matrices as a Tool for Prioritizing Radioecology Research J.C. Mora CIEMAT In 2010 the Strategy for Allied Radioecology (STAR) was launched with several objectives aimed towards integrating the radioecology research efforts of nine institutions in Europe. One of these objectives was the creation of European Radioecology Observatories. The Chernobyl Exclusion Zone (CEZ) and the Upper Silesian Coal Basin (USCB), a coal mining area in Poland, have been chosen after a selection process. A second objective was to develop a system for improving and validating the capabilities of predicting the behaviour of the main radionuclides existing at these observatories. Interaction Matrices (IM) have been used since the 1990's as a tool for developing ecological conceptual models and have also been used within radioecology. The Interaction Matrix system relies on expert judgement for structuring knowledge of a given ecosystem at the conceptual level and was selected for use in the STAR project. A group of experts, selected from each institution of STAR, designed two matrices with the main compartments for each ecosystem (a forest in CEZ and a lake in USCB). All the features, events and processes (FEPs) which could affect the behaviour of the considered radionuclides, focusing on radiocaesium in the Chernobyl forest and radium in the Rontok-Wielki lake, were also included in each IM. Two new sets of experts were appointed to review, improve and prioritize the processes included in each IM. A first processing of the various candidate interaction matrices produced a single interaction matrix for each ecosystem which incorporated all experts combined knowledge. During the prioritization of processes in the IMs, directed towards developing a whole predictive model of radionuclides behaviour in those ecosystems, raised interesting issues related to the processes and parameters involved, regarding the existing knowledge in them. This exercise revealed several processes

  18. Streamlining Research by Using Existing Tools

    OpenAIRE

    Greene, Sarah M.; Baldwin, Laura-Mae; Dolor, Rowena J.; Thompson, Ella; Neale, Anne Victoria

    2011-01-01

    Over the past two decades, the health research enterprise has matured rapidly, and many recognize an urgent need to translate pertinent research results into practice, to help improve the quality, accessibility, and affordability of U.S. health care. Streamlining research operations would speed translation, particularly for multi-site collaborations. However, the culture of research discourages reusing or adapting existing resources or study materials. Too often, researchers start studies and...

  19. Extraction Analysis and Creation of Three-Dimensional Road Profiles Using Matlab OpenCRG Tool

    Directory of Open Access Journals (Sweden)

    Rakesh Hari Borse

    2015-08-01

    Full Text Available In vehicle systems dynamics there are wide applications of simulation of vehicles on road surfaces. These simulation applications are related to vehicle handling ride comfort and durability. For accurate prediction of results there is a need for a reliable and efficient road representations. The efficient representation of road surface profiles is to represent them in three-dimensional space. This is made possible by the CRG Curved Regular Grid approach. OpenCRG is a completely open source project including a tool suite for the creation modification and evaluation of road surfaces. Its objective is to standardized detailed road surface description and it may be used for applications like tire models vibrations or driving simulation. The Matlab tool suite of OpenCRG provides powerful modification or creation tools and allows to visualize the 3D road data representation. The current research focuses on basic concepts of OpenCRG and its Matlab environment. The extraction of longitudinal two-dimensional road profiles from three-dimensional CRG format is researched. The creation of simple virtual three-dimensional roads has been programmed. A Matlab software tool to extract create and analyze the three-dimensional road profiles is to be developed.

  20. REMOD: a computational tool for remodeling neuronal dendrites

    Directory of Open Access Journals (Sweden)

    Panagiotis Bozelos

    2014-05-01

    Full Text Available In recent years, several modeling studies have indicated that dendritic morphology is a key determinant of how individual neurons acquire a unique signal processing profile. The highly branched dendritic structure that originates from the cell body, explores the surrounding 3D space in a fractal-like manner, until it reaches a certain amount of complexity. Its shape undergoes significant alterations not only in various neuropathological conditions, but in physiological, too. Yet, despite the profound effect that these alterations can have on neuronal function, the causal relationship between structure and function remains largely elusive. The lack of a systematic approach for remodeling neuronal cells and their dendritic trees is a key limitation that contributes to this problem. In this context, we developed a computational tool that allows the remodeling of any type of neurons, given a set of exemplar morphologies. The tool is written in Python and provides a simple GUI that guides the user through various options to manipulate selected neuronal morphologies. It provides the ability to load one or more morphology files (.swc or .hoc and choose specific dendrites to operate one of the following actions: shrink, remove, extend or branch (as shown in Figure 1. The user retains complete control over the extent of each alteration and if a chosen action is not possible due to pre-existing structural constraints, appropriate warnings are produced. Importantly, the tool can also be used to extract morphology statistics for one or multiple morphologies, including features such as the total dendritic length, path length to the root, branch order, diameter tapering, etc. Finally, an experimental utility enables the user to remodel entire dendritic trees based on preloaded statistics from a database of cell-type specific neuronal morphologies. To our knowledge, this is the first tool that allows (a the remodeling of existing –as opposed to the de novo

  1. Using perinatal morbidity scoring tools as a primary study outcome.

    Science.gov (United States)

    Hutcheon, Jennifer A; Bodnar, Lisa M; Platt, Robert W

    2017-11-01

    Perinatal morbidity scores are tools that score or weight different adverse events according to their relative severity. Perinatal morbidity scores are appealing for maternal-infant health researchers because they provide a way to capture a broad range of adverse events to mother and newborn while recognising that some events are considered more serious than others. However, they have proved difficult to implement as a primary outcome in applied research studies because of challenges in testing if the scores are significantly different between two or more study groups. We outline these challenges and describe a solution, based on Poisson regression, that allows differences in perinatal morbidity scores to be formally evaluated. The approach is illustrated using an existing maternal-neonatal scoring tool, the Adverse Outcome Index, to evaluate the safety of labour and delivery before and after the closure of obstetrical services in small rural communities. Applying the proposed Poisson regression to the case study showed a protective risk ratio for adverse outcome following closures as compared with the original analysis, where no difference was found. This approach opens the door for considerably broader use of perinatal morbidity scoring tools as a primary outcome in applied population and clinical maternal-infant health research studies. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  2. A Multidisciplinary Delphi Consensus-Based Checklist to Define Clinical Documentation Tools for Both Routine and Research Purposes

    Directory of Open Access Journals (Sweden)

    Cecilia Veraar

    2018-01-01

    Full Text Available Background: To the best of our knowledge, a strategic approach to define the contents of structured clinical documentation tools for both clinical routine patient care and research purposes has not been reported so far, although electronic health record will become more and more structured and detailed in the future. Objective: To achieve an interdisciplinary consensus on a checklist to be considered for the preparation of disease- and situation-specific clinical documentation tools. Methods: A 2-round Delphi consensus-based process was conducted both with 19 physicians of different disciplines and 14 students from Austria, Switzerland, and Germany. Agreement was defined as 80% or more positive votes of the participants. Results: The participants agreed that a working group should be set up for the development of structured disease- or situation-specific documentation tools (97% agreement. The final checklist included 4 recommendations concerning the setup of the working group, 12 content-related recommendations, and 3 general and technical recommendations (mean agreement [standard deviation] = 97.4% [4.0%], ranging from 84.2% to 100.0%. Discussion and Conclusion: In the future, disease- and situation-specific structured documentation tools will provide an important bridge between registries and electronic health records. Clinical documentation tools defined according to this Delphi consensus-based checklist will provide data for registries while serving as high-quality data acquisition tools in routine clinical care.

  3. Sandia Generated Matrix Tool (SGMT) v. 1.0

    Energy Technology Data Exchange (ETDEWEB)

    2010-03-24

    Provides a tool with which create and characterize a very large set of matrix-based visual analogy problems that have properties that are similar to Raven's Progressive Matrices (RPMs)™. The software uses the same underlying patterns found in RPMs to generate large numbers of unique matrix problems using parameters chosen by the researcher. Specifically, the software is designed so that researchers can choose the type, direction, and number of relations in a problem and then create any number of unique matrices that share the same underlying structure (e.g. changes in numerosity in a diagonal pattern) but have different surface features (e.g. shapes, colors).Raven's Progressive Matrices (RPMs) ™ are a widely-used test for assessing intelligence and reasoning ability. Since the test is non-verbal, it can be applied to many different populations and has been used all over the world. However, there are relatively few matrices in the sets developed by Raven, which limits their use in experiments requiring large numbers of stimuli. This tool creates a matrix set in a systematic way that allows researchers to have a great deal of control over the underlying structure, surface features, and difficulty of the matrix problems while providing a large set of novel matrices with which to conduct experiments.

  4. Collaboratively Teaching and Doing History: Promoting Historical Research in the 21st Century

    Science.gov (United States)

    Carey, Elaine; Pun, Raymond

    2016-01-01

    A collaborative course introduced history students to a variety of digital tools and printed materials for historical research. The authors explore the development of this program by a historian and a librarian as a case study to address the value of teaching history outside of the classroom and allowing students to conduct research on-site. This…

  5. PBPK Modeling - A Predictive, Eco-Friendly, Bio-Waiver Tool for Drug Research.

    Science.gov (United States)

    De, Baishakhi; Bhandari, Koushik; Mukherjee, Ranjan; Katakam, Prakash; Adiki, Shanta K; Gundamaraju, Rohit; Mitra, Analava

    2017-01-01

    The world has witnessed growing complexities in disease scenario influenced by the drastic changes in host-pathogen- environment triadic relation. Pharmaceutical R&Ds are in constant search of novel therapeutic entities to hasten transition of drug molecules from lab bench to patient bedside. Extensive animal studies and human pharmacokinetics are still the "gold standard" in investigational new drug research and bio-equivalency studies. Apart from cost, time and ethical issues on animal experimentation, burning questions arise relating to ecological disturbances, environmental hazards and biodiversity issues. Grave concerns arises when the adverse outcomes of continued studies on one particular disease on environment gives rise to several other pathogenic agents finally complicating the total scenario. Thus Pharma R&Ds face a challenge to develop bio-waiver protocols. Lead optimization, drug candidate selection with favorable pharmacokinetics and pharmacodynamics, toxicity assessment are vital steps in drug development. Simulation tools like Gastro Plus™, PK Sim®, SimCyp find applications for the purpose. Advanced technologies like organ-on-a chip or human-on-a chip where a 3D representation of human organs and systems can mimic the related processes and activities, thereby linking them to major features of human biology can be successfully incorporated in the drug development tool box. PBPK provides the State of Art to serve as an optional of animal experimentation. PBPK models can successfully bypass bio-equivalency studies, predict bioavailability, drug interactions and on hyphenation with in vitro-in vivo correlation can be extrapolated to humans thus serving as bio-waiver. PBPK can serve as an eco-friendly bio-waiver predictive tool in drug development. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  6. Induction hardening of tool steel for heavily loaded aircraft engine components

    Directory of Open Access Journals (Sweden)

    Rokicki P.

    2017-03-01

    Full Text Available Induction hardening is an innovative process allowing modification of the materials surface with more effective, cheaper and more reproducible way to compare with conventional hardening methods used in the aerospace industry. Unfortunately, high requirements and strict regulation concerning this branch of the industry force deep research allowing to obtain results that would be used for numerical modelling of the process. Only by this way one is able to start the industrial application of the process. The main scope of presented paper are results concerning investigation of microstructure evolution of tool steel after single-frequency induction hardening process. The specimens that aim in representing final industrial products (as heavily loaded gears, were heat- -treated with induction method and subjected to metallographic preparation, after which complex microstructure investigation was performed. The results obtained within the research will be a basis for numerical modelling of the process of induction hardening with potential to be introduced for the aviation industrial components.

  7. The NIDDK Information Network: A Community Portal for Finding Data, Materials, and Tools for Researchers Studying Diabetes, Digestive, and Kidney Diseases.

    Directory of Open Access Journals (Sweden)

    Patricia L Whetzel

    Full Text Available The NIDDK Information Network (dkNET; http://dknet.org was launched to serve the needs of basic and clinical investigators in metabolic, digestive and kidney disease by facilitating access to research resources that advance the mission of the National Institute of Diabetes and Digestive and Kidney Diseases (NIDDK. By research resources, we mean the multitude of data, software tools, materials, services, projects and organizations available to researchers in the public domain. Most of these are accessed via web-accessible databases or web portals, each developed, designed and maintained by numerous different projects, organizations and individuals. While many of the large government funded databases, maintained by agencies such as European Bioinformatics Institute and the National Center for Biotechnology Information, are well known to researchers, many more that have been developed by and for the biomedical research community are unknown or underutilized. At least part of the problem is the nature of dynamic databases, which are considered part of the "hidden" web, that is, content that is not easily accessed by search engines. dkNET was created specifically to address the challenge of connecting researchers to research resources via these types of community databases and web portals. dkNET functions as a "search engine for data", searching across millions of database records contained in hundreds of biomedical databases developed and maintained by independent projects around the world. A primary focus of dkNET are centers and projects specifically created to provide high quality data and resources to NIDDK researchers. Through the novel data ingest process used in dkNET, additional data sources can easily be incorporated, allowing it to scale with the growth of digital data and the needs of the dkNET community. Here, we provide an overview of the dkNET portal and its functions. We show how dkNET can be used to address a variety of use cases

  8. Mendeley as an integral tool in the arsenal of modern scientist

    Directory of Open Access Journals (Sweden)

    Taras Kotyk

    2016-11-01

    Full Text Available This paper presents the possibilities of Mendeley – a reference manager and social network for researchers. The key aspects of using this software as an effective reference manager as well as a tool for organizing full-text archive of publications and processing scientific sources when conducting research are highlighted. The possibilities of Mendeley as a social network, namely a means of communication and collaboration between researchers, sharing of reference database and search for new scientific publications are presented as well. In general, Mendeley, due to its functionality, is an integral part of the scientific research carried out by students, scientists or laboratory research groups. The use of Mendeley by all members of the research project will allow them to effectively search for original sources and analyze them; to quickly create the reference list according to different styles; to follow other researchers in order to view relevant papers; to greatly enhance the quality of the research; to expand the potential readership of their publications.

  9. Tools for Citizen-Science Recruitment and Student Engagement in Your Research and in Your Classroom

    Directory of Open Access Journals (Sweden)

    Sarah E. Council

    2016-01-01

    Full Text Available The field of citizen science is exploding and offers not only a great way to engage the general public in science literacy through primary research, but also an avenue for teaching professionals to engage their students in meaningful community research experiences. Though this field is expanding, there are many hurdles for researchers and participants, as well as challenges for teaching professionals who want to engage their students. Here we highlight one of our projects that engaged many citizens in Raleigh, NC, and across the world, and we use this as a case study to highlight ways to engage citizens in all kinds of research. Through the use of numerous tools to engage the public, we gathered citizen scientists to study skin microbes and their associated odors, and we offer valuable ideas for teachers to tap into resources for their own students and potential citizen-science projects.

  10. Reproducible research in palaeomagnetism

    Science.gov (United States)

    Lurcock, Pontus; Florindo, Fabio

    2015-04-01

    The reproducibility of research findings is attracting increasing attention across all scientific disciplines. In palaeomagnetism as elsewhere, computer-based analysis techniques are becoming more commonplace, complex, and diverse. Analyses can often be difficult to reproduce from scratch, both for the original researchers and for others seeking to build on the work. We present a palaeomagnetic plotting and analysis program designed to make reproducibility easier. Part of the problem is the divide between interactive and scripted (batch) analysis programs. An interactive desktop program with a graphical interface is a powerful tool for exploring data and iteratively refining analyses, but usually cannot operate without human interaction. This makes it impossible to re-run an analysis automatically, or to integrate it into a larger automated scientific workflow - for example, a script to generate figures and tables for a paper. In some cases the parameters of the analysis process itself are not saved explicitly, making it hard to repeat or improve the analysis even with human interaction. Conversely, non-interactive batch tools can be controlled by pre-written scripts and configuration files, allowing an analysis to be 'replayed' automatically from the raw data. However, this advantage comes at the expense of exploratory capability: iteratively improving an analysis entails a time-consuming cycle of editing scripts, running them, and viewing the output. Batch tools also tend to require more computer expertise from their users. PuffinPlot is a palaeomagnetic plotting and analysis program which aims to bridge this gap. First released in 2012, it offers both an interactive, user-friendly desktop interface and a batch scripting interface, both making use of the same core library of palaeomagnetic functions. We present new improvements to the program that help to integrate the interactive and batch approaches, allowing an analysis to be interactively explored and refined

  11. Medicare Physician and Other Supplier Look-up Tool

    Data.gov (United States)

    U.S. Department of Health & Human Services — This look-up tool is a searchable database that allows you to look up a provider by National Provider Identifier (NPI), or by name and location. The look-up tool...

  12. A Web-based Tool Combining Different Type Analyses

    DEFF Research Database (Denmark)

    Henriksen, Kim Steen; Gallagher, John Patrick

    2006-01-01

    of both, and they can be goal-dependent or goal-independent. We describe a prototype tool that can be accessed from a web browser, allowing various type analyses to be run. The first goal of the tool is to allow the analysis results to be examined conveniently by clicking on points in the original program...... the minimal "domain model" of the program with respect to the corresponding pre-interpretation, which can give more precise information than the original descriptive type....

  13. Evidence & Gap Maps: A tool for promoting evidence informed policy and strategic research agendas.

    Science.gov (United States)

    Snilstveit, Birte; Vojtkova, Martina; Bhavsar, Ami; Stevenson, Jennifer; Gaarder, Marie

    2016-11-01

    A range of organizations are engaged in the production of evidence on the effects of health, social, and economic development programs on human welfare outcomes. However, evidence is often scattered around different databases, web sites, and the gray literature and is often presented in inaccessible formats. Lack of overview of the evidence in a specific field can be a barrier to the use of existing research and prevent efficient use of limited resources for new research. Evidence & Gap Maps (EGMs) aim to address these issues and complement existing synthesis and mapping approaches. EGMs are a new addition to the tools available to support evidence-informed policymaking. To provide an accessible resource for researchers, commissioners, and decision makers, EGMs provide thematic collections of evidence structured around a framework which schematically represents the types of interventions and outcomes of relevance to a particular sector. By mapping the existing evidence using this framework, EGMs provide a visual overview of what we know and do not know about the effects of different programs. They make existing evidence available, and by providing links to user-friendly summaries of relevant studies, EGMs can facilitate the use of existing evidence for decision making. They identify key "gaps" where little or no evidence from impact evaluations and systematic reviews is available and can be a valuable resource to inform a strategic approach to building the evidence base in a particular sector. The article will introduce readers to the concept and methods of EGMs and present a demonstration of the EGM tool using existing examples. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Carbon Footprint Estimation Tool for Residential Buildings for Non-Specialized Users: OERCO2 Project

    Directory of Open Access Journals (Sweden)

    Jaime Solís-Guzmán

    2018-04-01

    Full Text Available Existing tools for environmental certification of buildings are failing in their ability to reach the general public and to create social awareness, since they require not only specialized knowledge regarding construction and energy sources, but also environmental knowledge. In this paper, an open-source online tool for the estimation of the carbon footprint of residential buildings by non-specialized users is presented as a product from the OERCO2 Erasmus + project. The internal calculations, data management and operation of this tool are extensively explained. The ten most common building typologies built in the last decade in Spain are analysed by using the OERCO2 tool, and the order of magnitude of the results is analysed by comparing them to the ranges determined by other authors. The OERCO2 tool proves itself to be reliable, with its results falling within the defined logical value ranges. Moreover, the major simplification of the interface allows non-specialized users to evaluate the sustainability of buildings. Further research is oriented towards its inclusion in other environmental certification tools and in Building Information Modeling (BIM environments.

  15. Involving citizens in priority setting for public health research: Implementation in infection research.

    Science.gov (United States)

    Rawson, Timothy M; Castro-Sánchez, Enrique; Charani, Esmita; Husson, Fran; Moore, Luke S P; Holmes, Alison H; Ahmad, Raheelah

    2018-02-01

    Public sources fund the majority of UK infection research, but citizens currently have no formal role in resource allocation. To explore the feasibility and willingness of citizens to engage in strategic decision making, we developed and tested a practical tool to capture public priorities for research. A scenario including six infection themes for funding was developed to assess citizen priorities for research funding. This was tested over two days at a university public festival. Votes were cast anonymously along with rationale for selection. The scenario was then implemented during a three-hour focus group exploring views on engagement in strategic decisions and in-depth evaluation of the tool. 188/491(38%) prioritized funding research into drug-resistant infections followed by emerging infections(18%). Results were similar between both days. Focus groups contained a total of 20 citizens with an equal gender split, range of ethnicities and ages ranging from 18 to >70 years. The tool was perceived as clear with participants able to make informed comparisons. Rationale for funding choices provided by voters and focus group participants are grouped into three major themes: (i) Information processing; (ii) Knowledge of the problem; (iii) Responsibility; and a unique theme within the focus groups (iv) The potential role of citizens in decision making. Divergent perceptions of relevance and confidence of "non-experts" as decision makers were expressed. Voting scenarios can be used to collect, en-masse, citizens' choices and rationale for research priorities. Ensuring adequate levels of citizen information and confidence is important to allow deployment in other formats. © 2017 The Authors Health Expectations Published by John Wiley & Sons Ltd.

  16. Multidimensional Ranking: A New Transparency Tool for Higher Education and Research

    Science.gov (United States)

    van Vught, Frans; Westerheijden, Don F.

    2010-01-01

    This paper sets out to analyse the need for better "transparency tools" which inform university stakeholders about the quality of universities. First, we give an overview of what we understand by the concept of transparency tools and those that are currently available. We then critique current transparency tools' methodologies, looking in detail…

  17. The Use of a Drawing Tool to Assess the Implicit Ageism of Students

    Directory of Open Access Journals (Sweden)

    Loredana Ivan

    2018-06-01

    Full Text Available Ageism has been generally defined as a prejudice people from a certain age group hold towards other age groups (Butler, 1969; 1975. Although such definitions do not restrict the use of the term to researching prejudices regarding a certain age group, currently ageism is deployed in studies concerning prejudices regarding older people and includes cognitive evaluations (negative stereotypes people might have regarding older people as well as affective – emotional reactions towards older people, in different instances of daily life. Researchers admit the fact that some of the ageist reactions (both cognitive and emotional could be captured by implicit measures. Implicit association tests have been used to measure subtle cues of ageism (see Levy & Banaji, 2002 and the validity of these measurements are largely discussed in the international psychological literature (see Greenwald, McGhee & Schwartz, 1998; Rudman et al., 1999, for a review. Drawing could also be used as a tool to research implicit ageism, though it has been approached to a lesser extend to research on ageism (see for example Barrett & Cantwell, 2007. In the current research, we employ the drawing technique on a sample of undergraduate students from a public university (N=165 to assess their visual representations of older people. Examining the features of the drawing allows us to talk about implicit ageism and the way the drawing tool could be a valid tool to examine implicit ageism.

  18. Machine Assistance in Collection Building: New Tools, Research, Issues, and Reflections

    Directory of Open Access Journals (Sweden)

    Steve Mitchell

    2006-12-01

    Full Text Available Digital tool making offers many challenges, involving much trial and error. Developing machine learning and assistance in automated and semi-automated Internet resource discovery, metadata generation, and rich-text identification provides opportunities for great discovery, innovation, and the potential for transformation of the library community. The areas of computer science involved, as applied to the library applications addressed, are among that discipline’s leading edges. Making applied research practical and applicable, through placement within library/collection-management systems and services, involves equal parts computer scientist, research librarian, and legacy-systems archaeologist. Still, the early harvest is there for us now, with a large harvest pending. Data Fountains and iVia, the projects discussed, demonstrate this. Clearly, then, the present would be a good time for the library community to more proactively and significantly engage with this technology and research, to better plan for its impacts, to more proactively take up the challenges involved in its exploration, and to better and more comprehensively guide effort in this new territory. The alternative to doing this is that others will develop this territory for us, do it not as well, and sell it back to us at a premium. Awareness of this technology and its current capabilities, promises, limitations, and probable major impacts needs to be generalized throughout the library management, metadata, and systems communities. This article charts recent work, promising avenues for new research and development, and issues the library community needs to understand.

  19. Common data elements for clinical research in mitochondrial disease: a National Institute for Neurological Disorders and Stroke project

    NARCIS (Netherlands)

    Karaa, A.; Rahman, S.; Lombes, A.; Yu-Wai-Man, P.; Sheikh, M.K.; Alai-Hansen, S.; Cohen, B.H.; Dimmock, D.; Emrick, L.; Falk, M.J.; McCormack, S.; Mirsky, D.; Moore, T.; Parikh, S.; Shoffner, J.; Taivassalo, T.; Tarnopolsky, M.; Tein, I.; Odenkirchen, J.C.; Goldstein, A.; Koene, S.; Smeitink, J.A.M.; et al.,

    2017-01-01

    OBJECTIVES: The common data elements (CDE) project was developed by the National Institute of Neurological Disorders and Stroke (NINDS) to provide clinical researchers with tools to improve data quality and allow for harmonization of data collected in different research studies. CDEs have been

  20. Application of bioinformatics tools and databases in microbial dehalogenation research (a review).

    Science.gov (United States)

    Satpathy, R; Konkimalla, V B; Ratha, J

    2015-01-01

    Microbial dehalogenation is a biochemical process in which the halogenated substances are catalyzed enzymatically in to their non-halogenated form. The microorganisms have a wide range of organohalogen degradation ability both explicit and non-specific in nature. Most of these halogenated organic compounds being pollutants need to be remediated; therefore, the current approaches are to explore the potential of microbes at a molecular level for effective biodegradation of these substances. Several microorganisms with dehalogenation activity have been identified and characterized. In this aspect, the bioinformatics plays a key role to gain deeper knowledge in this field of dehalogenation. To facilitate the data mining, many tools have been developed to annotate these data from databases. Therefore, with the discovery of a microorganism one can predict a gene/protein, sequence analysis, can perform structural modelling, metabolic pathway analysis, biodegradation study and so on. This review highlights various methods of bioinformatics approach that describes the application of various databases and specific tools in the microbial dehalogenation fields with special focus on dehalogenase enzymes. Attempts have also been made to decipher some recent applications of in silico modeling methods that comprise of gene finding, protein modelling, Quantitative Structure Biodegradibility Relationship (QSBR) study and reconstruction of metabolic pathways employed in dehalogenation research area.

  1. Multi Sector Planning Tools for Trajectory-Based Operations

    Science.gov (United States)

    Prevot, Thomas; Mainini, Matthew; Brasil, Connie

    2010-01-01

    This paper discusses a suite of multi sector planning tools for trajectory-based operations that were developed and evaluated in the Airspace Operations Laboratory (AOL) at the NASA Ames Research Center. The toolset included tools for traffic load and complexity assessment as well as trajectory planning and coordination. The situation assessment tools included an integrated suite of interactive traffic displays, load tables, load graphs, and dynamic aircraft filters. The planning toolset allowed for single and multi aircraft trajectory planning and data communication-based coordination of trajectories between operators. Also newly introduced was a real-time computation of sector complexity into the toolset that operators could use in lieu of aircraft count to better estimate and manage sector workload, especially in situations with convective weather. The tools were used during a joint NASA/FAA multi sector planner simulation in the AOL in 2009 that had multiple objectives with the assessment of the effectiveness of the tools being one of them. Current air traffic control operators who were experienced as area supervisors and traffic management coordinators used the tools throughout the simulation and provided their usefulness and usability ratings in post simulation questionnaires. This paper presents these subjective assessments as well as the actual usage data that was collected during the simulation. The toolset was rated very useful and usable overall. Many elements received high scores by the operators and were used frequently and successfully. Other functions were not used at all, but various requests for new functions and capabilities were received that could be added to the toolset.

  2. Developmental screening tools: feasibility of use at primary healthcare level in low- and middle-income settings.

    Science.gov (United States)

    Fischer, Vinicius Jobim; Morris, Jodi; Martines, José

    2014-06-01

    An estimated 150 million children have a disability. Early identification of developmental disabilities is a high priority for the World Health Organization to allow action to reduce impairments through Gap Action Program on mental health. The study identified the feasibility of using the developmental screening and monitoring tools for children aged 0-3 year(s) by non-specialist primary healthcare providers in low-resource settings. A systematic review of the literature was conducted to identify the tools, assess their psychometric properties, and feasibility of use in low- and middle-income countries (LMICs). Key indicators to examine feasibility in LMICs were derived from a consultation with 23 international experts. We identified 426 studies from which 14 tools used in LMICs were extracted for further examination. Three tools reported adequate psychometric properties and met most of the feasibility criteria. Three tools appear promising for use in identifying and monitoring young children with disabilities at primary healthcare level in LMICs. Further research and development are needed to optimize these tools.

  3. Tracer-tracer relations as a tool for research on polar ozone loss

    Energy Technology Data Exchange (ETDEWEB)

    Mueller, Rolf

    2010-07-01

    The report includes the following chapters: (1) Introduction: ozone in the atmosphere, anthropogenic influence on the ozone layer, polar stratospheric ozone loss; (2) Tracer-tracer relations in the stratosphere: tracer-tracer relations as a tool in atmospheric research; impact of cosmic-ray-induced heterogeneous chemistry on polar ozone; (3) quantifying polar ozone loss from ozone-tracer relations: principles of tracer-tracer correlation techniques; reference ozone-tracer relations in the early polar vortex; impact of mixing on ozone-tracer relations in the polar vortex; impact of mesospheric intrusions on ozone-tracer relations in the stratospheric polar vortex calculation of chemical ozone loss in the arctic in March 2003 based on ILAS-II measurements; (4) epilogue.

  4. Mobile Devices: A Distraction, or a Useful Tool to Engage Nursing Students?

    Science.gov (United States)

    Gallegos, Cara; Nakashima, Hannah

    2018-03-01

    Engaging nursing students in theoretical courses, such as research, can be challenging. Innovative instructional strategies are essential to engage nursing students in theoretical nursing courses. This article describes an educational innovation using technology as a tool in an undergraduate nursing research class. All students in the course received iPads for the semester. Lecture material was presented in class using Nearpod, an interactive presentation embedded with slides, multimedia components, and learning activities. Students reported that using the mobile technology helped them minimize off-task activities, interact more with each other and the instructor, solve problems in the class, and develop skills and confidence related to their career. Allowing device use in the classroom, such as iPads and interactive mobile applications, can be a useful learning tool. Intentional use of technology and pedagogy can increase engagement and interaction with students. [J Nurs Educ. 2018;57(3):170-173.]. Copyright 2018, SLACK Incorporated.

  5. Research design: the methodology for interdisciplinary research framework.

    Science.gov (United States)

    Tobi, Hilde; Kampen, Jarl K

    2018-01-01

    Many of today's global scientific challenges require the joint involvement of researchers from different disciplinary backgrounds (social sciences, environmental sciences, climatology, medicine, etc.). Such interdisciplinary research teams face many challenges resulting from differences in training and scientific culture. Interdisciplinary education programs are required to train truly interdisciplinary scientists with respect to the critical factor skills and competences. For that purpose this paper presents the Methodology for Interdisciplinary Research (MIR) framework. The MIR framework was developed to help cross disciplinary borders, especially those between the natural sciences and the social sciences. The framework has been specifically constructed to facilitate the design of interdisciplinary scientific research, and can be applied in an educational program, as a reference for monitoring the phases of interdisciplinary research, and as a tool to design such research in a process approach. It is suitable for research projects of different sizes and levels of complexity, and it allows for a range of methods' combinations (case study, mixed methods, etc.). The different phases of designing interdisciplinary research in the MIR framework are described and illustrated by real-life applications in teaching and research. We further discuss the framework's utility in research design in landscape architecture, mixed methods research, and provide an outlook to the framework's potential in inclusive interdisciplinary research, and last but not least, research integrity.

  6. Improving nutrition surveillance and public health research in Central and Eastern Europe/Balkan Countries using the Balkan Food Platform and dietary tools.

    Science.gov (United States)

    Gurinović, Mirjana; Milešević, Jelena; Novaković, Romana; Kadvan, Agnes; Djekić-Ivanković, Marija; Šatalić, Zvonimir; Korošec, Mojca; Spiroski, Igor; Ranić, Marija; Dupouy, Eleonora; Oshaug, Arne; Finglas, Paul; Glibetić, Maria

    2016-02-15

    The objective of this paper is to share experience and provide updated information on Capacity Development in the Central and Eastern Europe/Balkan Countries (CEE/BC) region relevant to public health nutrition, particularly in creation of food composition databases (FCDBs), applying dietary intake assessment and monitoring tools, and harmonizing methodology for nutrition surveillance. Balkan Food Platform was established by a Memorandum of Understanding among EuroFIR AISBL, Institute for Medical Research, Belgrade, Capacity Development Network in Nutrition in CEE - CAPNUTRA and institutions from nine countries in the region. Inventory on FCDB status identified lack of harmonized and standardized research tools. To strengthen harmonization in CEE/BC in line with European research trends, the Network members collaborated in development of a Regional FCDB, using web-based food composition data base management software following EuroFIR standards. Comprehensive nutrition assessment and planning tool - DIET ASSESS & PLAN could enable synchronization of nutrition surveillance across countries. Copyright © 2015 Elsevier Ltd. All rights reserved.

  7. IT Tools for Teachers and Scientists, Created by Undergraduate Researchers

    Science.gov (United States)

    Millar, A. Z.; Perry, S.

    2007-12-01

    Interns in the Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT) program conduct computer science research for the benefit of earthquake scientists and have created products in growing use within the SCEC education and research communities. SCEC/UseIT comprises some twenty undergraduates who combine their varied talents and academic backgrounds to achieve a Grand Challenge that is formulated around needs of SCEC scientists and educators and that reflects the value SCEC places on the integration of computer science and the geosciences. In meeting the challenge, students learn to work on multidisciplinary teams and to tackle complex problems with no guaranteed solutions. Meantime, their efforts bring fresh perspectives and insight to the professionals with whom they collaborate, and consistently produces innovative, useful tools for research and education. The 2007 Grand Challenge was to design and prototype serious games to communicate important earthquake science concepts. Interns broke themselves into four game teams, the Educational Game, the Training Game, the Mitigation Game and the Decision-Making Game, and created four diverse games with topics from elementary plate tectonics to earthquake risk mitigation, with intended players ranging from elementary students to city planners. The games were designed to be versatile, to accommodate variation in the knowledge base of the player; and extensible, to accommodate future additions. The games are played on a web browser or from within SCEC-VDO (Virtual Display of Objects). SCEC-VDO, also engineered by UseIT interns, is a 4D, interactive, visualization software that enables integration and exploration of datasets and models such as faults, earthquake hypocenters and ruptures, digital elevation models, satellite imagery, global isochrons, and earthquake prediction schemes. SCEC-VDO enables the user to create animated movies during a session, and is now part

  8. The MEU web platform: a tool dedicated to urban energy management

    OpenAIRE

    Scartezzini, Jean-Louis; Puerto, Pablo; Pernet, Mathias; Capezzali, Massimiliano; Darmayan, Loïc; Cherix, Gaëtan

    2015-01-01

    The MEU GIS-enabled web-platform [1] has been developed in close collaboration with four Swiss cities. The tool enables detailed monitoring and planning for both energy demand and supply at individual building, neighborhood and whole city scale (http://meu.epfl.ch). This web-platform acts like an interface between different tools and allows to establish detailed energy balances for entire cities comprising several thousand buildings. In its present configuration, the MEU tool does not allow y...

  9. Diagnostic framework and health check tool for engineering and technology projects

    Directory of Open Access Journals (Sweden)

    Simon P Philbin

    2014-10-01

    Full Text Available Purpose: Development of a practitioner oriented diagnostic framework and health check tool to support the robust assessment of engineering and technology projects.Design/methodology/approach: The research is based on a literature review that draws together insights on project assessment and critical success factors to establish an integrated systems view of projects. This is extended to allow a comprehensive diagnostic framework to be developed along with a high-level health check tool that can be readily deployed on projects. The utility of the diagnostic framework and health check tool are explored through three illustrative case studies, with two from Canada and one from the United Kingdom. Findings andOriginality/value: The performance of engineering and technology projects can be viewed through a systems perspective and being a function of six sub-systems that are: process, technology, resources, impact, knowledge and culture. The diagnostic framework that is developed through this research integrates these sub-systems to provide a comprehensive assessment methodology for projects, which is linked to existing best practice for project reviews, performance management and maturity models. The case studies provide managerial insights that are related to the diagnostic framework but crucially also position the approach in the context of industrial applications for construction engineering and technology management.Research limitations/implications: The case study approach includes two case studies from the construction and facilities development sector with the third case study from the research and technology sector. Further work is required to investigate the use of the diagnostic framework and health check tool in other sectors.Practical implications: The health check tool will be of practical benefit to new projects managers that require access to a robust and convenient project review methodology for assessing the status and health of a

  10. PANTHER version 11: expanded annotation data from Gene Ontology and Reactome pathways, and data analysis tool enhancements.

    Science.gov (United States)

    Mi, Huaiyu; Huang, Xiaosong; Muruganujan, Anushya; Tang, Haiming; Mills, Caitlin; Kang, Diane; Thomas, Paul D

    2017-01-04

    The PANTHER database (Protein ANalysis THrough Evolutionary Relationships, http://pantherdb.org) contains comprehensive information on the evolution and function of protein-coding genes from 104 completely sequenced genomes. PANTHER software tools allow users to classify new protein sequences, and to analyze gene lists obtained from large-scale genomics experiments. In the past year, major improvements include a large expansion of classification information available in PANTHER, as well as significant enhancements to the analysis tools. Protein subfamily functional classifications have more than doubled due to progress of the Gene Ontology Phylogenetic Annotation Project. For human genes (as well as a few other organisms), PANTHER now also supports enrichment analysis using pathway classifications from the Reactome resource. The gene list enrichment tools include a new 'hierarchical view' of results, enabling users to leverage the structure of the classifications/ontologies; the tools also allow users to upload genetic variant data directly, rather than requiring prior conversion to a gene list. The updated coding single-nucleotide polymorphisms (SNP) scoring tool uses an improved algorithm. The hidden Markov model (HMM) search tools now use HMMER3, dramatically reducing search times and improving accuracy of E-value statistics. Finally, the PANTHER Tree-Attribute Viewer has been implemented in JavaScript, with new views for exploring protein sequence evolution. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  11. Correction Notice: Tools for Citizen-Science Recruitment and Student Engagement in Your Research and in Your Classroom

    Directory of Open Access Journals (Sweden)

    JMBE Production Editor

    2016-05-01

    Full Text Available Correction for Sarah E. Council and Julie E. Horvath, “Tools for Citizen-Science Recruitment and Student Engagement in Your Research and in Your Classroom,” which appeared in the Journal of Microbiology & Biology Education, volume 17, number 1, March 2016, pages 38–40.

  12. Tools for Understanding Identity

    Energy Technology Data Exchange (ETDEWEB)

    Creese, Sadie; Gibson-Robinson, Thomas; Goldsmith, Michael; Hodges, Duncan; Kim, Dee DH; Love, Oriana J.; Nurse, Jason R.; Pike, William A.; Scholtz, Jean

    2013-12-28

    Identity attribution and enrichment is critical to many aspects of law-enforcement and intelligence gathering; this identity typically spans a number of domains in the natural-world such as biographic information (factual information – e.g. names, addresses), biometric information (e.g. fingerprints) and psychological information. In addition to these natural-world projections of identity, identity elements are projected in the cyber-world. Conversely, undesirable elements may use similar techniques to target individuals for spear-phishing attacks (or worse), and potential targets or their organizations may want to determine how to minimize the attack surface exposed. Our research has been exploring the construction of a mathematical model for identity that supports such holistic identities. The model captures the ways in which an identity is constructed through a combination of data elements (e.g. a username on a forum, an address, a telephone number). Some of these elements may allow new characteristics to be inferred, hence enriching the holistic view of the identity. An example use-case would be the inference of real names from usernames, the ‘path’ created by inferring new elements of identity is highlighted in the ‘critical information’ panel. Individual attribution exercises can be understood as paths through a number of elements. Intuitively the entire realizable ‘capability’ can be modeled as a directed graph, where the elements are nodes and the inferences are represented by links connecting one or more antecedents with a conclusion. The model can be operationalized with two levels of tool support described in this paper, the first is a working prototype, the second is expected to reach prototype by July 2013: Understanding the Model The tool allows a user to easily determine, given a particular set of inferences and attributes, which elements or inferences are of most value to an investigator (or an attacker). The tool is also able to take

  13. Distribution view: a tool to write and simulate distributions

    OpenAIRE

    Coelho, José; Branco, Fernando; Oliveira, Teresa

    2006-01-01

    In our work we present a tool to write and simulate distributions. This tool allows to write mathematical expressions which can contain not only functions and variables, but also statistical distributions, including mixtures. Each time the expression is evaluated, for all inner distributions, is generated a value according to the distribution and is used for expression value determination. The inversion method can be used in this language, allowing to generate all distributions...

  14. The Potential to use Publication of Undergraduate Research as a Teaching Tool

    Science.gov (United States)

    Brevik, Eric C.; Lindbo, David L.; Belcher, Christopher

    2015-04-01

    Several studies crossing numerous disciplinary boundaries have demonstrated that undergraduate students benefit from research experiences. These benefits include personal and intellectual development, more and closer contact with faculty, the use of active learning techniques, the creation of high expectations, the development of creative and problem-solving skills, and the development of greater independence and intrinsic motivation to learn. The discipline also gains in that studies show undergraduates who engage in research experiences are more likely to remain science majors and finish their degree program. Research experiences come as close as possible to allowing undergraduates to experience what it is like to be an academic or research member of their profession working to advance their discipline, therefore enhancing their professional socialization into their chosen field. If the goals achieved by undergraduate research include introducing these students to the advancement of their chosen field, it stands to reason the ultimate ending to this experience would be the publication of a peer-reviewed paper. While not all undergraduate projects will end with a product worthy of peer-reviewed publication, some definitely do, and the personal experience of the authors indicates that undergraduate students who achieve publication get great satisfaction and a sense of personal achievement from that publication. While a top-tier international journal probably isn't going to be the ultimate destination for many of these projects, there are several appropriate outlets. The SSSA journal Soil Horizons has published several undergraduate projects in recent years, and good undergraduate projects can often be published in state academy of science journals. Journals focused expressly on publishing undergraduate research include the Journal of Undergraduate Research and Scholarly Excellence, Reinvention, and the American Journal of Undergraduate Research. Case studies of

  15. Polymerase chain reaction: A molecular diagnostic tool in periodontology

    Science.gov (United States)

    Maheaswari, Rajendran; Kshirsagar, Jaishree Tukaram; Lavanya, Nallasivam

    2016-01-01

    This review discusses the principles of polymerase chain reaction (PCR) and its application as a diagnostic tool in periodontology. The relevant MEDLINE and PubMed indexed journals were searched manually and electronically by typing PCR, applications of PCR, PCR in periodontics, polymorphism studies in periodontitis, and molecular techniques in periodontology. The searches were limited to articles in English language and the articles describing PCR process and its relation to periodontology were collected and used to prepare a concise review. PCR has now become a standard diagnostic and research tool in periodontology. Various studies reveal that its sensitivity and specificity allow it as a rapid, efficient method of detecting, identifying, and quantifying organism. Different immune and inflammatory markers can be identified at the mRNA expression level, and also the determination of genetic polymorphisms, thus providing the deeper insight into the mechanisms underlying the periodontal disease. PMID:27143822

  16. Raman Spectroscopy: An Emerging Tool in Neurodegenerative Disease Research and Diagnosis.

    Science.gov (United States)

    Devitt, George; Howard, Kelly; Mudher, Amrit; Mahajan, Sumeet

    2018-03-21

    The pathogenesis underlining many neurodegenerative diseases remains incompletely understood. The lack of effective biomarkers and disease preventative medicine demands the development of new techniques to efficiently probe the mechanisms of disease and to detect early biomarkers predictive of disease onset. Raman spectroscopy is an established technique that allows the label-free fingerprinting and imaging of molecules based on their chemical constitution and structure. While analysis of isolated biological molecules has been widespread in the chemical community, applications of Raman spectroscopy to study clinically relevant biological species, disease pathogenesis, and diagnosis have been rapidly increasing since the past decade. The growing number of biomedical applications has shown the potential of Raman spectroscopy for detection of novel biomarkers that could enable the rapid and accurate screening of disease susceptibility and onset. Here we provide an overview of Raman spectroscopy and related techniques and their application to neurodegenerative diseases. We further discuss their potential utility in research, biomarker detection, and diagnosis. Challenges to routine use of Raman spectroscopy in the context of neuroscience research are also presented.

  17. MARs Tools for Interactive ANalysis (MARTIAN): Google Maps Tools for Visual Exploration of Geophysical Modeling on Mars

    Science.gov (United States)

    Dimitrova, L. L.; Haines, M.; Holt, W. E.; Schultz, R. A.; Richard, G.; Haines, A. J.

    2006-12-01

    Interactive maps of surface-breaking faults and stress models on Mars provide important tools to engage undergraduate students, educators, and scientists with current geological and geophysical research. We have developed a map based on the Google Maps API -- an Internet based tool combining DHTML and AJAX, -- which allows very large maps to be viewed over the World Wide Web. Typically, small portions of the maps are downloaded as needed, rather than the entire image at once. This set-up enables relatively fast access for users with low bandwidth. Furthermore, Google Maps provides an extensible interactive interface making it ideal for visualizing multiple data sets at the user's choice. The Google Maps API works primarily with data referenced to latitudes and longitudes, which is then mapped in Mercator projection only. We have developed utilities for general cylindrical coordinate systems by converting these coordinates into equivalent Mercator projection before including them on the map. The MARTIAN project is available at http://rock.geo.sunysb.edu/~holt/Mars/MARTIAN/. We begin with an introduction to the Martian surface using a topography model. Faults from several datasets are classified by type (extension vs. compression) and by time epoch. Deviatoric stresses due to gravitational potential energy differences, calculated from the topography and crustal thickness, can be overlain. Several quantitative measures for the fit of the stress field to the faults are also included. We provide introductory text and exercises spanning a range of topics: how are faults identified, what stress is and how it relates to faults, what gravitational potential energy is and how variations in it produce stress, how the models are created, and how these models can be evaluated and interpreted. The MARTIAN tool is used at Stony Brook University in GEO 310: Introduction to Geophysics, a class geared towards junior and senior geosciences majors. Although this project is in its

  18. Assessment of the condition of a consumer market: interactive research

    Directory of Open Access Journals (Sweden)

    Anastasiya Yevgenyevna Sudakova

    2014-09-01

    Full Text Available Results of an assessment of a condition of the consumer market are presented in the article on the basis of official statistics data. At the heart of an assessment, the method of the indicative analysis lies. The technique includes five modules: quality of consumer goods, works, services; food security, nonfood safety; safety of services; security of participants of the consumer market. Also results of interactive Internet research of the condition of the ultimate consumer are presented in the article, by means of the carried out research. Interactive research is presented by 3 blocks: the general block (allows to make the respondent’s portrait; the special block (allows to estimate the changes in price, quality and the range of consumer goods and services; the additional block (allows respondents to leave comments. On the basis of the conducted research, it is possible to draw a conclusion that the assessment of the state received on the basis of methodical tools, shows positive dynamics, nevertheless, the condition of the consumer market remains unsatisfactory that also is confirmed by results of interactive research. The recommendations, allowing to lower the rise in prices and increase the quality of consumer goods and services are presented in the article

  19. American Recovery and Reinvestment Act-comparative effectiveness research infrastructure investments: emerging data resources, tools and publications.

    Science.gov (United States)

    Segal, Courtney; Holve, Erin

    2014-11-01

    The Recovery Act provided a substantial, one-time investment in data infrastructure for comparative effectiveness research (CER). A review of the publications, data, and tools developed as a result of this support has informed understanding of the level of effort undertaken by these projects. Structured search queries, as well as outreach efforts, were conducted to identify and review resources from American Recovery and Reinvestment Act of 2009 CER projects building electronic clinical data infrastructure. The findings from this study provide a spectrum of productivity across a range of topics and settings. A total of 451 manuscripts published in 192 journals, and 141 data resources and tools were identified and address gaps in evidence on priority populations, conditions, and the infrastructure needed to support CER.

  20. Keeping research reactors relevant: A pro-active approach for SLOWPOKE-2

    International Nuclear Information System (INIS)

    Cosby, L.R.; Bennett, L.G.I.; Nielsen, K.; Weir, R.

    2010-01-01

    The SLOWPOKE is a small, inherently safe, pool-type research reactor that was engineered and marketed by Atomic Energy of Canada Limited (AECL) in the 1970s and 80s. The original reactor, SLOWPOKE-1, was moved from Chalk River to the University of Toronto in 1970 and was operated until upgraded to the SLOWPOKE-2 reactor in 1973. In all, eight reactors in the two versions were produced and five are still in operation today, three having been decommissioned. All of the remaining reactors are designated as SLOWPOKE-2 reactors. These research reactors are prone to two major issues: aging components and lack of relevance to a younger audience. In order to combat these problems, one SLOWPOKE -2 facility has embraced a strategy that involves modernizing their reactor in order to keep the reactor up to date and relevant. In 2001, this facility replaced its aging analogue reactor control system with a digital control system. The system was successfully commissioned and has provided a renewed platform for student learning and research. The digital control system provides a better interface and allows flexibility in data storage and retrieval that was never possible with the analogue control system. This facility has started work on another upgrade to the digital control and instrumentation system that will be installed in 2010. The upgrade includes new computer hardware, updated software and a web-based simulation and training system that will allow licensed operators, students and researchers to use an online simulation tool for training, education and research. The tool consists of: 1) A dynamic simulation for reactor kinetics (e.g., core flux, power, core temperatures, etc). This tool is useful for operator training and student education; 2) Dynamic mapping of the reactor and pool container gamma and neutron fluxes as well as the vertical neutron beam tube flux. This research planning tool is used for various researchers who wish to do irradiations (e.g., neutron

  1. Post-Flight Data Analysis Tool

    Science.gov (United States)

    George, Marina

    2018-01-01

    A software tool that facilitates the retrieval and analysis of post-flight data. This allows our team and other teams to effectively and efficiently analyze and evaluate post-flight data in order to certify commercial providers.

  2. Multimedia Informed Consent Tool for a Low Literacy African Research Population: Development and Pilot-Testing

    OpenAIRE

    Afolabi, Muhammed Olanrewaju; Bojang, Kalifa; D’Alessandro, Umberto; Imoukhuede, Egeruan Babatunde; Ravinetto, Raffaella; Larson, Heidi Jane; McGrath, Nuala; Chandramohan, Daniel

    2014-01-01

    Background International guidelines recommend the use of appropriate informed consent procedures in low literacy research settings because written information is not known to guarantee comprehension of study information. Objectives This study developed and evaluated a multimedia informed consent tool for people with low literacy in an area where a malaria treatment trial was being planned in The Gambia. Methods We developed the informed consent document of the malaria treatment trial into a m...

  3. The Biobank Economic Modeling Tool (BEMT): Online Financial Planning to Facilitate Biobank Sustainability

    Science.gov (United States)

    Odeh, Hana; Miranda, Lisa; Rao, Abhi; Vaught, Jim; Greenman, Howard; McLean, Jeffrey; Reed, Daniel; Memon, Sarfraz; Fombonne, Benjamin; Guan, Ping

    2015-01-01

    Background: Biospecimens are essential resources for advancing basic and translational research. However, there are little data available regarding the costs associated with operating a biobank, and few resources to enable their long-term sustainability. To support the research community in this effort, the National Institutes of Health, National Cancer Institute's Biorepositories and Biospecimen Research Branch has developed the Biobank Economic Modeling Tool (BEMT). The tool is accessible at http://biospecimens.cancer.gov/resources/bemt.asp. Methods: To obtain market-based cost information and to inform the development of the tool, a survey was designed and sent to 423 biobank managers and directors across the world. The survey contained questions regarding infrastructure investments, salary costs, funding options, types of biospecimen resources and services offered, as well as biospecimen pricing and service-related costs. Results: A total of 106 responses were received. The data were anonymized, aggregated, and used to create a comprehensive database of cost and pricing information that was integrated into the web-based tool, the BEMT. The BEMT was built to allow the user to input cost and pricing data through a seven-step process to build a cost profile for their biobank, define direct and indirect costs, determine cost recovery fees, perform financial forecasting, and query the anonymized survey data from comparable biobanks. Conclusion: A survey was conducted to obtain a greater understanding of the costs involved in operating a biobank. The anonymized survey data was then used to develop the BEMT, a cost modeling tool for biobanks. Users of the tool will be able to create a cost profile for their biobanks' specimens, products and services, establish pricing, and allocate costs for biospecimens based on percent cost recovered, and perform project-specific cost analyses and financial forecasting. PMID:26697911

  4. The Biobank Economic Modeling Tool (BEMT): Online Financial Planning to Facilitate Biobank Sustainability.

    Science.gov (United States)

    Odeh, Hana; Miranda, Lisa; Rao, Abhi; Vaught, Jim; Greenman, Howard; McLean, Jeffrey; Reed, Daniel; Memon, Sarfraz; Fombonne, Benjamin; Guan, Ping; Moore, Helen M

    2015-12-01

    Biospecimens are essential resources for advancing basic and translational research. However, there are little data available regarding the costs associated with operating a biobank, and few resources to enable their long-term sustainability. To support the research community in this effort, the National Institutes of Health, National Cancer Institute's Biorepositories and Biospecimen Research Branch has developed the Biobank Economic Modeling Tool (BEMT). The tool is accessible at http://biospecimens.cancer.gov/resources/bemt.asp. To obtain market-based cost information and to inform the development of the tool, a survey was designed and sent to 423 biobank managers and directors across the world. The survey contained questions regarding infrastructure investments, salary costs, funding options, types of biospecimen resources and services offered, as well as biospecimen pricing and service-related costs. A total of 106 responses were received. The data were anonymized, aggregated, and used to create a comprehensive database of cost and pricing information that was integrated into the web-based tool, the BEMT. The BEMT was built to allow the user to input cost and pricing data through a seven-step process to build a cost profile for their biobank, define direct and indirect costs, determine cost recovery fees, perform financial forecasting, and query the anonymized survey data from comparable biobanks. A survey was conducted to obtain a greater understanding of the costs involved in operating a biobank. The anonymized survey data was then used to develop the BEMT, a cost modeling tool for biobanks. Users of the tool will be able to create a cost profile for their biobanks' specimens, products and services, establish pricing, and allocate costs for biospecimens based on percent cost recovered, and perform project-specific cost analyses and financial forecasting.

  5. The Meuse-Haute Marne underground research laboratory. A scientific research tool for the study of deep geologic disposal of radioactive wastes

    International Nuclear Information System (INIS)

    2006-01-01

    The Meuse-Haute Marne underground research laboratory, is an essential scientific tool for the achievement of one of the ANDRA's mission defined in the framework of the law from December 30, 1991 about the long-term management of high-level and long-living radioactive wastes. This document presents this laboratory: site characterization, characteristics of the Callovo-Oxfordian clay, and laboratory creation, coordinated experiments carried out at the surface and in depth, and the results obtained (published in an exhaustive way in the 'Clay 2005' dossier). (J.S.)

  6. A Data Management System for International Space Station Simulation Tools

    Science.gov (United States)

    Betts, Bradley J.; DelMundo, Rommel; Elcott, Sharif; McIntosh, Dawn; Niehaus, Brian; Papasin, Richard; Mah, Robert W.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Groups associated with the design, operational, and training aspects of the International Space Station make extensive use of modeling and simulation tools. Users of these tools often need to access and manipulate large quantities of data associated with the station, ranging from design documents to wiring diagrams. Retrieving and manipulating this data directly within the simulation and modeling environment can provide substantial benefit to users. An approach for providing these kinds of data management services, including a database schema and class structure, is presented. Implementation details are also provided as a data management system is integrated into the Intelligent Virtual Station, a modeling and simulation tool developed by the NASA Ames Smart Systems Research Laboratory. One use of the Intelligent Virtual Station is generating station-related training procedures in a virtual environment, The data management component allows users to quickly and easily retrieve information related to objects on the station, enhancing their ability to generate accurate procedures. Users can associate new information with objects and have that information stored in a database.

  7. The tools for evaluating logistics processes

    Directory of Open Access Journals (Sweden)

    Michał Adamczak

    2013-12-01

    Full Text Available Background: The growing importance of business process approach and dynamic management is triggered by market expectations for lead time reductions and the pressure for cost cuts. An efficient process management requires measurement and assessment skills. This article is intended to present the tools used in evaluating processes and the way in which they work together under simulated conditions. Methods: The project's Authors believe that a process can be assessed by measuring its attributes: cost, time and quality. An assessment tool has been developed for each of those attributes. For costs - it could be activity based costing, for time - value stream mapping; for quality - statistical process control. Each tool allows for evaluating one of the attributes, any element in the process hierarchy. The methods presented in the paper have been supplemented with process modelling and simulation. Results: In order to show how process assessment tools are combined with process simulation the Authors show a sample process in three versions (serial, parallel and mixed. A variant simulation (using iGrafx software allows for determining the values of attributes in the entire process based on the data set for its components (activities. In the example under investigation the process variant has no impact on its quality. Process cost and time are affected. Conclusions: The tools for identifying attribute values, in combination with process modelling and simulation, can prove very beneficial when applied in business practice. In the first place they allow for evaluating a process based on the value of the attributes pertaining to its particular activities, which, on the other hand, raises the possibility of process configuration at the design stage. The solution presented in the paper can be developed further with a view to process standardization and best variant recommendation.  

  8. Ubuntunet Alliance: A Collaborative Research Platform for Sharing of Technological Tools for Eradication of Brain Drain

    Directory of Open Access Journals (Sweden)

    Jameson Mbale

    2012-12-01

    Full Text Available The UbuntuNet Alliance Alliance is well-placed to facilitate interaction between education and research institutions and the African academic and researcher in the Diaspora so that together they can strengthen research that will exploit new technological tools and increase the industrial base. It is envisaged that the Alliance will become an important vehicle for linkages that will facilitate repatriation of scientific knowledge and skills to Africa and even help reduce and eventually eradicate the brain drain which has taken so many excellent intellectuals to the developed world. As organisational vehicles for inter-institutional collaboration both established and emerging NRENs can play a critical role in reversing these trends and in mitigating what appears to be the negative impact of the brain drain.

  9. Conducting Creativity Brainstorming Sessions in Small and Medium-Sized Enterprises Using Computer-Mediated Communication Tools

    Science.gov (United States)

    Murthy, Uday S.

    A variety of Web-based low cost computer-mediated communication (CMC) tools are now available for use by small and medium-sized enterprises (SME). These tools invariably incorporate chat systems that facilitate simultaneous input in synchronous electronic meeting environments, allowing what is referred to as “electronic brainstorming.” Although prior research in information systems (IS) has established that electronic brainstorming can be superior to face-to-face brainstorming, there is a lack of detailed guidance regarding how CMC tools should be optimally configured to foster creativity in SMEs. This paper discusses factors to be considered in using CMC tools for creativity brainstorming and proposes recommendations for optimally configuring CMC tools to enhance creativity in SMEs. The recommendations are based on lessons learned from several recent experimental studies on the use of CMC tools for rich brainstorming tasks that require participants to invoke domain-specific knowledge. Based on a consideration of the advantages and disadvantages of the various configuration options, the recommendations provided can form the basis for selecting a CMC tool for creativity brainstorming or for creating an in-house CMC tool for the purpose.

  10. Open source tools for management and archiving of digital microscopy data to allow integration with patient pathology and treatment information.

    Science.gov (United States)

    Khushi, Matloob; Edwards, Georgina; de Marcos, Diego Alonso; Carpenter, Jane E; Graham, J Dinny; Clarke, Christine L

    2013-02-12

    Virtual microscopy includes digitisation of histology slides and the use of computer technologies for complex investigation of diseases such as cancer. However, automated image analysis, or website publishing of such digital images, is hampered by their large file sizes. We have developed two Java based open source tools: Snapshot Creator and NDPI-Splitter. Snapshot Creator converts a portion of a large digital slide into a desired quality JPEG image. The image is linked to the patient's clinical and treatment information in a customised open source cancer data management software (Caisis) in use at the Australian Breast Cancer Tissue Bank (ABCTB) and then published on the ABCTB website (http://www.abctb.org.au) using Deep Zoom open source technology. Using the ABCTB online search engine, digital images can be searched by defining various criteria such as cancer type, or biomarkers expressed. NDPI-Splitter splits a large image file into smaller sections of TIFF images so that they can be easily analysed by image analysis software such as Metamorph or Matlab. NDPI-Splitter also has the capacity to filter out empty images. Snapshot Creator and NDPI-Splitter are novel open source Java tools. They convert digital slides into files of smaller size for further processing. In conjunction with other open source tools such as Deep Zoom and Caisis, this suite of tools is used for the management and archiving of digital microscopy images, enabling digitised images to be explored and zoomed online. Our online image repository also has the capacity to be used as a teaching resource. These tools also enable large files to be sectioned for image analysis. The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/5330903258483934.

  11. Open source tools for management and archiving of digital microscopy data to allow integration with patient pathology and treatment information

    Directory of Open Access Journals (Sweden)

    Khushi Matloob

    2013-02-01

    Full Text Available Abstract Background Virtual microscopy includes digitisation of histology slides and the use of computer technologies for complex investigation of diseases such as cancer. However, automated image analysis, or website publishing of such digital images, is hampered by their large file sizes. Results We have developed two Java based open source tools: Snapshot Creator and NDPI-Splitter. Snapshot Creator converts a portion of a large digital slide into a desired quality JPEG image. The image is linked to the patient’s clinical and treatment information in a customised open source cancer data management software (Caisis in use at the Australian Breast Cancer Tissue Bank (ABCTB and then published on the ABCTB website (http://www.abctb.org.au using Deep Zoom open source technology. Using the ABCTB online search engine, digital images can be searched by defining various criteria such as cancer type, or biomarkers expressed. NDPI-Splitter splits a large image file into smaller sections of TIFF images so that they can be easily analysed by image analysis software such as Metamorph or Matlab. NDPI-Splitter also has the capacity to filter out empty images. Conclusions Snapshot Creator and NDPI-Splitter are novel open source Java tools. They convert digital slides into files of smaller size for further processing. In conjunction with other open source tools such as Deep Zoom and Caisis, this suite of tools is used for the management and archiving of digital microscopy images, enabling digitised images to be explored and zoomed online. Our online image repository also has the capacity to be used as a teaching resource. These tools also enable large files to be sectioned for image analysis. Virtual Slides The virtual slide(s for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/5330903258483934

  12. Investigation of the Processing Parameters Impact on the Flexural Tool Vibrations While Drilling

    Directory of Open Access Journals (Sweden)

    I. I. Ivanov

    2015-01-01

    Full Text Available The paper considers an approach to analyze a dynamic stability of the drilling process in terms of tool flexibility. The proposed technique takes into consideration a regenerative effect leading to time delay in the dynamic system. This regenerative delay is the main source of arising dynamically unstable machining conditions. The paper describes a principle of emerging self-vibrations while cutting. It mentions the undesirable nature of transverse bending selfvibrations of tool, which cause a decreasing quality of the processed hole surface.The suggested approach consists in building a diagram of the drilling process stability for a tool model allowing only its flexural vibrations. The feature of the study is to describe tool dynamics using a finite element model based on the quadratic approximation of displacements for tool dynamics modeling. The assumption of an axial symmetry of drill geometry was discarded. The reduced model of tool was built taking into account two eigenvectors corresponding to tool bending. This model contains 2 degrees of freedom (DOF, which are, essentially, rotations of a drill tip. The technology of rigid multi-point constraints was used to connect those DOFs with solid finite element nodes. The system of delayed differential equations describing the reduced tool model dynamics was derived to estimate a dynamic stability of the drilling process. The Floquet theory is applied to build a stability diagram as a maximum multiplicator value versus a drill rotation rate. The presented diagram allows us to draw a conclusion that in the wide range of rotation frequencies transverse bending self- vibrations can be excited. The results obtained and the calculation technique may be used to choose the operation modes free from undesirable flexural self-vibrations of tool.The reported study was supported by RFBR within the framework of the research project ” mol_a”№ 14-08-31603 “Development of methods and algorithms for

  13. History plotting tool for Data Quality Monitoring

    International Nuclear Information System (INIS)

    Giordano, D.; Le Bihan, A.-C.; Pierro, A.; De Mattia, M.

    2010-01-01

    The size and complexity of the CMS detector makes the Data Quality Monitoring (DQM) system very challenging. Given the high granularity of the CMS sub-detectors, several approaches and tools have been developed to monitor the detector performance closely. We describe here the History DQM, a tool allowing the detector performance monitoring over time.

  14. A Collaborative Educational Association Rule Mining Tool

    Science.gov (United States)

    Garcia, Enrique; Romero, Cristobal; Ventura, Sebastian; de Castro, Carlos

    2011-01-01

    This paper describes a collaborative educational data mining tool based on association rule mining for the ongoing improvement of e-learning courses and allowing teachers with similar course profiles to share and score the discovered information. The mining tool is oriented to be used by non-expert instructors in data mining so its internal…

  15. Heat Treatment Optimization and Properties Correlation for H11-Type Hot-Work Tool Steel

    Science.gov (United States)

    Podgornik, B.; Puš, G.; Žužek, B.; Leskovšek, V.; Godec, M.

    2018-02-01

    The aim of this research was to determine the effect of vacuum-heat-treatment process parameters on the material properties and their correlations for low-Si-content AISI H11-type hot-work tool steel using a single Circumferentially Notched and fatigue Pre-cracked Tensile Bar (CNPTB) test specimen. The work was also focused on the potential of the proposed approach for designing advanced tempering diagrams and optimizing the vacuum heat treatment and design of forming tools. The results show that the CNPTB specimen allows a simultaneous determination and correlation of multiple properties for hot-work tool steels, with the compression and bending strength both increasing with hardness, and the strain-hardening exponent and bending strain increasing with the fracture toughness. On the other hand, the best machinability and surface quality of the hardened hot-work tool steel are obtained for hardness values between 46 and 50 HRC and a fracture toughness below 60 MPa√m.

  16. GIS tools for analyzing accidents and road design: A review

    Energy Technology Data Exchange (ETDEWEB)

    Satria, R.

    2016-07-01

    A significant unexpected outcome of transportation systems is road accidents with injuries and loss of lives. In recent years, the number of studies about the tools for analyzing accidents and road design has increased considerably. Among these tools, Geographical Information Systems (GIS) stand out for their ability to perform complex spatial analyses. However, sometimes the GIS, has been used only as a geographical database to store and represent data about accidents and road characteristics. It has also been used to represent the results of statistical studies of accidents but, these statistical studies have not been carried out with GIS. Owing to its integrated statistical-analysis capabilities GIS provides several advantages. First, it allows a more careful and accurate data selection, screening and reduction. Also, it allows a spatial analysis of the results in pre and post-processing. Second, GIS allows the development of spatial statistics that rely on geographically-referenced data. In this paper, several GIS tools used to model accidents have been examined. The understanding of these tools will help the analyst to make a better decision about which tool could be applied in each particular condition and context. (Author)

  17. Application of genomic tools in plant breeding.

    Science.gov (United States)

    Pérez-de-Castro, A M; Vilanova, S; Cañizares, J; Pascual, L; Blanca, J M; Díez, M J; Prohens, J; Picó, B

    2012-05-01

    Plant breeding has been very successful in developing improved varieties using conventional tools and methodologies. Nowadays, the availability of genomic tools and resources is leading to a new revolution of plant breeding, as they facilitate the study of the genotype and its relationship with the phenotype, in particular for complex traits. Next Generation Sequencing (NGS) technologies are allowing the mass sequencing of genomes and transcriptomes, which is producing a vast array of genomic information. The analysis of NGS data by means of bioinformatics developments allows discovering new genes and regulatory sequences and their positions, and makes available large collections of molecular markers. Genome-wide expression studies provide breeders with an understanding of the molecular basis of complex traits. Genomic approaches include TILLING and EcoTILLING, which make possible to screen mutant and germplasm collections for allelic variants in target genes. Re-sequencing of genomes is very useful for the genome-wide discovery of markers amenable for high-throughput genotyping platforms, like SSRs and SNPs, or the construction of high density genetic maps. All these tools and resources facilitate studying the genetic diversity, which is important for germplasm management, enhancement and use. Also, they allow the identification of markers linked to genes and QTLs, using a diversity of techniques like bulked segregant analysis (BSA), fine genetic mapping, or association mapping. These new markers are used for marker assisted selection, including marker assisted backcross selection, 'breeding by design', or new strategies, like genomic selection. In conclusion, advances in genomics are providing breeders with new tools and methodologies that allow a great leap forward in plant breeding, including the 'superdomestication' of crops and the genetic dissection and breeding for complex traits.

  18. Report of the 2. research co-ordination meeting of the co-ordinated research programme on the development of computer-based troubleshooting tools and instruments

    International Nuclear Information System (INIS)

    1998-11-01

    The Research coordination meeting reviewed current results on the Development of Computer-Based Troubleshooting Tools and Instruments. Presentations at the meeting were made by the participants, and the project summary reports include: PC based software for troubleshooting microprocessor-based instruments; technical data base software; design and construction of a random pulser for maintenance and quality control of a nuclear counting system; microprocessor-based power conditioner; in-circuit emulator for microprocessor-based nuclear instruments; PC-based analog signal generator for simulated detector signals and arbitrary test waveforms for testing of nuclear instruments; expert system for nuclear instrument troubleshooting; development and application of versatile computer-based measurement and diagnostic tools; and development of a programmable signal generator for troubleshooting of nuclear instrumentation

  19. Primers-4-Yeast: a comprehensive web tool for planning primers for Saccharomyces cerevisiae.

    Science.gov (United States)

    Yofe, Ido; Schuldiner, Maya

    2014-02-01

    The budding yeast Saccharomyces cerevisiae is a key model organism of functional genomics, due to its ease and speed of genetic manipulations. In fact, in this yeast, the requirement for homologous sequences for recombination purposes is so small that 40 base pairs (bp) are sufficient. Hence, an enormous variety of genetic manipulations can be performed by simply planning primers with the correct homology, using a defined set of transformation plasmids. Although designing primers for yeast transformations and for the verification of their correct insertion is a common task in all yeast laboratories, primer planning is usually done manually and a tool that would enable easy, automated primer planning for the yeast research community is still lacking. Here we introduce Primers-4-Yeast, a web tool that allows primers to be designed in batches for S. cerevisiae gene-targeting transformations, and for the validation of correct insertions. This novel tool enables fast, automated, accurate primer planning for large sets of genes, introduces consistency in primer planning and is therefore suggested to serve as a standard in yeast research. Primers-4-Yeast is available at: http://www.weizmann.ac.il/Primers-4-Yeast Copyright © 2013 John Wiley & Sons, Ltd.

  20. An Educational Tool for Interactive Parallel and Distributed Processing

    DEFF Research Database (Denmark)

    Pagliarini, Luigi; Lund, Henrik Hautop

    2011-01-01

    In this paper we try to describe how the Modular Interactive Tiles System (MITS) can be a valuable tool for introducing students to interactive parallel and distributed processing programming. This is done by providing an educational hands-on tool that allows a change of representation of the abs......In this paper we try to describe how the Modular Interactive Tiles System (MITS) can be a valuable tool for introducing students to interactive parallel and distributed processing programming. This is done by providing an educational hands-on tool that allows a change of representation...... of the abstract problems related to designing interactive parallel and distributed systems. Indeed, MITS seems to bring a series of goals into the education, such as parallel programming, distributedness, communication protocols, master dependency, software behavioral models, adaptive interactivity, feedback......, connectivity, topology, island modeling, user and multiuser interaction, which can hardly be found in other tools. Finally, we introduce the system of modular interactive tiles as a tool for easy, fast, and flexible hands-on exploration of these issues, and through examples show how to implement interactive...

  1. Lightweight object oriented structure analysis: tools for building tools to analyze molecular dynamics simulations.

    Science.gov (United States)

    Romo, Tod D; Leioatts, Nicholas; Grossfield, Alan

    2014-12-15

    LOOS (Lightweight Object Oriented Structure-analysis) is a C++ library designed to facilitate making novel tools for analyzing molecular dynamics simulations by abstracting out the repetitive tasks, allowing developers to focus on the scientifically relevant part of the problem. LOOS supports input using the native file formats of most common biomolecular simulation packages, including CHARMM, NAMD, Amber, Tinker, and Gromacs. A dynamic atom selection language based on the C expression syntax is included and is easily accessible to the tool-writer. In addition, LOOS is bundled with over 140 prebuilt tools, including suites of tools for analyzing simulation convergence, three-dimensional histograms, and elastic network models. Through modern C++ design, LOOS is both simple to develop with (requiring knowledge of only four core classes and a few utility functions) and is easily extensible. A python interface to the core classes is also provided, further facilitating tool development. © 2014 Wiley Periodicals, Inc.

  2. Orangutans (Pongo spp.) may prefer tools with rigid properties to flimsy tools.

    Science.gov (United States)

    Walkup, Kristina R; Shumaker, Robert W; Pruetz, Jill D

    2010-11-01

    Preference for tools with either rigid or flexible properties was explored in orangutans (Pongo spp.) through an extension of D. J. Povinelli, J. E. Reaux, and L. A. Theall's (2000) flimsy-tool problem. Three captive orangutans were presented with three unfamiliar pairs of tools to solve a novel problem. Although each orangutan has spontaneously used tools in the past, the tools presented in this study were novel to the apes. Each pair of tools contained one tool with rigid properties (functional) and one tool with flimsy properties (nonfunctional). Solving the problem required selection of a rigid tool to retrieve a food reward. The functional tool was selected in nearly all trials. Moreover, two of the orangutans demonstrated this within the first test trials with each of the three tool types. Although further research is required to test this statistically, it suggests either a preexisting preference for rigid tools or comprehension of the relevant features required in a tool to solve the task. The results of this study demonstrate that orangutans can recognize, or learn to recognize, relevant tool properties and can choose an appropriate tool to solve a problem. (PsycINFO Database Record (c) 2010 APA, all rights reserved).

  3. The Future of Earthquake Relocation Tools

    Science.gov (United States)

    Lecocq, T.; Caudron, C.

    2010-12-01

    Many scientists around the world use earthquake relocation software for their research. Some use "known" software like HYPODD or COMPLOC, while others use their own algorithms and codes. Often, beginners struggle to get one tool running or to properly configure input parameters. This Poster will be witness of debates that will take place during the Meeting, for example adressing questions like "Which program for which application?" ; "Standardized In/Outs?" , "Tectonic / Volcanic / Other ?" ; "All programs inside one single Super-Package?" ; "Common/Base Bibliography for the Relocation-Beginner?" ; "Continuous or Layered Velocity Model?" etc... We will also present the scheme of a Super-Package we are working on, grouping HYPODD [Waldhauser 2001], COMPLOC [Lin&Shearer 2006], LOTOS [Koulakov 2009] ; allowing standard in/outs for the 3 programs, and thus, the comparison of their outputs.

  4. Frontal affinity chromatography: A unique research tool for biospecific interaction that promotes glycobiology

    Science.gov (United States)

    KASAI, Kenichi

    2014-01-01

    Combination of bioaffinity and chromatography gave birth to affinity chromatography. A further combination with frontal analysis resulted in creation of frontal affinity chromatography (FAC). This new versatile research tool enabled detailed analysis of weak interactions that play essential roles in living systems, especially those between complex saccharides and saccharide-binding proteins. FAC now becomes the best method for the investigation of saccharide-binding proteins (lectins) from viewpoints of sensitivity, accuracy, and efficiency, and is contributing greatly to the development of glycobiology. It opened a door leading to deeper understanding of the significance of saccharide recognition in life. The theory is also concisely described. PMID:25169774

  5. U.S. Geological Survey community for data integration: data upload, registry, and access tool

    Science.gov (United States)

    ,

    2012-01-01

    As a leading science and information agency and in fulfillment of its mission to provide reliable scientific information to describe and understand the Earth, the U.S. Geological Survey (USGS) ensures that all scientific data are effectively hosted, adequately described, and appropriately accessible to scientists, collaborators, and the general public. To succeed in this task, the USGS established the Community for Data Integration (CDI) to address data and information management issues affecting the proficiency of earth science research. Through the CDI, the USGS is providing data and metadata management tools, cyber infrastructure, collaboration tools, and training in support of scientists and technology specialists throughout the project life cycle. One of the significant tools recently created to contribute to this mission is the Uploader tool. This tool allows scientists with limited data management resources to address many of the key aspects of the data life cycle: the ability to protect, preserve, publish and share data. By implementing this application inside ScienceBase, scientists also can take advantage of other collaboration capabilities provided by the ScienceBase platform.

  6. Facebook: an effective tool for participant retention in longitudinal research.

    Science.gov (United States)

    Mychasiuk, R; Benzies, K

    2012-09-01

    Facebook is currently one of the world's most visited websites, and home to millions of users who access their accounts on a regular basis. Owing to the website's ease of accessibility and free service, demographic characteristics of users span all domains. As such, Facebook may be a valuable tool for locating and communicating with participants in longitudinal research studies. This article outlines the benefit gained in a longitudinal follow-up study, of an intervention programme for at-risk families, through the use of Facebook as a search engine. Using Facebook as a resource, we were able to locate 19 participants that were otherwise 'lost' to follow-up, decreasing attrition in our study by 16%. Additionally, analysis indicated that hard-to-reach participants located with Facebook differed significantly on measures of receptive language and self-esteem when compared to their easier-to-locate counterparts. These results suggest that Facebook is an effective means of improving participant retention in a longitudinal intervention study and may help improve study validity by reaching participants that contribute differing results. © 2011 Blackwell Publishing Ltd.

  7. Research on PCPV for BWR - physical model as design tool - main results

    International Nuclear Information System (INIS)

    Fumagalli, E.; Verdelli, G.

    1975-01-01

    ISMES (Experimental Institute for Models and Structures) is now carrying out a series of tests on physical models as a part of a research programme sponsored by DSR (Studies and Research Direction) of ENEL (Italian State Electricity Board) on behalf of CPN (Nuclear Design and Construction Centre) of ENEL with the aim to experience a 'Thin'-walled PCPV for 'BWR'. The physical model, together with the mathematical model and the rheological model of the materials, is intended as a meaningful design tool. The mathematical model covers the overall structural design phase, (geometries) and the linear behaviour, whereas the physical model, besides of a global information to be compared with the results of the mathematical model, supplies a number of data as the non-linear behaviour up to failure and local conditions (penetration area etc.) are concerned. The aim of the first phase of this research programme is to make a comparison between the calculation and experiment tests as the thicknesses of the wall and the bottom slab are concerned, whereas the second phase of the research deals with the behaviour of the removable lid and its connection with the main structure. To do this, a model in scale 1:10 has been designed which symmetrically reproduces with respect to the equator, the bottom part of the structure. In the bottom slab the penetrations of the prototype design are reproduced, whereas the upper slab is plain. This paper describes the model, and illustrates the main results, underlining the different behaviour of the upper and bottom slabs up to collapse

  8. FIM imaging and FIMtrack: two new tools allowing high-throughput and cost effective locomotion analysis.

    Science.gov (United States)

    Risse, Benjamin; Otto, Nils; Berh, Dimitri; Jiang, Xiaoyi; Klämbt, Christian

    2014-12-24

    The analysis of neuronal network function requires a reliable measurement of behavioral traits. Since the behavior of freely moving animals is variable to a certain degree, many animals have to be analyzed, to obtain statistically significant data. This in turn requires a computer assisted automated quantification of locomotion patterns. To obtain high contrast images of almost translucent and small moving objects, a novel imaging technique based on frustrated total internal reflection called FIM was developed. In this setup, animals are only illuminated with infrared light at the very specific position of contact with the underlying crawling surface. This methodology results in very high contrast images. Subsequently, these high contrast images are processed using established contour tracking algorithms. Based on this, we developed the FIMTrack software, which serves to extract a number of features needed to quantitatively describe a large variety of locomotion characteristics. During the development of this software package, we focused our efforts on an open source architecture allowing the easy addition of further modules. The program operates platform independent and is accompanied by an intuitive GUI guiding the user through data analysis. All locomotion parameter values are given in form of csv files allowing further data analyses. In addition, a Results Viewer integrated into the tracking software provides the opportunity to interactively review and adjust the output, as might be needed during stimulus integration. The power of FIM and FIMTrack is demonstrated by studying the locomotion of Drosophila larvae.

  9. Toxicity Estimation Software Tool (TEST)

    Science.gov (United States)

    The Toxicity Estimation Software Tool (TEST) was developed to allow users to easily estimate the toxicity of chemicals using Quantitative Structure Activity Relationships (QSARs) methodologies. QSARs are mathematical models used to predict measures of toxicity from the physical c...

  10. CHILD ALLOWANCE

    CERN Multimedia

    Human Resources Division

    2001-01-01

    HR Division wishes to clarify to members of the personnel that the allowance for a dependent child continues to be paid during all training courses ('stages'), apprenticeships, 'contrats de qualification', sandwich courses or other courses of similar nature. Any payment received for these training courses, including apprenticeships, is however deducted from the amount reimbursable as school fees. HR Division would also like to draw the attention of members of the personnel to the fact that any contract of employment will lead to the suppression of the child allowance and of the right to reimbursement of school fees.

  11. Integration of ROOT Notebooks as a Web-based ATLAS Analysis tool for public data releases and outreach

    CERN Document Server

    Banda, Tea; CERN. Geneva. EP Department

    2016-01-01

    The project consists in the initial development of ROOT notebooks for a Z boson analysis in C++ programming language that will allow students and researches to perform fast and very useful data analysis, using ATLAS public data and Monte- Carlo simulations. Several tools are considered: ROOT Data Analysis Frame- work, Jupyter Notebook Technology and CERN-ROOT computing service so-called SWAN.

  12. Security for ICT collaboration tools

    NARCIS (Netherlands)

    Broenink, E.G.; Kleinhuis, G.; Fransen, F.

    2010-01-01

    In order for collaboration tools to be productive in an operational setting, an information base that is shared across the collaborating parties is needed. Therefore, a lot of research is done for tooling to create such a common information base in a collaboration tool. However, security is often

  13. Security for ICT collaboration tools

    NARCIS (Netherlands)

    Broenink, E.G.; Kleinhuis, G.; Fransen, F.

    2011-01-01

    In order for collaboration tools to be productive in an operational setting, an information base that is shared across the collaborating parties is needed. Therefore, a lot of research is done for tooling to create such a common information base in a collaboration tool. However, security is often

  14. An introduction to joint research by the USEPA and USGS on contaminants of emerging concern in source and treated drinking waters of the United States

    Science.gov (United States)

    Improvements in analytical methodology have allowed low-level detection of an ever increasing number of pharmaceuticals, personal care products, hormones, pathogens and other contaminants of emerging concern (CECs). The use of these improved analytical tools has allowed researche...

  15. New Tools and Methods for Assessing Risk-Management Strategies

    National Research Council Canada - National Science Library

    Vendlinski, Terry P; Munro, Allen; Chung, Gregory K; De la Cruz, Girlie C; Pizzini, Quentin A; Bewley, William L; Stuart, Gale; Baker, Eva L

    2004-01-01

    .... The Decision Analysis Tool (DAT) allowed subjects to use Expected Value and Multi-attribute Utility Theories to evaluate the risks and benefits of various acquisition alternatives, and allowed us to monitor the process subjects used...

  16. Trajectory Calculation as Forecasting Support Tool for Dust Storms

    Directory of Open Access Journals (Sweden)

    Sultan Al-Yahyai

    2014-01-01

    Full Text Available In arid and semiarid regions, dust storms are common during windy seasons. Strong wind can blow loose sand from the dry surface. The rising sand and dust is then transported to other places depending on the wind conditions (speed and direction at different levels of the atmosphere. Considering dust as a moving object in space and time, trajectory calculation then can be used to determine the path it will follow. Trajectory calculation is used as a forecast supporting tool for both operational and research activities. Predefined dust sources can be identified and the trajectories can be precalculated from the Numerical Weather Prediction (NWP forecast. In case of long distance transported dust, the tool should allow the operational forecaster to perform online trajectory calculation. This paper presents a case study for using trajectory calculation based on NWP models as a forecast supporting tool in Oman Meteorological Service during some dust storm events. Case study validation results showed a good agreement between the calculated trajectories and the real transport path of the dust storms and hence trajectory calculation can be used at operational centers for warning purposes.

  17. Web Viz 2.0: A versatile suite of tools for collaboration and visualization

    Science.gov (United States)

    Spencer, C.; Yuen, D. A.

    2012-12-01

    Most scientific applications on the web fail to realize the full collaborative potential of the internet by not utilizing web 2.0 technology. To relieve users from the struggle with software tools and allow them to focus on their research, new software developed for scientists and researchers must harness the full suite of web technology. For several years WebViz 1.0 enabled researchers with any web accessible device to interact with the peta-scale data generated by the Hierarchical Volume Renderer (HVR) system. We have developed a new iteration of WebViz that can be easily interfaced with many problem domains in addition to HVR by employing the best practices of software engineering and object-oriented programming. This is done by separating the core WebViz system from domain specific code at an interface, leveraging inheritance and polymorphism to allow newly developed modules access to the core services. We employed several design patterns (model-view-controller, singleton, observer, and application controller) to engineer this highly modular system implemented in Java.

  18. How many research nurses for how many clinical trials in an oncology setting? Definition of the Nursing Time Required by Clinical Trial-Assessment Tool (NTRCT-AT).

    Science.gov (United States)

    Milani, Alessandra; Mazzocco, Ketti; Stucchi, Sara; Magon, Giorgio; Pravettoni, Gabriella; Passoni, Claudia; Ciccarelli, Chiara; Tonali, Alessandra; Profeta, Teresa; Saiani, Luisa

    2017-02-01

    Few resources are available to quantify clinical trial-associated workload, needed to guide staffing and budgetary planning. The aim of the study is to describe a tool to measure clinical trials nurses' workload expressed in time spent to complete core activities. Clinical trials nurses drew up a list of nursing core activities, integrating results from literature searches with personal experience. The final 30 core activities were timed for each research nurse by an outside observer during daily practice in May and June 2014. Average times spent by nurses for each activity were calculated. The "Nursing Time Required by Clinical Trial-Assessment Tool" was created as an electronic sheet that combines the average times per specified activities and mathematic functions to return the total estimated time required by a research nurse for each specific trial. The tool was tested retrospectively on 141 clinical trials. The increasing complexity of clinical research requires structured approaches to determine workforce requirements. This study provides a tool to describe the activities of a clinical trials nurse and to estimate the associated time required to deliver individual trials. The application of the proposed tool in clinical research practice could provide a consistent structure for clinical trials nursing workload estimation internationally. © 2016 John Wiley & Sons Australia, Ltd.

  19. Hybrid Design Tools Intuit Interaction

    NARCIS (Netherlands)

    Wendrich, Robert E.; Kyvsgaard Hansen, P.; Rasmussen, J.; Jorgensen, K.A.; Tollestrup, C.

    2012-01-01

    Non-linear, non-explicit, non-standard thinking and ambiguity in design tools has a great impact on enhancement of creativity during ideation and conceptualization. Tacit-tangible representation based on a mere idiosyncratic and individual approach combined with computational assistance allows the

  20. PCTFPeval: a web tool for benchmarking newly developed algorithms for predicting cooperative transcription factor pairs in yeast.

    Science.gov (United States)

    Lai, Fu-Jou; Chang, Hong-Tsun; Wu, Wei-Sheng

    2015-01-01

    Computational identification of cooperative transcription factor (TF) pairs helps understand the combinatorial regulation of gene expression in eukaryotic cells. Many advanced algorithms have been proposed to predict cooperative TF pairs in yeast. However, it is still difficult to conduct a comprehensive and objective performance comparison of different algorithms because of lacking sufficient performance indices and adequate overall performance scores. To solve this problem, in our previous study (published in BMC Systems Biology 2014), we adopted/proposed eight performance indices and designed two overall performance scores to compare the performance of 14 existing algorithms for predicting cooperative TF pairs in yeast. Most importantly, our performance comparison framework can be applied to comprehensively and objectively evaluate the performance of a newly developed algorithm. However, to use our framework, researchers have to put a lot of effort to construct it first. To save researchers time and effort, here we develop a web tool to implement our performance comparison framework, featuring fast data processing, a comprehensive performance comparison and an easy-to-use web interface. The developed tool is called PCTFPeval (Predicted Cooperative TF Pair evaluator), written in PHP and Python programming languages. The friendly web interface allows users to input a list of predicted cooperative TF pairs from their algorithm and select (i) the compared algorithms among the 15 existing algorithms, (ii) the performance indices among the eight existing indices, and (iii) the overall performance scores from two possible choices. The comprehensive performance comparison results are then generated in tens of seconds and shown as both bar charts and tables. The original comparison results of each compared algorithm and each selected performance index can be downloaded as text files for further analyses. Allowing users to select eight existing performance indices and 15

  1. DengueTools: innovative tools and strategies for the surveillance and control of dengue.

    Science.gov (United States)

    Wilder-Smith, Annelies; Renhorn, Karl-Erik; Tissera, Hasitha; Abu Bakar, Sazaly; Alphey, Luke; Kittayapong, Pattamaporn; Lindsay, Steve; Logan, James; Hatz, Christoph; Reiter, Paul; Rocklöv, Joacim; Byass, Peter; Louis, Valérie R; Tozan, Yesim; Massad, Eduardo; Tenorio, Antonio; Lagneau, Christophe; L'Ambert, Grégory; Brooks, David; Wegerdt, Johannah; Gubler, Duane

    2012-01-01

    Dengue fever is a mosquito-borne viral disease estimated to cause about 230 million infections worldwide every year, of which 25,000 are fatal. Global incidence has risen rapidly in recent decades with some 3.6 billion people, over half of the world's population, now at risk, mainly in urban centres of the tropics and subtropics. Demographic and societal changes, in particular urbanization, globalization, and increased international travel, are major contributors to the rise in incidence and geographic expansion of dengue infections. Major research gaps continue to hamper the control of dengue. The European Commission launched a call under the 7th Framework Programme with the title of 'Comprehensive control of Dengue fever under changing climatic conditions'. Fourteen partners from several countries in Europe, Asia, and South America formed a consortium named 'DengueTools' to respond to the call to achieve better diagnosis, surveillance, prevention, and predictive models and improve our understanding of the spread of dengue to previously uninfected regions (including Europe) in the context of globalization and climate change.The consortium comprises 12 work packages to address a set of research questions in three areas:Research area 1: Develop a comprehensive early warning and surveillance system that has predictive capability for epidemic dengue and benefits from novel tools for laboratory diagnosis and vector monitoring.Research area 2: Develop novel strategies to prevent dengue in children.Research area 3: Understand and predict the risk of global spread of dengue, in particular the risk of introduction and establishment in Europe, within the context of parameters of vectorial capacity, global mobility, and climate change.In this paper, we report on the rationale and specific study objectives of 'DengueTools'. DengueTools is funded under the Health theme of the Seventh Framework Programme of the European Community, Grant Agreement Number: 282589 Dengue Tools.

  2. Authoring Issues beyond Tools

    Science.gov (United States)

    Spierling, Ulrike; Szilas, Nicolas

    Authoring is still considered a bottleneck in successful Interactive Storytelling and Drama. The claim for intuitive authoring tools is high, especially for tools that allow storytellers and artists to define dynamic content that can be run with an AI-based story engine. We explored two concrete authoring processes in depth, using various Interactive Storytelling prototypes, and have provided feedback from the practical steps. The result is a presentation of general issues in authoring Interactive Storytelling, rather than of particular problems with a specific system that could be overcome by 'simply' designing the right interface. Priorities for future developments are also outlined.

  3. The GNEMRE Dendro Tool.

    Energy Technology Data Exchange (ETDEWEB)

    Merchant, Bion John

    2007-10-01

    The GNEMRE Dendro Tool provides a previously unrealized analysis capability in the field of nuclear explosion monitoring. Dendro Tool allows analysts to quickly and easily determine the similarity between seismic events using the waveform time-series for each of the events to compute cross-correlation values. Events can then be categorized into clusters of similar events. This analysis technique can be used to characterize historical archives of seismic events in order to determine many of the unique sources that are present. In addition, the source of any new events can be quickly identified simply by comparing the new event to the historical set.

  4. Emerging role of bioinformatics tools and software in evolution of clinical research

    Directory of Open Access Journals (Sweden)

    Supreet Kaur Gill

    2016-01-01

    Full Text Available Clinical research is making toiling efforts for promotion and wellbeing of the health status of the people. There is a rapid increase in number and severity of diseases like cancer, hepatitis, HIV etc, resulting in high morbidity and mortality. Clinical research involves drug discovery and development whereas clinical trials are performed to establish safety and efficacy of drugs. Drug discovery is a long process starting with the target identification, validation and lead optimization. This is followed by the preclinical trials, intensive clinical trials and eventually post marketing vigilance for drug safety. Softwares and the bioinformatics tools play a great role not only in the drug discovery but also in drug development. It involves the use of informatics in the development of new knowledge pertaining to health and disease, data management during clinical trials and to use clinical data for secondary research. In addition, new technology likes molecular docking, molecular dynamics simulation, proteomics and quantitative structure activity relationship in clinical research results in faster and easier drug discovery process. During the preclinical trials, the software is used for randomization to remove bias and to plan study design. In clinical trials software like electronic data capture, Remote data capture and electronic case report form (eCRF is used to store the data. eClinical, Oracle clinical are software used for clinical data management and for statistical analysis of the data. After the drug is marketed the safety of a drug could be monitored by drug safety software like Oracle Argus or ARISg. Therefore, softwares are used from the very early stages of drug designing, to drug development, clinical trials and during pharmacovigilance. This review describes different aspects related to application of computers and bioinformatics in drug designing, discovery and development, formulation designing and clinical research.

  5. Programming database tools for the casual user

    International Nuclear Information System (INIS)

    Katz, R.A; Griffiths, C.

    1990-01-01

    The AGS Distributed Control System (AGSDCS) uses a relational database management system (INTERBASE) for the storage of all data associated with the control of the particle accelerator complex. This includes the static data which describes the component devices of the complex, as well as data for application program startup and data records that are used in analysis. Due to licensing restraints, it was necessary to develop tools to allow programs requiring access to a database to be unconcerned whether or not they were running on a licensed node. An in-house database server program was written, using Apollo mailbox communication protocols, allowing application programs via calls to this server to access the interbase database. Initially, the tools used by the server to actually access the database were written using the GDML C host language interface. Through the evolutionary learning process these tools have been converted to Dynamic SQL. Additionally, these tools have been extracted from the exclusive province of the database server and placed in their own library. This enables application programs to use these same tools on a licensed node without using the database server and without having to modify the application code. The syntax of the C calls remain the same

  6. Building a Community Infrastructure for Scalable On-Line Performance Analysis Tools around Open|SpeedShop

    Energy Technology Data Exchange (ETDEWEB)

    Galarowicz, James E. [Krell Institute, Ames, IA (United States); Miller, Barton P. [Univ. of Wisconsin, Madison, WI (United States). Computer Sciences Dept.; Hollingsworth, Jeffrey K. [Univ. of Maryland, College Park, MD (United States). Computer Sciences Dept.; Roth, Philip [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Future Technologies Group, Computer Science and Math Division; Schulz, Martin [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States). Center for Applied Scientific Computing (CASC)

    2013-12-19

    In this project we created a community tool infrastructure for program development tools targeting Petascale class machines and beyond. This includes tools for performance analysis, debugging, and correctness tools, as well as tuning and optimization frameworks. The developed infrastructure provides a comprehensive and extensible set of individual tool building components. We started with the basic elements necessary across all tools in such an infrastructure followed by a set of generic core modules that allow a comprehensive performance analysis at scale. Further, we developed a methodology and workflow that allows others to add or replace modules, to integrate parts into their own tools, or to customize existing solutions. In order to form the core modules, we built on the existing Open|SpeedShop infrastructure and decomposed it into individual modules that match the necessary tool components. At the same time, we addressed the challenges found in performance tools for petascale systems in each module. When assembled, this instantiation of community tool infrastructure provides an enhanced version of Open|SpeedShop, which, while completely different in its architecture, provides scalable performance analysis for petascale applications through a familiar interface. This project also built upon and enhances capabilities and reusability of project partner components as specified in the original project proposal. The overall project team’s work over the project funding cycle was focused on several areas of research, which are described in the following sections. The reminder of this report also highlights related work as well as preliminary work that supported the project. In addition to the project partners funded by the Office of Science under this grant, the project team included several collaborators who contribute to the overall design of the envisioned tool infrastructure. In particular, the project team worked closely with the other two DOE NNSA

  7. Designing Research Services: Cross-Disciplinary Administration and the Research Lifecycle

    Science.gov (United States)

    Madden, G.

    2017-12-01

    The sheer number of technical and administrative offices involved in the research lifecycle, and the lack of shared governance and shared processes across those offices, creates challenges to the successful preservation of research outputs. Universities need a more integrated approach to the research lifecycle that allows us to: recognize a research project as it is being initiated; identify the data associated with the research project; document and track any compliance, security, access, and publication requirements associated with the research and its data; follow the research and its associated components across the research lifecycle; and finally recognize that the research has come to a close so we can trigger the various preservation, access, and communications processes that close the loop, inform the public, and promote the continued progress of science. Such an approach will require cooperation, communications, and shared workflow tools that tie together (often across many years) PIs, research design methodologists, grants offices, contract negotiators, central research administrators, research compliance specialists, desktop IT support units, server administrators, high performance computing facilities, data centers, specialized data transfer networks, institutional research repositories, institutional data repositories, and research communications groups, all of which play a significant role in the technical or administrative success of research. This session will focus on progress towards improving cross-disciplinary administrative and technical cooperation at Penn State University, with an emphasis on generalizable approaches that can be adopted elsewhere.

  8. PET in tumor imaging: research only or a cost effective clinical tool?

    International Nuclear Information System (INIS)

    Wahl, R.L.

    1997-01-01

    PET imaging has for many years been a versatile tool for non-invasive imaging of neuro-physiology and, indeed, whole body physiology. Quantitative PET imaging of trace amounts of radioactivity is scientifically elegant and can be very complex. This lecture focuses on whether and where this test is clinically useful. Because of the research tradition, PET imaging has been perceived as an 'expensive' test, as it costs more per scan than CT and MRI scans at most institutions. Such a superficial analysis is incorrect, however, as it is increasingly recognized that imaging costs, which in some circumstances will be increased by the use of PET, are only a relatively small component of patient care costs. Thus, PET may raise imaging costs and the number of imaging procedures in some settings, though PET may reduce imaging test numbers in other settings. However, the analysis must focus on the total costs of patient management. Analyses focused on total patient care costs, including cost of hospitalization and cost surgery as well as imaging costs, have shown that PET can substantially reduce total patient care costs in several settings. This is achieved by providing a more accurate diagnosis, and thus having fewer instances of an incorrect diagnosis resulting in subsequent inappropriate surgery or investigations. Several institutions have shown scenarios in which PET for tumor imaging is cost effective. While the specific results of the analyses vary based on disease prevalence and cost input values for each procedure, as well as the projected performance of PET, the similar results showing total care cost savings in the management of several common cancers, strongly supports the rational for the use of PET in cancer management. In addition, promising clinical results are forthcoming in several other illnesses, suggesting PET will have broader utility than these uses, alone. Thus, while PET is an 'expensive' imaging procedure and has considerable utility as a research

  9. MicroTracker: a Data Management Tool for Facilitating the Education of Undergraduate Students in Laboratory Research Environments

    Directory of Open Access Journals (Sweden)

    David Ammons

    2010-10-01

    Full Text Available Many undergraduate laboratories are, too often, little more than an exercise in “cooking” where students are instructed step-by-step what to add, mix, and, most unfortunately, expect as an outcome. Although the shortcomings of “cookbook” laboratories are well known, they are considerably easier to manage than the more desirable inquiry-based laboratories. Thus the ability to quickly access, share, sort, and analyze research data would make a significant contribution towards the feasibility of teaching/mentoring large numbers of inexperienced students in an inquiry-based research environment, as well as facilitating research collaborations among students. Herein we report on a software tool (MicroTracker designed to address the educational problems that we experienced with inquiry-based research education due to constraints on data management and accessibility.

  10. HMMEditor: a visual editing tool for profile hidden Markov model

    Directory of Open Access Journals (Sweden)

    Cheng Jianlin

    2008-03-01

    Full Text Available Abstract Background Profile Hidden Markov Model (HMM is a powerful statistical model to represent a family of DNA, RNA, and protein sequences. Profile HMM has been widely used in bioinformatics research such as sequence alignment, gene structure prediction, motif identification, protein structure prediction, and biological database search. However, few comprehensive, visual editing tools for profile HMM are publicly available. Results We develop a visual editor for profile Hidden Markov Models (HMMEditor. HMMEditor can visualize the profile HMM architecture, transition probabilities, and emission probabilities. Moreover, it provides functions to edit and save HMM and parameters. Furthermore, HMMEditor allows users to align a sequence against the profile HMM and to visualize the corresponding Viterbi path. Conclusion HMMEditor provides a set of unique functions to visualize and edit a profile HMM. It is a useful tool for biological sequence analysis and modeling. Both HMMEditor software and web service are freely available.

  11. Evolution of allowable stresses in shear for lumber

    Science.gov (United States)

    Robert L. Ethington; William L. Galligan; Henry M. Montrey; Alan D. Freas

    1979-01-01

    This paper surveys research leading to allowable shear stress parallel to grain for lumber. In early flexure tests of lumber, some pieces failed in shear. The estimated shear stress at time of failure was generally lower than shear strength measured on small, clear, straight-grained specimens. This and other engineering observations gave rise to adjustments that...

  12. What Does the Future Hold for Scientific Journals? Visual Abstracts and Other Tools for Communicating Research.

    Science.gov (United States)

    Nikolian, Vahagn C; Ibrahim, Andrew M

    2017-09-01

    Journals fill several important roles within academic medicine, including building knowledge, validating quality of methods, and communicating research. This section provides an overview of these roles and highlights innovative approaches journals have taken to enhance dissemination of research. As journals move away from print formats and embrace web-based content, design-centered thinking will allow for engagement of a larger audience. Examples of recent efforts in this realm are provided, as well as simplified strategies for developing visual abstracts to improve dissemination via social media. Finally, we hone in on principles of learning and education which have driven these advances in multimedia-based communication in scientific research.

  13. FlowPing - The New Tool for Throughput and Stress Testing

    Directory of Open Access Journals (Sweden)

    Ondrej Vondrous

    2015-01-01

    Full Text Available This article presents a new tool for network throughput and stress testing. The FlowPing tool is easy to use, and its basic output is very similar to standard Linux ping application. The FlowPing tool is not limited to reach-ability or round trip time testing but is capable of complex UDP based throughput stress testing with rich reporting capabilities on client and server sides. Our new tool implements features, which allow the user to perform tests with variable packet size and traffic rate. All these features can be used in one single test run. This allows the user to use and develop new methodologies for network throughput and stress testing. With the FlowPing tool, it is easy to perform the test with the slowly increasing the amount of network traffic and monitor the behavior of network when the congestion occurs.

  14. Watershed Management Optimization Support Tool (WMOST) ...

    Science.gov (United States)

    EPA's Watershed Management Optimization Support Tool (WMOST) version 2 is a decision support tool designed to facilitate integrated water management by communities at the small watershed scale. WMOST allows users to look across management options in stormwater (including green infrastructure), wastewater, drinking water, and land conservation programs to find the least cost solutions. The pdf version of these presentations accompany the recorded webinar with closed captions to be posted on the WMOST web page. The webinar was recorded at the time a training workshop took place for EPA's Watershed Management Optimization Support Tool (WMOST, v2).

  15. LabKey Server NAb: A tool for analyzing, visualizing and sharing results from neutralizing antibody assays

    Directory of Open Access Journals (Sweden)

    Gao Hongmei

    2011-05-01

    Full Text Available Abstract Background Multiple types of assays allow sensitive detection of virus-specific neutralizing antibodies. For example, the extent of antibody neutralization of HIV-1, SIV and SHIV can be measured in the TZM-bl cell line through the degree of luciferase reporter gene expression after infection. In the past, neutralization curves and titers for this standard assay have been calculated using an Excel macro. Updating all instances of such a macro with new techniques can be unwieldy and introduce non-uniformity across multi-lab teams. Using Excel also poses challenges in centrally storing, sharing and associating raw data files and results. Results We present LabKey Server's NAb tool for organizing, analyzing and securely sharing data, files and results for neutralizing antibody (NAb assays, including the luciferase-based TZM-bl NAb assay. The customizable tool supports high-throughput experiments and includes a graphical plate template designer, allowing researchers to quickly adapt calculations to new plate layouts. The tool calculates the percent neutralization for each serum dilution based on luminescence measurements, fits a range of neutralization curves to titration results and uses these curves to estimate the neutralizing antibody titers for benchmark dilutions. Results, curve visualizations and raw data files are stored in a database and shared through a secure, web-based interface. NAb results can be integrated with other data sources based on sample identifiers. It is simple to make results public after publication by updating folder security settings. Conclusions Standardized tools for analyzing, archiving and sharing assay results can improve the reproducibility, comparability and reliability of results obtained across many labs. LabKey Server and its NAb tool are freely available as open source software at http://www.labkey.com under the Apache 2.0 license. Many members of the HIV research community can also access the Lab

  16. Department of defense environmental cleanup cost allowability policy. Master`s thesis

    Energy Technology Data Exchange (ETDEWEB)

    Murdock, J.M.

    1994-12-01

    The purpose of this thesis was to investigate the factors affecting the allowability determination of defense contractor environmental remediation costs. The primary objective of this thesis was to determine what policies and contracting cost principles the Department of Defense (DOD) should develop to address environmental costs in a consistent manner, providing a `single face` to industry. A secondary objective was to develop an audit framework and questions to allow for consistent policy analysis and application to a contractor`s proposed environmental remediation costs based upon the materiality of the situation. Background material was presented to show the amount and complexity of environmental regulations, the effects of current judicial decisions and DOD`s efforts to develop a consistent policy. Research material was provided from Congress, the General Accounting Office, DOD, defense contractors, California, Washington, industry associations and environmental protection coalitions. The researcher`s analysis of the material produced an environmental cost principle. This cost principle was applied to a current environmental claim, producing an audit framework and tailored list of cost and/or pricing data analysis, questions. Both the cost principle and audit framework are recommended for incorporation into DOD`s final environmental cost allowability decision.

  17. The Prospect of Neutron Scattering In the 21st Century: A Powerful Tool for Materials Research

    Directory of Open Access Journals (Sweden)

    E. Kartini

    2007-07-01

    Full Text Available Over the last 60 years research reactors (RRs have played an important role in technological and socio-economical development of mankind, such as radioisotope production for medicine, industry, research and education. Neutron scattering has been widely used for research and development in materials science. The prospect of neutron scattering as a powerful tool for materials research is increasing in the 21st century. This can be seen from the investment of several new neutron sources all over the world such as the Spallation Neutron Source (SNS in USA, the Japan Proton Accelerator Complex (JPARC in Japan, the new OPAL Reactor in Australia, and some upgrading to the existing sources at ISIS, Rutherford Appleton Laboratory, UK; Institute of Laue Langevin (ILL in Grenoble, France and Berlin Reactor, Germany. Developing countries with moderate flux research reactor have also been involved in this technique, such as India, Malaysia and Indonesia. The Siwabessy Multipurpose Reactor in Serpong, Indonesia that also produces thermal neutron has contributed to the research and development in the Asia Pacific Region. However, the international joint research among those countries plays an important role on optimizing the results.

  18. The Prospect of Neutron Scattering in The 21st Century : A Powerful Tool For Materials Research

    International Nuclear Information System (INIS)

    E-Kartini

    2007-01-01

    Over the last 60 years research reactors (RRs) have played an important role in technological and socio-economical development of mankind, such as radioisotope production for medicine, industry, research and education. Neutron scattering has been widely used for research and development in materials science. The prospect of neutron scattering as a powerful tool for materials research is increasing in the 21 st century. This can be seen from the investment of several new neutron sources all over the world such as the Spallation Neutron Source (SNS) in USA, the Japan Proton Accelerator Complex (JPARC) in Japan, the new OPAL Reactor in Australia, and some upgrading to the existing sources at ISIS, Rutherford Appleton Laboratory, UK; Institute of Laue Langevin (ILL) in Grenoble, France and Berlin Reactor, Germany. Developing countries with moderate flux research reactor have also been involved in this technique, such as India, Malaysia and Indonesia The Siwabessy Multipurpose Reactor in Serpong, Indonesia that also produces thermal neutron has contributed to the research and development in the Asia Pacific Region. However,the international joint research among those countries plays an important role on optimizing the results. (author)

  19. A software tool to estimate the dynamic behaviour of the IP2C samples as sensors for didactic purposes

    International Nuclear Information System (INIS)

    Graziani, S.; Pagano, F.; Pitrone, N.; Umana, E.

    2010-01-01

    Ionic Polymer Polymer Composites (IP 2 Cs) are emerging materials used to realize motion actuators and sensors. In the former case a voltage input is able to cause the membrane to bend while in the latter case by bending an IP 2 C membrane, a voltage output is obtained. In this paper authors introduce a software tool able to estimate the dynamic behaviour for sensors based on IP 2 Cs working in air. In the proposed tool, geometrical quantities that rule the sensing properties of IP 2 C-based transducers are taken into account together with their dynamic characteristics. A graphical interface (GUI) has been developed in order to give a useful tool that allows the user to understand the behaviour and the role of the parameters involved in the transduction phenomena. The tool is based on the idea that a graphical user interface will allow persons not skilled in IP 2 C materials to observe their behaviour and to analyze their characteristics. This could greatly increase the interest of researchers towards this new class of transducers; moreover, it can support the educational activity of students involved in advanced academical courses.

  20. Patient- and Caregiver-Reported Assessment Tools for Palliative Care: Summary of the 2017 Agency for Healthcare Research and Quality Technical Brief.

    Science.gov (United States)

    Aslakson, Rebecca A; Dy, Sydney M; Wilson, Renee F; Waldfogel, Julie; Zhang, Allen; Isenberg, Sarina R; Blair, Alex; Sixon, Joshua; Lorenz, Karl A; Robinson, Karen A

    2017-12-01

    Assessment tools are data collection instruments that are completed by or with patients or caregivers and which collect data at the individual patient or caregiver level. The objectives of this study are to 1) summarize palliative care assessment tools completed by or with patients or caregivers and 2) identify needs for future tool development and evaluation. We completed 1) a systematic review of systematic reviews; 2) a supplemental search of previous reviews and Web sites, and/or 3) a targeted search for primary articles when no tools existed in a domain. Paired investigators screened search results, assessed risk of bias, and abstracted data. We organized tools by domains from the National Consensus Project Clinical Practice Guidelines for Palliative Care and selected the most relevant, recent, and highest quality systematic review for each domain. We included 10 systematic reviews and identified 152 tools (97 from systematic reviews and 55 from supplemental sources). Key gaps included no systematic review for pain and few tools assessing structural, cultural, spiritual, or ethical/legal domains, or patient-reported experience with end-of-life care. Psychometric information was available for many tools, but few studies evaluated responsiveness (sensitivity to change) and no studies compared tools. Few to no tools address the spiritual, ethical, or cultural domains or patient-reported experience with end-of-life care. While some data exist on psychometric properties of tools, the responsiveness of different tools to change and/or comparisons between tools have not been evaluated. Future research should focus on developing or testing tools that address domains for which few tools exist, evaluating responsiveness, and comparing tools. Copyright © 2017 American Academy of Hospice and Palliative Medicine. All rights reserved.