WorldWideScience

Sample records for analytical tool research

  1. Microplasmas for chemical analysis: analytical tools or research toys?

    International Nuclear Information System (INIS)

    Karanassios, Vassili

    2004-01-01

    An overview of the activities of the research groups that have been involved in fabrication, development and characterization of microplasmas for chemical analysis over the last few years is presented. Microplasmas covered include: miniature inductively coupled plasmas (ICPs); capacitively coupled plasmas (CCPs); microwave-induced plasmas (MIPs); a dielectric barrier discharge (DBD); microhollow cathode discharge (MCHD) or microstructure electrode (MSE) discharges, other microglow discharges (such as those formed between 'liquid' electrodes); microplasmas formed in micrometer-diameter capillary tubes for gas chromatography (GC) or high-performance liquid chromatography (HPLC) applications, and a stabilized capacitive plasma (SCP) for GC applications. Sample introduction into microplasmas, in particular, into a microplasma device (MPD), battery operation of a MPD and of a mini- in-torch vaporization (ITV) microsample introduction system for MPDs, and questions of microplasma portability for use on site (e.g., in the field) are also briefly addressed using examples of current research. To emphasize the significance of sample introduction into microplasmas, some previously unpublished results from the author's laboratory have also been included. And an overall assessment of the state-of-the-art of analytical microplasma research is provided

  2. Current research relevant to the improvement of γ-ray spectroscopy as an analytical tool

    International Nuclear Information System (INIS)

    Meyer, R.A.; Tirsell, K.G.; Armantrout, G.A.

    1976-01-01

    Four areas of research that will have significant impact on the further development of γ-ray spectroscopy as an accurate analytical tool are considered. The areas considered are: (1) automation; (2) accurate multigamma ray sources; (3) accuracy of the current and future γ-ray energy scale, and (4) new solid state X and γ-ray detectors

  3. Lowering the Barrier to Reproducible Research by Publishing Provenance from Common Analytical Tools

    Science.gov (United States)

    Jones, M. B.; Slaughter, P.; Walker, L.; Jones, C. S.; Missier, P.; Ludäscher, B.; Cao, Y.; McPhillips, T.; Schildhauer, M.

    2015-12-01

    Scientific provenance describes the authenticity, origin, and processing history of research products and promotes scientific transparency by detailing the steps in computational workflows that produce derived products. These products include papers, findings, input data, software products to perform computations, and derived data and visualizations. The geosciences community values this type of information, and, at least theoretically, strives to base conclusions on computationally replicable findings. In practice, capturing detailed provenance is laborious and thus has been a low priority; beyond a lab notebook describing methods and results, few researchers capture and preserve detailed records of scientific provenance. We have built tools for capturing and publishing provenance that integrate into analytical environments that are in widespread use by geoscientists (R and Matlab). These tools lower the barrier to provenance generation by automating capture of critical information as researchers prepare data for analysis, develop, test, and execute models, and create visualizations. The 'recordr' library in R and the `matlab-dataone` library in Matlab provide shared functions to capture provenance with minimal changes to normal working procedures. Researchers can capture both scripted and interactive sessions, tag and manage these executions as they iterate over analyses, and then prune and publish provenance metadata and derived products to the DataONE federation of archival repositories. Provenance traces conform to the ProvONE model extension of W3C PROV, enabling interoperability across tools and languages. The capture system supports fine-grained versioning of science products and provenance traces. By assigning global identifiers such as DOIs, reseachers can cite the computational processes used to reach findings. And, finally, DataONE has built a web portal to search, browse, and clearly display provenance relationships between input data, the software

  4. Social Data Analytics Tool

    DEFF Research Database (Denmark)

    Hussain, Abid; Vatrapu, Ravi

    2014-01-01

    This paper presents the design, development and demonstrative case studies of the Social Data Analytics Tool, SODATO. Adopting Action Design Framework [1], the objective of SODATO [2] is to collect, store, analyze, and report big social data emanating from the social media engagement of and social...... media conversations about organizations. We report and discuss results from two demonstrative case studies that were conducted using SODATO and conclude with implications and future work....

  5. Biosensors: Future Analytical Tools

    Directory of Open Access Journals (Sweden)

    Vikas

    2007-02-01

    Full Text Available Biosensors offer considerable promises for attaining the analytic information in a faster, simpler and cheaper manner compared to conventional assays. Biosensing approach is rapidly advancing and applications ranging from metabolite, biological/ chemical warfare agent, food pathogens and adulterant detection to genetic screening and programmed drug delivery have been demonstrated. Innovative efforts, coupling micromachining and nanofabrication may lead to even more powerful devices that would accelerate the realization of large-scale and routine screening. With gradual increase in commercialization a wide range of new biosensors are thus expected to reach the market in the coming years.

  6. Developing a learning analytics tool

    DEFF Research Database (Denmark)

    Wahl, Christian; Belle, Gianna; Clemmensen, Anita Lykke

    This poster describes how learning analytics and collective intelligence can be combined in order to develop a tool for providing support and feedback to learners and teachers regarding students self-initiated learning activities.......This poster describes how learning analytics and collective intelligence can be combined in order to develop a tool for providing support and feedback to learners and teachers regarding students self-initiated learning activities....

  7. Narrative health research: Exploring big and small stories as analytical tools

    NARCIS (Netherlands)

    Sools, Anna Maria

    2013-01-01

    In qualitative health research many researchers use a narrative approach to study lay health concepts and experiences. In this article, I explore the theoretical linkages between the concepts narrative and health, which are used in a variety of ways. The article builds on previous work that

  8. Developing Healthcare Data Analytics APPs with Open Data Science Tools.

    Science.gov (United States)

    Hao, Bibo; Sun, Wen; Yu, Yiqin; Xie, Guotong

    2017-01-01

    Recent advances in big data analytics provide more flexible, efficient, and open tools for researchers to gain insight from healthcare data. Whilst many tools require researchers to develop programs with programming languages like Python, R and so on, which is not a skill set grasped by many researchers in the healthcare data analytics area. To make data science more approachable, we explored existing tools and developed a practice that can help data scientists convert existing analytics pipelines to user-friendly analytics APPs with rich interactions and features of real-time analysis. With this practice, data scientists can develop customized analytics pipelines as APPs in Jupyter Notebook and disseminate them to other researchers easily, and researchers can benefit from the shared notebook to perform analysis tasks or reproduce research results much more easily.

  9. Guidance for the Design and Adoption of Analytic Tools.

    Energy Technology Data Exchange (ETDEWEB)

    Bandlow, Alisa [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-12-01

    The goal is to make software developers aware of common issues that can impede the adoption of analytic tools. This paper provides a summary of guidelines, lessons learned and existing research to explain what is currently known about what analysts want and how to better understand what tools they do and don't need.

  10. PIXE as an analytical/educational tool

    International Nuclear Information System (INIS)

    Williams, E.T.

    1991-01-01

    The advantages and disadvantages of PIXE as an analytical method will be summarized. The authors will also discuss the advantages of PIXE as a means of providing interesting and feasible research projects for undergraduate science students

  11. Social Network Analysis as an Analytic Tool for Task Group Research: A Case Study of an Interdisciplinary Community of Practice

    Science.gov (United States)

    Lockhart, Naorah C.

    2017-01-01

    Group counselors commonly collaborate in interdisciplinary settings in health care, substance abuse, and juvenile justice. Social network analysis is a methodology rarely used in counseling research yet has potential to examine task group dynamics in new ways. This case study explores the scholarly relationships among 36 members of an…

  12. Public Interest Energy Research (PIER) Program Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Tengfang [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Flapper, Joris [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ke, Jing [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Kramer, Klaas [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sathaye, Jayant [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-02-01

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry - including four dairy processes - cheese, fluid milk, butter, and milk powder. BEST-Dairy tool developed in this project provides three options for the user to benchmark each of the dairy product included in the tool, with each option differentiated based on specific detail level of process or plant, i.e., 1) plant level; 2) process-group level, and 3) process-step level. For each detail level, the tool accounts for differences in production and other variables affecting energy use in dairy processes. The dairy products include cheese, fluid milk, butter, milk powder, etc. The BEST-Dairy tool can be applied to a wide range of dairy facilities to provide energy and water savings estimates, which are based upon the comparisons with the best available reference cases that were established through reviewing information from international and national samples. We have performed and completed alpha- and beta-testing (field testing) of the BEST-Dairy tool, through which feedback from voluntary users in the U.S. dairy industry was gathered to validate and improve the tool's functionality. BEST-Dairy v1.2 was formally published in May 2011, and has been made available for free downloads from the internet (i.e., http://best-dairy.lbl.gov). A user's manual has been developed and published as the companion documentation for use with the BEST-Dairy tool. In addition, we also carried out technology transfer activities by engaging the dairy industry in the process of tool development and testing, including field testing, technical presentations, and technical assistance throughout the project. To date, users from more than ten countries in addition to those in the U.S. have downloaded the BEST-Dairy from the LBNL website. It is expected that the use of BEST-Dairy tool will advance understanding of energy and

  13. Web analytics tools and web metrics tools: An overview and comparative analysis

    Directory of Open Access Journals (Sweden)

    Ivan Bekavac

    2015-10-01

    Full Text Available The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytics tools to exploring their functionalities and ability to be integrated into the respective business model. Web analytics tools support the business analyst’s efforts in obtaining useful and relevant insights into market dynamics. Thus, generally speaking, selecting a web analytics and web metrics tool should be based on an investigative approach, not a random decision. The second section is a quantitative focus shifting from theory to an empirical approach, and which subsequently presents output data resulting from a study based on perceived user satisfaction of web analytics tools. The empirical study was carried out on employees from 200 Croatian firms from either an either IT or marketing branch. The paper contributes to highlighting the support for management that available web analytics and web metrics tools available on the market have to offer, and based on the growing needs of understanding and predicting global market trends.

  14. Analytical Web Tool for CERES Products

    Science.gov (United States)

    Mitrescu, C.; Chu, C.; Doelling, D.

    2012-12-01

    The CERES project provides the community climate quality observed TOA fluxes, consistent cloud properties, and computed profile and surface fluxes. The 11-year long data set proves invaluable for remote sensing and climate modeling communities for annual global mean energy, meridianal heat transport, consistent cloud and fluxes and climate trends studies. Moreover, a broader audience interested in Earth's radiative properties such as green energy, health and environmental companies have showed their interest in CERES derived products. A few years ago, the CERES team start developing a new web-based Ordering Tool tailored for this wide diversity of users. Recognizing the potential that web-2.0 technologies can offer to both Quality Control (QC) and scientific data visualization and manipulation, the CERES team began introducing a series of specialized functions that addresses the above. As such, displaying an attractive, easy to use modern web-based format, the Ordering Tool added the following analytical functions: i) 1-D Histograms to display the distribution of the data field to identify outliers that are useful for QC purposes; ii) an "Anomaly" map that shows the regional differences between the current month and the climatological monthly mean; iii) a 2-D Histogram that can identify either potential problems with the data (i.e. QC function) or provides a global view of trends and/or correlations between various CERES flux, cloud, aerosol, and atmospheric properties. The large volume and diversity of data, together with the on-the-fly execution were the main challenges that had to be tackle with. Depending on the application, the execution was done on either the browser side or the server side with the help of auxiliary files. Additional challenges came from the use of various open source applications, the multitude of CERES products and the seamless transition from previous development. For the future, we plan on expanding the analytical capabilities of the

  15. Cryogenic Propellant Feed System Analytical Tool Development

    Science.gov (United States)

    Lusby, Brian S.; Miranda, Bruno M.; Collins, Jacob A.

    2011-01-01

    The Propulsion Systems Branch at NASA s Lyndon B. Johnson Space Center (JSC) has developed a parametric analytical tool to address the need to rapidly predict heat leak into propellant distribution lines based on insulation type, installation technique, line supports, penetrations, and instrumentation. The Propellant Feed System Analytical Tool (PFSAT) will also determine the optimum orifice diameter for an optional thermodynamic vent system (TVS) to counteract heat leak into the feed line and ensure temperature constraints at the end of the feed line are met. PFSAT was developed primarily using Fortran 90 code because of its number crunching power and the capability to directly access real fluid property subroutines in the Reference Fluid Thermodynamic and Transport Properties (REFPROP) Database developed by NIST. A Microsoft Excel front end user interface was implemented to provide convenient portability of PFSAT among a wide variety of potential users and its ability to utilize a user-friendly graphical user interface (GUI) developed in Visual Basic for Applications (VBA). The focus of PFSAT is on-orbit reaction control systems and orbital maneuvering systems, but it may be used to predict heat leak into ground-based transfer lines as well. PFSAT is expected to be used for rapid initial design of cryogenic propellant distribution lines and thermodynamic vent systems. Once validated, PFSAT will support concept trades for a variety of cryogenic fluid transfer systems on spacecraft, including planetary landers, transfer vehicles, and propellant depots, as well as surface-based transfer systems. The details of the development of PFSAT, its user interface, and the program structure will be presented.

  16. Analytical framework and tool kit for SEA follow-up

    International Nuclear Information System (INIS)

    Nilsson, Mans; Wiklund, Hans; Finnveden, Goeran; Jonsson, Daniel K.; Lundberg, Kristina; Tyskeng, Sara; Wallgren, Oskar

    2009-01-01

    Most Strategic Environmental Assessment (SEA) research and applications have so far neglected the ex post stages of the process, also called SEA follow-up. Tool kits and methodological frameworks for engaging effectively with SEA follow-up have been conspicuously missing. In particular, little has so far been learned from the much more mature evaluation literature although many aspects are similar. This paper provides an analytical framework and tool kit for SEA follow-up. It is based on insights and tools developed within programme evaluation and environmental systems analysis. It is also grounded in empirical studies into real planning and programming practices at the regional level, but should have relevance for SEA processes at all levels. The purpose of the framework is to promote a learning-oriented and integrated use of SEA follow-up in strategic decision making. It helps to identify appropriate tools and their use in the process, and to systematise the use of available data and knowledge across the planning organization and process. It distinguishes three stages in follow-up: scoping, analysis and learning, identifies the key functions and demonstrates the informational linkages to the strategic decision-making process. The associated tool kit includes specific analytical and deliberative tools. Many of these are applicable also ex ante, but are then used in a predictive mode rather than on the basis of real data. The analytical element of the framework is organized on the basis of programme theory and 'DPSIR' tools. The paper discusses three issues in the application of the framework: understanding the integration of organizations and knowledge; understanding planners' questions and analytical requirements; and understanding interests, incentives and reluctance to evaluate

  17. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    Energy Technology Data Exchange (ETDEWEB)

    Robinson Khosah

    2007-07-31

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project was conducted in two phases. Phase One included the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two involved the development of a platform for on-line data analysis. Phase Two included the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now technically completed.

  18. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    Energy Technology Data Exchange (ETDEWEB)

    Robinson P. Khosah; Frank T. Alex

    2007-02-11

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-eighth month of development activities.

  19. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    Energy Technology Data Exchange (ETDEWEB)

    Robinson P. Khosah; Charles G. Crawford

    2003-03-13

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase 1, which is currently in progress and will take twelve months to complete, will include the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. In Phase 2, which will be completed in the second year of the project, a platform for on-line data analysis will be developed. Phase 2 will include the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its sixth month of Phase

  20. Evaluation of Linked Data tools for Learning Analytics

    NARCIS (Netherlands)

    Drachsler, Hendrik; Herder, Eelco; d'Aquin, Mathieu; Dietze, Stefan

    2013-01-01

    Drachsler, H., Herder, E., d'Aquin, M., & Dietze, S. (2013, 8-12 April). Evaluation of Linked Data tools for Learning Analytics. Presentation given in the tutorial on 'Using Linked Data for Learning Analytics' at LAK2013, the Third Conference on Learning Analytics and Knowledge, Leuven, Belgium.

  1. Factors Influencing Beliefs for Adoption of a Learning Analytics Tool: An Empirical Study

    Science.gov (United States)

    Ali, Liaqat; Asadi, Mohsen; Gasevic, Dragan; Jovanovic, Jelena; Hatala, Marek

    2013-01-01

    Present research and development offer various learning analytics tools providing insights into different aspects of learning processes. Adoption of a specific tool for practice is based on how its learning analytics are perceived by educators to support their pedagogical and organizational goals. In this paper, we propose and empirically validate…

  2. Method for effective usage of Google Analytics tools

    Directory of Open Access Journals (Sweden)

    Ирина Николаевна Егорова

    2016-01-01

    Full Text Available Modern Google Analytics tools have been investigated against effective attraction channels for users and bottlenecks detection. Conducted investigation allowed to suggest modern method for effective usage of Google Analytics tools. The method is based on main traffic indicators analysis, as well as deep analysis of goals and their consecutive tweaking. Method allows to increase website conversion and might be useful for SEO and Web analytics specialists

  3. Identity as an Analytical Tool to Explore Students’ Mathematical Writing

    DEFF Research Database (Denmark)

    Iversen, Steffen Møllegaard

    Learning to communicate in, with and about mathematics is a key part of learning mathematics (Niss & Højgaard, 2011). Therefore, understanding how students’ mathematical writing is shaped is important to mathematics education research. In this paper the notion of ‘writer identity’ (Ivanič, 1998; ......; Burgess & Ivanič, 2010) is introduced and operationalised in relation to students’ mathematical writing and through a case study it is illustrated how this analytic tool can facilitate valuable insights when exploring students’ mathematical writing....

  4. Factors Influencing Attitudes Towards the Use of CRM’s Analytical Tools in Organizations

    Directory of Open Access Journals (Sweden)

    Šebjan Urban

    2016-02-01

    Full Text Available Background and Purpose: Information solutions for analytical customer relationship management CRM (aCRM IS that include the use of analytical tools are becoming increasingly important, due organizations’ need for knowledge of their customers and the ability to manage big data. The objective of the research is, therefore, to determine how the organizations’ orientations (process, innovation, and technology as critical organizational factors affect the attitude towards the use of the analytical tools of aCRM IS.

  5. Predictive Analytics in Information Systems Research

    OpenAIRE

    Shmueli, Galit; Koppius, Otto

    2011-01-01

    textabstractThis research essay highlights the need to integrate predictive analytics into information systems research and shows several concrete ways in which this goal can be accomplished. Predictive analytics include empirical methods (statistical and other) that generate data predictions as well as methods for assessing predictive power. Predictive analytics not only assist in creating practically useful models, they also play an important role alongside explanatory modeling in theory bu...

  6. Predictive Analytics in Information Systems Research

    NARCIS (Netherlands)

    G. Shmueli (Galit); O.R. Koppius (Otto)

    2011-01-01

    textabstractThis research essay highlights the need to integrate predictive analytics into information systems research and shows several concrete ways in which this goal can be accomplished. Predictive analytics include empirical methods (statistical and other) that generate data predictions as

  7. Analytical Tools for Space Suit Design

    Science.gov (United States)

    Aitchison, Lindsay

    2011-01-01

    As indicated by the implementation of multiple small project teams within the agency, NASA is adopting a lean approach to hardware development that emphasizes quick product realization and rapid response to shifting program and agency goals. Over the past two decades, space suit design has been evolutionary in approach with emphasis on building prototypes then testing with the largest practical range of subjects possible. The results of these efforts show continuous improvement but make scaled design and performance predictions almost impossible with limited budgets and little time. Thus, in an effort to start changing the way NASA approaches space suit design and analysis, the Advanced Space Suit group has initiated the development of an integrated design and analysis tool. It is a multi-year-if not decadal-development effort that, when fully implemented, is envisioned to generate analysis of any given space suit architecture or, conversely, predictions of ideal space suit architectures given specific mission parameters. The master tool will exchange information to and from a set of five sub-tool groups in order to generate the desired output. The basic functions of each sub-tool group, the initial relationships between the sub-tools, and a comparison to state of the art software and tools are discussed.

  8. Fiscal 1997 research report. International energy use rationalization project (Analytical tool research project for energy consumption efficiency improvement in Asia); 1997 nendo kokusai energy shiyo gorika nado taisaku jigyo chosa hokokusho. Asia energy shohi koritsuka bunseki tool chosa jigyo (honpen)

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-05-01

    Efforts have been under way to prepare inter-industry relations tables and energy data for four Asian countries, namely, China, Taiwan, Singapore and Malaysia, and a tool for energy consumption efficiency analysis has been developed and improved. In Chapter 1, energy supply and demand in the above-named four countries is reviewed on the basis of recent economic situations in these countries. In Chapter 2, bilateral inter-industry relations tables usable under the project are employed for the analysis of the economic status of each of the countries and energy transactions between them, and a method is described of converting the tables into one-nation inter-industry relations tables which meet the need of this project. In Chapter 3, national characteristics reflected on the respective energy input tables are described, and a method is shown of converting a nationally characterized unit energy table into a common unit energy input table for registration with a database. In Chapter 4, the constitution of the Asian energy consumption efficiency improvement analyzing tool and a system using the tool are explained. In Chapter 5, some examples of analyses conducted by use of the analyzing tool are shown, in which the energy saving effect and CO2 emission reduction effect are estimated for Indonesia by use of the analyzing tool. (NEDO)

  9. Artist - analytical RT inspection simulation tool

    International Nuclear Information System (INIS)

    Bellon, C.; Jaenisch, G.R.

    2007-01-01

    The computer simulation of radiography is applicable for different purposes in NDT such as for the qualification of NDT systems, the prediction of its reliability, the optimization of system parameters, feasibility analysis, model-based data interpretation, education and training of NDT/NDE personnel, and others. Within the framework of the integrated project FilmFree the radiographic testing (RT) simulation software developed by BAM is being further developed to meet practical requirements for inspection planning in digital industrial radiology. It combines analytical modelling of the RT inspection process with the CAD-orientated object description applicable to various industrial sectors such as power generation, railways and others. (authors)

  10. Electronic tongue: An analytical gustatory tool

    Directory of Open Access Journals (Sweden)

    Rewanthwar Swathi Latha

    2012-01-01

    Full Text Available Taste is an important organoleptic property governing acceptance of products for administration through mouth. But majority of drugs available are bitter in taste. For patient acceptability and compliance, bitter taste drugs are masked by adding several flavoring agents. Thus, taste assessment is one important quality control parameter for evaluating taste-masked formulations. The primary method for the taste measurement of drug substances and formulations is by human panelists. The use of sensory panelists is very difficult and problematic in industry and this is due to the potential toxicity of drugs and subjectivity of taste panelists, problems in recruiting taste panelists, motivation and panel maintenance are significantly difficult when working with unpleasant products. Furthermore, Food and Drug Administration (FDA-unapproved molecules cannot be tested. Therefore, analytical taste-sensing multichannel sensory system called as electronic tongue (e-tongue or artificial tongue which can assess taste have been replacing the sensory panelists. Thus, e-tongue includes benefits like reducing reliance on human panel. The present review focuses on the electrochemical concepts in instrumentation, performance qualification of E-tongue, and applications in various fields.

  11. Analytical and numerical tools for vacuum systems

    CERN Document Server

    Kersevan, R

    2007-01-01

    Modern particle accelerators have reached a level of sophistication which require a thorough analysis of all their sub-systems. Among the latter, the vacuum system is often a major contributor to the operating performance of a particle accelerator. The vacuum engineer has nowadays a large choice of computational schemes and tools for the correct analysis, design, and engineering of the vacuum system. This paper is a review of the different type of algorithms and methodologies which have been developed and employed in the field since the birth of vacuum technology. The different level of detail between simple back-of-the-envelope calculations and more complex numerical analysis is discussed by means of comparisons. The domain of applicability of each method is discussed, together with its pros and cons.

  12. "Research Tools": Tools for supporting research and publications

    OpenAIRE

    Ebrahim, Nader Ale

    2014-01-01

    Research Tools” can be defined as vehicles that broadly facilitate research and related activities. “Research Tools” enable researchers to collect, organize, analyze, visualize and publicized research outputs. Dr. Nader has collected over 700 tools that enable students to follow the correct path in research and to ultimately produce high-quality research outputs with more accuracy and efficiency. It is assembled as an interactive Web-based mind map, titled “Research Tools”, which is updated ...

  13. A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.

    Science.gov (United States)

    Płotka-Wasylka, J

    2018-05-01

    A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.

  14. Research tools | IDRC - International Development Research Centre

    International Development Research Centre (IDRC) Digital Library (Canada)

    Through training materials and guides, we aim to build skills and knowledge to enhance the quality of development research. We also offer free access to our database of funded research projects, known as IDRIS+, and our digital library. Our research tools include. Guide to research databases at IDRC: How to access and ...

  15. Using complaints to enhance quality improvement: developing an analytical tool.

    Science.gov (United States)

    Hsieh, Sophie Yahui

    2012-01-01

    This study aims to construct an instrument for identifying certain attributes or capabilities that might enable healthcare staff to use complaints to improve service quality. PubMed and ProQuest were searched, which in turn expanded access to other literature. Three paramount dimensions emerged for healthcare quality management systems: managerial, operational, and technical (MOT). The paper reveals that the managerial dimension relates to quality improvement program infrastructure. It contains strategy, structure, leadership, people and culture. The operational dimension relates to implementation processes: organizational changes and barriers when using complaints to enhance quality. The technical dimension emphasizes the skills, techniques or information systems required to achieve successfully continuous quality improvement. The MOT model was developed by drawing from the relevant literature. However, individuals have different training, interests and experiences and, therefore, there will be variance between researchers when generating the MOT model. The MOT components can be the guidelines for examining whether patient complaints are used to improve service quality. However, the model needs testing and validating by conducting further research before becoming a theory. Empirical studies on patient complaints did not identify any analytical tool that could be used to explore how complaints can drive quality improvement. This study developed an instrument for identifying certain attributes or capabilities that might enable healthcare professionals to use complaints and improve service quality.

  16. Promoting Efficacy Research on Functional Analytic Psychotherapy

    Science.gov (United States)

    Maitland, Daniel W. M.; Gaynor, Scott T.

    2012-01-01

    Functional Analytic Psychotherapy (FAP) is a form of therapy grounded in behavioral principles that utilizes therapist reactions to shape target behavior. Despite a growing literature base, there is a paucity of research to establish the efficacy of FAP. As a general approach to psychotherapy, and how the therapeutic relationship produces change,…

  17. Big Data Analytics Tools as Applied to ATLAS Event Data

    CERN Document Server

    Vukotic, Ilija; The ATLAS collaboration

    2016-01-01

    Big Data technologies have proven to be very useful for storage, processing and visualization of derived metrics associated with ATLAS distributed computing (ADC) services. Log file data and database records, and metadata from a diversity of systems have been aggregated and indexed to create an analytics platform for ATLAS ADC operations analysis. Dashboards, wide area data access cost metrics, user analysis patterns, and resource utilization efficiency charts are produced flexibly through queries against a powerful analytics cluster. Here we explore whether these techniques and analytics ecosystem can be applied to add new modes of open, quick, and pervasive access to ATLAS event data so as to simplify access and broaden the reach of ATLAS public data to new communities of users. An ability to efficiently store, filter, search and deliver ATLAS data at the event and/or sub-event level in a widely supported format would enable or significantly simplify usage of big data, statistical and machine learning tools...

  18. Analytical Modelling Of Milling For Tool Design And Selection

    International Nuclear Information System (INIS)

    Fontaine, M.; Devillez, A.; Dudzinski, D.

    2007-01-01

    This paper presents an efficient analytical model which allows to simulate a large panel of milling operations. A geometrical description of common end mills and of their engagement in the workpiece material is proposed. The internal radius of the rounded part of the tool envelope is used to define the considered type of mill. The cutting edge position is described for a constant lead helix and for a constant local helix angle. A thermomechanical approach of oblique cutting is applied to predict forces acting on the tool and these results are compared with experimental data obtained from milling tests on a 42CrMo4 steel for three classical types of mills. The influence of some tool's geometrical parameters on predicted cutting forces is presented in order to propose optimisation criteria for design and selection of cutting tools

  19. Applications of Spatial Data Using Business Analytics Tools

    Directory of Open Access Journals (Sweden)

    Anca Ioana ANDREESCU

    2011-12-01

    Full Text Available This paper addresses the possibilities of using spatial data in business analytics tools, with emphasis on SAS software. Various kinds of map data sets containing spatial data are presented and discussed. Examples of map charts illustrating macroeconomic parameters demonstrate the application of spatial data for the creation of map charts in SAS Enterprise Guise. Extended features of map charts are being exemplified by producing charts via SAS programming procedures.

  20. Ultrasonic trap as analytical tool; Die Ultraschallfalle als analytisches Werkzeug

    Energy Technology Data Exchange (ETDEWEB)

    Leiterer, Jork

    2009-11-30

    nanoparticles. This comprises fields of research like biomineralisation, protein agglomeration, distance dependent effects of nanocrystalline quantum dots and the in situ observation of early crystallization stages. In summary, the results of this work open a broad area of application to use the ultrasonic trap as an analytical tool. [German] Die Ultraschallfalle bietet eine besondere Moeglichkeit zur Handhabung von Proben im Mikrolitermassstab. Durch die akustische Levitation wird die Probe kontaktfrei in einer gasfoermigen Umgebung positioniert und somit dem Einfluss fester Oberflaechen entzogen. In dieser Arbeit werden die Moeglichkeiten der Ultraschallfalle fuer den Einsatz in der Analytik experimentell untersucht. Durch die Kopplung mit typischen kontaktlosen Analysemethoden wie der Spektroskopie und der Roentgenstreuung werden die Vorteile dieser Levitationstechnik an verschiedenen Materialien wie anorganischen, organischen, pharmazeutischen Substanzen bis hin zu Proteinen, Nano- und Mikropartikeln demonstriert. Es wird gezeigt, dass die Nutzung der akustischen Levitation zuverlaessig eine beruehrungslose Probenhandhabung fuer den Einsatz spektroskopischer Methoden (LIF, Raman) sowie erstmalig Methoden der Roentgenstreuung (EDXD, SAXS, WAXS) und Roentgenfluoreszenz (RFA, XANES) ermoeglicht. Fuer alle genannten Methoden erwies sich die wandlose Probenhalterung als vorteilhaft. So sind die Untersuchungsergebnisse vergleichbar mit denen herkoemmlicher Probenhalter und uebertreffen diese teilweise hinsichtlich der Datenqualitaet. Einen besonderen Erfolg stellt die Integration des akustischen Levitators in die experimentellen Aufbauten der Messplaetze am Synchrotron dar. Die Anwendung der Ultraschallfalle am BESSY konnte im Rahmen dieser Arbeit etabliert werden und bildet derzeit die Grundlage intensiver interdisziplinaerer Forschung. Ausserdem wurde das Potential der Falle zur Aufkonzentration erkannt und zum Studium verdunstungskontrollierter Prozesse angewendet. Die

  1. Analytical research using synchrotron radiation based techniques

    International Nuclear Information System (INIS)

    Jha, Shambhu Nath

    2015-01-01

    There are many Synchrotron Radiation (SR) based techniques such as X-ray Absorption Spectroscopy (XAS), X-ray Fluorescence Analysis (XRF), SR-Fourier-transform Infrared (SRFTIR), Hard X-ray Photoelectron Spectroscopy (HAXPS) etc. which are increasingly being employed worldwide in analytical research. With advent of modern synchrotron sources these analytical techniques have been further revitalized and paved ways for new techniques such as microprobe XRF and XAS, FTIR microscopy, Hard X-ray Photoelectron Spectroscopy (HAXPS) etc. The talk will cover mainly two techniques illustrating its capability in analytical research namely XRF and XAS. XRF spectroscopy: XRF spectroscopy is an analytical technique which involves the detection of emitted characteristic X-rays following excitation of the elements within the sample. While electron, particle (protons or alpha particles), or X-ray beams can be employed as the exciting source for this analysis, the use of X-ray beams from a synchrotron source has been instrumental in the advancement of the technique in the area of microprobe XRF imaging and trace level compositional characterisation of any sample. Synchrotron radiation induced X-ray emission spectroscopy, has become competitive with the earlier microprobe and nanoprobe techniques following the advancements in manipulating and detecting these X-rays. There are two important features that contribute to the superb elemental sensitivities of microprobe SR induced XRF: (i) the absence of the continuum (Bremsstrahlung) background radiation that is a feature of spectra obtained from charged particle beams, and (ii) the increased X-ray flux on the sample associated with the use of tunable third generation synchrotron facilities. Detection sensitivities have been reported in the ppb range, with values of 10 -17 g - 10 -14 g (depending on the particular element and matrix). Keeping in mind its demand, a microprobe XRF beamline has been setup by RRCAT at Indus-2 synchrotron

  2. Learning Analytics: Challenges and Future Research Directions

    Directory of Open Access Journals (Sweden)

    Vlatko Lukarov

    2014-12-01

    Full Text Available In recent years, learning analytics (LA has attracted a great deal of attention in technology-enhanced learning (TEL research as practitioners, institutions, and researchers are increasingly seeing the potential that LA has to shape the future TEL landscape. Generally, LA deals with the development of methods that harness educational data sets to support the learning process. This paper provides a foundation for future research in LA. It provides a systematic overview on this emerging field and its key concepts through a reference model for LA based on four dimensions, namely data, environments, context (what?, stakeholders (who?, objectives (why?, and methods (how?. It further identifies various challenges and research opportunities in the area of LA in relation to each dimension.

  3. Neutron activation analysis as analytical tool of environmental issue

    International Nuclear Information System (INIS)

    Otoshi, Tsunehiko

    2004-01-01

    Neutron activation analysis (NAA) ia applicable to the sample of wide range of research fields, such as material science, biology, geochemistry and so on. However, respecting the advantages of NAA, a sample with small amounts or a precious sample is the most suitable samples for NAA, because NAA is capable of trace analysis and non-destructive determination. In this paper, among these fields, NAA of atmospheric particulate matter (PM) sample is discussed emphasizing on the use of obtained data as an analytical tool of environmental issue. Concentration of PM in air is usually very low, and it is not easy to get vast amount of sample even using a high volume air sampling devise. Therefore, high sensitive NAA is suitable to determine elements in PM samples. Main components of PM is crust oriented silicate, and so on in rural/remote area, and carbonic materials and heavy metals are concentrated in PM in urban area, because of automobile exhaust and other anthropogenic emission source. Elemental pattern of PM reflects a condition of air around the monitoring site. Trends of air pollution can be traced by periodical monitoring of PM by NAA method. Elemental concentrations in air change by season. For example, crustal elements increase in dry season, and sea salts components increase their concentration when wind direction from sea is dominant. Elements that emitted from anthropogenic sources are mainly contained in fine portion of PM, and increase their concentration during winter season, when emission from heating system is high and air is stable. For further analysis and understanding of environmental issues, indicator elements for various emission sources, and elemental concentration ratios of some environmental samples and source portion assignment techniques are useful. (author)

  4. Micromechanical String Resonators: Analytical Tool for Thermal Characterization of Polymers

    DEFF Research Database (Denmark)

    Bose, Sanjukta; Schmid, Silvan; Larsen, Tom

    2014-01-01

    Resonant microstrings show promise as a new analytical tool for thermal characterization of polymers with only few nanograms of sample. The detection of the glass transition temperature (Tg) of an amorphous poly(d,l-lactide) (PDLLA) and a semicrystalline poly(l-lactide) (PLLA) is investigated....... The polymers are spray coated on one side of the resonating microstrings. The resonance frequency and quality factor (Q) are measured simultaneously as a function of temperature. Change in the resonance frequency reflects a change in static tensile stress, which yields information about the Young’s modulus...... of the polymer, and a change in Q reflects the change in damping of the polymer-coated string. The frequency response of the microstring is validated with an analytical model. From the frequency independent tensile stress change, static Tg values of 40.6 and 57.6 °C were measured for PDLLA and PLLA, respectively...

  5. Selecting analytical tools for characterization of polymersomes in aqueous solution

    DEFF Research Database (Denmark)

    Habel, Joachim Erich Otto; Ogbonna, Anayo; Larsen, Nanna

    2015-01-01

    Selecting the appropriate analytical methods for characterizing the assembly and morphology of polymer-based vesicles, or polymersomes are required to reach their full potential in biotechnology. This work presents and compares 17 different techniques for their ability to adequately report size....../purification. Of the analytical methods tested, Cryo-transmission electron microscopy and atomic force microscopy (AFM) turned out to be advantageous for polymersomes with smaller diameter than 200 nm, whereas confocal microscopy is ideal for diameters >400 nm. Polymersomes in the intermediate diameter range can be characterized...... using freeze fracture Cryo-scanning electron microscopy (FF-Cryo-SEM) and nanoparticle tracking analysis (NTA). Small angle X-ray scattering (SAXS) provides reliable data on bilayer thickness and internal structure, Cryo-TEM on multilamellarity. Taken together, these tools are valuable...

  6. A Tool Supporting Collaborative Data Analytics Workflow Design and Management

    Science.gov (United States)

    Zhang, J.; Bao, Q.; Lee, T. J.

    2016-12-01

    Collaborative experiment design could significantly enhance the sharing and adoption of the data analytics algorithms and models emerged in Earth science. Existing data-oriented workflow tools, however, are not suitable to support collaborative design of such a workflow, to name a few, to support real-time co-design; to track how a workflow evolves over time based on changing designs contributed by multiple Earth scientists; and to capture and retrieve collaboration knowledge on workflow design (discussions that lead to a design). To address the aforementioned challenges, we have designed and developed a technique supporting collaborative data-oriented workflow composition and management, as a key component toward supporting big data collaboration through the Internet. Reproducibility and scalability are two major targets demanding fundamental infrastructural support. One outcome of the project os a software tool, supporting an elastic number of groups of Earth scientists to collaboratively design and compose data analytics workflows through the Internet. Instead of recreating the wheel, we have extended an existing workflow tool VisTrails into an online collaborative environment as a proof of concept.

  7. Analytical Aerodynamic Simulation Tools for Vertical Axis Wind Turbines

    International Nuclear Information System (INIS)

    Deglaire, Paul

    2010-01-01

    Wind power is a renewable energy source that is today the fastest growing solution to reduce CO 2 emissions in the electric energy mix. Upwind horizontal axis wind turbine with three blades has been the preferred technical choice for more than two decades. This horizontal axis concept is today widely leading the market. The current PhD thesis will cover an alternative type of wind turbine with straight blades and rotating along the vertical axis. A brief overview of the main differences between the horizontal and vertical axis concept has been made. However the main focus of this thesis is the aerodynamics of the wind turbine blades. Making aerodynamically efficient turbines starts with efficient blades. Making efficient blades requires a good understanding of the physical phenomena and effective simulations tools to model them. The specific aerodynamics for straight bladed vertical axis turbine flow are reviewed together with the standard aerodynamic simulations tools that have been used in the past by blade and rotor designer. A reasonably fast (regarding computer power) and accurate (regarding comparison with experimental results) simulation method was still lacking in the field prior to the current work. This thesis aims at designing such a method. Analytical methods can be used to model complex flow if the geometry is simple. Therefore, a conformal mapping method is derived to transform any set of section into a set of standard circles. Then analytical procedures are generalized to simulate moving multibody sections in the complex vertical flows and forces experienced by the blades. Finally the fast semi analytical aerodynamic algorithm boosted by fast multipole methods to handle high number of vortices is coupled with a simple structural model of the rotor to investigate potential aeroelastic instabilities. Together with these advanced simulation tools, a standard double multiple streamtube model has been developed and used to design several straight bladed

  8. Selecting analytical tools for characterization of polymersomes in aqueous solution

    DEFF Research Database (Denmark)

    Habel, Joachim Erich Otto; Ogbonna, Anayo; Larsen, Nanna

    2015-01-01

    /purification. Of the analytical methods tested, Cryo-transmission electron microscopy and atomic force microscopy (AFM) turned out to be advantageous for polymersomes with smaller diameter than 200 nm, whereas confocal microscopy is ideal for diameters >400 nm. Polymersomes in the intermediate diameter range can be characterized...... using freeze fracture Cryo-scanning electron microscopy (FF-Cryo-SEM) and nanoparticle tracking analysis (NTA). Small angle X-ray scattering (SAXS) provides reliable data on bilayer thickness and internal structure, Cryo-TEM on multilamellarity. Taken together, these tools are valuable...

  9. The Journal of Learning Analytics: Supporting and Promoting Learning Analytics Research

    OpenAIRE

    Siemens, George

    2014-01-01

    The paper gives a brief overview of the main activities for the development of the emerging field of learning analytics led by the Society for Learning Analytics Research (SoLAR). The place of the Journal of Learning Analytics is identified Analytics is the most significant new initiative of SoLAR. 

  10. The "Journal of Learning Analytics": Supporting and Promoting Learning Analytics Research

    Science.gov (United States)

    Siemens, George

    2014-01-01

    The paper gives a brief overview of the main activities for the development of the emerging field of learning analytics led by the Society for Learning Analytics Research (SoLAR). The place of the "Journal of Learning Analytics" is identified. Analytics is the most significant new initiative of SoLAR.

  11. Analytical Tools to Improve Optimization Procedures for Lateral Flow Assays

    Directory of Open Access Journals (Sweden)

    Helen V. Hsieh

    2017-05-01

    Full Text Available Immunochromatographic or lateral flow assays (LFAs are inexpensive, easy to use, point-of-care medical diagnostic tests that are found in arenas ranging from a doctor’s office in Manhattan to a rural medical clinic in low resource settings. The simplicity in the LFA itself belies the complex task of optimization required to make the test sensitive, rapid and easy to use. Currently, the manufacturers develop LFAs by empirical optimization of material components (e.g., analytical membranes, conjugate pads and sample pads, biological reagents (e.g., antibodies, blocking reagents and buffers and the design of delivery geometry. In this paper, we will review conventional optimization and then focus on the latter and outline analytical tools, such as dynamic light scattering and optical biosensors, as well as methods, such as microfluidic flow design and mechanistic models. We are applying these tools to find non-obvious optima of lateral flow assays for improved sensitivity, specificity and manufacturing robustness.

  12. 33 CFR 385.33 - Revisions to models and analytical tools.

    Science.gov (United States)

    2010-07-01

    ... on a case-by-case basis what documentation is appropriate for revisions to models and analytic tools... analytical tools. 385.33 Section 385.33 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE... Incorporating New Information Into the Plan § 385.33 Revisions to models and analytical tools. (a) In carrying...

  13. Mapping healthcare systems: a policy relevant analytic tool.

    Science.gov (United States)

    Sekhri Feachem, Neelam; Afshar, Ariana; Pruett, Cristina; Avanceña, Anton L V

    2017-07-01

    In the past decade, an international consensus on the value of well-functioning systems has driven considerable health systems research. This research falls into two broad categories. The first provides conceptual frameworks that take complex healthcare systems and create simplified constructs of interactions and functions. The second focuses on granular inputs and outputs. This paper presents a novel translational mapping tool - the University of California, San Francisco mapping tool (the Tool) - which bridges the gap between these two areas of research, creating a platform for multi-country comparative analysis. Using the Murray-Frenk framework, we create a macro-level representation of a country's structure, focusing on how it finances and delivers healthcare. The map visually depicts the fundamental policy questions in healthcare system design: funding sources and amount spent through each source, purchasers, populations covered, provider categories; and the relationship between these entities. We use the Tool to provide a macro-level comparative analysis of the structure of India's and Thailand's healthcare systems. As part of the systems strengthening arsenal, the Tool can stimulate debate about the merits and consequences of different healthcare systems structural designs, using a common framework that fosters multi-country comparative analyses. © The Author 2017. Published by Oxford University Press on behalf of Royal Society of Tropical Medicine and Hygiene.

  14. Analytical Tools for Functional Assessment of Architectural Layouts

    Science.gov (United States)

    Bąkowski, Jarosław

    2017-10-01

    Functional layout of the building, understood as a layout or set of the facility rooms (or groups of rooms) with a system of internal communication, creates an environment and a place of mutual relations between the occupants of the object. Achieving optimal (from the occupants’ point of view) spatial arrangement is possible through activities that often go beyond the stage of architectural design. Adopted in the architectural design, most often during trial and error process or on the basis of previous experience (evidence-based design), functional layout is subject to continuous evaluation and dynamic changing since the beginning of its use. Such verification of the occupancy phase allows to plan future, possible transformations, as well as to develop model solutions for use in other settings. In broader terms, the research hypothesis is to examine whether and how the collected datasets concerning the facility and its utilization can be used to develop methods for assessing functional layout of buildings. In other words, if it is possible to develop an objective method of assessing functional layouts basing on a set of buildings’ parameters: technical, technological and functional ones and whether the method allows developing a set of tools enhancing the design methodology of complex functional objects. By linking the design with the construction phase it is possible to build parametric models of functional layouts, especially in the context of sustainable design or lean design in every aspect: ecological (by reducing the property’s impact on environment), economic (by optimizing its cost) and social (through the implementation of high-performance work environment). Parameterization of size and functional connections of the facility become part of the analyses, as well as the element of model solutions. The “lean” approach means the process of analysis of the existing scheme and consequently - finding weak points as well as means for eliminating these

  15. Analytical Tools for the Routine Use of the UMAE

    International Nuclear Information System (INIS)

    D'Auria, F.; Eramo, A.; Gianotti, W.

    1998-01-01

    UMAE (Uncertainty Methodology based on Accuracy Extrapolation) is methodology developed to calculate the uncertainties related to thermal-hydraulic code results in Nuclear Power Plant transient analysis. The use of the methodology has shown the need of specific analytical tools to simplify some steps of its application and making clearer individual procedures adopted in the development. Three of these have been recently completed and are illustrated in this paper. The first one makes it possible to attribute ''weight factors'' to the experimental Integral Test Facilities; results are also shown. The second one deals with the calculation of the accuracy of a code results. The concerned computer program makes a comparison between experimental and calculated trends of any quantity and gives as output the accuracy of the calculation. The third one consists in a computer program suitable to get continuous uncertainty bands from single valued points. (author)

  16. JEarth | Analytical Remote Sensing Imagery Application for Researchers and Practitioners

    Science.gov (United States)

    Prashad, L.; Christensen, P. R.; Anwar, S.; Dickenshied, S.; Engle, E.; Noss, D.

    2009-12-01

    The ASU 100 Cities Project and the ASU Mars Space Flight Facility (MSFF) present JEarth, a set of analytical Geographic Information System (GIS) tools for viewing and processing Earth-based remote sensing imagery and vectors, including high-resolution and hyperspectral imagery such as TIMS and MASTER. JEarth is useful for a wide range of researchers and practitioners who need to access, view, and analyze remote sensing imagery. JEarth stems from existing MSFF applications: the Java application JMars (Java Mission-planning and Analysis for Remote Sensing) for viewing and analyzing remote sensing imagery and THMPROC, a web-based, interactive tool for processing imagery to create band combinations, stretches, and other imagery products. JEarth users can run the application on their desktops by installing Java-based open source software on Windows, Mac, or Linux operating systems.

  17. Methodological and Analytical Dilemmas in Autoethnographic Research

    Directory of Open Access Journals (Sweden)

    Elena Maydell

    2010-01-01

    Full Text Available This article presents an argument on the application of theoretical and methodological frameworks to the study of identity from an autoethnographic perspective. In order to guide the analysis process, the author employed social constructionism as the main theoretical foundation, whereas thematic analysis and positioning theory were deployed as the methodological frameworks. Further, in the process of using ethnographic methods to study the identity of Russian immigrants to New Zealand, the author found herself also needing to use autoethnography to interrogate and understand her own journey. The insider/outsider position of the author who belongs to the same minority group became the most vital tool in her identity construction. In this regard, it is impossible to engage fully with the autoethnographic research practice without understanding the impact of others on identity construction of self, and a strong theoretical and methodological scholarship can provide a valuable foundation for this process.

  18. Network analytical tool for monitoring global food safety highlights China.

    Directory of Open Access Journals (Sweden)

    Tamás Nepusz

    Full Text Available BACKGROUND: The Beijing Declaration on food safety and security was signed by over fifty countries with the aim of developing comprehensive programs for monitoring food safety and security on behalf of their citizens. Currently, comprehensive systems for food safety and security are absent in many countries, and the systems that are in place have been developed on different principles allowing poor opportunities for integration. METHODOLOGY/PRINCIPAL FINDINGS: We have developed a user-friendly analytical tool based on network approaches for instant customized analysis of food alert patterns in the European dataset from the Rapid Alert System for Food and Feed. Data taken from alert logs between January 2003-August 2008 were processed using network analysis to i capture complexity, ii analyze trends, and iii predict possible effects of interventions by identifying patterns of reporting activities between countries. The detector and transgressor relationships are readily identifiable between countries which are ranked using i Google's PageRank algorithm and ii the HITS algorithm of Kleinberg. The program identifies Iran, China and Turkey as the transgressors with the largest number of alerts. However, when characterized by impact, counting the transgressor index and the number of countries involved, China predominates as a transgressor country. CONCLUSIONS/SIGNIFICANCE: This study reports the first development of a network analysis approach to inform countries on their transgressor and detector profiles as a user-friendly aid for the adoption of the Beijing Declaration. The ability to instantly access the country-specific components of the several thousand annual reports will enable each country to identify the major transgressors and detectors within its trading network. Moreover, the tool can be used to monitor trading countries for improved detector/transgressor ratios.

  19. Mutated genes as research tool

    International Nuclear Information System (INIS)

    1981-01-01

    Green plants are the ultimate source of all resources required for man's life, his food, his clothes, and almost all his energy requirements. Primitive prehistoric man could live from the abundance of nature surrounding him. Man today, dominating nature in terms of numbers and exploiting its limited resources, cannot exist without employing his intelligence to direct natural evolution. Plant sciences, therefore, are not a matter of curiosity but an essential requirement. From such considerations, the IAEA and FAO jointly organized a symposium to assess the value of mutation research for various kinds of plant science, which directly or indirectly might contribute to sustaining and improving crop production. The benefit through developing better cultivars that plant breeders can derive from using the additional genetic resources resulting from mutation induction has been assessed before at other FAO/IAEA meetings (Rome 1964, Pullman 1969, Ban 1974, Ibadan 1978) and is also monitored in the Mutation Breeding Newsletter, published by IAEA twice a year. Several hundred plant cultivars which carry economically important characters because their genes have been altered by ionizing radiation or other mutagens, are grown by farmers and horticulturists in many parts of the world. But the benefit derived from such mutant varieties is without any doubt surpassed by the contribution which mutation research has made towards the advancement of genetics. For this reason, a major part of the papers and discussions at the symposium dealt with the role induced-mutation research played in providing insight into gene action and gene interaction, the organization of genes in plant chromosomes in view of homology and homoeology, the evolutionary role of gene duplication and polyploidy, the relevance of gene blocks, the possibilities for chromosome engineering, the functioning of cytroplasmic inheritance and the genetic dynamics of populations. In discussing the evolutionary role of

  20. Development of computer-based analytical tool for assessing physical protection system

    International Nuclear Information System (INIS)

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-01

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer–based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system’s detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure

  1. Development of computer-based analytical tool for assessing physical protection system

    Energy Technology Data Exchange (ETDEWEB)

    Mardhi, Alim, E-mail: alim-m@batan.go.id [National Nuclear Energy Agency Indonesia, (BATAN), PUSPIPTEK area, Building 80, Serpong, Tangerang Selatan, Banten (Indonesia); Chulalongkorn University, Faculty of Engineering, Nuclear Engineering Department, 254 Phayathai Road, Pathumwan, Bangkok Thailand. 10330 (Thailand); Pengvanich, Phongphaeth, E-mail: ppengvan@gmail.com [Chulalongkorn University, Faculty of Engineering, Nuclear Engineering Department, 254 Phayathai Road, Pathumwan, Bangkok Thailand. 10330 (Thailand)

    2016-01-22

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer–based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system’s detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  2. Development of computer-based analytical tool for assessing physical protection system

    Science.gov (United States)

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-01

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer-based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system's detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  3. Scalable Combinatorial Tools for Health Disparities Research

    Directory of Open Access Journals (Sweden)

    Michael A. Langston

    2014-10-01

    Full Text Available Despite staggering investments made in unraveling the human genome, current estimates suggest that as much as 90% of the variance in cancer and chronic diseases can be attributed to factors outside an individual’s genetic endowment, particularly to environmental exposures experienced across his or her life course. New analytical approaches are clearly required as investigators turn to complicated systems theory and ecological, place-based and life-history perspectives in order to understand more clearly the relationships between social determinants, environmental exposures and health disparities. While traditional data analysis techniques remain foundational to health disparities research, they are easily overwhelmed by the ever-increasing size and heterogeneity of available data needed to illuminate latent gene x environment interactions. This has prompted the adaptation and application of scalable combinatorial methods, many from genome science research, to the study of population health. Most of these powerful tools are algorithmically sophisticated, highly automated and mathematically abstract. Their utility motivates the main theme of this paper, which is to describe real applications of innovative transdisciplinary models and analyses in an effort to help move the research community closer toward identifying the causal mechanisms and associated environmental contexts underlying health disparities. The public health exposome is used as a contemporary focus for addressing the complex nature of this subject.

  4. Analytical and ethical complexities in video game research

    DEFF Research Database (Denmark)

    Andersen, Mads Lund; Chimiri, Niklas Alexander; Søndergaard, Dorte Marie

    Session: Sociomaterial complexities in digital-analog spaces Abstract: Analytical and ethical complexities in video game research A central issue that video game research seldom explicitly articulates is the ethical complexities involved in its empirical and analytical work. The presentation...... explores common research questions posed and analytical foci chosen by video game researchers subscribing to either the media effects tradition, represented by (ref.) or to interdisciplinary Game Studies. Both fields, which tend to depict themselves as polar-opposites, build on ethical assumptions...... of theoretical or analytical arrogance. The relevance of acknowledging and situating ethical complexity becomes pertinent when alternatively taking a sociomaterial perspective on doing empirical and analytical work on video gaming. From an agential realist point of view, for instance, a researcher...

  5. Cluster Analysis as an Analytical Tool of Population Policy

    Directory of Open Access Journals (Sweden)

    Oksana Mikhaylovna Shubat

    2017-12-01

    Full Text Available The predicted negative trends in Russian demography (falling birth rates, population decline actualize the need to strengthen measures of family and population policy. Our research purpose is to identify groups of Russian regions with similar characteristics in the family sphere using cluster analysis. The findings should make an important contribution to the field of family policy. We used hierarchical cluster analysis based on the Ward method and the Euclidean distance for segmentation of Russian regions. Clustering is based on four variables, which allowed assessing the family institution in the region. The authors used the data of Federal State Statistics Service from 2010 to 2015. Clustering and profiling of each segment has allowed forming a model of Russian regions depending on the features of the family institution in these regions. The authors revealed four clusters grouping regions with similar problems in the family sphere. This segmentation makes it possible to develop the most relevant family policy measures in each group of regions. Thus, the analysis has shown a high degree of differentiation of the family institution in the regions. This suggests that a unified approach to population problems’ solving is far from being effective. To achieve greater results in the implementation of family policy, a differentiated approach is needed. Methods of multidimensional data classification can be successfully applied as a relevant analytical toolkit. Further research could develop the adaptation of multidimensional classification methods to the analysis of the population problems in Russian regions. In particular, the algorithms of nonparametric cluster analysis may be of relevance in future studies.

  6. Analytical quality by design: a tool for regulatory flexibility and robust analytics.

    Science.gov (United States)

    Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).

  7. INTELLECTUAL PROPERTY RIGHTS ISSUES FOR RESEARCH TOOLS IN BIOTECHNOLOGY RESEARCH

    Directory of Open Access Journals (Sweden)

    Rekha Chaturvedi

    2015-09-01

    Full Text Available The research tools refer to the resources researchers need to use in experimental work. In Biotechnology, these can include cell lines, monoclonal antibodies, reagents, animal models, growth factors, combinatorial chemistry libraries, drug and drug targets, clones and cloning tools (such as PCR, method, laboratory equipment and machines, database and computer software. Research tools therefore serve as basis for upstream research to improve the present product or process. There are several challenges in the way of using patented research tools. IP issues with regard to research tools are important and may sometime pose hindrance for researchers. Hence in the case of patented research tools, IPR issues can compose a major hurdle for technology development. In majority instances research tools are permitted through MTAs for academic research and for imparting education. TRIPS provides a provision for exception to patent rights for experimental use of patented technology in scientific research and several countries including India have included this provision in their patent legislation. For commercially important work, licensing of research tools can be based on royalty or one time lump sum payment. Some patent owners of important high-end research tools for development of platform technology create problems in licensing which can impede research. Usually cost of a commercially available research tool is built up in its price.

  8. RESEARCH CENTRIFUGE- ADVANCED TOOL SEPERATION

    OpenAIRE

    Mahajan Ashwini; Prof. B.V. Jain; Dr Surajj Sarode

    2015-01-01

    A centrifuge is a critical piece of equipment for the laboratory. Purpose of this study was to study research centrifuge in detail, its applications, uses in different branches and silent features. Their are two types of research centrifuge study here revolutionary research centrifuge and microprocessor research centrifuge. A centrifuge is a device that separates particles from a solution through use of a rotor. In biology, the particles are usually cells, sub cellular organelles, or large mo...

  9. Can We Practically Bring Physics-based Modeling Into Operational Analytics Tools?

    Energy Technology Data Exchange (ETDEWEB)

    Granderson, Jessica [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bonvini, Marco [Whisker Labs, Oakland, CA (United States); Piette, Mary Ann [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Page, Janie [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Lin, Guanjing [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hu, R. Lilly [Univ. of California, Berkeley, CA (United States)

    2017-08-11

    We present that analytics software is increasingly used to improve and maintain operational efficiency in commercial buildings. Energy managers, owners, and operators are using a diversity of commercial offerings often referred to as Energy Information Systems, Fault Detection and Diagnostic (FDD) systems, or more broadly Energy Management and Information Systems, to cost-effectively enable savings on the order of ten to twenty percent. Most of these systems use data from meters and sensors, with rule-based and/or data-driven models to characterize system and building behavior. In contrast, physics-based modeling uses first-principles and engineering models (e.g., efficiency curves) to characterize system and building behavior. Historically, these physics-based approaches have been used in the design phase of the building life cycle or in retrofit analyses. Researchers have begun exploring the benefits of integrating physics-based models with operational data analytics tools, bridging the gap between design and operations. In this paper, we detail the development and operator use of a software tool that uses hybrid data-driven and physics-based approaches to cooling plant FDD and optimization. Specifically, we describe the system architecture, models, and FDD and optimization algorithms; advantages and disadvantages with respect to purely data-driven approaches; and practical implications for scaling and replicating these techniques. Finally, we conclude with an evaluation of the future potential for such tools and future research opportunities.

  10. XCluSim: a visual analytics tool for interactively comparing multiple clustering results of bioinformatics data

    Science.gov (United States)

    2015-01-01

    Background Though cluster analysis has become a routine analytic task for bioinformatics research, it is still arduous for researchers to assess the quality of a clustering result. To select the best clustering method and its parameters for a dataset, researchers have to run multiple clustering algorithms and compare them. However, such a comparison task with multiple clustering results is cognitively demanding and laborious. Results In this paper, we present XCluSim, a visual analytics tool that enables users to interactively compare multiple clustering results based on the Visual Information Seeking Mantra. We build a taxonomy for categorizing existing techniques of clustering results visualization in terms of the Gestalt principles of grouping. Using the taxonomy, we choose the most appropriate interactive visualizations for presenting individual clustering results from different types of clustering algorithms. The efficacy of XCluSim is shown through case studies with a bioinformatician. Conclusions Compared to other relevant tools, XCluSim enables users to compare multiple clustering results in a more scalable manner. Moreover, XCluSim supports diverse clustering algorithms and dedicated visualizations and interactions for different types of clustering results, allowing more effective exploration of details on demand. Through case studies with a bioinformatics researcher, we received positive feedback on the functionalities of XCluSim, including its ability to help identify stably clustered items across multiple clustering results. PMID:26328893

  11. Data analytics research in public universities

    OpenAIRE

    Smeaton, Alan F.

    2014-01-01

    Research into big data in publicly-funded Universities and research centres has major disadvantages compared to the private sector, and not just in the obvious areas of funding and access to data. In this abstract we highlight some of these differences around the area of ethics and privacy, and two specific examples of our work are used to illustrate this.

  12. Analytical Modeling Tool for Design of Hydrocarbon Sensitive Optical Fibers

    Directory of Open Access Journals (Sweden)

    Khalil Al Handawi

    2017-09-01

    Full Text Available Pipelines are the main transportation means for oil and gas products across large distances. Due to the severe conditions they operate in, they are regularly inspected using conventional Pipeline Inspection Gages (PIGs for corrosion damage. The motivation for researching a real-time distributed monitoring solution arose to mitigate costs and provide a proactive indication of potential failures. Fiber optic sensors with polymer claddings provide a means of detecting contact with hydrocarbons. By coating the fibers with a layer of metal similar in composition to that of the parent pipeline, corrosion of this coating may be detected when the polymer cladding underneath is exposed to the surrounding hydrocarbons contained within the pipeline. A Refractive Index (RI change occurs in the polymer cladding causing a loss in intensity of a traveling light pulse due to a reduction in the fiber’s modal capacity. Intensity losses may be detected using Optical Time Domain Reflectometry (OTDR while pinpointing the spatial location of the contact via time delay calculations of the back-scattered pulses. This work presents a theoretical model for the above sensing solution to provide a design tool for the fiber optic cable in the context of hydrocarbon sensing following corrosion of an external metal coating. Results are verified against the experimental data published in the literature.

  13. Analytical Modeling Tool for Design of Hydrocarbon Sensitive Optical Fibers.

    Science.gov (United States)

    Al Handawi, Khalil; Vahdati, Nader; Shiryayev, Oleg; Lawand, Lydia

    2017-09-28

    Pipelines are the main transportation means for oil and gas products across large distances. Due to the severe conditions they operate in, they are regularly inspected using conventional Pipeline Inspection Gages (PIGs) for corrosion damage. The motivation for researching a real-time distributed monitoring solution arose to mitigate costs and provide a proactive indication of potential failures. Fiber optic sensors with polymer claddings provide a means of detecting contact with hydrocarbons. By coating the fibers with a layer of metal similar in composition to that of the parent pipeline, corrosion of this coating may be detected when the polymer cladding underneath is exposed to the surrounding hydrocarbons contained within the pipeline. A Refractive Index (RI) change occurs in the polymer cladding causing a loss in intensity of a traveling light pulse due to a reduction in the fiber's modal capacity. Intensity losses may be detected using Optical Time Domain Reflectometry (OTDR) while pinpointing the spatial location of the contact via time delay calculations of the back-scattered pulses. This work presents a theoretical model for the above sensing solution to provide a design tool for the fiber optic cable in the context of hydrocarbon sensing following corrosion of an external metal coating. Results are verified against the experimental data published in the literature.

  14. Advances in analytical tools for high throughput strain engineering

    DEFF Research Database (Denmark)

    Marcellin, Esteban; Nielsen, Lars Keld

    2018-01-01

    The emergence of inexpensive, base-perfect genome editing is revolutionising biology. Modern industrial biotechnology exploits the advances in genome editing in combination with automation, analytics and data integration to build high-throughput automated strain engineering pipelines also known...... as biofoundries. Biofoundries replace the slow and inconsistent artisanal processes used to build microbial cell factories with an automated design–build–test cycle, considerably reducing the time needed to deliver commercially viable strains. Testing and hence learning remains relatively shallow, but recent...... advances in analytical chemistry promise to increase the depth of characterization possible. Analytics combined with models of cellular physiology in automated systems biology pipelines should enable deeper learning and hence a steeper pitch of the learning cycle. This review explores the progress...

  15. Electrochemical sensors: a powerful tool in analytical chemistry

    Directory of Open Access Journals (Sweden)

    Stradiotto Nelson R.

    2003-01-01

    Full Text Available Potentiometric, amperometric and conductometric electrochemical sensors have found a number of interesting applications in the areas of environmental, industrial, and clinical analyses. This review presents a general overview of the three main types of electrochemical sensors, describing fundamental aspects, developments and their contribution to the area of analytical chemistry, relating relevant aspects of the development of electrochemical sensors in Brazil.

  16. A collaborative visual analytics suite for protein folding research.

    Science.gov (United States)

    Harvey, William; Park, In-Hee; Rübel, Oliver; Pascucci, Valerio; Bremer, Peer-Timo; Li, Chenglong; Wang, Yusu

    2014-09-01

    Molecular dynamics (MD) simulation is a crucial tool for understanding principles behind important biochemical processes such as protein folding and molecular interaction. With the rapidly increasing power of modern computers, large-scale MD simulation experiments can be performed regularly, generating huge amounts of MD data. An important question is how to analyze and interpret such massive and complex data. One of the (many) challenges involved in analyzing MD simulation data computationally is the high-dimensionality of such data. Given a massive collection of molecular conformations, researchers typically need to rely on their expertise and prior domain knowledge in order to retrieve certain conformations of interest. It is not easy to make and test hypotheses as the data set as a whole is somewhat "invisible" due to its high dimensionality. In other words, it is hard to directly access and examine individual conformations from a sea of molecular structures, and to further explore the entire data set. There is also no easy and convenient way to obtain a global view of the data or its various modalities of biochemical information. To this end, we present an interactive, collaborative visual analytics tool for exploring massive, high-dimensional molecular dynamics simulation data sets. The most important utility of our tool is to provide a platform where researchers can easily and effectively navigate through the otherwise "invisible" simulation data sets, exploring and examining molecular conformations both as a whole and at individual levels. The visualization is based on the concept of a topological landscape, which is a 2D terrain metaphor preserving certain topological and geometric properties of the high dimensional protein energy landscape. In addition to facilitating easy exploration of conformations, this 2D terrain metaphor also provides a platform where researchers can visualize and analyze various properties (such as contact density) overlayed on the

  17. Laser-induced plasma spectrometry: truly a surface analytical tool

    International Nuclear Information System (INIS)

    Vadillo, Jose M.; Laserna, J.

    2004-01-01

    For a long period, analytical applications of laser induced plasma spectrometry (LIPS) have been mainly restricted to overall and quantitative determination of elemental composition in bulk, solid samples. However, introduction of new compact and reliable solid state lasers and technological development in multidimensional intensified detectors have made possible the seeking of new analytical niches for LIPS where its analytical advantages (direct sampling from any material irrespective of its conductive status without sample preparation and with sensitivity adequate for many elements in different matrices) could be fully exploited. In this sense, the field of surface analysis could take advantage from the cited advantages taking into account in addition, the capability of LIPS for spot analysis, line scan, depth-profiling, area analysis and compositional mapping with a single instrument in air at atmospheric pressure. This review paper outlines the fundamental principles of laser-induced plasma emission relevant to sample surface studies, discusses the experimental parameters governing the spatial (lateral and in-depth) resolution in LIPS analysis and presents the applications concerning surface examination

  18. Comparison of Left Ventricular Hypertrophy by Electrocardiography and Echocardiography in Children Using Analytics Tool.

    Science.gov (United States)

    Tague, Lauren; Wiggs, Justin; Li, Qianxi; McCarter, Robert; Sherwin, Elizabeth; Weinberg, Jacqueline; Sable, Craig

    2018-05-17

    Left ventricular hypertrophy (LVH) is a common finding on pediatric electrocardiography (ECG) leading to many referrals for echocardiography (echo). This study utilizes a novel analytics tool that combines ECG and echo databases to evaluate ECG as a screening tool for LVH. SQL Server 2012 data warehouse incorporated ECG and echo databases for all patients from a single institution from 2006 to 2016. Customized queries identified patients 0-18 years old with LVH on ECG and an echo performed within 24 h. Using data visualization (Tableau) and analytic (Stata 14) software, ECG and echo findings were compared. Of 437,699 encounters, 4637 met inclusion criteria. ECG had high sensitivity (≥ 90%) but poor specificity (43%), and low positive predictive value (< 20%) for echo abnormalities. ECG performed only 11-22% better than chance (AROC = 0.50). 83% of subjects with LVH on ECG had normal left ventricle (LV) structure and size on echo. African-Americans with LVH were least likely to have an abnormal echo. There was a low correlation between V 6 R on ECG and echo-derived Z score of left ventricle diastolic diameter (r = 0.14) and LV mass index (r = 0.24). The data analytics client was able to mine a database of ECG and echo reports, comparing LVH by ECG and LV measurements and qualitative findings by echo, identifying an abnormal LV by echo in only 17% of cases with LVH on ECG. This novel tool is useful for rapid data mining for both clinical and research endeavors.

  19. R for marketing research and analytics

    CERN Document Server

    Chapman, Chris

    2015-01-01

    This book is a complete introduction to the power of R for marketing research practitioners. The text describes statistical models from a conceptual point of view with a minimal amount of mathematics, presuming only an introductory knowledge of statistics. Hands-on chapters accelerate the learning curve by asking readers to interact with R from the beginning. Core topics include the R language, basic statistics, linear modeling, and data visualization, which is presented throughout as an integral part of analysis. Later chapters cover more advanced topics yet are intended to be approachable for all analysts. These sections examine logistic regression, customer segmentation, hierarchical linear modeling, market basket analysis, structural equation modeling, and conjoint analysis in R. The text uniquely presents Bayesian models with a minimally complex approach, demonstrating and explaining Bayesian methods alongside traditional analyses for analysis of variance, linear models, and metric and choice-based conj...

  20. Web analytics as tool for improvement of website taxonomies

    DEFF Research Database (Denmark)

    Jonasen, Tanja Svarre; Ådland, Marit Kristine; Lykke, Marianne

    The poster examines how web analytics can be used to provide information about users and inform design and redesign of taxonomies. It uses a case study of the website Cancer.dk by the Danish Cancer Society. The society is a private organization with an overall goal to prevent the development...... provides information about e.g. subjects of interest, searching behaviour, browsing patterns in website structure as well as tag clouds, page views. The poster discusses benefits and challenges of the two web metrics, with a model of how to use search and tag data for the design of taxonomies, e.g. choice...

  1. Enhancing Safeguards through Information Analysis: Business Analytics Tools

    International Nuclear Information System (INIS)

    Vincent, J.; Midwinter, J.

    2015-01-01

    For the past 25 years the IBM i2 Intelligence Analysis product portfolio has assisted over 4,500 organizations across law enforcement, defense, government agencies, and commercial private sector businesses to maximize the value of the mass of information to discover and disseminate actionable intelligence that can help identify, investigate, predict, prevent, and disrupt criminal, terrorist, and fraudulent acts; safeguarding communities, organizations, infrastructures, and investments. The collaborative Intelligence Analysis environment delivered by i2 is specifically designed to be: · scalable: supporting business needs as well as operational and end user environments · modular: an architecture which can deliver maximum operational flexibility with ability to add complimentary analytics · interoperable: integrating with existing environments and eases information sharing across partner agencies · extendable: providing an open source developer essential toolkit, examples, and documentation for custom requirements i2 Intelligence Analysis brings clarity to complex investigations and operations by delivering industry leading multidimensional analytics that can be run on-demand across disparate data sets or across a single centralized analysis environment. The sole aim is to detect connections, patterns, and relationships hidden within high-volume, all-source data, and to create and disseminate intelligence products in near real time for faster informed decision making. (author)

  2. Distributed Engine Control Empirical/Analytical Verification Tools

    Science.gov (United States)

    DeCastro, Jonathan; Hettler, Eric; Yedavalli, Rama; Mitra, Sayan

    2013-01-01

    NASA's vision for an intelligent engine will be realized with the development of a truly distributed control system featuring highly reliable, modular, and dependable components capable of both surviving the harsh engine operating environment and decentralized functionality. A set of control system verification tools was developed and applied to a C-MAPSS40K engine model, and metrics were established to assess the stability and performance of these control systems on the same platform. A software tool was developed that allows designers to assemble easily a distributed control system in software and immediately assess the overall impacts of the system on the target (simulated) platform, allowing control system designers to converge rapidly on acceptable architectures with consideration to all required hardware elements. The software developed in this program will be installed on a distributed hardware-in-the-loop (DHIL) simulation tool to assist NASA and the Distributed Engine Control Working Group (DECWG) in integrating DCS (distributed engine control systems) components onto existing and next-generation engines.The distributed engine control simulator blockset for MATLAB/Simulink and hardware simulator provides the capability to simulate virtual subcomponents, as well as swap actual subcomponents for hardware-in-the-loop (HIL) analysis. Subcomponents can be the communication network, smart sensor or actuator nodes, or a centralized control system. The distributed engine control blockset for MATLAB/Simulink is a software development tool. The software includes an engine simulation, a communication network simulation, control algorithms, and analysis algorithms set up in a modular environment for rapid simulation of different network architectures; the hardware consists of an embedded device running parts of the CMAPSS engine simulator and controlled through Simulink. The distributed engine control simulation, evaluation, and analysis technology provides unique

  3. Analytical techniques and quality control in biomedical trace element research

    DEFF Research Database (Denmark)

    Heydorn, K.

    1994-01-01

    The small number of analytical results in trace element research calls for special methods of quality control. It is shown that when the analytical methods are in statistical control, only small numbers of duplicate or replicate results are needed to ascertain the absence of systematic errors....../kg. Measurement compatibility is obtained by control of traceability to certified reference materials, (C) 1994 Wiley-Liss, Inc....

  4. An Analytical Study of Tools and Techniques for Movie Marketing

    Directory of Open Access Journals (Sweden)

    Garima Maik

    2014-08-01

    Full Text Available Abstract. Bollywood or Hindi movie industry is one of the fastest growing sector in the media and entertainment space creating numerous business and employment opportunities. Movies in India are a major source of entertainment for all sects of society. They not only face competition from other movie industries and movies but from other source of entertainment such as adventure sports, amusement parks, theatre and drama, pubs and discothèques. A lot of man power, man hours, creative brains, and money are put in to build a quality feature film. Bollywood is the industry which continuously works towards providing the 7 billion population with something new always. So it is important for the movie and production team to stand out, to grab the due attention of the maximum audience. Movie makers employ various tools and techniques today to market their movies. They leave no stone unturned. They roll out teasers, First look, Theatrical trailer release, Music launch, City tours, Producer’s and director’s interview, Movie premier, Movie release, post release follow up and etc. to pull the viewers to the Cineplex. The audience today which comprises mainly of youth requires photos, videos, meet ups, gossip, debate, collaboration and content creation. These requirements of today’s generation are most fulfilled through digital platforms. However, the traditional media like newspapers, radio, and television are not old school. They reach out to mass audience and play an upper role in effective marketing. This study aims at analysing these tools for their effectiveness. The objectives are fulfilled through a consumer survey. This study will bring out the effectiveness and relational importance of various tools which are employed by movie marketers to generate maximum returns on the investments by using various data reduction techniques like factor analysis and statistical techniques like chi-square test with data visualization using pie charts

  5. Median of patient results as a tool for assessment of analytical stability

    DEFF Research Database (Denmark)

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft

    2015-01-01

    BACKGROUND: In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. METHOD......: Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable...... analytical bias based on biological variation. RESULTS: Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. DISCUSSION: Patient results applied in analytical quality performance...

  6. Improving web site performance using commercially available analytical tools.

    Science.gov (United States)

    Ogle, James A

    2010-10-01

    It is easy to accurately measure web site usage and to quantify key parameters such as page views, site visits, and more complex variables using commercially available tools that analyze web site log files and search engine use. This information can be used strategically to guide the design or redesign of a web site (templates, look-and-feel, and navigation infrastructure) to improve overall usability. The data can also be used tactically to assess the popularity and use of new pages and modules that are added and to rectify problems that surface. This paper describes software tools used to: (1) inventory search terms that lead to available content; (2) propose synonyms for commonly used search terms; (3) evaluate the effectiveness of calls to action; (4) conduct path analyses to targeted content. The American Academy of Orthopaedic Surgeons (AAOS) uses SurfRay's Behavior Tracking software (Santa Clara CA, USA, and Copenhagen, Denmark) to capture and archive the search terms that have been entered into the site's Google Mini search engine. The AAOS also uses Unica's NetInsight program to analyze its web site log files. These tools provide the AAOS with information that quantifies how well its web sites are operating and insights for making improvements to them. Although it is easy to quantify many aspects of an association's web presence, it also takes human involvement to analyze the results and then recommend changes. Without a dedicated resource to do this, the work often is accomplished only sporadically and on an ad hoc basis.

  7. Practical applications of surface analytic tools in tribology

    Science.gov (United States)

    Ferrante, J.

    1980-01-01

    Many of the currently, widely used tools available for surface analysis are described. Those which have the highest applicability for giving elemental and/or compound analysis for problems of interest in tribology and are truly surface sensitive (that is, less than 10 atomic layers) are presented. The latter group is evaluated in detail in terms of strengths and weaknesses. Emphasis is placed on post facto analysis of experiments performed under 'real' conditions (e.g., in air with lubricants). It is further indicated that such equipment could be used for screening and quality control.

  8. Harnessing scientific literature reports for pharmacovigilance. Prototype software analytical tool development and usability testing.

    Science.gov (United States)

    Sorbello, Alfred; Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier

    2017-03-22

    We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers' capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. All usability test participants cited the tool's ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool's automated literature search relative to a manual 'all fields' PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction.

  9. Median of patient results as a tool for assessment of analytical stability.

    Science.gov (United States)

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft; Sölétormos, György

    2015-06-15

    In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable analytical bias based on biological variation. Seventy five percent of the twenty analytes achieved on two COBASs INTEGRA 800 instruments performed in accordance with the optimum and with the desirable specifications for bias. Patient results applied in analytical quality performance control procedures are the most reliable sources of material as they represent the genuine substance of the measurements and therefore circumvent the problems associated with non-commutable materials in external assessment. Patient medians in the monthly monitoring of analytical stability in laboratory medicine are an inexpensive, simple and reliable tool to monitor the steadiness of the analytical practice. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. Horses for courses: analytical tools to explore planetary boundaries

    Science.gov (United States)

    van Vuuren, Detlef P.; Lucas, Paul L.; Häyhä, Tiina; Cornell, Sarah E.; Stafford-Smith, Mark

    2016-03-01

    There is a need for more integrated research on sustainable development and global environmental change. In this paper, we focus on the planetary boundaries framework to provide a systematic categorization of key research questions in relation to avoiding severe global environmental degradation. The four categories of key questions are those that relate to (1) the underlying processes and selection of key indicators for planetary boundaries, (2) understanding the impacts of environmental pressure and connections between different types of impacts, (3) better understanding of different response strategies to avoid further degradation, and (4) the available instruments to implement such strategies. Clearly, different categories of scientific disciplines and associated model types exist that can accommodate answering these questions. We identify the strength and weaknesses of different research areas in relation to the question categories, focusing specifically on different types of models. We discuss that more interdisciplinary research is need to increase our understanding by better linking human drivers and social and biophysical impacts. This requires better collaboration between relevant disciplines (associated with the model types), either by exchanging information or by fully linking or integrating them. As fully integrated models can become too complex, the appropriate type of model (the racehorse) should be applied for answering the target research question (the race course).

  11. Outlining an analytical framework for mapping research evaluation landscapes

    Energy Technology Data Exchange (ETDEWEB)

    Åström, F.

    2016-07-01

    This paper suggests an infrastructure perspective, as suggested by Star and Bowker (2006), as an analytical framework for studying the research evaluation landscape. An infrastructure is suggested to be understood, not as a concrete technology, but as a system of contextual factors including ‘Actors/Stakeholders’, ‘Technical systems’, and ‘Evaluation practices’. How the framework can be operationationalized is exemplified by examples from previous and ongoing research, as well as by identify gaps in current research. (Author)

  12. Laser: a Tool for Optimization and Enhancement of Analytical Methods

    Energy Technology Data Exchange (ETDEWEB)

    Preisler, Jan [Iowa State Univ., Ames, IA (United States)

    1997-01-01

    In this work, we use lasers to enhance possibilities of laser desorption methods and to optimize coating procedure for capillary electrophoresis (CE). We use several different instrumental arrangements to characterize matrix-assisted laser desorption (MALD) at atmospheric pressure and in vacuum. In imaging mode, 488-nm argon-ion laser beam is deflected by two acousto-optic deflectors to scan plumes desorbed at atmospheric pressure via absorption. All absorbing species, including neutral molecules, are monitored. Interesting features, e.g. differences between the initial plume and subsequent plumes desorbed from the same spot, or the formation of two plumes from one laser shot are observed. Total plume absorbance can be correlated with the acoustic signal generated by the desorption event. A model equation for the plume velocity as a function of time is proposed. Alternatively, the use of a static laser beam for observation enables reliable determination of plume velocities even when they are very high. Static scattering detection reveals negative influence of particle spallation on MS signal. Ion formation during MALD was monitored using 193-nm light to photodissociate a portion of insulin ion plume. These results define the optimal conditions for desorbing analytes from matrices, as opposed to achieving a compromise between efficient desorption and efficient ionization as is practiced in mass spectrometry. In CE experiment, we examined changes in a poly(ethylene oxide) (PEO) coating by continuously monitoring the electroosmotic flow (EOF) in a fused-silica capillary during electrophoresis. An imaging CCD camera was used to follow the motion of a fluorescent neutral marker zone along the length of the capillary excited by 488-nm Ar-ion laser. The PEO coating was shown to reduce the velocity of EOF by more than an order of magnitude compared to a bare capillary at pH 7.0. The coating protocol was important, especially at an intermediate pH of 7.7. The increase of p

  13. Positron spectroscopy as an analytical tool in material sciences

    International Nuclear Information System (INIS)

    Pujari, P.K.

    2010-01-01

    Full text: Positron annihilation spectroscopy has emerged as a powerful tool in material sciences due to its ability to provide information about the electron momentum distribution and electron density in a given medium. These features help in identifying altered state of electronic rearrangements as one encounters in phase transitions. In addition, positrons prefer regions of lower electron density such as open volume defects i.e. vacancies or vacancy clusters in metals, alloys and semiconductors or free-volumes in molecular solids. Its sensitivity to defects is extremely high e.g. it can detect as small a defect as monovacancy to concentration as low as parts per million(ppm). Innovative nuclear instrumentation has helped in getting chemical specificity at the annihilation site. For example, precipitates, embedded, nanoparticles or element decorated vacancies can now be easily identified. This presentation is structured to introduce the technique and provide a global perspective on area of applications. Specific examples on defect characterization, nanostructure-property correlations in polymers, advantages of elemental specificity by indexing the core electron momentum will be given. In addition, slow positron beam based studies on nanostructured materials as well as particle accelerator based positron spectroscopy for volumetric assay of defects in large engineering samples will be presented

  14. Ecotoxicity on a stick: A novel analytical tool for predicting the ecotoxicity of petroleum contaminated samples

    International Nuclear Information System (INIS)

    Parkerton, T.F.; Stone, M.A.

    1995-01-01

    Hydrocarbons generally elicit toxicity via a nonpolar narcotic mechanism. Recent research suggests that chemicals acting by this mode invoke ecotoxicity when the molar concentration in organisms lipid exceeds a critical threshold. Since ecotoxicity of nonpolar narcotic mixtures appears to be additive, the ecotoxicity of hydrocarbon mixtures thus depends upon: (1) the partitioning of individual hydrocarbons comprising the mixture from the environment to lipids and (2) the total molar sum of the constituent hydrocarbons in lipids. These insights have led previous investigators to advance the concept of biomimetic extraction as a novel tool for assessing potential narcosis-type or baseline ecotoxicity in aqueous samples. Drawing from this earlier work, the authors have developed a method to quantify Bioavailable Petroleum Hydrocarbons (BPHS) in hydrocarbon-contaminated aqueous and soil/sediment samples. A sample is equilibrated with a solid phase microextraction (SPME) fiber that serves as a surrogate for organism lipids. The total moles of hydrocarbons that partition to the SPME fiber is then quantified using a simple GC/FID procedure. Research conducted to support the development and initial validation of this method will be presented. Results suggest that BPH analyses provide a promising, cost-effective approach for predicting the ecotoxicity of environmental samples contaminated with hydrocarbon mixtures. Consequently, BPH analyses may provide a valuable analytical screening tool for ecotoxicity assessment in product and effluent testing, environmental monitoring and site remediation applications

  15. EPIPOI: A user-friendly analytical tool for the extraction and visualization of temporal parameters from epidemiological time series

    Directory of Open Access Journals (Sweden)

    Alonso Wladimir J

    2012-11-01

    Full Text Available Abstract Background There is an increasing need for processing and understanding relevant information generated by the systematic collection of public health data over time. However, the analysis of those time series usually requires advanced modeling techniques, which are not necessarily mastered by staff, technicians and researchers working on public health and epidemiology. Here a user-friendly tool, EPIPOI, is presented that facilitates the exploration and extraction of parameters describing trends, seasonality and anomalies that characterize epidemiological processes. It also enables the inspection of those parameters across geographic regions. Although the visual exploration and extraction of relevant parameters from time series data is crucial in epidemiological research, until now it had been largely restricted to specialists. Methods EPIPOI is freely available software developed in Matlab (The Mathworks Inc that runs both on PC and Mac computers. Its friendly interface guides users intuitively through useful comparative analyses including the comparison of spatial patterns in temporal parameters. Results EPIPOI is able to handle complex analyses in an accessible way. A prototype has already been used to assist researchers in a variety of contexts from didactic use in public health workshops to the main analytical tool in published research. Conclusions EPIPOI can assist public health officials and students to explore time series data using a broad range of sophisticated analytical and visualization tools. It also provides an analytical environment where even advanced users can benefit by enabling a higher degree of control over model assumptions, such as those associated with detecting disease outbreaks and pandemics.

  16. Software tool for portal dosimetry research.

    Science.gov (United States)

    Vial, P; Hunt, P; Greer, P B; Oliver, L; Baldock, C

    2008-09-01

    This paper describes a software tool developed for research into the use of an electronic portal imaging device (EPID) to verify dose for intensity modulated radiation therapy (IMRT) beams. A portal dose image prediction (PDIP) model that predicts the EPID response to IMRT beams has been implemented into a commercially available treatment planning system (TPS). The software tool described in this work was developed to modify the TPS PDIP model by incorporating correction factors into the predicted EPID image to account for the difference in EPID response to open beam radiation and multileaf collimator (MLC) transmitted radiation. The processes performed by the software tool include; i) read the MLC file and the PDIP from the TPS, ii) calculate the fraction of beam-on time that each point in the IMRT beam is shielded by MLC leaves, iii) interpolate correction factors from look-up tables, iv) create a corrected PDIP image from the product of the original PDIP and the correction factors and write the corrected image to file, v) display, analyse, and export various image datasets. The software tool was developed using the Microsoft Visual Studio.NET framework with the C# compiler. The operation of the software tool was validated. This software provided useful tools for EPID dosimetry research, and it is being utilised and further developed in ongoing EPID dosimetry and IMRT dosimetry projects.

  17. Decision analytic tools for resolving uncertainty in the energy debate

    International Nuclear Information System (INIS)

    Renn, O.

    1986-01-01

    Within the context of a Social Compatibility Study on Energy Supply Systems a complex decision making model was used to incorporate scientific expertize and public participation into the process of policy formulation and evaluation. The study was directed by the program group ''Technology and Society'' of the Nuclear Research Centre Juelich. It consisted of three parts: First, with the aid of value tree analysis the whole spectrum of concern and dimensions relevant to the energy issue in Germany was collected and structured in a combined value tree representing the values and criteria of nine important interest groups in the Federal Republic of Germany. Second, the revealed criteria were translated into indicators. Four different energy scenarios were evaluated with respect to each indicator making use of physical measurement, literature review and expert surveys. Third, the weights for each indicator were elicited by interviewing randomly chosen citizens. Those citizens were informed about the scenarios and their impacts prior to the weighting process in a four day seminar. As a result most citizens favoured more moderate energy scenarios assigning high priority to energy conservation. Nuclear energy was perceived as necessary energy source in the long run, but should be restricted to meet only the demand that cannot be covered by other energy means. (orig.)

  18. Analytical sensitivity analysis of geometric errors in a three axis machine tool

    International Nuclear Information System (INIS)

    Park, Sung Ryung; Yang, Seung Han

    2012-01-01

    In this paper, an analytical method is used to perform a sensitivity analysis of geometric errors in a three axis machine tool. First, an error synthesis model is constructed for evaluating the position volumetric error due to the geometric errors, and then an output variable is defined, such as the magnitude of the position volumetric error. Next, the global sensitivity analysis is executed using an analytical method. Finally, the sensitivity indices are calculated using the quantitative values of the geometric errors

  19. Drawing as a user experience research tool

    DEFF Research Database (Denmark)

    Fleury, Alexandre

    2011-01-01

    such previous work, two case studies are presented, in which drawings helped investigate the relationship between media technology users and two specific devices, namely television and mobile phones. The experiment generated useful data and opened for further consideration of the method as an appropriate HCI...... research tool....

  20. Research Positioning & Trend Identification : a data-analytics toolbox

    NARCIS (Netherlands)

    Ranjbarsahraei, Bijan; Negenborn, R.R.

    2017-01-01

    AIDA was an initiative of TU Delft scientific staff in cooperation with TU Delft Library and Leiden University’s Centre for Science and Technology Studies (CWTS). The aim of the AIDA project was to provide TU Delft researchers and faculties with easy-to-use tools for research positioning and trend

  1. VisualUrText: A Text Analytics Tool for Unstructured Textual Data

    Science.gov (United States)

    Zainol, Zuraini; Jaymes, Mohd T. H.; Nohuddin, Puteri N. E.

    2018-05-01

    The growing amount of unstructured text over Internet is tremendous. Text repositories come from Web 2.0, business intelligence and social networking applications. It is also believed that 80-90% of future growth data is available in the form of unstructured text databases that may potentially contain interesting patterns and trends. Text Mining is well known technique for discovering interesting patterns and trends which are non-trivial knowledge from massive unstructured text data. Text Mining covers multidisciplinary fields involving information retrieval (IR), text analysis, natural language processing (NLP), data mining, machine learning statistics and computational linguistics. This paper discusses the development of text analytics tool that is proficient in extracting, processing, analyzing the unstructured text data and visualizing cleaned text data into multiple forms such as Document Term Matrix (DTM), Frequency Graph, Network Analysis Graph, Word Cloud and Dendogram. This tool, VisualUrText, is developed to assist students and researchers for extracting interesting patterns and trends in document analyses.

  2. Applied regression analysis a research tool

    CERN Document Server

    Pantula, Sastry; Dickey, David

    1998-01-01

    Least squares estimation, when used appropriately, is a powerful research tool. A deeper understanding of the regression concepts is essential for achieving optimal benefits from a least squares analysis. This book builds on the fundamentals of statistical methods and provides appropriate concepts that will allow a scientist to use least squares as an effective research tool. Applied Regression Analysis is aimed at the scientist who wishes to gain a working knowledge of regression analysis. The basic purpose of this book is to develop an understanding of least squares and related statistical methods without becoming excessively mathematical. It is the outgrowth of more than 30 years of consulting experience with scientists and many years of teaching an applied regression course to graduate students. Applied Regression Analysis serves as an excellent text for a service course on regression for non-statisticians and as a reference for researchers. It also provides a bridge between a two-semester introduction to...

  3. Benefits and limitations of using decision analytic tools to assess uncertainty and prioritize Landscape Conservation Cooperative information needs

    Science.gov (United States)

    Post van der Burg, Max; Cullinane Thomas, Catherine; Holcombe, Tracy R.; Nelson, Richard D.

    2016-01-01

    The Landscape Conservation Cooperatives (LCCs) are a network of partnerships throughout North America that are tasked with integrating science and management to support more effective delivery of conservation at a landscape scale. In order to achieve this integration, some LCCs have adopted the approach of providing their partners with better scientific information in an effort to facilitate more effective and coordinated conservation decisions. Taking this approach has led many LCCs to begin funding research to provide the information for improved decision making. To ensure that funding goes to research projects with the highest likelihood of leading to more integrated broad scale conservation, some LCCs have also developed approaches for prioritizing which information needs will be of most benefit to their partnerships. We describe two case studies in which decision analytic tools were used to quantitatively assess the relative importance of information for decisions made by partners in the Plains and Prairie Potholes LCC. The results of the case studies point toward a few valuable lessons in terms of using these tools with LCCs. Decision analytic tools tend to help shift focus away from research oriented discussions and toward discussions about how information is used in making better decisions. However, many technical experts do not have enough knowledge about decision making contexts to fully inform the latter type of discussion. When assessed in the right decision context, however, decision analyses can point out where uncertainties actually affect optimal decisions and where they do not. This helps technical experts understand that not all research is valuable in improving decision making. But perhaps most importantly, our results suggest that decision analytic tools may be more useful for LCCs as way of developing integrated objectives for coordinating partner decisions across the landscape, rather than simply ranking research priorities.

  4. Process analytical technology (PAT) tools for the cultivation step in biopharmaceutical production

    NARCIS (Netherlands)

    Streefland, M.; Martens, D.E.; Beuvery, E.C.; Wijffels, R.H.

    2013-01-01

    The process analytical technology (PAT) initiative is now 10 years old. This has resulted in the development of many tools and software packages dedicated to PAT application on pharmaceutical processes. However, most applications are restricted to small molecule drugs, mainly for the relatively

  5. APPLICATION OF FUZZY ANALYTIC HIERARCHY PROCESS TO BUILDING RESEARCH TEAMS

    Directory of Open Access Journals (Sweden)

    Karol DĄBROWSKI

    2016-01-01

    Full Text Available Building teams has a fundamental impact for execution of research and development projects. The teams appointed for the needs of given projects are based on individuals from both inside and outside of the organization. Knowledge is not only a product available on the market but also an intangible resource affecting their internal and external processes. Thus it is vitally important for businesses and scientific research facilities to effectively manage knowledge within project teams. The article presents a proposal to use Fuzzy AHP (Analytic Hierarchy Process and ANFIS (Adaptive Neuro Fuzzy Inference System methods in working groups building for R&D projects on the basis of employees skills.

  6. Application of Fuzzy Analytic Hierarchy Process to Building Research Teams

    Science.gov (United States)

    Dąbrowski, Karol; Skrzypek, Katarzyna

    2016-03-01

    Building teams has a fundamental impact for execution of research and development projects. The teams appointed for the needs of given projects are based on individuals from both inside and outside of the organization. Knowledge is not only a product available on the market but also an intangible resource affecting their internal and external processes. Thus it is vitally important for businesses and scientific research facilities to effectively manage knowledge within project teams. The article presents a proposal to use Fuzzy AHP (Analytic Hierarchy Process) and ANFIS (Adaptive Neuro Fuzzy Inference System) methods in working groups building for R&D projects on the basis of employees skills.

  7. Operations other than war: Requirements for analysis tools research report

    Energy Technology Data Exchange (ETDEWEB)

    Hartley, D.S. III

    1996-12-01

    This report documents the research effort to determine the requirements for new or improved analysis tools to support decisions at the strategic and operational levels for military Operations Other than War (OOTW). The work was performed for the Commander in Chief, U.S. Pacific Command (USCINCPAC). The data collection was based on workshops attended by experts in OOTWs: analysis personnel from each of the Combatant Commands, the Services, the Office of the Secretary of Defense (OSD), the Joint Staff, and other knowledgeable personnel. Further data were gathered from other workshops and conferences and from the literature. The results of this research begin with the creation of a taxonomy of OOTWs: categories of operations, attributes of operations, and tasks requiring analytical support. The tasks are connected to the Joint Staff`s Universal Joint Task List (UJTL). Historical OOTWs are analyzed to produce frequency distributions by category and responsible CINC. The analysis products are synthesized into a list of requirements for analytical tools and definitions of the requirements. The report concludes with a timeline or roadmap for satisfying the requirements.

  8. Measuring the bright side of being blue: a new tool for assessing analytical rumination in depression.

    Directory of Open Access Journals (Sweden)

    Skye P Barbic

    Full Text Available BACKGROUND: Diagnosis and management of depression occurs frequently in the primary care setting. Current diagnostic and management of treatment practices across clinical populations focus on eliminating signs and symptoms of depression. However, there is debate that some interventions may pathologize normal, adaptive responses to stressors. Analytical rumination (AR is an example of an adaptive response of depression that is characterized by enhanced cognitive function to help an individual focus on, analyze, and solve problems. To date, research on AR has been hampered by the lack of theoretically-derived and psychometrically sound instruments. This study developed and tested a clinically meaningful measure of AR. METHODS: Using expert panels and an extensive literature review, we developed a conceptual framework for AR and 22 candidate items. Items were field tested to 579 young adults; 140 of whom completed the items at a second time point. We used Rasch measurement methods to construct and test the item set; and traditional psychometric analyses to compare items to existing rating scales. RESULTS: Data were high quality (0.81; evidence for divergent validity. Evidence of misfit for 2 items suggested that a 20-item scale with 4-point response categories best captured the concept of AR, fitting the Rasch model (χ2 = 95.26; df = 76, p = 0.07, with high reliability (rp = 0.86, ordered response scale structure, and no item bias (gender, age, time. CONCLUSION: Our study provides evidence for a 20-item Analytical Rumination Questionnaire (ARQ that can be used to quantify AR in adults who experience symptoms of depression. The ARQ is psychometrically robust and a clinically useful tool for the assessment and improvement of depression in the primary care setting. Future work is needed to establish the validity of this measure in people with major depression.

  9. Application of quality improvement analytic methodology in emergency medicine research: A comparative evaluation.

    Science.gov (United States)

    Harries, Bruce; Filiatrault, Lyne; Abu-Laban, Riyad B

    2018-05-30

    Quality improvement (QI) analytic methodology is rarely encountered in the emergency medicine literature. We sought to comparatively apply QI design and analysis techniques to an existing data set, and discuss these techniques as an alternative to standard research methodology for evaluating a change in a process of care. We used data from a previously published randomized controlled trial on triage-nurse initiated radiography using the Ottawa ankle rules (OAR). QI analytic tools were applied to the data set from this study and evaluated comparatively against the original standard research methodology. The original study concluded that triage nurse-initiated radiographs led to a statistically significant decrease in mean emergency department length of stay. Using QI analytic methodology, we applied control charts and interpreted the results using established methods that preserved the time sequence of the data. This analysis found a compelling signal of a positive treatment effect that would have been identified after the enrolment of 58% of the original study sample, and in the 6th month of this 11-month study. Our comparative analysis demonstrates some of the potential benefits of QI analytic methodology. We found that had this approach been used in the original study, insights regarding the benefits of nurse-initiated radiography using the OAR would have been achieved earlier, and thus potentially at a lower cost. In situations where the overarching aim is to accelerate implementation of practice improvement to benefit future patients, we believe that increased consideration should be given to the use of QI analytic methodology.

  10. Hasse diagram as a green analytical metrics tool: ranking of methods for benzo[a]pyrene determination in sediments.

    Science.gov (United States)

    Bigus, Paulina; Tsakovski, Stefan; Simeonov, Vasil; Namieśnik, Jacek; Tobiszewski, Marek

    2016-05-01

    This study presents an application of the Hasse diagram technique (HDT) as the assessment tool to select the most appropriate analytical procedures according to their greenness or the best analytical performance. The dataset consists of analytical procedures for benzo[a]pyrene determination in sediment samples, which were described by 11 variables concerning their greenness and analytical performance. Two analyses with the HDT were performed-the first one with metrological variables and the second one with "green" variables as input data. Both HDT analyses ranked different analytical procedures as the most valuable, suggesting that green analytical chemistry is not in accordance with metrology when benzo[a]pyrene in sediment samples is determined. The HDT can be used as a good decision support tool to choose the proper analytical procedure concerning green analytical chemistry principles and analytical performance merits.

  11. Earth Science Data Analytics: Bridging Tools and Techniques with the Co-Analysis of Large, Heterogeneous Datasets

    Science.gov (United States)

    Kempler, Steve; Mathews, Tiffany

    2016-01-01

    The continuum of ever-evolving data management systems affords great opportunities to the enhancement of knowledge and facilitation of science research. To take advantage of these opportunities, it is essential to understand and develop methods that enable data relationships to be examined and the information to be manipulated. This presentation describes the efforts of the Earth Science Information Partners (ESIP) Federation Earth Science Data Analytics (ESDA) Cluster to understand, define, and facilitate the implementation of ESDA to advance science research. As a result of the void of Earth science data analytics publication material, the cluster has defined ESDA along with 10 goals to set the framework for a common understanding of tools and techniques that are available and still needed to support ESDA.

  12. FOSS Tools for Research Data Management

    Science.gov (United States)

    Stender, Vivien; Jankowski, Cedric; Hammitzsch, Martin; Wächter, Joachim

    2017-04-01

    Established initiatives and organizations, e.g. the Initiative for Scientific Cyberinfrastructures (NSF, 2007) or the European Strategy Forum on Research Infrastructures (ESFRI, 2008), promote and foster the development of sustainable research infrastructures. These infrastructures aim the provision of services supporting scientists to search, visualize and access data, to collaborate and exchange information, as well as to publish data and other results. In this regard, Research Data Management (RDM) gains importance and thus requires the support by appropriate tools integrated in these infrastructures. Different projects provide arbitrary solutions to manage research data. However, within two projects - SUMARIO for land and water management and TERENO for environmental monitoring - solutions to manage research data have been developed based on Free and Open Source Software (FOSS) components. The resulting framework provides essential components for harvesting, storing and documenting research data, as well as for discovering, visualizing and downloading these data on the basis of standardized services stimulated considerably by enhanced data management approaches of Spatial Data Infrastructures (SDI). In order to fully exploit the potentials of these developments for enhancing data management in Geosciences the publication of software components, e.g. via GitHub, is not sufficient. We will use our experience to move these solutions into the cloud e.g. as PaaS or SaaS offerings. Our contribution will present data management solutions for the Geosciences developed in two projects. A sort of construction kit with FOSS components build the backbone for the assembly and implementation of projects specific platforms. Furthermore, an approach is presented to stimulate the reuse of FOSS RDM solutions with cloud concepts. In further projects specific RDM platforms can be set-up much faster, customized to the individual needs and tools can be added during the run-time.

  13. Nuclear and nuclear related analytical methods applied in environmental research

    International Nuclear Information System (INIS)

    Popescu, Ion V.; Gheboianu, Anca; Bancuta, Iulian; Cimpoca, G. V; Stihi, Claudia; Radulescu, Cristiana; Oros Calin; Frontasyeva, Marina; Petre, Marian; Dulama, Ioana; Vlaicu, G.

    2010-01-01

    Nuclear Analytical Methods can be used for research activities on environmental studies like water quality assessment, pesticide residues, global climatic change (transboundary), pollution and remediation. Heavy metal pollution is a problem associated with areas of intensive industrial activity. In this work the moss bio monitoring technique was employed to study the atmospheric deposition in Dambovita County Romania. Also, there were used complementary nuclear and atomic analytical methods: Neutron Activation Analysis (NAA), Atomic Absorption Spectrometry (AAS) and Inductively Coupled Plasma Atomic Emission Spectrometry (ICP-AES). These high sensitivity analysis methods were used to determine the chemical composition of some samples of mosses placed in different areas with different pollution industrial sources. The concentrations of Cr, Fe, Mn, Ni and Zn were determined. The concentration of Fe from the same samples was determined using all these methods and we obtained a very good agreement, in statistical limits, which demonstrate the capability of these analytical methods to be applied on a large spectrum of environmental samples with the same results. (authors)

  14. New Tools to Prepare ACE Cross-section Files for MCNP Analytic Test Problems

    International Nuclear Information System (INIS)

    Brown, Forrest B.

    2016-01-01

    Monte Carlo calculations using one-group cross sections, multigroup cross sections, or simple continuous energy cross sections are often used to: (1) verify production codes against known analytical solutions, (2) verify new methods and algorithms that do not involve detailed collision physics, (3) compare Monte Carlo calculation methods with deterministic methods, and (4) teach fundamentals to students. In this work we describe 2 new tools for preparing the ACE cross-section files to be used by MCNP ® for these analytic test problems, simple a ce.pl and simple a ce m g.pl.

  15. Social media analytics and research testbed (SMART: Exploring spatiotemporal patterns of human dynamics with geo-targeted social media messages

    Directory of Open Access Journals (Sweden)

    Jiue-An Yang

    2016-06-01

    Full Text Available The multilevel model of meme diffusion conceptualizes how mediated messages diffuse over time and space. As a pilot application of implementing the meme diffusion, we developed the social media analytics and research testbed to monitor Twitter messages and track the diffusion of information in and across different cities and geographic regions. Social media analytics and research testbed is an online geo-targeted search and analytics tool, including an automatic data processing procedure at the backend and an interactive frontend user interface. Social media analytics and research testbed is initially designed to facilitate (1 searching and geo-locating tweet topics and terms in different cities and geographic regions; (2 filtering noise from raw data (such as removing redundant retweets and using machine learning methods to improve precision; (3 analyzing social media data from a spatiotemporal perspective; and (4 visualizing social media data in diagnostic ways (such as weekly and monthly trends, trend maps, top media, top retweets, top mentions, or top hashtags. Social media analytics and research testbed provides researchers and domain experts with a tool that can efficiently facilitate the refinement, formalization, and testing of research hypotheses or questions. Three case studies (flu outbreaks, Ebola epidemic, and marijuana legalization are introduced to illustrate how the predictions of meme diffusion can be examined and to demonstrate the potentials and key functions of social media analytics and research testbed.

  16. Chemometrics-based process analytical technology (PAT) tools: applications and adaptation in pharmaceutical and biopharmaceutical industries.

    Science.gov (United States)

    Challa, Shruthi; Potumarthi, Ravichandra

    2013-01-01

    Process analytical technology (PAT) is used to monitor and control critical process parameters in raw materials and in-process products to maintain the critical quality attributes and build quality into the product. Process analytical technology can be successfully implemented in pharmaceutical and biopharmaceutical industries not only to impart quality into the products but also to prevent out-of-specifications and improve the productivity. PAT implementation eliminates the drawbacks of traditional methods which involves excessive sampling and facilitates rapid testing through direct sampling without any destruction of sample. However, to successfully adapt PAT tools into pharmaceutical and biopharmaceutical environment, thorough understanding of the process is needed along with mathematical and statistical tools to analyze large multidimensional spectral data generated by PAT tools. Chemometrics is a chemical discipline which incorporates both statistical and mathematical methods to obtain and analyze relevant information from PAT spectral tools. Applications of commonly used PAT tools in combination with appropriate chemometric method along with their advantages and working principle are discussed. Finally, systematic application of PAT tools in biopharmaceutical environment to control critical process parameters for achieving product quality is diagrammatically represented.

  17. Straightforward statistics understanding the tools of research

    CERN Document Server

    Geher, Glenn

    2014-01-01

    Straightforward Statistics: Understanding the Tools of Research is a clear and direct introduction to statistics for the social, behavioral, and life sciences. Based on the author's extensive experience teaching undergraduate statistics, this book provides a narrative presentation of the core principles that provide the foundation for modern-day statistics. With step-by-step guidance on the nuts and bolts of computing these statistics, the book includes detailed tutorials how to use state-of-the-art software, SPSS, to compute the basic statistics employed in modern academic and applied researc

  18. Contextual and Analytic Qualities of Research Methods Exemplified in Research on Teaching

    Science.gov (United States)

    Svensson, Lennart; Doumas, Kyriaki

    2013-01-01

    The aim of the present article is to discuss contextual and analytic qualities of research methods. The arguments are specified in relation to research on teaching. A specific investigation is used as an example to illustrate the general methodological approach. It is argued that research methods should be carefully grounded in an understanding of…

  19. Analytical and Empirical Modeling of Wear and Forces of CBN Tool in Hard Turning - A Review

    Science.gov (United States)

    Patel, Vallabh Dahyabhai; Gandhi, Anishkumar Hasmukhlal

    2017-08-01

    Machining of steel material having hardness above 45 HRC (Hardness-Rockwell C) is referred as a hard turning. There are numerous models which should be scrutinized and implemented to gain optimum performance of hard turning. Various models in hard turning by cubic boron nitride tool have been reviewed, in attempt to utilize appropriate empirical and analytical models. Validation of steady state flank and crater wear model, Usui's wear model, forces due to oblique cutting theory, extended Lee and Shaffer's force model, chip formation and progressive flank wear have been depicted in this review paper. Effort has been made to understand the relationship between tool wear and tool force based on the different cutting conditions and tool geometries so that appropriate model can be used according to user requirement in hard turning.

  20. Development of a Suite of Analytical Tools for Energy and Water Infrastructure Knowledge Discovery

    Science.gov (United States)

    Morton, A.; Piburn, J.; Stewart, R.; Chandola, V.

    2017-12-01

    Energy and water generation and delivery systems are inherently interconnected. With demand for energy growing, the energy sector is experiencing increasing competition for water. With increasing population and changing environmental, socioeconomic, and demographic scenarios, new technology and investment decisions must be made for optimized and sustainable energy-water resource management. This also requires novel scientific insights into the complex interdependencies of energy-water infrastructures across multiple space and time scales. To address this need, we've developed a suite of analytical tools to support an integrated data driven modeling, analysis, and visualization capability for understanding, designing, and developing efficient local and regional practices related to the energy-water nexus. This work reviews the analytical capabilities available along with a series of case studies designed to demonstrate the potential of these tools for illuminating energy-water nexus solutions and supporting strategic (federal) policy decisions.

  1. Analytics for smart energy management tools and applications for sustainable manufacturing

    CERN Document Server

    Oh, Seog-Chan

    2016-01-01

    This book introduces the issues and problems that arise when implementing smart energy management for sustainable manufacturing in the automotive manufacturing industry and the analytical tools and applications to deal with them. It uses a number of illustrative examples to explain energy management in automotive manufacturing, which involves most types of manufacturing technology and various levels of energy consumption. It demonstrates how analytical tools can help improve energy management processes, including forecasting, consumption, and performance analysis, emerging new technology identification as well as investment decisions for establishing smart energy consumption practices. It also details practical energy management systems, making it a valuable resource for professionals involved in real energy management processes, and allowing readers to implement the procedures and applications presented.

  2. Basic Research Tools for Earthworm Ecology

    Directory of Open Access Journals (Sweden)

    Kevin R. Butt

    2010-01-01

    Full Text Available Earthworms are responsible for soil development, recycling organic matter and form a vital component within many food webs. For these and other reasons earthworms are worthy of investigation. Many technologically-enhanced approaches have been used within earthworm-focused research. These have their place, may be a development of existing practices or bring techniques from other fields. Nevertheless, let us not overlook the fact that much can still be learned through utilisation of more basic approaches which have been used for some time. New does not always equate to better. Information on community composition within an area and specific population densities can be learned using simple collection techniques, and burrowing behaviour can be determined from pits, resin-insertion or simple mesocosms. Life history studies can be achieved through maintenance of relatively simple cultures. Behavioural observations can be undertaken by direct observation or with low cost webcam usage. Applied aspects of earthworm research can also be achieved through use of simple techniques to enhance population development and even population dynamics can be directly addressed with use of relatively inexpensive, effective marking techniques. This paper seeks to demonstrate that good quality research in this sphere can result from appropriate application of relatively simple research tools.

  3. Basic Research Tools for Earthworm Ecology

    International Nuclear Information System (INIS)

    Butt, K.R.; Grigoropoulou, N.

    2010-01-01

    Earthworms are responsible for soil development, recycling organic matter and form a vital component within many food webs. For these and other reasons earthworms are worthy of investigation. Many technologically-enhanced approaches have been used within earthworm-focused research. These have their place, may be a development of existing practices or bring techniques from other fields. Nevertheless, let us not overlook the fact that much can still be learned through utilisation of more basic approaches which have been used for some time. New does not always equate to better. Information on community composition within an area and specific population densities can be learned using simple collection techniques, and burrowing behaviour can be determined from pits, resin-insertion or simple mesocosms. Life history studies can be achieved through maintenance of relatively simple cultures. Behavioural observations can be undertaken by direct observation or with low cost we became usage. Applied aspects of earthworm research can also be achieved through use of simple techniques to enhance population development and even population dynamics can be directly addressed with use of relatively inexpensive, effective marking techniques. This paper seeks to demonstrate that good quality research in this sphere can result from appropriate application of relatively simple research tools.

  4. Experiments with Analytic Centers: A confluence of data, tools and help in using them.

    Science.gov (United States)

    Little, M. M.; Crichton, D. J.; Hines, K.; Cole, M.; Quam, B. M.

    2017-12-01

    Traditional repositories have been primarily focused on data stewardship. Over the past two decades, data scientists have attempted to overlay a superstructure to make these repositories more amenable to analysis tasks, with limited success. This poster will summarize lessons learned and some realizations regarding what it takes to create an analytic center. As the volume of Earth Science data grows and the sophistication of analytic tools improves, a pattern has emerged that indicates different science communities uniquely apply a selection of tools to the data to produce scientific results. Infrequently do the experiences of one group help steer other groups. How can the information technology community seed these domains with tools that conform to the thought processes and experiences of that particular science group? What types of succcessful technology infusions have occured and how does technology get adopted. AIST has been experimenting with the management of this analytic center process; this paper will summarize the results and indicate a direction for future infusion attempts.

  5. ANALYTIC HIERARCHY PROCESS: AN APPLICATION IN GREEN BUILDING MARKET RESEARCH

    Directory of Open Access Journals (Sweden)

    Sharmin Attaran

    2013-01-01

    Full Text Available Sustainability has become a necessity in the building industry. In recent years, as the general public is more informed and aware of sustainability related issues, they are becoming major players in the decision making process regarding their built environment. However, there are still challenges with how sustainability is communicated to occupants and owners of buildings. As the global economic crisis is continuing, the marketing of green buildings needs to be refined to communicate the lifetime benefits of sustainability. One of the ways to develop effective marketing strategies, is to understand what the occupants value the most among many aspects of green buildings thus develop focused marketing solutions. Authors present a conceptual methodology using Analytic Hierarchy Process toward identifying consumer ranking and weights of a major green building rating system’s categories. Authors use sample non-representative data to illustrate the proposed methodology, while sharing preliminary qualitative data from the research in progress.

  6. A Business Analytics Software Tool for Monitoring and Predicting Radiology Throughput Performance.

    Science.gov (United States)

    Jones, Stephen; Cournane, Seán; Sheehy, Niall; Hederman, Lucy

    2016-12-01

    Business analytics (BA) is increasingly being utilised by radiology departments to analyse and present data. It encompasses statistical analysis, forecasting and predictive modelling and is used as an umbrella term for decision support and business intelligence systems. The primary aim of this study was to determine whether utilising BA technologies could contribute towards improved decision support and resource management within radiology departments. A set of information technology requirements were identified with key stakeholders, and a prototype BA software tool was designed, developed and implemented. A qualitative evaluation of the tool was carried out through a series of semi-structured interviews with key stakeholders. Feedback was collated, and emergent themes were identified. The results indicated that BA software applications can provide visibility of radiology performance data across all time horizons. The study demonstrated that the tool could potentially assist with improving operational efficiencies and management of radiology resources.

  7. Telerehabilitation: Policy Issues and Research Tools

    Directory of Open Access Journals (Sweden)

    Katherine D. Seelman

    2009-09-01

    Full Text Available The importance of public policy as a complementary framework for telehealth, telemedicine, and by association telerehabilitation, has been recognized by a number of experts. The purpose of this paper is to review literature on telerehabilitation (TR policy and research methodology issues in order to report on the current state of the science and make recommendations about future research needs. An extensive literature search was implemented using search terms grouped into main topics of telerehabilitation, policy, population of users, and policy specific issues such as cost and reimbursement. The availability of rigorous and valid evidence-based cost studies emerged as a major challenge to the field. Existing cost studies provided evidence that telehomecare may be a promising application area for TR. Cost studies also indicated that telepsychiatry is a promising telepractice area. The literature did not reference the International Classification on Functioning, Disability and Health (ICF. Rigorous and comprehensive TR assessment and evaluation tools for outcome studies are tantamount to generating confidence among providers, payers, clinicians and end users. In order to evaluate consumer satisfaction and participation, assessment criteria must include medical, functional and quality of life items such as assistive technology and environmental factors. Keywords: Telerehabilitation, Telehomecare, Telepsychiatry, Telepractice

  8. ATTIRE (analytical tools for thermal infrared engineering): A sensor simulation and modeling package

    Science.gov (United States)

    Jaggi, S.

    1993-01-01

    The Advanced Sensor Development Laboratory (ASDL) at the Stennis Space Center develops, maintains and calibrates remote sensing instruments for the National Aeronautics & Space Administration (NASA). To perform system design trade-offs, analysis, and establish system parameters, ASDL has developed a software package for analytical simulation of sensor systems. This package called 'Analytical Tools for Thermal InfraRed Engineering' - ATTIRE, simulates the various components of a sensor system. The software allows each subsystem of the sensor to be analyzed independently for its performance. These performance parameters are then integrated to obtain system level information such as Signal-to-Noise Ratio (SNR), Noise Equivalent Radiance (NER), Noise Equivalent Temperature Difference (NETD) etc. This paper describes the uses of the package and the physics that were used to derive the performance parameters.

  9. Research Data Alliance: Understanding Big Data Analytics Applications in Earth Science

    Science.gov (United States)

    Riedel, Morris; Ramachandran, Rahul; Baumann, Peter

    2014-01-01

    The Research Data Alliance (RDA) enables data to be shared across barriers through focused working groups and interest groups, formed of experts from around the world - from academia, industry and government. Its Big Data Analytics (BDA) interest groups seeks to develop community based recommendations on feasible data analytics approaches to address scientific community needs of utilizing large quantities of data. BDA seeks to analyze different scientific domain applications (e.g. earth science use cases) and their potential use of various big data analytics techniques. These techniques reach from hardware deployment models up to various different algorithms (e.g. machine learning algorithms such as support vector machines for classification). A systematic classification of feasible combinations of analysis algorithms, analytical tools, data and resource characteristics and scientific queries will be covered in these recommendations. This contribution will outline initial parts of such a classification and recommendations in the specific context of the field of Earth Sciences. Given lessons learned and experiences are based on a survey of use cases and also providing insights in a few use cases in detail.

  10. Development of the oil spill response cost-effectiveness analytical tool

    International Nuclear Information System (INIS)

    Etkin, D.S.; Welch, J.

    2005-01-01

    Decision-making during oil spill response operations or contingency planning requires balancing the need to remove as much oil as possible from the environment with the desire to minimize the impact of response operations on the environment they are intended to protect. This paper discussed the creation of a computer tool developed to help in planning and decision-making during response operations. The Oil Spill Response Cost-Effectiveness Analytical Tool (OSRCEAT) was developed to compare the costs of response with the benefits of response in both hypothetical and actual oil spills. The computer-based analytical tool can assist responders and contingency planners in decision-making processes as well as act as a basis of discussion in the evaluation of response options. Using inputs on spill parameters, location and response options, OSRCEAT can calculate response cost, costs of environmental and socioeconomic impacts of the oil spill and response impacts. Oil damages without any response are contrasted to oil damages with response, with expected improvements. Response damages are subtracted from the difference in damages with and without response in order to derive a more accurate response benefit. An OSRCEAT user can test various response options to compare potential benefits in order to maximize response benefit. OSRCEAT is best used to compare and contrast the relative benefits and costs of various response options. 50 refs., 19 tabs., 2 figs

  11. Analytical Design Package (ADP2): A computer aided engineering tool for aircraft transparency design

    Science.gov (United States)

    Wuerer, J. E.; Gran, M.; Held, T. W.

    1994-01-01

    The Analytical Design Package (ADP2) is being developed as a part of the Air Force Frameless Transparency Program (FTP). ADP2 is an integrated design tool consisting of existing analysis codes and Computer Aided Engineering (CAE) software. The objective of the ADP2 is to develop and confirm an integrated design methodology for frameless transparencies, related aircraft interfaces, and their corresponding tooling. The application of this methodology will generate high confidence for achieving a qualified part prior to mold fabrication. ADP2 is a customized integration of analysis codes, CAE software, and material databases. The primary CAE integration tool for the ADP2 is P3/PATRAN, a commercial-off-the-shelf (COTS) software tool. The open architecture of P3/PATRAN allows customized installations with different applications modules for specific site requirements. Integration of material databases allows the engineer to select a material, and those material properties are automatically called into the relevant analysis code. The ADP2 materials database will be composed of four independent schemas: CAE Design, Processing, Testing, and Logistics Support. The design of ADP2 places major emphasis on the seamless integration of CAE and analysis modules with a single intuitive graphical interface. This tool is being designed to serve and be used by an entire project team, i.e., analysts, designers, materials experts, and managers. The final version of the software will be delivered to the Air Force in Jan. 1994. The Analytical Design Package (ADP2) will then be ready for transfer to industry. The package will be capable of a wide range of design and manufacturing applications.

  12. VAO Tools Enhance CANDELS Research Productivity

    Science.gov (United States)

    Greene, Gretchen; Donley, J.; Rodney, S.; LAZIO, J.; Koekemoer, A. M.; Busko, I.; Hanisch, R. J.; VAO Team; CANDELS Team

    2013-01-01

    The formation of galaxies and their co-evolution with black holes through cosmic time are prominent areas in current extragalactic astronomy. New methods in science research are building upon collaborations between scientists and archive data centers which span large volumes of multi-wavelength and heterogeneous data. A successful example of this form of teamwork is demonstrated by the CANDELS (Cosmic Assembly Near-infrared Deep Extragalactic Legacy Survey) and the Virtual Astronomical Observatory (VAO) collaboration. The CANDELS project archive data provider services are registered and discoverable in the VAO through an innovative web based Data Discovery Tool, providing a drill down capability and cross-referencing with other co-spatially located astronomical catalogs, images and spectra. The CANDELS team is working together with the VAO to define new methods for analyzing Spectral Energy Distributions of galaxies containing active galactic nuclei, and helping to evolve advanced catalog matching methods for exploring images of variable depths, wavelengths and resolution. Through the publication of VOEvents, the CANDELS project is publishing data streams for newly discovered supernovae that are bright enough to be followed from the ground.

  13. ANALYTICAL MODEL FOR LATHE TOOL DISPLACEMENTS CALCULUS IN THE MANUFACTURING P ROCESS

    Directory of Open Access Journals (Sweden)

    Catălin ROŞU

    2014-01-01

    Full Text Available In this paper, we present an analytical model for lathe tools displacements calculus in the manufacturing process. We will present step by step the methodology for the displacements calculus and in the end we will insert these relations in a program for automatic calculus and we extract the conclusions. There is taken into account only the effects of the bending moments (because these insert the highest displacements. The simplifying assumptions and the calculus relations for the displacements (linea r and angular ones are presented in an original way.

  14. FIGURED WORLDS AS AN ANALYTIC AND METHODOLOGICAL TOOL IN PROFESSIONAL TEACHER DEVELOPMENT

    DEFF Research Database (Denmark)

    Møller, Hanne; Brok, Lene Storgaard

    . Hasse (2015) and Holland (1998) have inspired our study; i.e., learning is conceptualized as a social phenomenon, implying that contexts of learning are decisive for learner identity. The concept of Figured Worlds is used to understand the development and the social constitution of emergent interactions......,“(Holland et al., 1998, p. 52) and gives a framework for understanding meaning-making in particular pedagogical settings. We exemplify our use of the term Figured Worlds, both as an analytic and methodological tool for empirical studies in kindergarten and school. Based on data sources, such as field notes...

  15. Predictive analytics tools to adjust and monitor performance metrics for the ATLAS Production System

    CERN Document Server

    Barreiro Megino, Fernando Harald; The ATLAS collaboration

    2017-01-01

    Having information such as an estimation of the processing time or possibility of system outage (abnormal behaviour) helps to assist to monitor system performance and to predict its next state. The current cyber-infrastructure presents computing conditions in which contention for resources among high-priority data analysis happens routinely, that might lead to significant workload and data handling interruptions. The lack of the possibility to monitor and to predict the behaviour of the analysis process (its duration) and system’s state itself caused to focus on design of the built-in situational awareness analytic tools.

  16. Kalisphera: an analytical tool to reproduce the partial volume effect of spheres imaged in 3D

    International Nuclear Information System (INIS)

    Tengattini, Alessandro; Andò, Edward

    2015-01-01

    In experimental mechanics, where 3D imaging is having a profound effect, spheres are commonly adopted for their simplicity and for the ease of their modeling. In this contribution we develop an analytical tool, ‘kalisphera’, to produce 3D raster images of spheres including their partial volume effect. This allows us to evaluate the metrological performance of existing image-based measurement techniques (knowing a priori the ground truth). An advanced application of ‘kalisphera’ is developed here to identify and accurately characterize spheres in real 3D x-ray tomography images with the objective of improving trinarization and contact detection. The effect of the common experimental imperfections is assessed and the overall performance of the tool tested on real images. (paper)

  17. Some tooling for manufacturing research reactor fuel plates

    International Nuclear Information System (INIS)

    Knight, R.W.

    1999-01-01

    This paper will discuss some of the tooling necessary to manufacture aluminum-based research reactor fuel plates. Most of this tooling is intended for use in a high-production facility. Some of the tools shown have manufactured more than 150,000 pieces. The only maintenance has been sharpening. With careful design, tools can be made to accommodate the manufacture of several different fuel elements, thus, reducing tooling costs and maintaining tools that the operators are trained to use. An important feature is to design the tools using materials with good lasting quality. Good tools can increase return on investment. (author)

  18. Some Tooling for Manufacturing Research Reactor Fuel Plates

    International Nuclear Information System (INIS)

    Knight, R.W.

    1999-01-01

    This paper will discuss some of the tooling necessary to manufacture aluminum-based research reactor fuel plates. Most of this tooling is intended for use in a high-production facility. Some of the tools shown have manufactured more than 150,000 pieces. The only maintenance has been sharpening. With careful design, tools can be made to accommodate the manufacture of several different fuel elements, thus, reducing tooling costs and maintaining tools that the operators are trained to use. An important feature is to design the tools using materials with good lasting quality. Good tools can increase return on investment

  19. Research opportunities in a reactor-based nuclear analytical laboratory

    International Nuclear Information System (INIS)

    Robinson, L.; Brown, D.H.

    1994-01-01

    Although considered by many to be a open-quotes matureclose quotes science, neutron activation analysis (NAA) continues to be a valuable elemental analysis tool. Examples of the applicability of NAA can be found in a variety of areas including archaeology, environmental science, epidemiology, forensic science, and materials science to name a few. The major components of neutron activation are sample preparation, irradiation, counting, and data analysis. Each one of these stages provides opportunities to share numerous practical and fundamental scientific principles with high school teachers. This paper presents an overview of these opportunities. In addition, a specific example of the collaboration with a high school teacher whose research involved the automation of a gamma-ray spectroscopy counting system using a laboratory robot is discussed

  20. Application of quantum dots as analytical tools in automated chemical analysis: A review

    International Nuclear Information System (INIS)

    Frigerio, Christian; Ribeiro, David S.M.; Rodrigues, S. Sofia M.; Abreu, Vera L.R.G.; Barbosa, João A.C.; Prior, João A.V.; Marques, Karine L.; Santos, João L.M.

    2012-01-01

    Highlights: ► Review on quantum dots application in automated chemical analysis. ► Automation by using flow-based techniques. ► Quantum dots in liquid chromatography and capillary electrophoresis. ► Detection by fluorescence and chemiluminescence. ► Electrochemiluminescence and radical generation. - Abstract: Colloidal semiconductor nanocrystals or quantum dots (QDs) are one of the most relevant developments in the fast-growing world of nanotechnology. Initially proposed as luminescent biological labels, they are finding new important fields of application in analytical chemistry, where their photoluminescent properties have been exploited in environmental monitoring, pharmaceutical and clinical analysis and food quality control. Despite the enormous variety of applications that have been developed, the automation of QDs-based analytical methodologies by resorting to automation tools such as continuous flow analysis and related techniques, which would allow to take advantage of particular features of the nanocrystals such as the versatile surface chemistry and ligand binding ability, the aptitude to generate reactive species, the possibility of encapsulation in different materials while retaining native luminescence providing the means for the implementation of renewable chemosensors or even the utilisation of more drastic and even stability impairing reaction conditions, is hitherto very limited. In this review, we provide insights into the analytical potential of quantum dots focusing on prospects of their utilisation in automated flow-based and flow-related approaches and the future outlook of QDs applications in chemical analysis.

  1. Thermo Techno Modern Analytical Equipment for Research and Industrial Laboratories

    Directory of Open Access Journals (Sweden)

    Khokhlov, S.V.

    2014-03-01

    Full Text Available A brief overview of some models of Thermo Techno analytical equipment and possible areas of their application is given. Thermo Techno Company was created in 2000 as a part of representative office of international corporation Thermo Fisher Scientific — world leader in manufacturing analytical equipments. Thermo Techno is a unique company in its integrated approach in solving the problems of the user, which includes a series of steps: setting the analytical task, selection of effective analysis methods, sample delivery and preparation as well as data transmitting and archiving.

  2. Second International Workshop on Teaching Analytics

    DEFF Research Database (Denmark)

    Vatrapu, Ravi; Reimann, Peter; Halb, Wolfgang

    2013-01-01

    Teaching Analytics is conceived as a subfield of learning analytics that focuses on the design, development, evaluation, and education of visual analytics methods and tools for teachers in primary, secondary, and tertiary educational settings. The Second International Workshop on Teaching Analytics...... (IWTA) 2013 seeks to bring together researchers and practitioners in the fields of education, learning sciences, learning analytics, and visual analytics to investigate the design, development, use, evaluation, and impact of visual analytical methods and tools for teachers’ dynamic diagnostic decision...

  3. Streamlining Research by Using Existing Tools

    OpenAIRE

    Greene, Sarah M.; Baldwin, Laura-Mae; Dolor, Rowena J.; Thompson, Ella; Neale, Anne Victoria

    2011-01-01

    Over the past two decades, the health research enterprise has matured rapidly, and many recognize an urgent need to translate pertinent research results into practice, to help improve the quality, accessibility, and affordability of U.S. health care. Streamlining research operations would speed translation, particularly for multi-site collaborations. However, the culture of research discourages reusing or adapting existing resources or study materials. Too often, researchers start studies and...

  4. Forum Theater’s potential as a Research Tool

    Directory of Open Access Journals (Sweden)

    Andrea Calsamiglia Madurga

    2016-03-01

    Full Text Available We present a theoretical and epistemological reflection on Forum Theater’s potential as a Research Tool. Our presence on social action and research has led us to a double reflection on qualitative research’s limitations on the affect studies and the Forum Theater’s potential as a research tool to tackle research about affects. After some specific experiences in action research (qualitative research on romantic love and gender violence, and the creation process of the Forum Theater “Is it a joke?”, we explore Forum Theatre’s possibilities as a research tool in the feminist epistemology framework.

  5. Market research companies and new product development tools

    NARCIS (Netherlands)

    Nijssen, E.J.; Frambach, R.T.

    1998-01-01

    This research investigates (1) the share of new product development (NPD) research services in market research (MR) companies’ turnover, (2) MR companies’ awareness and use of NPD tools and the modifications made to these NPD tools, and (3) MR company managers’ perceptions of the influence of client

  6. Market research companies and new product development tools

    NARCIS (Netherlands)

    Nijssen, Edwin J.; Frambach, Ruud T.

    1998-01-01

    This research investigates (1) the share of new product development (NPD) research services in market research (MR) companies' turnover, (2) MR companies' awareness and use of NPD tools and the modifications made to these NPD tools, and (3) MR company managers' perceptions of the influence of client

  7. Tools for Ephemeral Gully Erosion Process Research

    Science.gov (United States)

    Techniques to quantify ephemeral gully erosion have been identified by USDA Natural Resources Conservation Service (NRCS) as one of gaps in current erosion assessment tools. One reason that may have contributed to this technology gap is the difficulty to quantify changes in channel geometry to asses...

  8. ProteoLens: a visual analytic tool for multi-scale database-driven biological network data mining.

    Science.gov (United States)

    Huan, Tianxiao; Sivachenko, Andrey Y; Harrison, Scott H; Chen, Jake Y

    2008-08-12

    New systems biology studies require researchers to understand how interplay among myriads of biomolecular entities is orchestrated in order to achieve high-level cellular and physiological functions. Many software tools have been developed in the past decade to help researchers visually navigate large networks of biomolecular interactions with built-in template-based query capabilities. To further advance researchers' ability to interrogate global physiological states of cells through multi-scale visual network explorations, new visualization software tools still need to be developed to empower the analysis. A robust visual data analysis platform driven by database management systems to perform bi-directional data processing-to-visualizations with declarative querying capabilities is needed. We developed ProteoLens as a JAVA-based visual analytic software tool for creating, annotating and exploring multi-scale biological networks. It supports direct database connectivity to either Oracle or PostgreSQL database tables/views, on which SQL statements using both Data Definition Languages (DDL) and Data Manipulation languages (DML) may be specified. The robust query languages embedded directly within the visualization software help users to bring their network data into a visualization context for annotation and exploration. ProteoLens supports graph/network represented data in standard Graph Modeling Language (GML) formats, and this enables interoperation with a wide range of other visual layout tools. The architectural design of ProteoLens enables the de-coupling of complex network data visualization tasks into two distinct phases: 1) creating network data association rules, which are mapping rules between network node IDs or edge IDs and data attributes such as functional annotations, expression levels, scores, synonyms, descriptions etc; 2) applying network data association rules to build the network and perform the visual annotation of graph nodes and edges

  9. Online Analytical Processing (OLAP: A Fast and Effective Data Mining Tool for Gene Expression Databases

    Directory of Open Access Journals (Sweden)

    Alkharouf Nadim W.

    2005-01-01

    Full Text Available Gene expression databases contain a wealth of information, but current data mining tools are limited in their speed and effectiveness in extracting meaningful biological knowledge from them. Online analytical processing (OLAP can be used as a supplement to cluster analysis for fast and effective data mining of gene expression databases. We used Analysis Services 2000, a product that ships with SQLServer2000, to construct an OLAP cube that was used to mine a time series experiment designed to identify genes associated with resistance of soybean to the soybean cyst nematode, a devastating pest of soybean. The data for these experiments is stored in the soybean genomics and microarray database (SGMD. A number of candidate resistance genes and pathways were found. Compared to traditional cluster analysis of gene expression data, OLAP was more effective and faster in finding biologically meaningful information. OLAP is available from a number of vendors and can work with any relational database management system through OLE DB.

  10. Capturing student mathematical engagement through differently enacted classroom practices: applying a modification of Watson's analytical tool

    Science.gov (United States)

    Patahuddin, Sitti Maesuri; Puteri, Indira; Lowrie, Tom; Logan, Tracy; Rika, Baiq

    2018-04-01

    This study examined student mathematical engagement through the intended and enacted lessons taught by two teachers in two different middle schools in Indonesia. The intended lesson was developed using the ELPSA learning design to promote mathematical engagement. Based on the premise that students will react to the mathematical tasks in the forms of words and actions, the analysis focused on identifying the types of mathematical engagement promoted through the intended lesson and performed by students during the lesson. Using modified Watson's analytical tool (2007), students' engagement was captured from what the participants' did or said mathematically. We found that teachers' enacted practices had an influence on student mathematical engagement. The teacher who demonstrated content in explicit ways tended to limit the richness of the engagement; whereas the teacher who presented activities in an open-ended manner fostered engagement.

  11. Online analytical processing (OLAP): a fast and effective data mining tool for gene expression databases.

    Science.gov (United States)

    Alkharouf, Nadim W; Jamison, D Curtis; Matthews, Benjamin F

    2005-06-30

    Gene expression databases contain a wealth of information, but current data mining tools are limited in their speed and effectiveness in extracting meaningful biological knowledge from them. Online analytical processing (OLAP) can be used as a supplement to cluster analysis for fast and effective data mining of gene expression databases. We used Analysis Services 2000, a product that ships with SQLServer2000, to construct an OLAP cube that was used to mine a time series experiment designed to identify genes associated with resistance of soybean to the soybean cyst nematode, a devastating pest of soybean. The data for these experiments is stored in the soybean genomics and microarray database (SGMD). A number of candidate resistance genes and pathways were found. Compared to traditional cluster analysis of gene expression data, OLAP was more effective and faster in finding biologically meaningful information. OLAP is available from a number of vendors and can work with any relational database management system through OLE DB.

  12. Using the Technology of the Confessional as an Analytical Resource: Four Analytical Stances Towards Research Interviews in Discourse Analysis

    Directory of Open Access Journals (Sweden)

    Brendan K. O'Rourke

    2007-05-01

    Full Text Available Among the various approaches that have developed from FOUCAULT's work is an Anglophone discourse analysis that has attempted to combine FOUCAULTian insights with the techniques of Conversation Analysis. An important current methodological issue in this discourse analytical approach is its theoretical preference for "naturally occurring" rather than research interview data. A FOUCAULTian perspective on the interview as a research instrument, questions the idea of "naturally-occurring discourse". The "technology of the confessional" operates, not only within research interviews, but permeates other interactions as well. Drawing on FOUCAULT does not dismiss the problems of the interview as research instrument rather it shows they cannot be escaped by simply switching to more "natural" interactions. Combining these insights with recent developments within discourse analysis can provide analytical resources for, rather than barriers to, the discourse analysis of research interviews. To aid such an approach, we develop a four-way categorisation of analytical stances towards the research interview in discourse analysis. A demonstration of how a research interview might be subjected to a discourse analysis using elements of this approach is then provided. URN: urn:nbn:de:0114-fqs070238

  13. X-ray fluorescence in Member States: Philippines. XRF activities at Analytical Measurements Research Group, Philippine Nuclear Research Institute

    International Nuclear Information System (INIS)

    Pabroa, Corazon B.; Castaneda, Soledad S.; Almoneda, Rosalina V.; Sucgang, Raymond J.; Racho, Joseph Michael D.; Morco, Ryan P.; Cuyco, Danilo; Jimenez, Gloria; Santos, Flora L.

    2008-01-01

    Full text: XRF analysis and activities of the Analytical Measurements Research (AMR) Group (see Fig.1) of the Philippine Nuclear Research Institute (PNRI) focus on both research and analytical services. Air pollution research, in particular source apportionment studies, requires multi-elemental data for a substantial number of samples. In the PNRI, energy-dispersive X-ray fluorescence (EDXRF) has been used as an effective tool for providing such multi-elemental data. With the latest acquisition of the Panalytical Epsilon 5 (E5) EDXRF system, the process of quantification has become easier and faster with the auto-quantify method. Other research involvements of the group are in the analysis of samples in relation to mineral explorations and the elemental characterization of water in support for isotope hydrology research. The AMR group, as part of its function to provide analytical services, offers qualitative or semi quantitative analysis of solid samples using the auto quantify method, quantitative analysis of environmental samples using the emission-transmission method and quantitative analysis of air particulate matter collected on filters. Telephone wire materials sold in junkshops (alleged to have been pilfered from installed telephone lines of a major telecommunications company in the country) and materials being assessed in relation to patent claims are other examples of samples submitted for analytical services. As mentioned, a useful feature of the E5 system is the use of the auto-quantify (AQ) method. Calibration lines used for this type of application are obtained using the fundamental parameter (FP) model. For AQ applications, accurate results are obtained for samples prepared as fused glass beads in which the whole matrix is known. However, only qualitative or semi quantitative analysis can be applied for other types of solid samples. The AQ method was adapted for the multi-elemental analysis of air particulates using the MicroMatter standards to set

  14. Learning analytics as a tool for closing the assessment loop in higher education

    Directory of Open Access Journals (Sweden)

    Karen D. Mattingly

    2012-09-01

    Full Text Available This paper examines learning and academic analytics and its relevance to distance education in undergraduate and graduate programs as it impacts students and teaching faculty, and also academic institutions. The focus is to explore the measurement, collection, analysis, and reporting of data as predictors of student success and drivers of departmental process and program curriculum. Learning and academic analytics in higher education is used to predict student success by examining how and what students learn and how success is supported by academic programs and institutions. The paper examines what is being done to support students, whether or not it is effective, and if not why, and what educators can do. The paper also examines how these data can be used to create new metrics and inform a continuous cycle of improvement. It presents examples of working models from a sample of institutions of higher education: The Graduate School of Medicine at the University of Wollongong, the University of Michigan, Purdue University, and the University of Maryland, Baltimore County. Finally, the paper identifies considerations and recommendations for using analytics and offer suggestions for future research.

  15. Stable isotope ratio analysis: A potential analytical tool for the authentication of South African lamb meat.

    Science.gov (United States)

    Erasmus, Sara Wilhelmina; Muller, Magdalena; van der Rijst, Marieta; Hoffman, Louwrens Christiaan

    2016-02-01

    Stable isotope ratios ((13)C/(12)C and (15)N/(14)N) of South African Dorper lambs from farms with different vegetation types were measured by isotope ratio mass spectrometry (IRMS), to evaluate it as a tool for the authentication of origin and feeding regime. Homogenised and defatted meat of the Longissimus lumborum (LL) muscle of lambs from seven different farms was assessed. The δ(13)C values were affected by the origin of the meat, mainly reflecting the diet. The Rûens and Free State farms had the lowest (p ⩽ 0.05) δ(15)N values, followed by the Northern Cape farms, with Hantam Karoo/Calvinia having the highest δ(15)N values. Discriminant analysis showed δ(13)C and δ(15)N differences as promising results for the use of IRMS as a reliable analytical tool for lamb meat authentication. The results suggest that diet, linked to origin, is an important factor to consider regarding region of origin classification for South African lamb. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Primer on consumer marketing research : procedures, methods, and tools

    Science.gov (United States)

    1994-03-01

    The Volpe Center developed a marketing research primer which provides a guide to the approach, procedures, and research tools used by private industry in predicting consumer response. The final two chapters of the primer focus on the challenges of do...

  17. The CATS Service: An Astrophysical Research Tool

    Directory of Open Access Journals (Sweden)

    O V Verkhodanov

    2009-03-01

    Full Text Available We describe the current status of CATS (astrophysical CATalogs Support system, a publicly accessible tool maintained at Special Astrophysical Observatory of the Russian Academy of Sciences (SAO RAS (http://cats.sao.ru allowing one to search hundreds of catalogs of astronomical objects discovered all along the electromagnetic spectrum. Our emphasis is mainly on catalogs of radio continuum sources observed from 10 MHz to 245 GHz, and secondly on catalogs of objects such as radio and active stars, X-ray binaries, planetary nebulae, HII regions, supernova remnants, pulsars, nearby and radio galaxies, AGN and quasars. CATS also includes the catalogs from the largest extragalactic surveys with non-radio waves. In 2008 CATS comprised a total of about 109 records from over 400 catalogs in the radio, IR, optical and X-ray windows, including most source catalogs deriving from observations with the Russian radio telescope RATAN-600. CATS offers several search tools through different ways of access, e.g. via Web-interface and e-mail. Since its creation in 1997 CATS has managed about 105requests. Currently CATS is used by external users about 1500 times per day and since its opening to the public in 1997 has received about 4000 requests for its selection and matching tasks.

  18. Moving research tools into practice: the successes and challenges in promoting uptake of classification tools.

    Science.gov (United States)

    Cunningham, Barbara Jane; Hidecker, Mary Jo Cooley; Thomas-Stonell, Nancy; Rosenbaum, Peter

    2018-05-01

    In this paper, we present our experiences - both successes and challenges - in implementing evidence-based classification tools into clinical practice. We also make recommendations for others wanting to promote the uptake and application of new research-based assessment tools. We first describe classification systems and the benefits of using them in both research and practice. We then present a theoretical framework from Implementation Science to report strategies we have used to implement two research-based classification tools into practice. We also illustrate some of the challenges we have encountered by reporting results from an online survey investigating 58 Speech-language Pathologists' knowledge and use of the Communication Function Classification System (CFCS), a new tool to classify children's functional communication skills. We offer recommendations for researchers wanting to promote the uptake of new tools in clinical practice. Specifically, we identify structural, organizational, innovation, practitioner, and patient-related factors that we recommend researchers address in the design of implementation interventions. Roles and responsibilities of both researchers and clinicians in making implementations science a success are presented. Implications for rehabilitation Promoting uptake of new and evidence-based tools into clinical practice is challenging. Implementation science can help researchers to close the knowledge-to-practice gap. Using concrete examples, we discuss our experiences in implementing evidence-based classification tools into practice within a theoretical framework. Recommendations are provided for researchers wanting to implement new tools in clinical practice. Implications for researchers and clinicians are presented.

  19. Development of a data capture tool for researching tech entrepreneurship

    DEFF Research Database (Denmark)

    Andersen, Jakob Axel Bejbro; Howard, Thomas J.; McAloone, Tim C.

    2014-01-01

    . This paper elucidates the requirements for such tools by drawing on knowledge of the entrepreneurial phenomenon and by building on the existing research tools used in design research. On this basis, the development of a capture method for tech startup processes is described and its potential discussed....

  20. Narratives and Activity Theory as Reflective Tools in Action Research

    Science.gov (United States)

    Stuart, Kaz

    2012-01-01

    Narratives and activity theory are useful as socially constructed data collection tools that allow a researcher access to the social, cultural and historical meanings that research participants place on events in their lives. This case study shows how these tools were used to promote reflection within a cultural-historical activity theoretically…

  1. Studying Behaviors Among Neurosurgery Residents Using Web 2.0 Analytic Tools.

    Science.gov (United States)

    Davidson, Benjamin; Alotaibi, Naif M; Guha, Daipayan; Amaral, Sandi; Kulkarni, Abhaya V; Lozano, Andres M

    Web 2.0 technologies (e.g., blogs, social networks, and wikis) are increasingly being used by medical schools and postgraduate training programs as tools for information dissemination. These technologies offer the unique opportunity to track metrics of user engagement and interaction. Here, we employ Web 2.0 tools to assess academic behaviors among neurosurgery residents. We performed a retrospective review of all educational lectures, part of the core Neurosurgery Residency curriculum at the University of Toronto, posted on our teaching website (www.TheBrainSchool.net). Our website was developed using publicly available Web 2.0 platforms. Lecture usage was assessed by the number of clicks, and associations were explored with lecturer academic position, timing of examinations, and lecture/subspecialty topic. The overall number of clicks on 77 lectures was 1079. Most of these clicks were occurring during the in-training examination month (43%). Click numbers were significantly higher on lectures presented by faculty (mean = 18.6, standard deviation ± 4.1) compared to those delivered by residents (mean = 8.4, standard deviation ± 2.1) (p = 0.031). Lectures covering topics in functional neurosurgery received the most clicks (47%), followed by pediatric neurosurgery (22%). This study demonstrates the value of Web 2.0 analytic tools in examining resident study behavior. Residents tend to "cram" by downloading lectures in the same month of training examinations and display a preference for faculty-delivered lectures. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  2. Experimental and Analytical Research on Fracture Processes in ROck

    Energy Technology Data Exchange (ETDEWEB)

    Herbert H.. Einstein; Jay Miller; Bruno Silva

    2009-02-27

    Experimental studies on fracture propagation and coalescence were conducted which together with previous tests by this group on gypsum and marble, provide information on fracturing. Specifically, different fracture geometries wsere tested, which together with the different material properties will provide the basis for analytical/numerical modeling. INitial steps on the models were made as were initial investigations on the effect of pressurized water on fracture coalescence.

  3. Data Integration Tool: From Permafrost Data Translation Research Tool to A Robust Research Application

    Science.gov (United States)

    Wilcox, H.; Schaefer, K. M.; Jafarov, E. E.; Strawhacker, C.; Pulsifer, P. L.; Thurmes, N.

    2016-12-01

    The United States National Science Foundation funded PermaData project led by the National Snow and Ice Data Center (NSIDC) with a team from the Global Terrestrial Network for Permafrost (GTN-P) aimed to improve permafrost data access and discovery. We developed a Data Integration Tool (DIT) to significantly speed up the time of manual processing needed to translate inconsistent, scattered historical permafrost data into files ready to ingest directly into the GTN-P. We leverage this data to support science research and policy decisions. DIT is a workflow manager that divides data preparation and analysis into a series of steps or operations called widgets. Each widget does a specific operation, such as read, multiply by a constant, sort, plot, and write data. DIT allows the user to select and order the widgets as desired to meet their specific needs. Originally it was written to capture a scientist's personal, iterative, data manipulation and quality control process of visually and programmatically iterating through inconsistent input data, examining it to find problems, adding operations to address the problems, and rerunning until the data could be translated into the GTN-P standard format. Iterative development of this tool led to a Fortran/Python hybrid then, with consideration of users, licensing, version control, packaging, and workflow, to a publically available, robust, usable application. Transitioning to Python allowed the use of open source frameworks for the workflow core and integration with a javascript graphical workflow interface. DIT is targeted to automatically handle 90% of the data processing for field scientists, modelers, and non-discipline scientists. It is available as an open source tool in GitHub packaged for a subset of Mac, Windows, and UNIX systems as a desktop application with a graphical workflow manager. DIT was used to completely translate one dataset (133 sites) that was successfully added to GTN-P, nearly translate three datasets

  4. Analytical Implications of Using Practice Theory in Workplace Information Literacy Research

    Science.gov (United States)

    Moring, Camilla; Lloyd, Annemaree

    2013-01-01

    Introduction: This paper considers practice theory and the analytical implications of using this theoretical approach in information literacy research. More precisely the aim of the paper is to discuss the translation of practice theoretical assumptions into strategies that frame the analytical focus and interest when researching workplace…

  5. Meson facility. Powerful new research tool

    International Nuclear Information System (INIS)

    Lobashev, V.M.; Tavkhelidze, A.N.

    A meson facility is being built at the Institute of Nuclear Research, USSR Academy of Sciences, in Troitsk, where the Scientific Center, USSR Academy of Sciences is located. The facility will include a linear accelerator for protons and negative hydrogen ions with 600 MeV energy and 0.5-1 mA beam current. Some fundamental studies that can be studied at a meson facility are described in the areas of elementary particles, neutron physics, solid state physics, and applied research. The characteristics of the linear accelerator are given and the meson facility's experimental complex is described

  6. LITERATURE REVIEWING WITH RESEARCH TOOLS, Part 3: Writing Literature Review

    OpenAIRE

    Ebrahim, Nader Ale

    2017-01-01

    Research Tools” enable researchers to collect, organize, analyze, visualize and publicized research outputs. Dr. Nader has collected over 700 tools that enable students to follow the correct path in research and to ultimately produce high-quality research outputs with more accuracy and efficiency. It is assembled as an interactive Web-based mind map, titled “Research Tools”, which is updated periodically. “Research Tools” consists of a hierarchical set of nodes. It has four main nodes: (1)...

  7. LITERATURE REVIEWING WITH RESEARCH TOOLS, Part 2: Finding proper articles

    OpenAIRE

    Ebrahim, Nader Ale

    2017-01-01

    Research Tools” enable researchers to collect, organize, analyze, visualize and publicized research outputs. Dr. Nader has collected over 700 tools that enable students to follow the correct path in research and to ultimately produce high-quality research outputs with more accuracy and efficiency. It is assembled as an interactive Web-based mind map, titled “Research Tools”, which is updated periodically. “Research Tools” consists of a hierarchical set of nodes. It has four main nodes: (1)...

  8. XPS - an essential tool in biomaterial research

    Energy Technology Data Exchange (ETDEWEB)

    StJohn, H.A.W.; Greisser, H.J. [Commonwealth Scientific and Industrial Research Organization (CSIRO), Clayton, VIC (Australia). Molecular Science

    1999-12-01

    non-smooth materials such as contact lenses, coating thicknesses can be determined by incorporating slope histograms obtained by atomic force microscopy into XPS photoelectron intensity algorithms. A key limiting factor is the irreversible, uncontrolled accumulation of biological material on the surface of devices placed in contact with biological media (`biofouling`). XPS can be used to study quantitatively the rates of biofouling events and to assess the nature of adsorbing molecules; combined with other techniques such as TOF-SSIMS and Surface-matrix-assisted laser desorption/ionization mass spectroscopy (MALDI-MS), it is possible to gain molecular understanding of which biological components adsorb on various surface chemistries. Copyright (1999) Australian X-ray Analytical Association Inc.

  9. XPS - an essential tool in biomaterial research

    International Nuclear Information System (INIS)

    StJohn, H.A.W.; Greisser, H.J.

    1999-01-01

    non-smooth materials such as contact lenses, coating thicknesses can be determined by incorporating slope histograms obtained by atomic force microscopy into XPS photoelectron intensity algorithms. A key limiting factor is the irreversible, uncontrolled accumulation of biological material on the surface of devices placed in contact with biological media ('biofouling'). XPS can be used to study quantitatively the rates of biofouling events and to assess the nature of adsorbing molecules; combined with other techniques such as TOF-SSIMS and Surface-matrix-assisted laser desorption/ionization mass spectroscopy (MALDI-MS), it is possible to gain molecular understanding of which biological components adsorb on various surface chemistries. Copyright (1999) Australian X-ray Analytical Association Inc

  10. Web Analytics

    Science.gov (United States)

    EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.

  11. Equity Audit: A Teacher Leadership Tool for Nurturing Teacher Research

    Science.gov (United States)

    View, Jenice L.; DeMulder, Elizabeth; Stribling, Stacia; Dodman, Stephanie; Ra, Sophia; Hall, Beth; Swalwell, Katy

    2016-01-01

    This is a three-part essay featuring six teacher educators and one classroom teacher researcher. Part one describes faculty efforts to build curriculum for teacher research, scaffold the research process, and analyze outcomes. Part two shares one teacher researcher's experience using an equity audit tool in several contexts: her teaching practice,…

  12. Simulation tools for robotics research and assessment

    Science.gov (United States)

    Fields, MaryAnne; Brewer, Ralph; Edge, Harris L.; Pusey, Jason L.; Weller, Ed; Patel, Dilip G.; DiBerardino, Charles A.

    2016-05-01

    The Robotics Collaborative Technology Alliance (RCTA) program focuses on four overlapping technology areas: Perception, Intelligence, Human-Robot Interaction (HRI), and Dexterous Manipulation and Unique Mobility (DMUM). In addition, the RCTA program has a requirement to assess progress of this research in standalone as well as integrated form. Since the research is evolving and the robotic platforms with unique mobility and dexterous manipulation are in the early development stage and very expensive, an alternate approach is needed for efficient assessment. Simulation of robotic systems, platforms, sensors, and algorithms, is an attractive alternative to expensive field-based testing. Simulation can provide insight during development and debugging unavailable by many other means. This paper explores the maturity of robotic simulation systems for applications to real-world problems in robotic systems research. Open source (such as Gazebo and Moby), commercial (Simulink, Actin, LMS), government (ANVEL/VANE), and the RCTA-developed RIVET simulation environments are examined with respect to their application in the robotic research domains of Perception, Intelligence, HRI, and DMUM. Tradeoffs for applications to representative problems from each domain are presented, along with known deficiencies and disadvantages. In particular, no single robotic simulation environment adequately covers the needs of the robotic researcher in all of the domains. Simulation for DMUM poses unique constraints on the development of physics-based computational models of the robot, the environment and objects within the environment, and the interactions between them. Most current robot simulations focus on quasi-static systems, but dynamic robotic motion places an increased emphasis on the accuracy of the computational models. In order to understand the interaction of dynamic multi-body systems, such as limbed robots, with the environment, it may be necessary to build component

  13. The National Shipbuilding Research Program, Analytical Quality Circles

    Science.gov (United States)

    1986-09-01

    standard tools for quality control, in English, see “Guide to Quality Control” by Dr. Kaoru Ishikawa , Asian Productivity Organization, Aoyama Dai-ichi...factors affect work evaluation is shown schemati- cally by Characteristic-Factor Diagrams (also called Fishbone or Ishikawa Diagrams), see Figure 2-5

  14. Game Analytics for Game User Research, Part 1

    DEFF Research Database (Denmark)

    Seif El-Nasr, Magy; Desurvire, Heather; Aghabeigi, Bardia

    2013-01-01

    The emerging field of game user research (GUR) investigates interaction between players and games and the surrounding context of play. Game user researchers have explored methods from, for example, human-computer interaction, psychology, interaction design......The emerging field of game user research (GUR) investigates interaction between players and games and the surrounding context of play. Game user researchers have explored methods from, for example, human-computer interaction, psychology, interaction design...

  15. Life cycle assessment as an analytical tool in strategic environmental assessment. Lessons learned from a case study on municipal energy planning in Sweden

    International Nuclear Information System (INIS)

    Björklund, Anna

    2012-01-01

    Life cycle assessment (LCA) is explored as an analytical tool in strategic environmental assessment (SEA), illustrated by case where a previously developed SEA process was applied to municipal energy planning in Sweden. The process integrated decision-making tools for scenario planning, public participation and environmental assessment. This article describes the use of LCA for environmental assessment in this context, with focus on methodology and practical experiences. While LCA provides a systematic framework for the environmental assessment and a wider systems perspective than what is required in SEA, LCA cannot address all aspects of environmental impact required, and therefore needs to be complemented by other tools. The integration of LCA with tools for public participation and scenario planning posed certain methodological challenges, but provided an innovative approach to designing the scope of the environmental assessment and defining and assessing alternatives. - Research highlights: ► LCA was explored as analytical tool in an SEA process of municipal energy planning. ► The process also integrated LCA with scenario planning and public participation. ► Benefits of using LCA were a systematic framework and wider systems perspective. ► Integration of tools required some methodological challenges to be solved. ► This proved an innovative approach to define alternatives and scope of assessment.

  16. saltPAD: A New Analytical Tool for Monitoring Salt Iodization in Low Resource Settings

    Directory of Open Access Journals (Sweden)

    Nicholas M. Myers

    2016-03-01

    Full Text Available We created a paper test card that measures a common iodizing agent, iodate, in salt. To test the analytical metrics, usability, and robustness of the paper test card when it is used in low resource settings, the South African Medical Research Council and GroundWork performed independ‐ ent validation studies of the device. The accuracy and precision metrics from both studies were comparable. In the SAMRC study, more than 90% of the test results (n=1704 were correctly classified as corresponding to adequately or inadequately iodized salt. The cards are suitable for market and household surveys to determine whether salt is adequately iodized. Further development of the cards will improve their utility for monitoring salt iodization during production.

  17. Time-resolved fluorescence microscopy (FLIM) as an analytical tool in skin nanomedicine.

    Science.gov (United States)

    Alexiev, Ulrike; Volz, Pierre; Boreham, Alexander; Brodwolf, Robert

    2017-07-01

    The emerging field of nanomedicine provides new approaches for the diagnosis and treatment of diseases, for symptom relief, and for monitoring of disease progression. Topical application of drug-loaded nanoparticles for the treatment of skin disorders is a promising strategy to overcome the stratum corneum, the upper layer of the skin, which represents an effective physical and biochemical barrier. The understanding of drug penetration into skin and enhanced penetration into skin facilitated by nanocarriers requires analytical tools that ideally allow to visualize the skin, its morphology, the drug carriers, drugs, their transport across the skin and possible interactions, as well as effects of the nanocarriers within the different skin layers. Here, we review some recent developments in the field of fluorescence microscopy, namely Fluorescence Lifetime Imaging Microscopy (FLIM)), for improved characterization of nanocarriers, their interactions and penetration into skin. In particular, FLIM allows for the discrimination of target molecules, e.g. fluorescently tagged nanocarriers, against the autofluorescent tissue background and, due to the environmental sensitivity of the fluorescence lifetime, also offers insights into the local environment of the nanoparticle and its interactions with other biomolecules. Thus, FLIM shows the potential to overcome several limits of intensity based microscopy. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Fourier transform ion cyclotron resonance mass spectrometry: the analytical tool for heavy oil and bitumen characterization

    Energy Technology Data Exchange (ETDEWEB)

    Oldenburg, Thomas B.P; Brown, Melisa; Hsieh, Ben; Larter, Steve [Petroleum Reservoir Group (prg), Department of Geoscience, University of Calgary, Alberta (Canada)

    2011-07-01

    The Fourier Transform Ion Cyclotron Resonance Mass Spectrometry (FTICRMS), developed in the 1970's by Marshall and Comisarow at the University of British Columbia, has become a commercially available tool capable of analyzing several hundred thousand components in a petroleum mixture at once. This analytical technology will probably usher a dramatic revolution in geochemical capability, equal to or greater than the molecular revolution that occurred when GCMS technologies became cheaply available. The molecular resolution and information content given by the FTICRMS petroleum analysis can be compared to the information in the human genome. With current GCMS-based petroleum geochemical protocols perhaps a few hundred components can be quantitatively determined, but with FTICRMS, 1000 times this number of components could possibly be resolved. However, fluid and other properties depend on interaction of this multitude of hydrocarbon and non-hydrocarbon components, not the components themselves, and access to the full potential of this new petroleomics will depend on the definition of this interactome.

  19. The 'secureplan' bomb utility: A PC-based analytic tool for bomb defense

    International Nuclear Information System (INIS)

    Massa, R.J.

    1987-01-01

    This paper illustrates a recently developed, PC-based software system for simulating the effects of an infinite variety of hypothetical bomb blasts on structures and personnel in the immediate vicinity of such blasts. The system incorporates two basic rectangular geometries in which blast assessments can be made - an external configuration (highly vented) and an internal configuration (vented and unvented). A variety of explosives can be used - each is translated to an equivalent TNT weight. Provisions in the program account for bomb cases (person, satchel, case and vehicle), mixes of explosives and shrapnel aggregates and detonation altitudes. The software permits architects, engineers, security personnel and facility managers, without specific knowledge of explosives, to incorporate realistic construction hardening, screening programs, barriers and stand-off provisions in the design and/or operation of diverse facilities. System outputs - generally represented as peak incident or reflected overpressure or impulses - are both graphic and analytic and integrate damage threshold data for common construction materials including window glazing. The effects of bomb blasts on humans is estimated in terms of temporary and permanent hearing damage, lung damage (lethality) and whole body translation injury. The software system has been used in the field in providing bomb defense services to a number of commercial clients since July of 1986. In addition to the design of siting, screening and hardening components of bomb defense programs, the software has proven very useful in post-incident analysis and repair scenarios and as a teaching tool for bomb defense training

  20. Impact of population and economic growth on carbon emissions in Taiwan using an analytic tool STIRPAT

    Directory of Open Access Journals (Sweden)

    Jong-Chao Yeh

    2017-01-01

    Full Text Available Carbon emission has increasingly become an issue of global concern because of climate change. Unfortunately, Taiwan was listed as top 20 countries of carbon emission in 2014. In order to provide appropriate measures to control carbon emission, it appears that there is an urgent need to address how such factors as population and economic growth impact the emission of carbon dioxide in any developing countries. In addition to total population, both the percentages of population living in urban area (i.e., urbanization percentage, and non-dependent population may also serve as limiting factors. On the other hand, the total energy-driven gross domestic production (GDP and the percentage of GDP generated by the manufacturing industries are assessed to see their respective degree of impact on carbon emission. Therefore, based on the past national data in the period 1990–2014 in Taiwan, an analytic tool of Stochastic Impacts by Regression on Population, Affluence and Technology (STIRPAT was employed to see how well those aforementioned factors can describe their individual potential impact on global warming, which is measured by the total amount of carbon emission into the atmosphere. Seven scenarios of STIRPAT model were proposed and tested statistically for the significance of each proposed model. As a result, two models were suggested to predict the impact of carbon emission due to population and economic growth by the year 2025 in Taiwan.

  1. Surveillance of emerging drugs of abuse in Hong Kong: validation of an analytical tool.

    Science.gov (United States)

    Tang, Magdalene H Y; Ching, C K; Tse, M L; Ng, Carol; Lee, Caroline; Chong, Y K; Wong, Watson; Mak, Tony W L

    2015-04-01

    To validate a locally developed chromatography-based method to monitor emerging drugs of abuse whilst performing regular drug testing in abusers. Cross-sectional study. Eleven regional hospitals, seven social service units, and a tertiary level clinical toxicology laboratory in Hong Kong. A total of 972 drug abusers and high-risk individuals were recruited from acute, rehabilitation, and high-risk settings between 1 November 2011 and 31 July 2013. A subset of the participants was of South Asian ethnicity. In total, 2000 urine or hair specimens were collected. Proof of concept that surveillance of emerging drugs of abuse can be performed whilst conducting routine drug of abuse testing in patients. The method was successfully applied to 2000 samples with three emerging drugs of abuse detected in five samples: PMMA (paramethoxymethamphetamine), TFMPP [1-(3-trifluoromethylphenyl)piperazine], and methcathinone. The method also detected conventional drugs of abuse, with codeine, methadone, heroin, methamphetamine, and ketamine being the most frequently detected drugs. Other findings included the observation that South Asians had significantly higher rates of using opiates such as heroin, methadone, and codeine; and that ketamine and cocaine had significantly higher detection rates in acute subjects compared with the rehabilitation population. This locally developed analytical method is a valid tool for simultaneous surveillance of emerging drugs of abuse and routine drug monitoring of patients at minimal additional cost and effort. Continued, proactive surveillance and early identification of emerging drugs will facilitate prompt clinical, social, and legislative management.

  2. Raman-in-SEM, a multimodal and multiscale analytical tool: performance for materials and expertise.

    Science.gov (United States)

    Wille, Guillaume; Bourrat, Xavier; Maubec, Nicolas; Lahfid, Abdeltif

    2014-12-01

    The availability of Raman spectroscopy in a powerful analytical scanning electron microscope (SEM) allows morphological, elemental, chemical, physical and electronic analysis without moving the sample between instruments. This paper documents the metrological performance of the SEMSCA commercial Raman interface operated in a low vacuum SEM. It provides multiscale and multimodal analyses as Raman/EDS, Raman/cathodoluminescence or Raman/STEM (STEM: scanning transmission electron microscopy) as well as Raman spectroscopy on nanomaterials. Since Raman spectroscopy in a SEM can be influenced by several SEM-related phenomena, this paper firstly presents a comparison of this new tool with a conventional micro-Raman spectrometer. Then, some possible artefacts are documented, which are due to the impact of electron beam-induced contamination or cathodoluminescence contribution to the Raman spectra, especially with geological samples. These effects are easily overcome by changing or adapting the Raman spectrometer and the SEM settings and methodology. The deletion of the adverse effect of cathodoluminescence is solved by using a SEM beam shutter during Raman acquisition. In contrast, this interface provides the ability to record the cathodoluminescence (CL) spectrum of a phase. In a second part, this study highlights the interest and efficiency of the coupling in characterizing micrometric phases at the same point. This multimodal approach is illustrated with various issues encountered in geosciences. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Education Research as Analytic Claims: The Case of Mathematics

    Science.gov (United States)

    Hyslop-Margison, Emery; Rogers, Matthew; Oladi, Soudeh

    2017-01-01

    Despite widespread calls for evidence-based research in education, this strategy has heretofore generated a surprisingly small return on the related financial investment. Some scholars have suggested that the situation follows from a mismatch between education as an assumed field of study and applied empirical research methods. This article's…

  4. New evaluation tool now available to assess research quality | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2016-04-25

    Apr 25, 2016 ... New evaluation tool now available to assess research quality ... Ratings on a scale defined by rubrics, to indicate the level at which a project ... Report: The value-for-money discourse: risks and opportunities for research for development ... Copyright · Open access policy · Privacy policy · Research ethics ...

  5. Reflective Drawing as a Tool for Reflection in Design Research

    Science.gov (United States)

    Calvo, Mirian

    2017-01-01

    This article explores the role of drawing as a tool for reflection. It reports on a PhD research project that aims to identify and analyse the value that co-design processes can bring to participants and their communities. The research is associated with Leapfrog, a three-year project funded by the UK Arts and Humanities Research Council (AHRC).…

  6. The Value of Open Source Software Tools in Qualitative Research

    Science.gov (United States)

    Greenberg, Gary

    2011-01-01

    In an era of global networks, researchers using qualitative methods must consider the impact of any software they use on the sharing of data and findings. In this essay, I identify researchers' main areas of concern regarding the use of qualitative software packages for research. I then examine how open source software tools, wherein the publisher…

  7. Research Tools and Materials | NCI Technology Transfer Center | TTC

    Science.gov (United States)

    Research Tools can be found in TTC's Available Technologies and in scientific publications. They are freely available to non-profits and universities through a Material Transfer Agreement (or other appropriate mechanism), and available via licensing to companies.

  8. Socio-semantic Networks of Research Publications in the Learning Analytics Community

    NARCIS (Netherlands)

    Fazeli, Soude; Drachsler, Hendrik; Sloep, Peter

    2013-01-01

    Fazeli, S., Drachsler, H., & Sloep, P. B. (2013). Socio-semantic Networks of Research Publications in the Learning Analytics Community. In M. d'Aquin, S. Dietze, H. Drachsler, E. Herder, & D. Taibi (Eds.), Linked data challenge, Learning Analytic and Knowledge (LAK13) (pp. 6-10). Vol. 974, Leuven,

  9. Using smartphones in survey research: a multifunctional tool

    OpenAIRE

    Nathalie Sonck; Henk Fernee

    2013-01-01

    Smartphones and apps offer an innovative means of collecting data from the public. The Netherlands Institute for Social Research | SCP has been engaged in one of the first experiments involving the use of a smartphone app to collect time use data recorded by means of an electronic diary. Is it feasible to use smartphones as a data collection tool for social research? What are the effects on data quality? Can we also incorporate reality mining tools in the smartphone app to replace traditional...

  10. Innovative analytical tools to characterize prebiotic carbohydrates of functional food interest.

    Science.gov (United States)

    Corradini, Claudio; Lantano, Claudia; Cavazza, Antonella

    2013-05-01

    Functional foods are one of the most interesting areas of research and innovation in the food industry. A functional food or functional ingredient is considered to be any food or food component that provides health benefits beyond basic nutrition. Recently, consumers have shown interest in natural bioactive compounds as functional ingredients in the diet owing to their various beneficial effects for health. Water-soluble fibers and nondigestible oligosaccharides and polysaccharides can be defined as functional food ingredients. Fructooligosaccharides (FOS) and inulin are resistant to direct metabolism by the host and reach the caecocolon, where they are used by selected groups of beneficial bacteria. Furthermore, they are able to improve physical and structural properties of food, such as hydration, oil-holding capacity, viscosity, texture, sensory characteristics, and shelf-life. This article reviews major innovative analytical developments to screen and identify FOS, inulins, and the most employed nonstarch carbohydrates added or naturally present in functional food formulations. High-performance anion-exchange chromatography with pulsed electrochemical detection (HPAEC-PED) is one of the most employed analytical techniques for the characterization of those molecules. Mass spectrometry is also of great help, in particularly matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS), which is able to provide extensive information regarding the molecular weight and length profiles of oligosaccharides and polysaccharides. Moreover, MALDI-TOF-MS in combination with HPAEC-PED has been shown to be of great value for the complementary information it can provide. Some other techniques, such as NMR spectroscopy, are also discussed, with relevant examples of recent applications. A number of articles have appeared in the literature in recent years regarding the analysis of inulin, FOS, and other carbohydrates of interest in the field and

  11. New method development in prehistoric stone tool research: evaluating use duration and data analysis protocols.

    Science.gov (United States)

    Evans, Adrian A; Macdonald, Danielle A; Giusca, Claudiu L; Leach, Richard K

    2014-10-01

    Lithic microwear is a research field of prehistoric stone tool (lithic) analysis that has been developed with the aim to identify how stone tools were used. It has been shown that laser scanning confocal microscopy has the potential to be a useful quantitative tool in the study of prehistoric stone tool function. In this paper, two important lines of inquiry are investigated: (1) whether the texture of worn surfaces is constant under varying durations of tool use, and (2) the development of rapid objective data analysis protocols. This study reports on the attempt to further develop these areas of study and results in a better understanding of the complexities underlying the development of flexible analytical algorithms for surface analysis. The results show that when sampling is optimised, surface texture may be linked to contact material type, independent of use duration. Further research is needed to validate this finding and test an expanded range of contact materials. The use of automated analytical protocols has shown promise but is only reliable if sampling location and scale are defined. Results suggest that the sampling protocol reports on the degree of worn surface invasiveness, complicating the ability to investigate duration related textural characterisation. Copyright © 2014. Published by Elsevier Ltd.

  12. Aligning Web-Based Tools to the Research Process Cycle: A Resource for Collaborative Research Projects

    Science.gov (United States)

    Price, Geoffrey P.; Wright, Vivian H.

    2012-01-01

    Using John Creswell's Research Process Cycle as a framework, this article describes various web-based collaborative technologies useful for enhancing the organization and efficiency of educational research. Visualization tools (Cacoo) assist researchers in identifying a research problem. Resource storage tools (Delicious, Mendeley, EasyBib)…

  13. Chemometrics in analytical chemistry-part I: history, experimental design and data analysis tools.

    Science.gov (United States)

    Brereton, Richard G; Jansen, Jeroen; Lopes, João; Marini, Federico; Pomerantsev, Alexey; Rodionova, Oxana; Roger, Jean Michel; Walczak, Beata; Tauler, Romà

    2017-10-01

    Chemometrics has achieved major recognition and progress in the analytical chemistry field. In the first part of this tutorial, major achievements and contributions of chemometrics to some of the more important stages of the analytical process, like experimental design, sampling, and data analysis (including data pretreatment and fusion), are summarised. The tutorial is intended to give a general updated overview of the chemometrics field to further contribute to its dissemination and promotion in analytical chemistry.

  14. Topological data analysis: A promising big data exploration tool in biology, analytical chemistry and physical chemistry.

    Science.gov (United States)

    Offroy, Marc; Duponchel, Ludovic

    2016-03-03

    An important feature of experimental science is that data of various kinds is being produced at an unprecedented rate. This is mainly due to the development of new instrumental concepts and experimental methodologies. It is also clear that the nature of acquired data is significantly different. Indeed in every areas of science, data take the form of always bigger tables, where all but a few of the columns (i.e. variables) turn out to be irrelevant to the questions of interest, and further that we do not necessary know which coordinates are the interesting ones. Big data in our lab of biology, analytical chemistry or physical chemistry is a future that might be closer than any of us suppose. It is in this sense that new tools have to be developed in order to explore and valorize such data sets. Topological data analysis (TDA) is one of these. It was developed recently by topologists who discovered that topological concept could be useful for data analysis. The main objective of this paper is to answer the question why topology is well suited for the analysis of big data set in many areas and even more efficient than conventional data analysis methods. Raman analysis of single bacteria should be providing a good opportunity to demonstrate the potential of TDA for the exploration of various spectroscopic data sets considering different experimental conditions (with high noise level, with/without spectral preprocessing, with wavelength shift, with different spectral resolution, with missing data). Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Manuscript title: antifungal proteins from moulds: analytical tools and potential application to dry-ripened foods.

    Science.gov (United States)

    Delgado, Josué; Owens, Rebecca A; Doyle, Sean; Asensio, Miguel A; Núñez, Félix

    2016-08-01

    Moulds growing on the surface of dry-ripened foods contribute to their sensory qualities, but some of them are able to produce mycotoxins that pose a hazard to consumers. Small cysteine-rich antifungal proteins (AFPs) from moulds are highly stable to pH and proteolysis and exhibit a broad inhibition spectrum against filamentous fungi, providing new chances to control hazardous moulds in fermented foods. The analytical tools for characterizing the cellular targets and affected pathways are reviewed. Strategies currently employed to study these mechanisms of action include 'omics' approaches that have come to the forefront in recent years, developing in tandem with genome sequencing of relevant organisms. These techniques contribute to a better understanding of the response of moulds against AFPs, allowing the design of complementary strategies to maximize or overcome the limitations of using AFPs on foods. AFPs alter chitin biosynthesis, and some fungi react inducing cell wall integrity (CWI) pathway. However, moulds able to increase chitin content at the cell wall by increasing proteins in either CWI or calmodulin-calcineurin signalling pathways will resist AFPs. Similarly, AFPs increase the intracellular levels of reactive oxygen species (ROS), and moulds increasing G-protein complex β subunit CpcB and/or enzymes to efficiently produce glutathione may evade apoptosis. Unknown aspects that need to be addressed include the interaction with mycotoxin production by less sensitive toxigenic moulds. However, significant steps have been taken to encourage the use of AFPs in intermediate-moisture foods, particularly for mould-ripened cheese and meat products.

  16. The Solid* toolset for software visual analytics of program structure and metrics comprehension : From research prototype to product

    NARCIS (Netherlands)

    Reniers, Dennie; Voinea, Lucian; Ersoy, Ozan; Telea, Alexandru

    2014-01-01

    Software visual analytics (SVA) tools combine static program analysis and fact extraction with information visualization to support program comprehension. However, building efficient and effective SVA tools is highly challenging, as it involves extensive software development in program analysis,

  17. Serious Games are a Serious Tool for Team Research

    Directory of Open Access Journals (Sweden)

    Michael D. Coovert

    2017-03-01

    Full Text Available Serious games are an attractive tool for education and training, but their utility is even broader. We argue serious games provide a unique opportunity for research as well, particularly in areas where multiple players (groups or teams are involved. In our paper we provide background in several substantive areas. First, we outline major constructs and challenges found in team research. Secondly, we discuss serious games, providing an overview and description of their role in education, training, and research. Thirdly, we describe necessary characteristics for game engines utilized in team research, followed by a discussion of the value added by utilizing serious games. Our goal in this paper is to argue serious games are an effective tool with demonstrated reliability and validity and should be part of a research program for those engaged in team research. Both team researchers and those involved in serious game development can benefit from a mutual partnership which is research focused.

  18. Validation of a new assessment tool for qualitative research articles

    DEFF Research Database (Denmark)

    Schou, Lone; Høstrup, Helle; Lyngsø, Elin

    2012-01-01

    schou l., høstrup h., lyngsø e.e., larsen s. & poulsen i. (2011) Validation of a new assessment tool for qualitative research articles. Journal of Advanced Nursing00(0), 000-000. doi: 10.1111/j.1365-2648.2011.05898.x ABSTRACT: Aim.  This paper presents the development and validation of a new...... assessment tool for qualitative research articles, which could assess trustworthiness of qualitative research articles as defined by Guba and at the same time aid clinicians in their assessment. Background.  There are more than 100 sets of proposals for quality criteria for qualitative research. However, we...... is the Danish acronym for Appraisal of Qualitative Studies. Phase 1 was to develop the tool based on a literature review and on consultation with qualitative researchers. Phase 2 was an inter-rater reliability test in which 40 health professionals participated. Phase 3 was an inter-rater reliability test among...

  19. Advancement in Perfluoroalkyl Research Hampered by Analytical Challenges

    NARCIS (Netherlands)

    Martin, J.W.; Kannan, K.; Berger, U.; de Voogt, P.; Field, J.; Giesy, J.P.; Harner, T.; Muir, D.C.G.; Scott, B.; Kaiser, M.; Jarnberg, U.; Jones, K.C.; Mabury, S.A.; Schroeder, H.; Simcik, M.; Sottani, C.; Van Bavel, B.; Karrman, A.; Lindstrom, G.; Van Leeuwen, S.

    2004-01-01

    The growing concern over these organohalogens, some of which have been found in human blood and appear to be widespread in the environment, led researchers to gather in Hamburg, Germany, in 2003 to evaluate the current state of methods to analyze for the organic contaminants. Jonathan Martin of the

  20. Collaborative Research between Malaysian and Australian Universities on Learning Analytics: Challenges and Strategies

    OpenAIRE

    Z. Tasir; S. N. Kew; D. West; Z. Abdullah; D. Toohey

    2016-01-01

    Research on Learning Analytics is progressively developing in the higher education field by concentrating on the process of students' learning. Therefore, a research project between Malaysian and Australian Universities was initiated in 2015 to look at the use of Learning Analytics to support the development of teaching practice. The focal point of this article is to discuss and share the experiences of Malaysian and Australian universities in the process of developing the collaborative resea...

  1. The Broad Application of Data Science and Analytics: Essential Tools for the Liberal Arts Graduate

    Science.gov (United States)

    Cárdenas-Navia, Isabel; Fitzgerald, Brian K.

    2015-01-01

    New technologies and data science are transforming a wide range of organizations into analytics-intensive enterprises. Despite the resulting demand for graduates with experience in the application of analytics, though, undergraduate education has been slow to change. The academic and policy communities have engaged in a decade-long conversation…

  2. Automation of analytical processes. A tool for higher efficiency and safety

    International Nuclear Information System (INIS)

    Groll, P.

    1976-01-01

    The analytical laboratory of a radiochemical facility is usually faced with the fact that numerous analyses of a similar type must be routinely carried out. Automation of such routine analytical procedures helps in increasing the efficiency and safety of the work. A review of the requirements for automation and its advantages is given and demonstrated on three examples. (author)

  3. Effect of virtual analytical chemistry laboratory on enhancing student research skills and practices

    Directory of Open Access Journals (Sweden)

    Boris Bortnik

    2017-12-01

    Full Text Available This article aims to determine the effect of a virtual chemistry laboratory on university student achievement. The article describes a model of a laboratory course that includes a virtual component. This virtual component is viewed as a tool of student pre-lab autonomous learning. It presents electronic resources designed for a virtual laboratory and outlines the methodology of e-resource application. To find out how virtual chemistry laboratory affects student scientific literacy, research skills and practices, a pedagogical experiment has been conducted. Student achievement was compared in two learning environments: traditional – in-class hands-on – learning (control group and blended learning – online learning combined with in-person learning (experimental group. The effectiveness of integrating an e-lab in the laboratory study was measured by comparing student lab reports of the two groups. For that purpose, a set of 10 criteria was developed. The experimental and control student groups were also compared in terms of test results and student portfolios. The study showed that the adopted approach blending both virtual and hands-on learning environments has the potential to enhance student research skills and practices in analytical chemistry studies.

  4. Research on the tool holder mode in high speed machining

    Science.gov (United States)

    Zhenyu, Zhao; Yongquan, Zhou; Houming, Zhou; Xiaomei, Xu; Haibin, Xiao

    2018-03-01

    High speed machining technology can improve the processing efficiency and precision, but also reduce the processing cost. Therefore, the technology is widely regarded in the industry. With the extensive application of high-speed machining technology, high-speed tool system has higher and higher requirements on the tool chuck. At present, in high speed precision machining, several new kinds of clip heads are as long as there are heat shrinkage tool-holder, high-precision spring chuck, hydraulic tool-holder, and the three-rib deformation chuck. Among them, the heat shrinkage tool-holder has the advantages of high precision, high clamping force, high bending rigidity and dynamic balance, etc., which are widely used. Therefore, it is of great significance to research the new requirements of the machining tool system. In order to adapt to the requirement of high speed machining precision machining technology, this paper expounds the common tool holder technology of high precision machining, and proposes how to select correctly tool clamping system in practice. The characteristics and existing problems are analyzed in the tool clamping system.

  5. Spec Tool; an online education and research resource

    Science.gov (United States)

    Maman, S.; Shenfeld, A.; Isaacson, S.; Blumberg, D. G.

    2016-06-01

    Education and public outreach (EPO) activities related to remote sensing, space, planetary and geo-physics sciences have been developed widely in the Earth and Planetary Image Facility (EPIF) at Ben-Gurion University of the Negev, Israel. These programs aim to motivate the learning of geo-scientific and technologic disciplines. For over the past decade, the facility hosts research and outreach activities for researchers, local community, school pupils, students and educators. As software and data are neither available nor affordable, the EPIF Spec tool was created as a web-based resource to assist in initial spectral analysis as a need for researchers and students. The tool is used both in the academic courses and in the outreach education programs and enables a better understanding of the theoretical data of spectroscopy and Imaging Spectroscopy in a 'hands-on' activity. This tool is available online and provides spectra visualization tools and basic analysis algorithms including Spectral plotting, Spectral angle mapping and Linear Unmixing. The tool enables to visualize spectral signatures from the USGS spectral library and additional spectra collected in the EPIF such as of dunes in southern Israel and from Turkmenistan. For researchers and educators, the tool allows loading collected samples locally for further analysis.

  6. Sustainability considerations for health research and analytic data infrastructures.

    Science.gov (United States)

    Wilcox, Adam; Randhawa, Gurvaneet; Embi, Peter; Cao, Hui; Kuperman, Gilad J

    2014-01-01

    The United States has made recent large investments in creating data infrastructures to support the important goals of patient-centered outcomes research (PCOR) and comparative effectiveness research (CER), with still more investment planned. These initial investments, while critical to the creation of the infrastructures, are not expected to sustain them much beyond the initial development. To provide the maximum benefit, the infrastructures need to be sustained through innovative financing models while providing value to PCOR and CER researchers. Based on our experience with creating flexible sustainability strategies (i.e., strategies that are adaptive to the different characteristics and opportunities of a resource or infrastructure), we define specific factors that are important considerations in developing a sustainability strategy. These factors include assets, expansion, complexity, and stakeholders. Each factor is described, with examples of how it is applied. These factors are dimensions of variation in different resources, to which a sustainability strategy should adapt. We also identify specific important considerations for maintaining an infrastructure, so that the long-term intended benefits can be realized. These observations are presented as lessons learned, to be applied to other sustainability efforts. We define the lessons learned, relating them to the defined sustainability factors as interactions between factors. Using perspectives and experiences from a diverse group of experts, we define broad characteristics of sustainability strategies and important observations, which can vary for different projects. Other descriptions of adaptive, flexible, and successful models of collaboration between stakeholders and data infrastructures can expand this framework by identifying other factors for sustainability, and give more concrete directions on how sustainability can be best achieved.

  7. Analytical tools for thermal infrared engineerig: a thermal sensor simulation package

    Science.gov (United States)

    Jaggi, Sandeep

    1992-09-01

    The Advanced Sensor Development Laboratory (ASDL) at the Stennis Space Center develops, maintains and calibrates remote sensing instruments for the National Aeronautics & Space Administration. To perform system design trade-offs, analysis, and establish system parameters, ASDL has developed a software package for analytical simulation of sensor systems. This package called 'Analytical Tools for Thermal InfraRed Engineering'--ATTIRE, simulates the various components of a sensor system. The software allows each subsystem of the sensor to be analyzed independently for its performance. These performance parameters are then integrated to obtain system level information such as SNR, NER, NETD etc. This paper describes the uses of the package and the physics that were used to derive the performance parameters. In addition, ATTIRE can be used as a tutorial for understanding the distribution of thermal flux or solar irradiance over selected bandwidths of the spectrum. This spectrally distributed incident flux can then be analyzed as it propagates through the subsystems that constitute the entire sensor. ATTIRE provides a variety of functions ranging from plotting black-body curves for varying bandwidths and computing the integral flux, to performing transfer function analysis of the sensor system. The package runs from a menu- driven interface in a PC-DOS environment. Each sub-system of the sensor is represented by windows and icons. A user-friendly mouse-controlled point-and-click interface allows the user to simulate various aspects of a sensor. The package can simulate a theoretical sensor system. Trade-off studies can be easily done by changing the appropriate parameters and monitoring the effect of the system performance. The package can provide plots of system performance versus any system parameter. A parameter (such as the entrance aperture of the optics) could be varied and its effect on another parameter (e.g., NETD) can be plotted. A third parameter (e.g., the

  8. Analytic solution to leading order coupled DGLAP evolution equations: A new perturbative QCD tool

    International Nuclear Information System (INIS)

    Block, Martin M.; Durand, Loyal; Ha, Phuoc; McKay, Douglas W.

    2011-01-01

    We have analytically solved the LO perturbative QCD singlet DGLAP equations [V. N. Gribov and L. N. Lipatov, Sov. J. Nucl. Phys. 15, 438 (1972)][G. Altarelli and G. Parisi, Nucl. Phys. B126, 298 (1977)][Y. L. Dokshitzer, Sov. Phys. JETP 46, 641 (1977)] using Laplace transform techniques. Newly developed, highly accurate, numerical inverse Laplace transform algorithms [M. M. Block, Eur. Phys. J. C 65, 1 (2010)][M. M. Block, Eur. Phys. J. C 68, 683 (2010)] allow us to write fully decoupled solutions for the singlet structure function F s (x,Q 2 ) and G(x,Q 2 ) as F s (x,Q 2 )=F s (F s0 (x 0 ),G 0 (x 0 )) and G(x,Q 2 )=G(F s0 (x 0 ),G 0 (x 0 )), where the x 0 are the Bjorken x values at Q 0 2 . Here F s and G are known functions--found using LO DGLAP splitting functions--of the initial boundary conditions F s0 (x)≡F s (x,Q 0 2 ) and G 0 (x)≡G(x,Q 0 2 ), i.e., the chosen starting functions at the virtuality Q 0 2 . For both G(x) and F s (x), we are able to either devolve or evolve each separately and rapidly, with very high numerical accuracy--a computational fractional precision of O(10 -9 ). Armed with this powerful new tool in the perturbative QCD arsenal, we compare our numerical results from the above equations with the published MSTW2008 and CTEQ6L LO gluon and singlet F s distributions [A. D. Martin, W. J. Stirling, R. S. Thorne, and G. Watt, Eur. Phys. J. C 63, 189 (2009)], starting from their initial values at Q 0 2 =1 GeV 2 and 1.69 GeV 2 , respectively, using their choice of α s (Q 2 ). This allows an important independent check on the accuracies of their evolution codes and, therefore, the computational accuracies of their published parton distributions. Our method completely decouples the two LO distributions, at the same time guaranteeing that both G and F s satisfy the singlet coupled DGLAP equations. It also allows one to easily obtain the effects of the starting functions on the evolved gluon and singlet structure functions, as functions of both Q

  9. Evaluation And Selection Process of Suppliers Through Analytical Framework: An Emprical Evidence of Evaluation Tool

    Directory of Open Access Journals (Sweden)

    Imeri Shpend

    2015-09-01

    Full Text Available The supplier selection process is very important to companies as selecting the right suppliers that fit companies strategy needs brings drastic savings. Therefore, this paper seeks to address the key area of supplies evaluation from the supplier review perspective. The purpose was to identify the most important criteria for suppliers’ evaluation and develop evaluation tool based on surveyed criteria. The research was conducted through structured questionnaire and the sample focused on small to medium sized companies (SMEs in Greece. In total eighty companies participated in the survey answering the full questionnaire which consisted of questions whether these companies utilize some suppliers’ evaluation criteria and what criteria if any is applied. The main statistical instrument used in the study is Principal Component Analysis (PCA. Thus, the research has shown that the main criteria are: the attitude of the vendor towards the customer, supplier delivery time, product quality and price. Conclusions are made on the suitability and usefulness of suppliers’ evaluation criteria and in way they are applied in enterprises.

  10. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data.

    Science.gov (United States)

    Ben-Ari Fuchs, Shani; Lieder, Iris; Stelzer, Gil; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-03-01

    Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from "data-to-knowledge-to-innovation," a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ ( geneanalytics.genecards.org ), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®--the human gene database; the MalaCards-the human diseases database; and the PathCards--the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®--the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene-tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell "cards" in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics, pharmacogenomics, vaccinomics

  11. Effect of Micro Electrical Discharge Machining Process Conditions on Tool Wear Characteristics: Results of an Analytic Study

    DEFF Research Database (Denmark)

    Puthumana, Govindan; P., Rajeev

    2016-01-01

    Micro electrical discharge machining is one of the established techniques to manufacture high aspect ratio features on electrically conductive materials. This paper presents the results and inferences of an analytical study for estimating theeffect of process conditions on tool electrode wear...... characteristicsin micro-EDM process. A new approach with two novel factors anticipated to directly control the material removal mechanism from the tool electrode are proposed; using discharge energyfactor (DEf) and dielectric flushing factor (DFf). The results showed that the correlation between the tool wear rate...... (TWR) and the factors is poor. Thus, individual effects of each factor on TWR are analyzed. The factors selected for the study of individual effects are pulse on-time, discharge peak current, gap voltage and gap flushing pressure. The tool wear rate decreases linearly with an increase in the pulse on...

  12. The Rack-Gear Tool Generation Modelling. Non-Analytical Method Developed in CATIA, Using the Relative Generating Trajectories Method

    Science.gov (United States)

    Teodor, V. G.; Baroiu, N.; Susac, F.; Oancea, N.

    2016-11-01

    The modelling of a curl of surfaces associated with a pair of rolling centrodes, when it is known the profile of the rack-gear's teeth profile, by direct measuring, as a coordinate matrix, has as goal the determining of the generating quality for an imposed kinematics of the relative motion of tool regarding the blank. In this way, it is possible to determine the generating geometrical error, as a base of the total error. The generation modelling allows highlighting the potential errors of the generating tool, in order to correct its profile, previously to use the tool in machining process. A method developed in CATIA is proposed, based on a new method, namely the method of “relative generating trajectories”. They are presented the analytical foundation, as so as some application for knows models of rack-gear type tools used on Maag teething machines.

  13. Learning analytics as a tool for closing the assessment loop in higher education

    OpenAIRE

    Karen D. Mattingly; Margaret C. Rice; Zane L. Berge

    2012-01-01

    This paper examines learning and academic analytics and its relevance to distance education in undergraduate and graduate programs as it impacts students and teaching faculty, and also academic institutions. The focus is to explore the measurement, collection, analysis, and reporting of data as predictors of student success and drivers of departmental process and program curriculum. Learning and academic analytics in higher education is used to predict student success by examining how and wha...

  14. A PART OF RESEARCH METHODOLOGY COURSE: Introduction to the Research Tools

    OpenAIRE

    Ebrahim, Nader Ale

    2016-01-01

    Research Tools” can be defined as vehicles that broadly facilitate research and related activities. “Research Tools” enable researchers to collect, organize, analyze, visualize and publicized research  outputs. Dr. Nader has collected over 800 tools that enable students to follow the correct path in research and to ultimately produce high-quality research outputs with more accuracy and efficiency. It is assembled as an interactive Web-based mind map, titled “Research Tools”, which is updated...

  15. Analytical Chemistry Section Chemistry Research Group, Winfrith. Report for 1982 and 1983

    International Nuclear Information System (INIS)

    Amey, M.D.H.; Capp, P.D.; James, H.

    1984-01-01

    This report reviews the principal activities of the Analytical Chemistry Section of Chemistry Research Group, Winfrith, during 1982 and 1983. The objectives of the report are to outline the range of chemical analysis support services available at Winfrith, indicate the research areas from which samples currently originate, and identify instrumental techniques where significant updating has occurred. (author)

  16. Research coordination meeting of the coordinated research project on analytical and experimental benchmark analyses of accelerator driven systems. Working material

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2006-07-01

    's overall objective is to make contributions towards the realization of a transmutation demonstration facility. The specific objective of the CRP is to improve the present understanding of the coupling of the ADS spallation source with the multiplicative sub-critical core. As outcome, the CRP aims at advancing the efforts under way in the Member States towards the proof of practicality for ADS based transmutation by providing an information exchange and collaborative research framework needed to ensure that the tools to perform detailed ADS calculations, namely from the high energy proton beam down to thermal neutron energies, are available. The CRP will address all major physics phenomena of the spallation source and its coupling to the sub-critical core. The participants will perform computational and experimental benchmark analyses using integrated calculation schemes and simulation methods. Apart from analytical benchmark exercises, the CRP will integrate some of the planned experimental demonstration projects of the coupling at power between a sub-critical core and a spallation source (e.g., YALINA Booster in Belarus and SAD at JINR, Dubna). The estimated duration of the CRP is 5 years. Following the establishment, during 2004, of the international CRP team by putting in place research agreements and contracts, and after convening this kick-off research RCM, the implementation plan of the CRP foresees three more RCMs (in 2007, 2008, and 2009, respectively), and the publication of the final report in 2010.

  17. Cross learning synergies between Operation Management content and the use of generic analytic tools

    Directory of Open Access Journals (Sweden)

    Frederic Marimon

    2017-06-01

    By presenting both objectives simultaneously students are found to be more motivated towards working deeply in both objectives. Students know that the theoretical content will be put in practice through certain tools, strengthening the student's interest on the conceptual issues of the chapter. In turn, because students know that they will use a generic tool in a known context, their interests in these tools is reinforced. The result is a cross learning synergy.

  18. Visualization and analytics tools for infectious disease epidemiology: a systematic review.

    Science.gov (United States)

    Carroll, Lauren N; Au, Alan P; Detwiler, Landon Todd; Fu, Tsung-Chieh; Painter, Ian S; Abernethy, Neil F

    2014-10-01

    A myriad of new tools and algorithms have been developed to help public health professionals analyze and visualize the complex data used in infectious disease control. To better understand approaches to meet these users' information needs, we conducted a systematic literature review focused on the landscape of infectious disease visualization tools for public health professionals, with a special emphasis on geographic information systems (GIS), molecular epidemiology, and social network analysis. The objectives of this review are to: (1) identify public health user needs and preferences for infectious disease information visualization tools; (2) identify existing infectious disease information visualization tools and characterize their architecture and features; (3) identify commonalities among approaches applied to different data types; and (4) describe tool usability evaluation efforts and barriers to the adoption of such tools. We identified articles published in English from January 1, 1980 to June 30, 2013 from five bibliographic databases. Articles with a primary focus on infectious disease visualization tools, needs of public health users, or usability of information visualizations were included in the review. A total of 88 articles met our inclusion criteria. Users were found to have diverse needs, preferences and uses for infectious disease visualization tools, and the existing tools are correspondingly diverse. The architecture of the tools was inconsistently described, and few tools in the review discussed the incorporation of usability studies or plans for dissemination. Many studies identified concerns regarding data sharing, confidentiality and quality. Existing tools offer a range of features and functions that allow users to explore, analyze, and visualize their data, but the tools are often for siloed applications. Commonly cited barriers to widespread adoption included lack of organizational support, access issues, and misconceptions about tool

  19. Auxiliary fields as a tool for computing analytical solutions of the Schroedinger equation

    International Nuclear Information System (INIS)

    Silvestre-Brac, Bernard; Semay, Claude; Buisseret, Fabien

    2008-01-01

    We propose a new method to obtain approximate solutions for the Schroedinger equation with an arbitrary potential that possesses bound states. This method, relying on the auxiliary field technique, allows to find in many cases, analytical solutions. It offers a convenient way to study the qualitative features of the energy spectrum of bound states in any potential. In particular, we illustrate our method by solving the case of central potentials with power-law form and with logarithmic form. For these types of potentials, we propose very accurate analytical energy formulae which greatly improves the corresponding formulae that can be found in the literature

  20. Auxiliary fields as a tool for computing analytical solutions of the Schroedinger equation

    Energy Technology Data Exchange (ETDEWEB)

    Silvestre-Brac, Bernard [LPSC Universite Joseph Fourier, Grenoble 1, CNRS/IN2P3, Institut Polytechnique de Grenoble, Avenue des Martyrs 53, F-38026 Grenoble-Cedex (France); Semay, Claude; Buisseret, Fabien [Groupe de Physique Nucleaire Theorique, Universite de Mons-Hainaut, Academie universitaire Wallonie-Bruxelles, Place du Parc 20, B-7000 Mons (Belgium)], E-mail: silvestre@lpsc.in2p3.fr, E-mail: claude.semay@umh.ac.be, E-mail: fabien.buisseret@umh.ac.be

    2008-07-11

    We propose a new method to obtain approximate solutions for the Schroedinger equation with an arbitrary potential that possesses bound states. This method, relying on the auxiliary field technique, allows to find in many cases, analytical solutions. It offers a convenient way to study the qualitative features of the energy spectrum of bound states in any potential. In particular, we illustrate our method by solving the case of central potentials with power-law form and with logarithmic form. For these types of potentials, we propose very accurate analytical energy formulae which greatly improves the corresponding formulae that can be found in the literature.

  1. Monte Carlo simulation: tool for the calibration in analytical determination of radionuclides

    International Nuclear Information System (INIS)

    Gonzalez, Jorge A. Carrazana; Ferrera, Eduardo A. Capote; Gomez, Isis M. Fernandez; Castro, Gloria V. Rodriguez; Ricardo, Niury Martinez

    2013-01-01

    This work shows how is established the traceability of the analytical determinations using this calibration method. Highlights the advantages offered by Monte Carlo simulation for the application of corrections by differences in chemical composition, density and height of the samples analyzed. Likewise, the results obtained by the LVRA in two exercises organized by the International Agency for Atomic Energy (IAEA) are presented. In these exercises (an intercomparison and a proficiency test) all reported analytical results were obtained based on calibrations in efficiency by Monte Carlo simulation using the DETEFF program

  2. Software Tools | Office of Cancer Clinical Proteomics Research

    Science.gov (United States)

    The CPTAC program develops new approaches to elucidate aspects of the molecular complexity of cancer made from large-scale proteogenomic datasets, and advance them toward precision medicine.  Part of the CPTAC mission is to make data and tools available and accessible to the greater research community to accelerate the discovery process.

  3. Managing complex research datasets using electronic tools: A meta-analysis exemplar

    Science.gov (United States)

    Brown, Sharon A.; Martin, Ellen E.; Garcia, Theresa J.; Winter, Mary A.; García, Alexandra A.; Brown, Adama; Cuevas, Heather E.; Sumlin, Lisa L.

    2013-01-01

    Meta-analyses of broad scope and complexity require investigators to organize many study documents and manage communication among several research staff. Commercially available electronic tools, e.g., EndNote, Adobe Acrobat Pro, Blackboard, Excel, and IBM SPSS Statistics (SPSS), are useful for organizing and tracking the meta-analytic process, as well as enhancing communication among research team members. The purpose of this paper is to describe the electronic processes we designed, using commercially available software, for an extensive quantitative model-testing meta-analysis we are conducting. Specific electronic tools improved the efficiency of (a) locating and screening studies, (b) screening and organizing studies and other project documents, (c) extracting data from primary studies, (d) checking data accuracy and analyses, and (e) communication among team members. The major limitation in designing and implementing a fully electronic system for meta-analysis was the requisite upfront time to: decide on which electronic tools to use, determine how these tools would be employed, develop clear guidelines for their use, and train members of the research team. The electronic process described here has been useful in streamlining the process of conducting this complex meta-analysis and enhancing communication and sharing documents among research team members. PMID:23681256

  4. The Efficacy of Violence Prediction: A Meta-Analytic Comparison of Nine Risk Assessment Tools

    Science.gov (United States)

    Yang, Min; Wong, Stephen C. P.; Coid, Jeremy

    2010-01-01

    Actuarial risk assessment tools are used extensively to predict future violence, but previous studies comparing their predictive accuracies have produced inconsistent findings as a result of various methodological issues. We conducted meta-analyses of the effect sizes of 9 commonly used risk assessment tools and their subscales to compare their…

  5. Online Programs as Tools to Improve Parenting: A meta-analytic review

    NARCIS (Netherlands)

    Prof. Dr. Jo J.M.A Hermanns; Prof. Dr. Ruben R.G. Fukkink; dr. Christa C.C. Nieuwboer

    2013-01-01

    Background. A number of parenting programs, aimed at improving parenting competencies,have recently been adapted or designed with the use of online technologies. Although webbased services have been claimed to hold promise for parent support, a meta-analytic review of online parenting interventions

  6. Online programs as tools to improve parenting: A meta-analytic review

    NARCIS (Netherlands)

    Nieuwboer, C.C.; Fukkink, R.G.; Hermanns, J.M.A.

    2013-01-01

    Background: A number of parenting programs, aimed at improving parenting competencies, have recently been adapted or designed with the use of online technologies. Although web-based services have been claimed to hold promise for parent support, a meta-analytic review of online parenting

  7. I-HASTREAM : density-based hierarchical clustering of big data streams and its application to big graph analytics tools

    NARCIS (Netherlands)

    Hassani, M.; Spaus, P.; Cuzzocrea, A.; Seidl, T.

    2016-01-01

    Big Data Streams are very popular at now, as stirred-up by a plethora of modern applications such as sensor networks, scientific computing tools, Web intelligence, social network analysis and mining tools, and so forth. Here, the main research issue consists in how to effectively and efficiently

  8. Moving standard deviation and moving sum of outliers as quality tools for monitoring analytical precision.

    Science.gov (United States)

    Liu, Jiakai; Tan, Chin Hon; Badrick, Tony; Loh, Tze Ping

    2018-02-01

    An increase in analytical imprecision (expressed as CV a ) can introduce additional variability (i.e. noise) to the patient results, which poses a challenge to the optimal management of patients. Relatively little work has been done to address the need for continuous monitoring of analytical imprecision. Through numerical simulations, we describe the use of moving standard deviation (movSD) and a recently described moving sum of outlier (movSO) patient results as means for detecting increased analytical imprecision, and compare their performances against internal quality control (QC) and the average of normal (AoN) approaches. The power of detecting an increase in CV a is suboptimal under routine internal QC procedures. The AoN technique almost always had the highest average number of patient results affected before error detection (ANPed), indicating that it had generally the worst capability for detecting an increased CV a . On the other hand, the movSD and movSO approaches were able to detect an increased CV a at significantly lower ANPed, particularly for measurands that displayed a relatively small ratio of biological variation to CV a. CONCLUSION: The movSD and movSO approaches are effective in detecting an increase in CV a for high-risk measurands with small biological variation. Their performance is relatively poor when the biological variation is large. However, the clinical risks of an increase in analytical imprecision is attenuated for these measurands as an increased analytical imprecision will only add marginally to the total variation and less likely to impact on the clinical care. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  9. Integrating information technologies as tools for surgical research.

    Science.gov (United States)

    Schell, Scott R

    2005-10-01

    Surgical research is dependent upon information technologies. Selection of the computer, operating system, and software tool that best support the surgical investigator's needs requires careful planning before research commences. This manuscript presents a brief tutorial on how surgical investigators can best select these information technologies, with comparisons and recommendations between existing systems, software, and solutions. Privacy concerns, based upon HIPAA and other regulations, now require careful proactive attention to avoid legal penalties, civil litigation, and financial loss. Security issues are included as part of the discussions related to selection and application of information technology. This material was derived from a segment of the Association for Academic Surgery's Fundamentals of Surgical Research course.

  10. Opening Digital Archives and Collections with Emerging Data Analytics Technology: A Research Agenda

    Directory of Open Access Journals (Sweden)

    Ahmed Elragal

    2017-02-01

    Full Text Available In the public sector, the EU legislation requires preservation and opening of increasing amounts of heterogeneous digital information that should be utilized by citizens and businesses. While technologies such as big data analytics (BDA have emerged, opening of digital archives and collections at a large scale is in its infancy. Opening archives and collections involve also particular requirements for recognizing and managing issues of privacy and digital rights. As well, ensuring the sustainability of the opened materials and economical appraisal of digital materials for preservation require robust digital preservation practices. We need to proceed beyond the state-of-the-art in opening digital archives and collections through the means of emerging big data analytics and validating a novel concept for analytics which then enables delivering of knowledge for citizens and the society. We set out an agenda for using BDA as our strategy for research and enquiry and for demonstrating the benefit of BDA for opening digital archives by civil servants and for citizens. That will – eventually - transform the preservation practices, and delivery and use opportunities of public digital archives. Our research agenda suggests a framework integrating four domains of inquiry, analytics-enhanced appraisal, analytics-prepared preservation, analytics-enhanced opening, and analytics-enhanced use, for utilizing the BDA technologies in the domain of digital archives and collections. The suggested framework and research agenda identifies initially particular BDA technologies to be utilized in each of the four domains, and contributes by highlighting a need for an integrated “public understanding of big data” in the domain of digital preservation.

  11. Visualizing the Geography of the Diseases of China: Western Disease Maps from Analytical Tools to Tools of Empire, Sovereignty, and Public Health Propaganda, 1878-1929.

    Science.gov (United States)

    Hanson, Marta

    2017-09-01

    Argument This article analyzes for the first time the earliest western maps of diseases in China spanning fifty years from the late 1870s to the end of the 1920s. The 24 featured disease maps present a visual history of the major transformations in modern medicine from medical geography to laboratory medicine wrought on Chinese soil. These medical transformations occurred within new political formations from the Qing dynasty (1644-1911) to colonialism in East Asia (Hong Kong, Taiwan, Manchuria, Korea) and hypercolonialism within China (Tianjin, Shanghai, Amoy) as well as the new Republican Chinese nation state (1912-49). As a subgenre of persuasive graphics, physicians marshaled disease maps for various rhetorical functions within these different political contexts. Disease maps in China changed from being mostly analytical tools to functioning as tools of empire, national sovereignty, and public health propaganda legitimating new medical concepts, public health interventions, and political structures governing over human and non-human populations.

  12. A Review on the Design Structure Matrix as an Analytical Tool for Product Development Management

    OpenAIRE

    Mokudai, Takefumi

    2006-01-01

    This article reviews fundamental concepts and analytical techniques of design structure matrix (DSM) as well as recent development of DSM studies. The DSM is a matrix representation of relationships between components of a complex system, such as products, development organizations and processes. Depending on targets of analysis, there are four basic types of DSM: Component-based DSM, Team-based DSM, Task-based DSM, and Parameter-based DSM. There are two streams of recent DSM studies: 1) ...

  13. Electrochemical immunoassay using magnetic beads for the determination of zearalenone in baby food: an anticipated analytical tool for food safety.

    Science.gov (United States)

    Hervás, Miriam; López, Miguel Angel; Escarpa, Alberto

    2009-10-27

    In this work, electrochemical immunoassay involving magnetic beads to determine zearalenone in selected food samples has been developed. The immunoassay scheme has been based on a direct competitive immunoassay method in which antibody-coated magnetic beads were employed as the immobilisation support and horseradish peroxidase (HRP) was used as enzymatic label. Amperometric detection has been achieved through the addition of hydrogen peroxide substrate and hydroquinone as mediator. Analytical performance of the electrochemical immunoassay has been evaluated by analysis of maize certified reference material (CRM) and selected baby food samples. A detection limit (LOD) of 0.011 microg L(-1) and EC(50) 0.079 microg L(-1) were obtained allowing the assessment of the detection of zearalenone mycotoxin. In addition, an excellent accuracy with a high recovery yield ranging between 95 and 108% has been obtained. The analytical features have shown the proposed electrochemical immunoassay to be a very powerful and timely screening tool for the food safety scene.

  14. Potential of Near-Infrared Chemical Imaging as Process Analytical Technology Tool for Continuous Freeze-Drying.

    Science.gov (United States)

    Brouckaert, Davinia; De Meyer, Laurens; Vanbillemont, Brecht; Van Bockstal, Pieter-Jan; Lammens, Joris; Mortier, Séverine; Corver, Jos; Vervaet, Chris; Nopens, Ingmar; De Beer, Thomas

    2018-04-03

    Near-infrared chemical imaging (NIR-CI) is an emerging tool for process monitoring because it combines the chemical selectivity of vibrational spectroscopy with spatial information. Whereas traditional near-infrared spectroscopy is an attractive technique for water content determination and solid-state investigation of lyophilized products, chemical imaging opens up possibilities for assessing the homogeneity of these critical quality attributes (CQAs) throughout the entire product. In this contribution, we aim to evaluate NIR-CI as a process analytical technology (PAT) tool for at-line inspection of continuously freeze-dried pharmaceutical unit doses based on spin freezing. The chemical images of freeze-dried mannitol samples were resolved via multivariate curve resolution, allowing us to visualize the distribution of mannitol solid forms throughout the entire cake. Second, a mannitol-sucrose formulation was lyophilized with variable drying times for inducing changes in water content. Analyzing the corresponding chemical images via principal component analysis, vial-to-vial variations as well as within-vial inhomogeneity in water content could be detected. Furthermore, a partial least-squares regression model was constructed for quantifying the water content in each pixel of the chemical images. It was hence concluded that NIR-CI is inherently a most promising PAT tool for continuously monitoring freeze-dried samples. Although some practicalities are still to be solved, this analytical technique could be applied in-line for CQA evaluation and for detecting the drying end point.

  15. Prognostic risk estimates of patients with multiple sclerosis and their physicians: comparison to an online analytical risk counseling tool.

    Directory of Open Access Journals (Sweden)

    Christoph Heesen

    Full Text Available BACKGROUND: Prognostic counseling in multiple sclerosis (MS is difficult because of the high variability of disease progression. Simultaneously, patients and physicians are increasingly confronted with making treatment decisions at an early stage, which requires taking individual prognoses into account to strike a good balance between benefits and harms of treatments. It is therefore important to understand how patients and physicians estimate prognostic risk, and whether and how these estimates can be improved. An online analytical processing (OLAP tool based on pooled data from placebo cohorts of clinical trials offers short-term prognostic estimates that can be used for individual risk counseling. OBJECTIVE: The aim of this study was to clarify if personalized prognostic information as presented by the OLAP tool is considered useful and meaningful by patients. Furthermore, we used the OLAP tool to evaluate patients' and physicians' risk estimates. Within this evaluation process we assessed short-time prognostic risk estimates of patients with MS (final n = 110 and their physicians (n = 6 and compared them with the estimates of OLAP. RESULTS: Patients rated the OLAP tool as understandable and acceptable, but to be only of moderate interest. It turned out that patients, physicians, and the OLAP tool ranked patients similarly regarding their risk of disease progression. Both patients' and physicians' estimates correlated most strongly with those disease covariates that the OLAP tool's estimates also correlated with most strongly. Exposure to the OLAP tool did not change patients' risk estimates. CONCLUSION: While the OLAP tool was rated understandable and acceptable, it was only of modest interest and did not change patients' prognostic estimates. The results suggest, however, that patients had some idea regarding their prognosis and which factors were most important in this regard. Future work with OLAP should assess long-term prognostic

  16. Prognostic risk estimates of patients with multiple sclerosis and their physicians: comparison to an online analytical risk counseling tool.

    Science.gov (United States)

    Heesen, Christoph; Gaissmaier, Wolfgang; Nguyen, Franziska; Stellmann, Jan-Patrick; Kasper, Jürgen; Köpke, Sascha; Lederer, Christian; Neuhaus, Anneke; Daumer, Martin

    2013-01-01

    Prognostic counseling in multiple sclerosis (MS) is difficult because of the high variability of disease progression. Simultaneously, patients and physicians are increasingly confronted with making treatment decisions at an early stage, which requires taking individual prognoses into account to strike a good balance between benefits and harms of treatments. It is therefore important to understand how patients and physicians estimate prognostic risk, and whether and how these estimates can be improved. An online analytical processing (OLAP) tool based on pooled data from placebo cohorts of clinical trials offers short-term prognostic estimates that can be used for individual risk counseling. The aim of this study was to clarify if personalized prognostic information as presented by the OLAP tool is considered useful and meaningful by patients. Furthermore, we used the OLAP tool to evaluate patients' and physicians' risk estimates. Within this evaluation process we assessed short-time prognostic risk estimates of patients with MS (final n = 110) and their physicians (n = 6) and compared them with the estimates of OLAP. Patients rated the OLAP tool as understandable and acceptable, but to be only of moderate interest. It turned out that patients, physicians, and the OLAP tool ranked patients similarly regarding their risk of disease progression. Both patients' and physicians' estimates correlated most strongly with those disease covariates that the OLAP tool's estimates also correlated with most strongly. Exposure to the OLAP tool did not change patients' risk estimates. While the OLAP tool was rated understandable and acceptable, it was only of modest interest and did not change patients' prognostic estimates. The results suggest, however, that patients had some idea regarding their prognosis and which factors were most important in this regard. Future work with OLAP should assess long-term prognostic estimates and clarify its usefulness for patients and physicians

  17. Scientific Visualization Tools for Enhancement of Undergraduate Research

    Science.gov (United States)

    Rodriguez, W. J.; Chaudhury, S. R.

    2001-05-01

    Undergraduate research projects that utilize remote sensing satellite instrument data to investigate atmospheric phenomena pose many challenges. A significant challenge is processing large amounts of multi-dimensional data. Remote sensing data initially requires mining; filtering of undesirable spectral, instrumental, or environmental features; and subsequently sorting and reformatting to files for easy and quick access. The data must then be transformed according to the needs of the investigation(s) and displayed for interpretation. These multidimensional datasets require views that can range from two-dimensional plots to multivariable-multidimensional scientific visualizations with animations. Science undergraduate students generally find these data processing tasks daunting. Generally, researchers are required to fully understand the intricacies of the dataset and write computer programs or rely on commercially available software, which may not be trivial to use. In the time that undergraduate researchers have available for their research projects, learning the data formats, programming languages, and/or visualization packages is impractical. When dealing with large multi-dimensional data sets appropriate Scientific Visualization tools are imperative in allowing students to have a meaningful and pleasant research experience, while producing valuable scientific research results. The BEST Lab at Norfolk State University has been creating tools for multivariable-multidimensional analysis of Earth Science data. EzSAGE and SAGE4D have been developed to sort, analyze and visualize SAGE II (Stratospheric Aerosol and Gas Experiment) data with ease. Three- and four-dimensional visualizations in interactive environments can be produced. EzSAGE provides atmospheric slices in three-dimensions where the researcher can change the scales in the three-dimensions, color tables and degree of smoothing interactively to focus on particular phenomena. SAGE4D provides a navigable

  18. The use of web2 tools in action research

    DEFF Research Database (Denmark)

    Kolbæk, Raymond; Steensgaard, Randi; Angel, Sanne

    2017-01-01

    . Furthermore we try to evidence-based the concept of "Sample handlings" and examines whether this concept can be used as a flexible methodological tool for developing workflow that promotes patient participation in their own rehabilitation. We use a action research design to identify actual problems, develop......, to test, evaluate and implement specific actions to promote patient participation in rehabilitation. Four nurses and four social and health assistants is having a "co-researcher" active role. The interaction with the researchers creates a reflexive and dynamic process with a learning and competence......Abstract Content: Major challenges occurs, when trying to implement research in clinical practice. In the West Danish Center for Spinal Cord Injury, we are doing a practice-based ph.d. project, that involves the practice field's own members as co-researchers. In the management of the project we use...

  19. Integrative Mixed Methods Data Analytic Strategies in Research on School Success in Challenging Circumstances

    Science.gov (United States)

    Jang, Eunice E.; McDougall, Douglas E.; Pollon, Dawn; Herbert, Monique; Russell, Pia

    2008-01-01

    There are both conceptual and practical challenges in dealing with data from mixed methods research studies. There is a need for discussion about various integrative strategies for mixed methods data analyses. This article illustrates integrative analytic strategies for a mixed methods study focusing on improving urban schools facing challenging…

  20. Random Forest as a Predictive Analytics Alternative to Regression in Institutional Research

    Science.gov (United States)

    He, Lingjun; Levine, Richard A.; Fan, Juanjuan; Beemer, Joshua; Stronach, Jeanne

    2018-01-01

    In institutional research, modern data mining approaches are seldom considered to address predictive analytics problems. The goal of this paper is to highlight the advantages of tree-based machine learning algorithms over classic (logistic) regression methods for data-informed decision making in higher education problems, and stress the success of…

  1. Nuclear analytical methods in teaching and research at the university of Illinois

    International Nuclear Information System (INIS)

    Landsberger, S.

    1994-01-01

    An overview of the nuclear analytical methods opportunities at the University of Illinois at Urbana-Champaign are given. Undergraduate and graduate level teaching and research are highlighted. The TRIGA reactor and neutron activation analysis facilities are described in the context of this role within an inter-disciplinary environment. (author)

  2. Island Explorations: Discovering Effects of Environmental Research-Based Lab Activities on Analytical Chemistry Students

    Science.gov (United States)

    Tomasik, Janice Hall; LeCaptain, Dale; Murphy, Sarah; Martin, Mary; Knight, Rachel M.; Harke, Maureen A.; Burke, Ryan; Beck, Kara; Acevedo-Polakovich, I. David

    2014-01-01

    Motivating students in analytical chemistry can be challenging, in part because of the complexity and breadth of topics involved. Some methods that help encourage students and convey real-world relevancy of the material include incorporating environmental issues, research-based lab experiments, and service learning projects. In this paper, we…

  3. Script Towards Research 2.0: The Influence of Digital and Online Tools in Academic Research

    Directory of Open Access Journals (Sweden)

    Gabriela Grosseck

    2016-07-01

    Full Text Available The new Internet technologies have infiltrated in a stunning way the academic environment, both at individual and at institutional level. Therefore, more and more teachers have started educational blogs, librarians are active on Twitter, other educational actors curate web content, students post on Instagram or Flickr, and university departments have Facebook pages and/or YouTube accounts etc. Today, the use of web technology has become “a legitimate activity in many areas of higher education” (Waycott, 2010 and a considerable shift to digital academic research has gradually occurred. Teachers are encouraging students to take up digital tools for research and writing, thus revealing new ways of using information and communication technologies for academic purposes and not just for socializing. The main objective of this paper is to investigate the effects of integrating diverse digital, Web 2.0 tools and resources and OERs/MOOCs in research and in the construction of students’ academic texts. We aim to stress the increasing influence of digital and online tools in academic research and writing. Teachers, specialists, and students alike are affected by this process. In order to show how, we explore the following issues: What is Research 2.0? Which digital/online tools have we used to assist our students? What are the challenges for academic research using digital / web 2.0 tools? And how do digital tools shape academic research?

  4. Program to develop analytical tools for environmental and safety assessment of nuclear material shipping container systems

    International Nuclear Information System (INIS)

    Butler, T.A.

    1978-11-01

    This paper describes a program for developing analytical techniques to evaluate the response of nuclear material shipping containers to severe accidents. Both lumped-mass and finite element techniques are employed to predict shipping container and shipping container-carrier response to impact. The general impact problem is computationally expensive because of its nonlinear, three-dimensional nature. This expense is minimized by using approximate models to parametrically identify critical cases before more exact analyses are performed. The computer codes developed for solving the problem are being experimentally substantiated with test data from full-scale and scale-model container drop tests. 6 figures, 1 table

  5. Genephony: a knowledge management tool for genome-wide research

    Directory of Open Access Journals (Sweden)

    Riva Alberto

    2009-09-01

    Full Text Available Abstract Background One of the consequences of the rapid and widespread adoption of high-throughput experimental technologies is an exponential increase of the amount of data produced by genome-wide experiments. Researchers increasingly need to handle very large volumes of heterogeneous data, including both the data generated by their own experiments and the data retrieved from publicly available repositories of genomic knowledge. Integration, exploration, manipulation and interpretation of data and information therefore need to become as automated as possible, since their scale and breadth are, in general, beyond the limits of what individual researchers and the basic data management tools in normal use can handle. This paper describes Genephony, a tool we are developing to address these challenges. Results We describe how Genephony can be used to manage large datesets of genomic information, integrating them with existing knowledge repositories. We illustrate its functionalities with an example of a complex annotation task, in which a set of SNPs coming from a genotyping experiment is annotated with genes known to be associated to a phenotype of interest. We show how, thanks to the modular architecture of Genephony and its user-friendly interface, this task can be performed in a few simple steps. Conclusion Genephony is an online tool for the manipulation of large datasets of genomic information. It can be used as a browser for genomic data, as a high-throughput annotation tool, and as a knowledge discovery tool. It is designed to be easy to use, flexible and extensible. Its knowledge management engine provides fine-grained control over individual data elements, as well as efficient operations on large datasets.

  6. Room temperature phosphorescence in the liquid state as a tool in analytical chemistry

    International Nuclear Information System (INIS)

    Kuijt, Jacobus; Ariese, Freek; Brinkman, Udo A.Th.; Gooijer, Cees

    2003-01-01

    A wide-ranging overview of room temperature phosphorescence in the liquid state (RTPL ) is presented, with a focus on recent developments. RTPL techniques like micelle-stabilized (MS)-RTP, cyclodextrin-induced (CD)-RTP, and heavy atom-induced (HAI)-RTP are discussed. These techniques are mainly applied in the stand-alone format, but coupling with some separation techniques appears to be feasible. Applications of direct, sensitized and quenched phosphorescence are also discussed. As regards sensitized and quenched RTP, emphasis is on the coupling with liquid chromatography (LC) and capillary electrophoresis (CE), but stand-alone applications are also reported. Further, the application of RTPL in immunoassays and in RTP optosensing - the optical sensing of analytes based on RTP - is reviewed. Next to the application of RTPL in quantitative analysis, its use for the structural probing of protein conformations and for time-resolved microscopy of labelled biomolecules is discussed. Finally, an overview is presented of the various analytical techniques which are based on the closely related phenomenon of long-lived lanthanide luminescence. The paper closes with a short evaluation of the state-of-the-art in RTP and a discussion on future perspectives

  7. From digital positivism and administrative big data analytics towards critical digital and social media research!

    OpenAIRE

    Fuchs, Christian

    2017-01-01

    This essay argues for a paradigm shift in the study of the Internet and digital/social media. Big data analytics is the dominant paradigm. It receives large amounts of funding, is administrative and a form of digital positivism. Critical social media research is an alternative approach that combines critical social media theory, critical digital methods and critical-realist social media research ethics. Strengthening the second approach is a material question of power in academia.

  8. FOSS Tools for Research Infrastructures - A Success Story?

    Science.gov (United States)

    Stender, V.; Schroeder, M.; Wächter, J.

    2015-12-01

    Established initiatives and mandated organizations, e.g. the Initiative for Scientific Cyberinfrastructures (NSF, 2007) or the European Strategy Forum on Research Infrastructures (ESFRI, 2008), promote and foster the development of sustainable research infrastructures. The basic idea behind these infrastructures is the provision of services supporting scientists to search, visualize and access data, to collaborate and exchange information, as well as to publish data and other results. Especially the management of research data is gaining more and more importance. In geosciences these developments have to be merged with the enhanced data management approaches of Spatial Data Infrastructures (SDI). The Centre for GeoInformationTechnology (CeGIT) at the GFZ German Research Centre for Geosciences has the objective to establish concepts and standards of SDIs as an integral part of research infrastructure architectures. In different projects, solutions to manage research data for land- and water management or environmental monitoring have been developed based on a framework consisting of Free and Open Source Software (FOSS) components. The framework provides basic components supporting the import and storage of data, discovery and visualization as well as data documentation (metadata). In our contribution, we present our data management solutions developed in three projects, Central Asian Water (CAWa), Sustainable Management of River Oases (SuMaRiO) and Terrestrial Environmental Observatories (TERENO) where FOSS components build the backbone of the data management platform. The multiple use and validation of tools helped to establish a standardized architectural blueprint serving as a contribution to Research Infrastructures. We examine the question of whether FOSS tools are really a sustainable choice and whether the increased efforts of maintenance are justified. Finally it should help to answering the question if the use of FOSS for Research Infrastructures is a

  9. Heat as a groundwater tracer in shallow and deep heterogeneous media: Analytical solution, spreadsheet tool, and field applications

    Science.gov (United States)

    Kurylyk, Barret L.; Irvine, Dylan J.; Carey, Sean K.; Briggs, Martin A.; Werkema, Dale D.; Bonham, Mariah

    2017-01-01

    Groundwater flow advects heat, and thus, the deviation of subsurface temperatures from an expected conduction‐dominated regime can be analysed to estimate vertical water fluxes. A number of analytical approaches have been proposed for using heat as a groundwater tracer, and these have typically assumed a homogeneous medium. However, heterogeneous thermal properties are ubiquitous in subsurface environments, both at the scale of geologic strata and at finer scales in streambeds. Herein, we apply the analytical solution of Shan and Bodvarsson (2004), developed for estimating vertical water fluxes in layered systems, in 2 new environments distinct from previous vadose zone applications. The utility of the solution for studying groundwater‐surface water exchange is demonstrated using temperature data collected from an upwelling streambed with sediment layers, and a simple sensitivity analysis using these data indicates the solution is relatively robust. Also, a deeper temperature profile recorded in a borehole in South Australia is analysed to estimate deeper water fluxes. The analytical solution is able to match observed thermal gradients, including the change in slope at sediment interfaces. Results indicate that not accounting for layering can yield errors in the magnitude and even direction of the inferred Darcy fluxes. A simple automated spreadsheet tool (Flux‐LM) is presented to allow users to input temperature and layer data and solve the inverse problem to estimate groundwater flux rates from shallow (e.g., regimes.

  10. The Systems Biology Research Tool: evolvable open-source software

    Directory of Open Access Journals (Sweden)

    Wright Jeremiah

    2008-06-01

    Full Text Available Abstract Background Research in the field of systems biology requires software for a variety of purposes. Software must be used to store, retrieve, analyze, and sometimes even to collect the data obtained from system-level (often high-throughput experiments. Software must also be used to implement mathematical models and algorithms required for simulation and theoretical predictions on the system-level. Results We introduce a free, easy-to-use, open-source, integrated software platform called the Systems Biology Research Tool (SBRT to facilitate the computational aspects of systems biology. The SBRT currently performs 35 methods for analyzing stoichiometric networks and 16 methods from fields such as graph theory, geometry, algebra, and combinatorics. New computational techniques can be added to the SBRT via process plug-ins, providing a high degree of evolvability and a unifying framework for software development in systems biology. Conclusion The Systems Biology Research Tool represents a technological advance for systems biology. This software can be used to make sophisticated computational techniques accessible to everyone (including those with no programming ability, to facilitate cooperation among researchers, and to expedite progress in the field of systems biology.

  11. Identifying problems and generating recommendations for enhancing complex systems: applying the abstraction hierarchy framework as an analytical tool.

    Science.gov (United States)

    Xu, Wei

    2007-12-01

    This study adopts J. Rasmussen's (1985) abstraction hierarchy (AH) framework as an analytical tool to identify problems and pinpoint opportunities to enhance complex systems. The process of identifying problems and generating recommendations for complex systems using conventional methods is usually conducted based on incompletely defined work requirements. As the complexity of systems rises, the sheer mass of data generated from these methods becomes unwieldy to manage in a coherent, systematic form for analysis. There is little known work on adopting a broader perspective to fill these gaps. AH was used to analyze an aircraft-automation system in order to further identify breakdowns in pilot-automation interactions. Four steps follow: developing an AH model for the system, mapping the data generated by various methods onto the AH, identifying problems based on the mapped data, and presenting recommendations. The breakdowns lay primarily with automation operations that were more goal directed. Identified root causes include incomplete knowledge content and ineffective knowledge structure in pilots' mental models, lack of effective higher-order functional domain information displayed in the interface, and lack of sufficient automation procedures for pilots to effectively cope with unfamiliar situations. The AH is a valuable analytical tool to systematically identify problems and suggest opportunities for enhancing complex systems. It helps further examine the automation awareness problems and identify improvement areas from a work domain perspective. Applications include the identification of problems and generation of recommendations for complex systems as well as specific recommendations regarding pilot training, flight deck interfaces, and automation procedures.

  12. Analytical tools for managing rock fall hazards in Australian coal mine roadways

    Energy Technology Data Exchange (ETDEWEB)

    Ross Seedsman; Nick Gordon; Naj Aziz [University of Wollongong (Australia)

    2009-03-15

    This report provides a reference source for the design of ground control measures in coal mine roadways using analytical methods. Collapse models are provided for roof and rib. The roof models recognise that different collapse modes can apply in different stress fields - high, intermediate, and zero compressive stresses. The rib models draw analogies to rock slope stability and also the impact of high vertical stresses. Methods for determining support or reinforcement requirements are provided. Suspension of collapsed masses is identified as the basis for roof support in both very high and zero compressive stress regimes. Reinforcement of bedding discontinuities is advocated for intermediate compressive stresses. For the ribs, restraint of coal blocks defined by pre-existing joints or by mining induced fractures is required.

  13. PLS2 regression as a tool for selection of optimal analytical modality

    DEFF Research Database (Denmark)

    Madsen, Michael; Esbensen, Kim

    Intelligent use of modern process analysers allows process technicians and engineers to look deep into the dynamic behaviour of production systems. This opens up for a plurality of new possibilities with respect to process optimisation. Oftentimes, several instruments representing different...... technologies and price classes are able to decipher relevant process information simultaneously. The question then is: how to choose between available technologies without compromising the quality and usability of the data. We apply PLS2 modelling to quantify the relative merits of competing, or complementing......, analytical modalities. We here present results from a feasibility study, where Fourier Transform Near InfraRed (FT-NIR), Fourier Transform Mid InfraRed (FT-MIR), and Raman laser spectroscopy were applied on the same set of samples obtained from a pilot-scale beer brewing process. Quantitative PLS1 models...

  14. Accept or Decline? An Analytics-Based Decision Tool for Kidney Offer Evaluation.

    Science.gov (United States)

    Bertsimas, Dimitris; Kung, Jerry; Trichakis, Nikolaos; Wojciechowski, David; Vagefi, Parsia A

    2017-12-01

    When a deceased-donor kidney is offered to a waitlisted candidate, the decision to accept or decline the organ relies primarily upon a practitioner's experience and intuition. Such decisions must achieve a delicate balance between estimating the immediate benefit of transplantation and the potential for future higher-quality offers. However, the current experience-based paradigm lacks scientific rigor and is subject to the inaccuracies that plague anecdotal decision-making. A data-driven analytics-based model was developed to predict whether a patient will receive an offer for a deceased-donor kidney at Kidney Donor Profile Index thresholds of 0.2, 0.4, and 0.6, and at timeframes of 3, 6, and 12 months. The model accounted for Organ Procurement Organization, blood group, wait time, DR antigens, and prior offer history to provide accurate and personalized predictions. Performance was evaluated on data sets spanning various lengths of time to understand the adaptability of the method. Using United Network for Organ Sharing match-run data from March 2007 to June 2013, out-of-sample area under the receiver operating characteristic curve was approximately 0.87 for all Kidney Donor Profile Index thresholds and timeframes considered for the 10 most populous Organ Procurement Organizations. As more data becomes available, area under the receiver operating characteristic curve values increase and subsequently level off. The development of a data-driven analytics-based model may assist transplant practitioners and candidates during the complex decision of whether to accept or forgo a current kidney offer in anticipation of a future high-quality offer. The latter holds promise to facilitate timely transplantation and optimize the efficiency of allocation.

  15. Analytical implications of using practice theory in workplace information literacy research

    DEFF Research Database (Denmark)

    Moring, Camilla Elisabeth; Lloyd, Annemaree

    2013-01-01

    Introduction: This paper considers practice theory and the analytical implications of using this theoretical approach in information literacy research. More precisely the aim of the paper is to discuss the translation of practice theoretical assumptions into strategies that frame the analytical...... focus and interest when researching workplace information literacy. Two practice theoretical perspectives are selected, one by Theodore Schatzki and one by Etienne Wenger, and their general commonalities and differences are analysed and discussed. Analysis: The two practice theories and their main ideas...... of what constitute practices, how practices frame social life and the central concepts used to explain this, are presented. Then the application of the theories within workplace information literacy research is briefly explored. Results and Conclusion: The two theoretical perspectives share some...

  16. Application of metabonomic analytical techniques in the modernization and toxicology research of traditional Chinese medicine.

    Science.gov (United States)

    Lao, Yong-Min; Jiang, Jian-Guo; Yan, Lu

    2009-08-01

    In the recent years, a wide range of metabonomic analytical techniques are widely used in the modern research of traditional Chinese medicine (TCM). At the same time, the international community has attached increasing importance to TCM toxicity problems. Thus, many studies have been implemented to investigate the toxicity mechanisms of TCM. Among these studies, many metabonomic-based methods have been implemented to facilitate TCM toxicity investigation. At present, the most prevailing methods for TCM toxicity research are mainly single analysis techniques using only one analytical means. These techniques include nuclear magnetic resonance (NMR), gas chromatography-mass spectrometry (GC-MS), and liquid chromatography-mass spectrometry (LC-MS), etc.; with these techniques, some favourable outcomes have been gained in the toxic reaction studies of TCM, such as the action target organs assay, the establishment of action pattern, the elucidation of action mechanism and the exploration of action material foundation. However, every analytical technique has its advantages and drawbacks, no existing analytical technique can be versatile. Multi-analysed techniques can partially overcome the shortcomings of single-analysed techniques. Combination of GC-MS and LC-MS metabolic profiling approaches has unravelled the pathological outcomes of aristolochic acid-induced nephrotoxicity, which can not be achieved by single-analysed techniques. It is believed that with the further development of metabonomic analytical techniques, especially multi-analysed techniques, metabonomics will greatly promote TCM toxicity research and be beneficial to the modernization of TCM in terms of extending the application of modern means in the TCM safety assessment, assisting the formulation of TCM safety norms and establishing the international standards indicators.

  17. New Tools for New Research in Psychiatry: A Scalable and Customizable Platform to Empower Data Driven Smartphone Research.

    Science.gov (United States)

    Torous, John; Kiang, Mathew V; Lorme, Jeanette; Onnela, Jukka-Pekka

    2016-05-05

    A longstanding barrier to progress in psychiatry, both in clinical settings and research trials, has been the persistent difficulty of accurately and reliably quantifying disease phenotypes. Mobile phone technology combined with data science has the potential to offer medicine a wealth of additional information on disease phenotypes, but the large majority of existing smartphone apps are not intended for use as biomedical research platforms and, as such, do not generate research-quality data. Our aim is not the creation of yet another app per se but rather the establishment of a platform to collect research-quality smartphone raw sensor and usage pattern data. Our ultimate goal is to develop statistical, mathematical, and computational methodology to enable us and others to extract biomedical and clinical insights from smartphone data. We report on the development and early testing of Beiwe, a research platform featuring a study portal, smartphone app, database, and data modeling and analysis tools designed and developed specifically for transparent, customizable, and reproducible biomedical research use, in particular for the study of psychiatric and neurological disorders. We also outline a proposed study using the platform for patients with schizophrenia. We demonstrate the passive data capabilities of the Beiwe platform and early results of its analytical capabilities. Smartphone sensors and phone usage patterns, when coupled with appropriate statistical learning tools, are able to capture various social and behavioral manifestations of illnesses, in naturalistic settings, as lived and experienced by patients. The ubiquity of smartphones makes this type of moment-by-moment quantification of disease phenotypes highly scalable and, when integrated within a transparent research platform, presents tremendous opportunities for research, discovery, and patient health.

  18. Accounting Research as a didactic tool for a accounting teaching

    Directory of Open Access Journals (Sweden)

    Valeria Gisela Perez

    2016-06-01

    Full Text Available This paper develops a reflection about the importance of the research of accounting subjects in the professional accountants training, this importance is an attribute of research to increase the wealth of discipline under investigation, this can be converted into a skill and/or competence wich accountants are required to demonstrate in their professional practice.Furthermore, accounting is recognized by the authors as a science in constant development, being able to be investigated. This change in knowledge is an element that motivates professionals to be constantly updated, becoming this aspect (constant updating the skill and competence that research can bring to professional training in university classrooms.The reflection is based on the study of documents developed by prestigious authors in accounting theory, teaching and research.Therefore, this paper concludes that research is a useful tool for the professional accounting training, and rewards the important skills and competencies for professional practice; it can be conceived as well as a strategy for technical and educational activities that allows students to recreate knowledge, allowing future updates that will require their professional practice.Key words: Accounting research, university teaching, accounting education. 

  19. Electric gun: a new tool for ultrahigh-pressure research

    International Nuclear Information System (INIS)

    Weingart, R.C.; Chau, H.H.; Goosman, D.R.; Hofer, W.W.; Honodel, C.A.; Lee, R.S.; Steinberg, D.J.; Stroud, J.R.

    1979-01-01

    We have developed a new tool for ultrahigh-pressure research at LLL. This system, which we call the electric gun, has already achieved thin flyer plate velocities in excess of 20 km/s and pressures of the order of 2 TPa in tantalum. We believe that the electric gun is competitive with laser- and nuclear-driven methods of producing shocks in the 1-to-5 TPa range because of its precision and ease and economy of operation. Its development is recommended for shock initiation studies, dry runs for Site 300 hydroshots, and as a shock wave generator for surface studies

  20. An analytical method on the surface residual stress for the cutting tool orientation

    Science.gov (United States)

    Li, Yueen; Zhao, Jun; Wang, Wei

    2010-03-01

    The residual stress is measured by choosing 8 kinds orientations on cutting the H13 dies steel on the HSM in the experiment of this paper. The measured data shows on that the residual stress exists periodicity for the different rake angle (β) and side rake angle (θ) parameters, further study find that the cutting tool orientations have closed relationship with the residual stresses, and for the original of the machined residual stress on the surface from the cutting force and the axial force, it can be gained the simply model of tool-workpiece force, using the model it can be deduced the residual stress model, which is feasible to calculate the size of residual stress. And for almost all the measured residual stresses are compressed stress, the compressed stress size and the direction could be confirmed by the input data for the H13 on HSM. As the result, the residual stress model is the key for optimization of rake angle (β) and side rake angle (θ) in theory, using the theory the more cutting mechanism can be expressed.

  1. Study of an ultrasound-based process analytical tool for homogenization of nanoparticulate pharmaceutical vehicles.

    Science.gov (United States)

    Cavegn, Martin; Douglas, Ryan; Akkermans, Guy; Kuentz, Martin

    2011-08-01

    There are currently no adequate process analyzers for nanoparticulate viscosity enhancers. This article aims to evaluate ultrasonic resonator technology as a monitoring tool for homogenization of nanoparticulate gels. Aqueous dispersions of colloidal microcrystalline cellulose (MCC) and a mixture of clay particles with xanthan gum were compared with colloidal silicon dioxide in oil. The processing was conducted using a laboratory-scale homogenizing vessel. The study investigated first the homogenization kinetics of the different systems to focus then on process factors in the case of colloidal MCC. Moreover, rheological properties were analyzed offline to assess the structure of the resulting gels. Results showed the suitability of ultrasound velocimetry to monitor the homogenization process. The obtained data were fitted using a novel heuristic model. It was possible to identify characteristic homogenization times for each formulation. The subsequent study of the process factors demonstrated that ultrasonic process analysis was equally sensitive as offline rheological measurements in detecting subtle manufacturing changes. It can be concluded that the ultrasonic method was able to successfully assess homogenization of nanoparticulate viscosity enhancers. This novel technique can become a vital tool for development and production of pharmaceutical suspensions in the future. Copyright © 2011 Wiley-Liss, Inc.

  2. Tools for the quantitative analysis of sedimentation boundaries detected by fluorescence optical analytical ultracentrifugation.

    Directory of Open Access Journals (Sweden)

    Huaying Zhao

    Full Text Available Fluorescence optical detection in sedimentation velocity analytical ultracentrifugation allows the study of macromolecules at nanomolar concentrations and below. This has significant promise, for example, for the study of systems of high-affinity protein interactions. Here we describe adaptations of the direct boundary modeling analysis approach implemented in the software SEDFIT that were developed to accommodate unique characteristics of the confocal fluorescence detection system. These include spatial gradients of signal intensity due to scanner movements out of the plane of rotation, temporal intensity drifts due to instability of the laser and fluorophores, and masking of the finite excitation and detection cone by the sample holder. In an extensive series of experiments with enhanced green fluorescent protein ranging from low nanomolar to low micromolar concentrations, we show that the experimental data provide sufficient information to determine the parameters required for first-order approximation of the impact of these effects on the recorded data. Systematic deviations of fluorescence optical sedimentation velocity data analyzed using conventional sedimentation models developed for absorbance and interference optics are largely removed after these adaptations, resulting in excellent fits that highlight the high precision of fluorescence sedimentation velocity data, thus allowing a more detailed quantitative interpretation of the signal boundaries that is otherwise not possible for this system.

  3. Thermal Lens Spectroscopy as a 'new' analytical tool for actinide determination in nuclear reprocessing processes

    International Nuclear Information System (INIS)

    Canto, Fabrice; Couston, Laurent; Magnaldo, Alastair; Broquin, Jean-Emmanuel; Signoret, Philippe

    2008-01-01

    Thermal Lens Spectroscopy (TLS) consists of measuring the effects induced by the relaxation of molecules excited by photons. Twenty years ago, the Cea already worked on TLS. Technologic reasons impeded. But, needs in sensitive analytical methods coupled with very low sample volumes (for example, traces of Np in the COEX TM process) and also the reduction of the nuclear wastes encourage us to revisit this method thanks to the improvement of optoelectronic technologies. We can also imagine coupling TLS with micro-fluidic technologies, decreasing significantly the experiments cost. Generally two laser beams are used for TLS: one for the selective excitation by molecular absorption (inducing the thermal lens) and one for probing the thermal lens. They can be coupled with different geometries, collinear or perpendicular, depending on the application and on the laser mode. Also, many possibilities of measurement have been studied to detect the thermal lens signal: interferometry, direct intensities variations, deflection etc... In this paper, one geometrical configuration and two measurements have been theoretically evaluated. For a single photodiode detection (z-scan) the limit of detection is calculated to be near 5*10 -6 mol*L -1 for Np(IV) in dodecane. (authors)

  4. External beam milli-PIXE as analytical tool for Neolithic obsidian provenance studies

    Energy Technology Data Exchange (ETDEWEB)

    Constantinescu, B.; Cristea-Stan, D. [National Institute for Nuclear Physics and Engineering Horia Hulubei, Bucharest-Magurele (Romania); Kovács, I.; Szõkefalvi-Nagy, Z. [Wigner Research Centre for Phyics, Institute for Particle and Nuclear Physics, Budapest (Hungary)

    2013-07-01

    Full text: Obsidian is the most important archaeological material used for tools and weapons before metals appearance. Its geological sources are limited and concentrated in few geographical zones: Armenia, Eastern Anatolia, Italian Lipari and Sardinia islands, Greek Melos and Yali islands, Hungarian and Slovak Tokaj Mountains. Due to this fact, in Mesolithic and Neolithic periods obsidian was the first archaeological material intensively traded even at long distances. To determine the geological provenance of obsidian and to identify the prehistoric long-range trade routes and possible population migrations elemental concentration ratios can help a 101, since each geological source has its 'fingerprints'. In this work external milli-PIXE technique was applied for elemental concentration ratio determinations in some Neolithic tools found in Transylvania and in the lron Gales region near Danube, and on few relevant geological samples (Slovak Tokaj Mountains, Lipari,Armenia). In Transylvania (the North-Western part of Romania, a region surrounded by Carpathian Mountains), Neolithic obsidian tools were discovered mainly in three regions: North-West - Oradea (near the border with Hungary, Slovakia and Ukraine), Centre -Cluj and Southwest- Banat (near the border with Serbia). A special case is lron Gales - Mesolithic and Early Neolithic sites, directly related to the appearance of agriculture replacing the Mesolithic economy based on hunting and fishing. Three long-distance trade routes could be considered: from Caucasus Mountains via North of the Black Sea, from Greek islands or Asia Minor via ex-Yugoslavia area or via Greece-Bulgaria or from Central Europe- Tokaj Mountains in the case of obsidian. As provenance 'fingerprints', we focused on Ti to Mn, and Rb-Sr-Y-Zr ratios. The measurements were performed at the external milli-PIXE beam-line of the 5MV VdG accelerator of the Wigner RCP. Proton energy of 3MeV and beam currents in the range of 1 ±1 D

  5. External beam milli-PIXE as analytical tool for Neolithic obsidian provenance studies

    International Nuclear Information System (INIS)

    Constantinescu, B.; Cristea-Stan, D.; Kovács, I.; Szõkefalvi-Nagy, Z.

    2013-01-01

    Full text: Obsidian is the most important archaeological material used for tools and weapons before metals appearance. Its geological sources are limited and concentrated in few geographical zones: Armenia, Eastern Anatolia, Italian Lipari and Sardinia islands, Greek Melos and Yali islands, Hungarian and Slovak Tokaj Mountains. Due to this fact, in Mesolithic and Neolithic periods obsidian was the first archaeological material intensively traded even at long distances. To determine the geological provenance of obsidian and to identify the prehistoric long-range trade routes and possible population migrations elemental concentration ratios can help a 101, since each geological source has its 'fingerprints'. In this work external milli-PIXE technique was applied for elemental concentration ratio determinations in some Neolithic tools found in Transylvania and in the lron Gales region near Danube, and on few relevant geological samples (Slovak Tokaj Mountains, Lipari,Armenia). In Transylvania (the North-Western part of Romania, a region surrounded by Carpathian Mountains), Neolithic obsidian tools were discovered mainly in three regions: North-West - Oradea (near the border with Hungary, Slovakia and Ukraine), Centre -Cluj and Southwest- Banat (near the border with Serbia). A special case is lron Gales - Mesolithic and Early Neolithic sites, directly related to the appearance of agriculture replacing the Mesolithic economy based on hunting and fishing. Three long-distance trade routes could be considered: from Caucasus Mountains via North of the Black Sea, from Greek islands or Asia Minor via ex-Yugoslavia area or via Greece-Bulgaria or from Central Europe- Tokaj Mountains in the case of obsidian. As provenance 'fingerprints', we focused on Ti to Mn, and Rb-Sr-Y-Zr ratios. The measurements were performed at the external milli-PIXE beam-line of the 5MV VdG accelerator of the Wigner RCP. Proton energy of 3MeV and beam currents in the range of 1 ±1 D

  6. Evaluating research impact: the development of a ‘RESEARCH for IMPACT’ TOOL

    Directory of Open Access Journals (Sweden)

    Komla Tsey

    2016-08-01

    Full Text Available Introduction: This paper describes the development of a ‘Research for Impact’ Tool against a background of concerns about the over-researching of Aboriginal and Torres Strait Islander people’s issues without demonstrable benefits.Material and Methods: A combination of literature reviews, workshops with researchers and reflections by project team members and partners using participatory snowball techniques.Results: Assessing research impact is difficult, akin to so-called ‘wicked problem’, but not impossible. Heuristic and collaborative approach to research that takes in the expectations of research users, those being researched and the funders of research offers a pragmatic solution to evaluating research impact. The proposed ‘Research for Impact’ Tool is based on the understanding that the value of research is to create evidence and/or products to support smarter decisions so as to improve the human condition.Research is of limited value unless the evidence produced is used to inform smarter decisions. A practical way of approaching research impact is therefore to start with the decisions confronting decision makers whether they are government policymakers, professional practitioners or households and the extent to which the research supports smarter decisions and the knock-on consequences of such smart decisions. Embedded at each step in the impact planning, monitoring and evaluation process is the need for Indigenous leadership and participation, capacity enhancement and collaborative partnerships and participatory learning by doing approaches across partners.Discussion: The tool is designed in the context of Indigenous research but the basic idea that the way to assess research impact is to start upfront by defining the users’ of research and their information needs, the decisions confronting them and the extent to which research informs smarter decisions is equally applicable to research in other settings, both applied and

  7. An analytical tool to support the pedestrianisation process: The case of via Roma, Cagliari

    Directory of Open Access Journals (Sweden)

    Alfonso Annunziata

    2018-04-01

    Full Text Available The article focuses on the case of the modification of an urban road network: the transformation of a portion of an important distributor road in the urban area of Cagliari into a pedestrian space. By means of this case study the article aims to point out how pedestrianisation interventions have not been completely defined within a theoretical system that clearly establishes modes and conditions of implementation. This lack of theorization has led to the common understanding of pedestrianisation as good operations in and of itself and, as such, exportable, meant to produce the same effects everywhere (Bianchetti, 2016. This analysis uses the fundamental conditions of hierarchy as a tool to assess to what extent the modification of the road network articulation has resulted in conditions of lesser inter-connectivity, legibility and functionality. In this perspective the article proposes a system of criteria, founded on the principles of hierarchy, meant to be a theoretical support for processes of pedestrianisation.

  8. Puzzle test: A tool for non-analytical clinical reasoning assessment.

    Science.gov (United States)

    Monajemi, Alireza; Yaghmaei, Minoo

    2016-01-01

    Most contemporary clinical reasoning tests typically assess non-automatic thinking. Therefore, a test is needed to measure automatic reasoning or pattern recognition, which has been largely neglected in clinical reasoning tests. The Puzzle Test (PT) is dedicated to assess automatic clinical reasoning in routine situations. This test has been introduced first in 2009 by Monajemi et al in the Olympiad for Medical Sciences Students.PT is an item format that has gained acceptance in medical education, but no detailed guidelines exist for this test's format, construction and scoring. In this article, a format is described and the steps to prepare and administer valid and reliable PTs are presented. PT examines a specific clinical reasoning task: Pattern recognition. PT does not replace other clinical reasoning assessment tools. However, it complements them in strategies for assessing comprehensive clinical reasoning.

  9. Thermodynamics and structure of liquid surfaces investigated directly with surface analytical tools

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, Gunther [Flinders Univ., Adelaide, SA (Australia). Centre for NanoScale Science and Technology; Morgner, Harald [Leipzig Univ. (Germany). Wilhelm Ostwald Inst. for Physical and Theoretical Chemistry

    2017-06-15

    Measuring directly the composition, the distribution of constituents as function of the depth and the orientation of molecules at liquid surfaces is essential for determining physicochemical properties of liquid surfaces. While the experimental tools that have been developed for analyzing solid surfaces can in principal be applied to liquid surfaces, it turned out that they had to be adjusted to the particular challenges imposed by liquid samples, e.g. by the unavoidable vapor pressure and by the mobility of the constituting atoms/molecules. In the present work it is shown, how electron spectroscopy and ion scattering spectroscopy have been used for analyzing liquid surfaces. The emphasis of this review is on using the structural information gained for determining the physicochemical properties of liquid surfaces. (copyright 2017 by WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  10. Interactive network analytical tool for instantaneous bespoke interrogation of food safety notifications.

    Science.gov (United States)

    Nepusz, Tamás; Petróczi, Andrea; Naughton, Declan P

    2012-01-01

    The globalization of food supply necessitates continued advances in regulatory control measures to ensure that citizens enjoy safe and adequate nutrition. The aim of this study was to extend previous reports on network analysis relating to food notifications by including an optional filter by type of notification and in cases of contamination, by type of contaminant in the notified foodstuff. A filter function has been applied to enable processing of selected notifications by contaminant or type of notification to i) capture complexity, ii) analyze trends, and iii) identify patterns of reporting activities between countries. The program rapidly assesses nations' roles as transgressor and/or detector for each category of contaminant and for the key class of border rejection. In the open access demonstration version, the majority of notifications in the Rapid Alert System for Food and Feed were categorized by contaminant type as mycotoxin (50.4%), heavy metals (10.9%) or bacteria (20.3%). Examples are given demonstrating how network analytical approaches complement, and in some cases supersede, descriptive statistics such as frequency counts, which may give limited or potentially misleading information. One key feature is that network analysis takes the relationship between transgressor and detector countries, along with number of reports and impact simultaneously into consideration. Furhermore, the indices that compliment the network maps and reflect each country's transgressor and detector activities allow comparisons to be made between (transgressing vs. detecting) as well as within (e.g. transgressing) activities. This further development of the network analysis approach to food safety contributes to a better understanding of the complexity of the effort ensuring food is safe for consumption in the European Union. The unique patterns of the interplay between detectors and transgressors, instantly revealed by our approach, could supplement the intelligence

  11. Interactive network analytical tool for instantaneous bespoke interrogation of food safety notifications.

    Directory of Open Access Journals (Sweden)

    Tamás Nepusz

    Full Text Available The globalization of food supply necessitates continued advances in regulatory control measures to ensure that citizens enjoy safe and adequate nutrition. The aim of this study was to extend previous reports on network analysis relating to food notifications by including an optional filter by type of notification and in cases of contamination, by type of contaminant in the notified foodstuff.A filter function has been applied to enable processing of selected notifications by contaminant or type of notification to i capture complexity, ii analyze trends, and iii identify patterns of reporting activities between countries. The program rapidly assesses nations' roles as transgressor and/or detector for each category of contaminant and for the key class of border rejection. In the open access demonstration version, the majority of notifications in the Rapid Alert System for Food and Feed were categorized by contaminant type as mycotoxin (50.4%, heavy metals (10.9% or bacteria (20.3%. Examples are given demonstrating how network analytical approaches complement, and in some cases supersede, descriptive statistics such as frequency counts, which may give limited or potentially misleading information. One key feature is that network analysis takes the relationship between transgressor and detector countries, along with number of reports and impact simultaneously into consideration. Furhermore, the indices that compliment the network maps and reflect each country's transgressor and detector activities allow comparisons to be made between (transgressing vs. detecting as well as within (e.g. transgressing activities.This further development of the network analysis approach to food safety contributes to a better understanding of the complexity of the effort ensuring food is safe for consumption in the European Union. The unique patterns of the interplay between detectors and transgressors, instantly revealed by our approach, could supplement the

  12. Algorithms for optimized maximum entropy and diagnostic tools for analytic continuation

    Science.gov (United States)

    Bergeron, Dominic; Tremblay, A.-M. S.

    2016-08-01

    Analytic continuation of numerical data obtained in imaginary time or frequency has become an essential part of many branches of quantum computational physics. It is, however, an ill-conditioned procedure and thus a hard numerical problem. The maximum-entropy approach, based on Bayesian inference, is the most widely used method to tackle that problem. Although the approach is well established and among the most reliable and efficient ones, useful developments of the method and of its implementation are still possible. In addition, while a few free software implementations are available, a well-documented, optimized, general purpose, and user-friendly software dedicated to that specific task is still lacking. Here we analyze all aspects of the implementation that are critical for accuracy and speed and present a highly optimized approach to maximum entropy. Original algorithmic and conceptual contributions include (1) numerical approximations that yield a computational complexity that is almost independent of temperature and spectrum shape (including sharp Drude peaks in broad background, for example) while ensuring quantitative accuracy of the result whenever precision of the data is sufficient, (2) a robust method of choosing the entropy weight α that follows from a simple consistency condition of the approach and the observation that information- and noise-fitting regimes can be identified clearly from the behavior of χ2 with respect to α , and (3) several diagnostics to assess the reliability of the result. Benchmarks with test spectral functions of different complexity and an example with an actual physical simulation are presented. Our implementation, which covers most typical cases for fermions, bosons, and response functions, is available as an open source, user-friendly software.

  13. Basic statistical tools in research and data analysis

    Directory of Open Access Journals (Sweden)

    Zulfiqar Ali

    2016-01-01

    Full Text Available Statistical methods involved in carrying out a study include planning, designing, collecting data, analysing, drawing meaningful interpretation and reporting of the research findings. The statistical analysis gives meaning to the meaningless numbers, thereby breathing life into a lifeless data. The results and inferences are precise only if proper statistical tests are used. This article will try to acquaint the reader with the basic research tools that are utilised while conducting various studies. The article covers a brief outline of the variables, an understanding of quantitative and qualitative variables and the measures of central tendency. An idea of the sample size estimation, power analysis and the statistical errors is given. Finally, there is a summary of parametric and non-parametric tests used for data analysis.

  14. Integrating research tools to support the management of social-ecological systems under climate change

    Science.gov (United States)

    Miller, Brian W.; Morisette, Jeffrey T.

    2014-01-01

    Developing resource management strategies in the face of climate change is complicated by the considerable uncertainty associated with projections of climate and its impacts and by the complex interactions between social and ecological variables. The broad, interconnected nature of this challenge has resulted in calls for analytical frameworks that integrate research tools and can support natural resource management decision making in the face of uncertainty and complex interactions. We respond to this call by first reviewing three methods that have proven useful for climate change research, but whose application and development have been largely isolated: species distribution modeling, scenario planning, and simulation modeling. Species distribution models provide data-driven estimates of the future distributions of species of interest, but they face several limitations and their output alone is not sufficient to guide complex decisions for how best to manage resources given social and economic considerations along with dynamic and uncertain future conditions. Researchers and managers are increasingly exploring potential futures of social-ecological systems through scenario planning, but this process often lacks quantitative response modeling and validation procedures. Simulation models are well placed to provide added rigor to scenario planning because of their ability to reproduce complex system dynamics, but the scenarios and management options explored in simulations are often not developed by stakeholders, and there is not a clear consensus on how to include climate model outputs. We see these strengths and weaknesses as complementarities and offer an analytical framework for integrating these three tools. We then describe the ways in which this framework can help shift climate change research from useful to usable.

  15. ARM Climate Research Facility: Outreach Tools and Strategies

    Science.gov (United States)

    Roeder, L.; Jundt, R.

    2009-12-01

    Sponsored by the Department of Energy, the ARM Climate Research Facility is a global scientific user facility for the study of climate change. To publicize progress and achievements and to reach new users, the ACRF uses a variety of Web 2.0 tools and strategies that build off of the program’s comprehensive and well established News Center (www.arm.gov/news). These strategies include: an RSS subscription service for specific news categories; an email “newsletter” distribution to the user community that compiles the latest News Center updates into a short summary with links; and a Facebook page that pulls information from the News Center and links to relevant information in other online venues, including those of our collaborators. The ACRF also interacts with users through field campaign blogs, like Discovery Channel’s EarthLive, to share research experiences from the field. Increasingly, field campaign Wikis are established to help ACRF researchers collaborate during the planning and implementation phases of their field studies and include easy to use logs and image libraries to help record the campaigns. This vital reference information is used in developing outreach material that is shared in highlights, news, and Facebook. Other Web 2.0 tools that ACRF uses include Google Maps to help users visualize facility locations and aircraft flight patterns. Easy-to-use comment boxes are also available on many of the data-related web pages on www.arm.gov to encourage feedback. To provide additional opportunities for increased interaction with the public and user community, future Web 2.0 plans under consideration for ACRF include: evaluating field campaigns for Twitter and microblogging opportunities, adding public discussion forums to research highlight web pages, moving existing photos into albums on FlickR or Facebook, and building online video archives through YouTube.

  16. Analytic Models of Domain-Averaged Fermi Holes: A New Tool for the

    Czech Academy of Sciences Publication Activity Database

    Ponec, Robert; Cooper, D.L.; Savin, A.

    2008-01-01

    Roč. 14, č. 11 (2008), s. 3338-3345 ISSN 0947-6539 R&D Projects: GA AV ČR(CZ) IAA4072403 Institutional research plan: CEZ:AV0Z40720504 Keywords : bond theory * fermi hole analysis * lewis model Subject RIV: CF - Physical ; Theoretical Chemistry Impact factor: 5.454, year: 2008

  17. Big Data & Learning Analytics: A Potential Way to Optimize eLearning Technological Tools

    Science.gov (United States)

    García, Olga Arranz; Secades, Vidal Alonso

    2013-01-01

    In the information age, one of the most influential institutions is education. The recent emergence of MOOCS [Massively Open Online Courses] is a sample of the new expectations that are offered to university students. Basing decisions on data and evidence seems obvious, and indeed, research indicates that data-driven decision-making improves…

  18. Analytical tools and functions of GIS in the process control and decision support of mining company

    Directory of Open Access Journals (Sweden)

    Semrád Peter

    2001-12-01

    calculate their total quantity of reserves and distance and network analysis (modelling protection pillars as buffer zones for operating objects above ground, time calculation for transport of mineral resource, using optimal routes combined with cost calcultion, from the analytical apparatus and functions of GIS used in the process control and decision support of mining company.Modern mining is ranked to the specific group of fields with high information intensity. Because of high financial demands of the mine processes and technologies, the basical strategy of all mining companies is the utilization of information technologies for the reduction of expenses. Implementation of GIS in this area is, according to their options and functions, ideal.

  19. Research prioritization using the Analytic Hierarchy Process: basic methods. Volume 1

    International Nuclear Information System (INIS)

    Vesely, W.E.; Shafaghi, A.; Gary, I. Jr.; Rasmuson, D.M.

    1983-08-01

    This report describes a systematic approach for prioritizing research needs and research programs. The approach is formally called the Analytic Hierarchy Process which was developed by T.L. Saaty and is described in several of his texts referenced in the report. The Analytic Hierarchy Process, or AHP for short, has been applied to a wide variety of prioritization problems and has a good record of success as documented in Saaty's texts. The report develops specific guidelines for constructing the hierarchy and for prioritizing the research programs. Specific examples are given to illustrate the steps in the AHP. As part of the work, a computer code has been developed and the use of the code is described. The code allows the prioritizations to be done in a codified and efficient manner; sensitivity and parametric studies can also be straightforwardly performed to gain a better understanding of the prioritization results. Finally, as an important part of the work, an approach is developed which utilizes probabilistic risk analyses (PRAs) to systematically identify and prioritize research needs and research programs. When utilized in an AHP framework, the PRA's which have been performed to date provide a powerful information source for focusing research on those areas most impacting risk and risk uncertainty

  20. Experimental anti-GBM nephritis as an analytical tool for studying spontaneous lupus nephritis.

    Science.gov (United States)

    Du, Yong; Fu, Yuyang; Mohan, Chandra

    2008-01-01

    Systemic lupus erythematosus (SLE) is an autoimmune disease that results in immune-mediated damage to multiple organs. Among these, kidney involvement is the most common and fatal. Spontaneous lupus nephritis (SLN) in mouse models has provided valuable insights into the underlying mechanisms of human lupus nephritis. However, SLN in mouse models takes 6-12 months to manifest; hence there is clearly the need for a mouse model that can be used to unveil the pathogenic processes that lead to immune nephritis over a shorter time frame. In this article more than 25 different molecules are reviewed that have been studied both in the anti-glomerular basement membrane (anti-GBM) model and in SLN and it was found that these molecules influence both diseases in a parallel fashion, suggesting that the two disease settings share common molecular mechanisms. Based on these observations, the authors believe the experimental anti-GBM disease model might be one of the best tools currently available for uncovering the downstream molecular mechanisms leading to SLN.

  1. Evaluation of Heat Flux Measurement as a New Process Analytical Technology Monitoring Tool in Freeze Drying.

    Science.gov (United States)

    Vollrath, Ilona; Pauli, Victoria; Friess, Wolfgang; Freitag, Angelika; Hawe, Andrea; Winter, Gerhard

    2017-05-01

    This study investigates the suitability of heat flux measurement as a new technique for monitoring product temperature and critical end points during freeze drying. The heat flux sensor is tightly mounted on the shelf and measures non-invasively (no contact with the product) the heat transferred from shelf to vial. Heat flux data were compared to comparative pressure measurement, thermocouple readings, and Karl Fischer titration as current state of the art monitoring techniques. The whole freeze drying process including freezing (both by ramp freezing and controlled nucleation) and primary and secondary drying was considered. We found that direct measurement of the transferred heat enables more insights into thermodynamics of the freezing process. Furthermore, a vial heat transfer coefficient can be calculated from heat flux data, which ultimately provides a non-invasive method to monitor product temperature throughout primary drying. The end point of primary drying determined by heat flux measurements was in accordance with the one defined by thermocouples. During secondary drying, heat flux measurements could not indicate the progress of drying as monitoring the residual moisture content. In conclusion, heat flux measurements are a promising new non-invasive tool for lyophilization process monitoring and development using energy transfer as a control parameter. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  2. DR-Integrator: a new analytic tool for integrating DNA copy number and gene expression data.

    Science.gov (United States)

    Salari, Keyan; Tibshirani, Robert; Pollack, Jonathan R

    2010-02-01

    DNA copy number alterations (CNA) frequently underlie gene expression changes by increasing or decreasing gene dosage. However, only a subset of genes with altered dosage exhibit concordant changes in gene expression. This subset is likely to be enriched for oncogenes and tumor suppressor genes, and can be identified by integrating these two layers of genome-scale data. We introduce DNA/RNA-Integrator (DR-Integrator), a statistical software tool to perform integrative analyses on paired DNA copy number and gene expression data. DR-Integrator identifies genes with significant correlations between DNA copy number and gene expression, and implements a supervised analysis that captures genes with significant alterations in both DNA copy number and gene expression between two sample classes. DR-Integrator is freely available for non-commercial use from the Pollack Lab at http://pollacklab.stanford.edu/ and can be downloaded as a plug-in application to Microsoft Excel and as a package for the R statistical computing environment. The R package is available under the name 'DRI' at http://cran.r-project.org/. An example analysis using DR-Integrator is included as supplemental material. Supplementary data are available at Bioinformatics online.

  3. Demonstration of FBRM as process analytical technology tool for dewatering processes via CST correlation.

    Science.gov (United States)

    Cobbledick, Jeffrey; Nguyen, Alexander; Latulippe, David R

    2014-07-01

    The current challenges associated with the design and operation of net-energy positive wastewater treatment plants demand sophisticated approaches for the monitoring of polymer-induced flocculation. In anaerobic digestion (AD) processes, the dewaterability of the sludge is typically assessed from off-line lab-bench tests - the capillary suction time (CST) test is one of the most common. Focused beam reflectance measurement (FBRM) is a promising technique for real-time monitoring of critical performance attributes in large scale processes and is ideally suited for dewatering applications. The flocculation performance of twenty-four cationic polymers, that spanned a range of polymer size and charge properties, was measured using both the FBRM and CST tests. Analysis of the data revealed a decreasing monotonic trend; the samples that had the highest percent removal of particles less than 50 microns in size as determined by FBRM had the lowest CST values. A subset of the best performing polymers was used to evaluate the effects of dosage amount and digestate sources on dewatering performance. The results from this work show that FBRM is a powerful tool that can be used for optimization and on-line monitoring of dewatering processes. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Analytical tool for the periodic safety analysis of NPP according to the PSA guideline. Vol. 1

    International Nuclear Information System (INIS)

    Balfanz, H.P.; Boehme, E.; Musekamp, W.; Hussels, U.; Becker, G.; Behr, H.; Luettgert, H.

    1994-01-01

    The SAIS (Safety Analysis and Informationssystem) Programme System is based on an integrated data base, which consists of a plant-data and a PSA related data part. Using SAIS analyses can be performed by special tools, which are connected directly to the data base. Two main editors, RISA+ and DEDIT, are used for data base management. The access to the data base is done via different types of pages, which are displayed on a displayed on a computer screen. The pages are called data sheets. Sets of input and output data sheets were implemented, such as system or component data sheets, fault trees or event trees. All input information, models and results needed for updated results of PSA (Living PSA) can be stored in the SAIS. The programme system contains the editor KVIEW which guarantees consistency of the stored data, e.g. with respect to names and codes of components and events. The information contained in the data base are called in by a standardized users guide programme, called Page Editor. (Brunsbuettel on reference NPP). (orig./HP) [de

  5. Electrochemical immunoassay using magnetic beads for the determination of zearalenone in baby food: An anticipated analytical tool for food safety

    International Nuclear Information System (INIS)

    Hervas, Miriam; Lopez, Miguel Angel; Escarpa, Alberto

    2009-01-01

    In this work, electrochemical immunoassay involving magnetic beads to determine zearalenone in selected food samples has been developed. The immunoassay scheme has been based on a direct competitive immunoassay method in which antibody-coated magnetic beads were employed as the immobilisation support and horseradish peroxidase (HRP) was used as enzymatic label. Amperometric detection has been achieved through the addition of hydrogen peroxide substrate and hydroquinone as mediator. Analytical performance of the electrochemical immunoassay has been evaluated by analysis of maize certified reference material (CRM) and selected baby food samples. A detection limit (LOD) of 0.011 μg L -1 and EC 50 0.079 μg L -1 were obtained allowing the assessment of the detection of zearalenone mycotoxin. In addition, an excellent accuracy with a high recovery yield ranging between 95 and 108% has been obtained. The analytical features have shown the proposed electrochemical immunoassay to be a very powerful and timely screening tool for the food safety scene.

  6. Photography by Cameras Integrated in Smartphones as a Tool for Analytical Chemistry Represented by an Butyrylcholinesterase Activity Assay.

    Science.gov (United States)

    Pohanka, Miroslav

    2015-06-11

    Smartphones are popular devices frequently equipped with sensitive sensors and great computational ability. Despite the widespread availability of smartphones, practical uses in analytical chemistry are limited, though some papers have proposed promising applications. In the present paper, a smartphone is used as a tool for the determination of cholinesterasemia i.e., the determination of a biochemical marker butyrylcholinesterase (BChE). The work should demonstrate suitability of a smartphone-integrated camera for analytical purposes. Paper strips soaked with indoxylacetate were used for the determination of BChE activity, while the standard Ellman's assay was used as a reference measurement. In the smartphone-based assay, BChE converted indoxylacetate to indigo blue and coloration was photographed using the phone's integrated camera. A RGB color model was analyzed and color values for the individual color channels were determined. The assay was verified using plasma samples and samples containing pure BChE, and validated using Ellmans's assay. The smartphone assay was proved to be reliable and applicable for routine diagnoses where BChE serves as a marker (liver function tests; some poisonings, etc.). It can be concluded that the assay is expected to be of practical applicability because of the results' relevance.

  7. Photography by Cameras Integrated in Smartphones as a Tool for Analytical Chemistry Represented by an Butyrylcholinesterase Activity Assay

    Directory of Open Access Journals (Sweden)

    Miroslav Pohanka

    2015-06-01

    Full Text Available Smartphones are popular devices frequently equipped with sensitive sensors and great computational ability. Despite the widespread availability of smartphones, practical uses in analytical chemistry are limited, though some papers have proposed promising applications. In the present paper, a smartphone is used as a tool for the determination of cholinesterasemia i.e., the determination of a biochemical marker butyrylcholinesterase (BChE. The work should demonstrate suitability of a smartphone-integrated camera for analytical purposes. Paper strips soaked with indoxylacetate were used for the determination of BChE activity, while the standard Ellman’s assay was used as a reference measurement. In the smartphone-based assay, BChE converted indoxylacetate to indigo blue and coloration was photographed using the phone’s integrated camera. A RGB color model was analyzed and color values for the individual color channels were determined. The assay was verified using plasma samples and samples containing pure BChE, and validated using Ellmans’s assay. The smartphone assay was proved to be reliable and applicable for routine diagnoses where BChE serves as a marker (liver function tests; some poisonings, etc.. It can be concluded that the assay is expected to be of practical applicability because of the results’ relevance.

  8. Electrochemical immunoassay using magnetic beads for the determination of zearalenone in baby food: An anticipated analytical tool for food safety

    Energy Technology Data Exchange (ETDEWEB)

    Hervas, Miriam; Lopez, Miguel Angel [Departamento Quimica Analitica, Universidad de Alcala, Ctra. Madrid-Barcelona, Km. 33600, E-28871 Alcala de Henares, Madrid (Spain); Escarpa, Alberto, E-mail: alberto.escarpa@uah.es [Departamento Quimica Analitica, Universidad de Alcala, Ctra. Madrid-Barcelona, Km. 33600, E-28871 Alcala de Henares, Madrid (Spain)

    2009-10-27

    In this work, electrochemical immunoassay involving magnetic beads to determine zearalenone in selected food samples has been developed. The immunoassay scheme has been based on a direct competitive immunoassay method in which antibody-coated magnetic beads were employed as the immobilisation support and horseradish peroxidase (HRP) was used as enzymatic label. Amperometric detection has been achieved through the addition of hydrogen peroxide substrate and hydroquinone as mediator. Analytical performance of the electrochemical immunoassay has been evaluated by analysis of maize certified reference material (CRM) and selected baby food samples. A detection limit (LOD) of 0.011 {mu}g L{sup -1} and EC{sub 50} 0.079 {mu}g L{sup -1} were obtained allowing the assessment of the detection of zearalenone mycotoxin. In addition, an excellent accuracy with a high recovery yield ranging between 95 and 108% has been obtained. The analytical features have shown the proposed electrochemical immunoassay to be a very powerful and timely screening tool for the food safety scene.

  9. Towards a green analytical laboratory: microextraction techniques as a useful tool for the monitoring of polluted soils

    Science.gov (United States)

    Lopez-Garcia, Ignacio; Viñas, Pilar; Campillo, Natalia; Hernandez Cordoba, Manuel; Perez Sirvent, Carmen

    2016-04-01

    Microextraction techniques are a valuable tool at the analytical laboratory since they allow sensitive measurements of pollutants to be carried out by means of easily available instrumentation. There is a large number of such procedures involving miniaturized liquid-liquid or liquid-solid extractions with the common denominator of using very low amounts (only a few microliters) or even none of organic solvents. Since minimal amounts of reagents are involved, and the generation of residues is consequently minimized, the approach falls within the concept of Green Analytical Chemistry. This general methodology is useful both for inorganic and organic pollutants. Thus, low amounts of metallic ions can be measured without the need of using ICP-MS since this instrument can be replaced by a simple AAS spectrometer which is commonly present in any laboratory and involves low acquisition and maintenance costs. When dealing with organic pollutants, the microextracts obtained can be introduced into liquid or gas chromatographs equipped with common detectors and there is no need for the most sophisticated and expensive mass spectrometers. This communication reports an overview of the advantages of such a methodology, and gives examples for the determination of some particular contaminants in soil and water samples The authors are grateful to the Comunidad Autonóma de la Región de Murcia , Spain (Fundación Séneca, 19888/GERM/15) for financial support

  10. Analytical Tools for the Application of Operational Culture: A Case Study in the Trans-Sahel

    Science.gov (United States)

    2011-03-28

    literatures including nursing and applied health care, management sciences, legal studies, and political science. In these fields, theory generally...Erlbaum Associates 87 Grant, J. & Davis, L. (1997). ―Selection and Use of Content Experts for Instrument Development,‖ Research in Nursing and...June 6, 2009. Due to boycotts by opposition groups, the elections were rescheduled for July 18, 2009. Following the resignation of General Aziz in

  11. Review: Andrew Bennett & Jeffrey T. Checkel (Eds. (2015. Process Tracing: From Metaphor to Analytic Tool

    Directory of Open Access Journals (Sweden)

    Felix Anderl

    2015-09-01

    Full Text Available In this review, I argue that this textbook edited by BENNETT and CHECKEL is exceptionally valuable in at least four aspects. First, with regards to form, the editors provide a paragon of how an edited volume should look: well-connected articles "speak to" and build on each other. The contributors refer to and grapple with the theoretical framework of the editors who, in turn, give heed to the conclusions of the contributors. Second, the book is packed with examples from research practice. These are not only named but thoroughly discussed and evaluated for their methodological potential in all chapters. Third, the book aims at improving and popularizing process tracing, but does not shy away from systematically considering the potential weaknesses of the approach. Fourth, the book combines and bridges various approaches to (mostly qualitative methods and still manages to provide abstract and easily accessible standards for making "good" process tracing. As such, it is a must-read for scholars working with qualitative methods. However, BENNETT and CHECKEL struggle with fulfilling their promise of bridging positivist and interpretive approaches, for while they do indeed take the latter into account, their general research framework remains largely unchanged by these considerations. On these grounds, I argue that, especially for scholars in the positivist camp, the book can function as a "how-to" guide for designing and implementing research. Although this may not apply equally to interpretive researchers, the book is still a treasure chest for them, providing countless conceptual clarifications and potential pitfalls of process tracing practice. URN: http://nbn-resolving.de/urn:nbn:de:0114-fqs1503187

  12. LitPathExplorer: a confidence-based visual text analytics tool for exploring literature-enriched pathway models.

    Science.gov (United States)

    Soto, Axel J; Zerva, Chrysoula; Batista-Navarro, Riza; Ananiadou, Sophia

    2018-04-15

    Pathway models are valuable resources that help us understand the various mechanisms underpinning complex biological processes. Their curation is typically carried out through manual inspection of published scientific literature to find information relevant to a model, which is a laborious and knowledge-intensive task. Furthermore, models curated manually cannot be easily updated and maintained with new evidence extracted from the literature without automated support. We have developed LitPathExplorer, a visual text analytics tool that integrates advanced text mining, semi-supervised learning and interactive visualization, to facilitate the exploration and analysis of pathway models using statements (i.e. events) extracted automatically from the literature and organized according to levels of confidence. LitPathExplorer supports pathway modellers and curators alike by: (i) extracting events from the literature that corroborate existing models with evidence; (ii) discovering new events which can update models; and (iii) providing a confidence value for each event that is automatically computed based on linguistic features and article metadata. Our evaluation of event extraction showed a precision of 89% and a recall of 71%. Evaluation of our confidence measure, when used for ranking sampled events, showed an average precision ranging between 61 and 73%, which can be improved to 95% when the user is involved in the semi-supervised learning process. Qualitative evaluation using pair analytics based on the feedback of three domain experts confirmed the utility of our tool within the context of pathway model exploration. LitPathExplorer is available at http://nactem.ac.uk/LitPathExplorer_BI/. sophia.ananiadou@manchester.ac.uk. Supplementary data are available at Bioinformatics online.

  13. Software Tools for Battery Design | Transportation Research | NREL

    Science.gov (United States)

    Software Tools for Battery Design Software Tools for Battery Design Under the Computer-Aided Engineering for Electric Drive Vehicle Batteries (CAEBAT) project, NREL has developed software tools to help using CAEBAT software tools. Knowledge of the interplay of multi-physics at varied scales is imperative

  14. An Analytical Framework for Delirium Research in Palliative Care Settings: Integrated Epidemiologic, Clinician-Researcher, and Knowledge User Perspectives

    Science.gov (United States)

    Ansari, Mohammed; Hosie, Annmarie; Kanji, Salmaan; Momoli, Franco; Bush, Shirley H.; Watanabe, Sharon; Currow, David C.; Gagnon, Bruno; Agar, Meera; Bruera, Eduardo; Meagher, David J.; de Rooij, Sophia E.J.A.; Adamis, Dimitrios; Caraceni, Augusto; Marchington, Katie; Stewart, David J.

    2014-01-01

    Context Delirium often presents difficult management challenges in the context of goals of care in palliative care settings. Objectives The aim was to formulate an analytical framework for further research on delirium in palliative care settings, prioritize the associated research questions, discuss the inherent methodological challenges associated with relevant studies, and outline the next steps in a program of delirium research. Methods We combined multidisciplinary input from delirium researchers and knowledge users at an international delirium study planning meeting, relevant literature searches, focused input of epidemiologic expertise, and a meeting participant and coauthor survey to formulate a conceptual research framework and prioritize research questions. Results Our proposed framework incorporates three main groups of research questions: the first was predominantly epidemiologic, such as delirium occurrence rates, risk factor evaluation, screening, and diagnosis; the second covers pragmatic management questions; and the third relates to the development of predictive models for delirium outcomes. Based on aggregated survey responses to each research question or domain, the combined modal ratings of “very” or “extremely” important confirmed their priority. Conclusion Using an analytical framework to represent the full clinical care pathway of delirium in palliative care settings, we identified multiple knowledge gaps in relation to the occurrence rates, assessment, management, and outcome prediction of delirium in this population. The knowledge synthesis generated from adequately powered, multicenter studies to answer the framework’s research questions will inform decision making and policy development regarding delirium detection and management and thus help to achieve better outcomes for patients in palliative care settings. PMID:24726762

  15. Prompt nuclear analytical techniques for material research in accelerator driven transmutation technologies: Prospects and quantitative analyses

    International Nuclear Information System (INIS)

    Vacik, J.; Hnatowicz, V.; Cervena, J.; Perina, V.; Mach, R.

    1998-01-01

    Accelerator driven transmutation technology (ADTT) is a promising way toward liquidation of spent nuclear fuel, nuclear wastes and weapon grade Pu. The ADTT facility comprises a high current (proton) accelerator supplying a sub-critical reactor assembly with spallation neutrons. The reactor part is supposed to be cooled by molten fluorides or metals which serve, at the same time, as a carrier of nuclear fuel. Assumed high working temperature (400-600 C) and high radiation load in the subcritical reactor and spallation neutron source put forward the problem of optimal choice of ADTT construction materials, especially from the point of their radiation and corrosion resistance when in contact with liquid working media. The use of prompt nuclear analytical techniques in ADTT related material research is considered and examples of preliminary analytical results obtained using neutron depth profiling method are shown for illustration. (orig.)

  16. Temperature-controlled micro-TLC: a versatile green chemistry and fast analytical tool for separation and preliminary screening of steroids fraction from biological and environmental samples.

    Science.gov (United States)

    Zarzycki, Paweł K; Slączka, Magdalena M; Zarzycka, Magdalena B; Bartoszuk, Małgorzata A; Włodarczyk, Elżbieta; Baran, Michał J

    2011-11-01

    This paper is a continuation of our previous research focusing on development of micro-TLC methodology under temperature-controlled conditions. The main goal of present paper is to demonstrate separation and detection capability of micro-TLC technique involving simple analytical protocols without multi-steps sample pre-purification. One of the advantages of planar chromatography over its column counterpart is that each TLC run can be performed using non-previously used stationary phase. Therefore, it is possible to fractionate or separate complex samples characterized by heavy biological matrix loading. In present studies components of interest, mainly steroids, were isolated from biological samples like fish bile using single pre-treatment steps involving direct organic liquid extraction and/or deproteinization by freeze-drying method. Low-molecular mass compounds with polarity ranging from estetrol to progesterone derived from the environmental samples (lake water, untreated and treated sewage waters) were concentrated using optimized solid-phase extraction (SPE). Specific bands patterns for samples derived from surface water of the Middle Pomerania in northern part of Poland can be easily observed on obtained micro-TLC chromatograms. This approach can be useful as simple and non-expensive complementary method for fast control and screening of treated sewage water discharged by the municipal wastewater treatment plants. Moreover, our experimental results show the potential of micro-TLC as an efficient tool for retention measurements of a wide range of steroids under reversed-phase (RP) chromatographic conditions. These data can be used for further optimalization of SPE or HPLC systems working under RP conditions. Furthermore, we also demonstrated that micro-TLC based analytical approach can be applied as an effective method for the internal standard (IS) substance search. Generally, described methodology can be applied for fast fractionation or screening of the

  17. Astonishing advances in mouse genetic tools for biomedical research.

    Science.gov (United States)

    Kaczmarczyk, Lech; Jackson, Walker S

    2015-01-01

    The humble house mouse has long been a workhorse model system in biomedical research. The technology for introducing site-specific genome modifications led to Nobel Prizes for its pioneers and opened a new era of mouse genetics. However, this technology was very time-consuming and technically demanding. As a result, many investigators continued to employ easier genome manipulation methods, though resulting models can suffer from overlooked or underestimated consequences. Another breakthrough, invaluable for the molecular dissection of disease mechanisms, was the invention of high-throughput methods to measure the expression of a plethora of genes in parallel. However, the use of samples containing material from multiple cell types could obfuscate data, and thus interpretations. In this review we highlight some important issues in experimental approaches using mouse models for biomedical research. We then discuss recent technological advances in mouse genetics that are revolutionising human disease research. Mouse genomes are now easily manipulated at precise locations thanks to guided endonucleases, such as transcription activator-like effector nucleases (TALENs) or the CRISPR/Cas9 system, both also having the potential to turn the dream of human gene therapy into reality. Newly developed methods of cell type-specific isolation of transcriptomes from crude tissue homogenates, followed by detection with next generation sequencing (NGS), are vastly improving gene regulation studies. Taken together, these amazing tools simplify the creation of much more accurate mouse models of human disease, and enable the extraction of hitherto unobtainable data.

  18. Development of dosimetry tools for proton therapy research

    International Nuclear Information System (INIS)

    Kim, Jong-Won; Kim, Dogyun

    2010-01-01

    Dosimetry tools for proton therapy research have been developed to measure the properties of a therapeutic proton beam. A CCD camera-scintillation screen system, which can verify the 2D dose distribution of a scanning beam and can be used for proton radiography, was developed. Also developed were a large area parallel-plate ionization chamber and a multi-layer Faraday cup to monitor the beam current and to measure the beam energy, respectively. To investigate the feasibility of locating the distal dose falloff in real time during patient treatment, a prompt gamma measuring system composed of multi-layer shielding structures was then devised. The system worked well for a pristine proton beam. However, correlation between the distal dose falloff and the prompt gamma distribution was blurred by neutron background for a therapy beam formed by scattering method. We have also worked on the design of a Compton camera to image the 2D distribution of prompt gamma rays.

  19. Direct writing of metal nanostructures: lithographic tools for nanoplasmonics research.

    Science.gov (United States)

    Leggett, Graham J

    2011-03-22

    Continued progress in the fast-growing field of nanoplasmonics will require the development of new methods for the fabrication of metal nanostructures. Optical lithography provides a continually expanding tool box. Two-photon processes, as demonstrated by Shukla et al. (doi: 10.1021/nn103015g), enable the fabrication of gold nanostructures encapsulated in dielectric material in a simple, direct process and offer the prospect of three-dimensional fabrication. At higher resolution, scanning probe techniques enable nanoparticle particle placement by localized oxidation, and near-field sintering of nanoparticulate films enables direct writing of nanowires. Direct laser "printing" of single gold nanoparticles offers a remarkable capability for the controlled fabrication of model structures for fundamental studies, particle-by-particle. Optical methods continue to provide a powerful support for research into metamaterials.

  20. Real-time imaging as an emerging process analytical technology tool for monitoring of fluid bed coating process.

    Science.gov (United States)

    Naidu, Venkata Ramana; Deshpande, Rucha S; Syed, Moinuddin R; Wakte, Pravin S

    2018-07-01

    A direct imaging system (Eyecon TM ) was used as a Process Analytical Technology (PAT) tool to monitor fluid bed coating process. Eyecon TM generated real-time onscreen images, particle size and shape information of two identically manufactured laboratory-scale batches. Eyecon TM has accuracy of measuring the particle size increase of ±1 μm on particles in the size range of 50-3000 μm. Eyecon TM captured data every 2 s during the entire process. The moving average of D90 particle size values recorded by Eyecon TM were calculated for every 30 min to calculate the radial coating thickness of coated particles. After the completion of coating process, the radial coating thickness was found to be 11.3 and 9.11 μm, with a standard deviation of ±0.68 and 1.8 μm for Batch 1 and Batch 2, respectively. The coating thickness was also correlated with percent weight build-up by gel permeation chromatography (GPC) and dissolution. GPC indicated weight build-up of 10.6% and 9.27% for Batch 1 and Batch 2, respectively. In conclusion, weight build-up of 10% can also be correlated with 10 ± 2 μm increase in the coating thickness of pellets, indicating the potential applicability of real-time imaging as an endpoint determination tool for fluid bed coating process.

  1. Rapid process development of chromatographic process using direct analysis in real time mass spectrometry as a process analytical technology tool.

    Science.gov (United States)

    Yan, Binjun; Chen, Teng; Xu, Zhilin; Qu, Haibin

    2014-06-01

    The concept of quality by design (QbD) is widely applied in the process development of pharmaceuticals. However, the additional cost and time have caused some resistance about QbD implementation. To show a possible solution, this work proposed a rapid process development method, which used direct analysis in real time mass spectrometry (DART-MS) as a process analytical technology (PAT) tool for studying the chromatographic process of Ginkgo biloba L., as an example. The breakthrough curves were fast determined by DART-MS at-line. A high correlation coefficient of 0.9520 was found between the concentrations of ginkgolide A determined by DART-MS and HPLC. Based on the PAT tool, the impacts of process parameters on the adsorption capacity were discovered rapidly, which showed a decreased adsorption capacity with the increase of the flow rate. This work has shown the feasibility and advantages of integrating PAT into QbD implementation for rapid process development. Copyright © 2014 Elsevier B.V. All rights reserved.

  2. Applied research on air pollution using nuclear-related analytical techniques. Report on the second research co-ordination meeting

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-07-01

    A co-ordinated research programme (CRP) on applied research on air pollution using nuclear-related techniques is a global CRP which started in 1992, and is scheduled to run until early 1997. The purpose of this CRP is to promote the use of nuclear analytical techniques in air pollution studies, e.g. NAA, XRF, and PIXE for the analysis of toxic and other trace elements in air particulate matter. The main purposes of the core programme are i) to support the use of nuclear and nuclear-related analytical techniques for research and monitoring studies on air pollution, ii) to identify major sources of air pollution affecting each of the participating countries with particular reference to toxic heavy metals, and iii) to obtain comparative data on pollution levels in areas of high pollution (e.g. a city centre or a populated area downwind of a large pollution source) and low pollution (e.g. rural area). This document reports the discussions held during the second Research Co-ordination Meeting (RCM) for the CRP which took place at ANSTO in Menai, Australia. (author)

  3. Applied research on air pollution using nuclear-related analytical techniques. Report on the second research co-ordination meeting

    International Nuclear Information System (INIS)

    1995-01-01

    A co-ordinated research programme (CRP) on applied research on air pollution using nuclear-related techniques is a global CRP which started in 1992, and is scheduled to run until early 1997. The purpose of this CRP is to promote the use of nuclear analytical techniques in air pollution studies, e.g. NAA, XRF, and PIXE for the analysis of toxic and other trace elements in air particulate matter. The main purposes of the core programme are i) to support the use of nuclear and nuclear-related analytical techniques for research and monitoring studies on air pollution, ii) to identify major sources of air pollution affecting each of the participating countries with particular reference to toxic heavy metals, and iii) to obtain comparative data on pollution levels in areas of high pollution (e.g. a city centre or a populated area downwind of a large pollution source) and low pollution (e.g. rural area). This document reports the discussions held during the second Research Co-ordination Meeting (RCM) for the CRP which took place at ANSTO in Menai, Australia. (author)

  4. Analytic tools for investigating the structure of network reliability measures with regard to observation correlations

    Science.gov (United States)

    Prószyński, W.; Kwaśniak, M.

    2018-03-01

    A global measure of observation correlations in a network is proposed, together with the auxiliary indices related to non-diagonal elements of the correlation matrix. Based on the above global measure, a specific representation of the correlation matrix is presented, being the result of rigorously proven theorem formulated within the present research. According to the theorem, each positive definite correlation matrix can be expressed by a scale factor and a so-called internal weight matrix. Such a representation made it possible to investigate the structure of the basic reliability measures with regard to observation correlations. Numerical examples carried out for two test networks illustrate the structure of those measures that proved to be dependent on global correlation index. Also, the levels of global correlation are proposed. It is shown that one can readily find an approximate value of the global correlation index, and hence the correlation level, for the expected values of auxiliary indices being the only knowledge about a correlation matrix of interest. The paper is an extended continuation of the previous study of authors that was confined to the elementary case termed uniform correlation. The extension covers arbitrary correlation matrices and a structure of correlation effect.

  5. Environmental research organizations and climate change policy analytical capacity : an assessment of the Canadian case

    International Nuclear Information System (INIS)

    Howlett, M.; Oliphant, S.

    2010-01-01

    Climate change is a topic of increasing interest to contemporary decision makers. In order for governments to make informed decisions in climate change mitigation and adaptation efforts, environmental policy makers require strong research and analytical capabilities to design and implement effective policies to deal with wide-ranging and complex policy issues. This articles presented a 7-criteria model of policy analytical capacity (PAC) and applied it to 3 prominent Canadian environmental policy research organizations. The 2 governmental organizations examined in this study were Environment Canada and the British Columbia Ministry of the Environment, while the non-government organization was the David Suzuki Foundation. Following the 7 principles that determine the PAC of an organization, each case study examined the education/training of the organization's employees; the types and mix of policy analysis techniques used by the organization; the culture and structure of decision making in the organization; the nature and source of demand for the organization's research; and the organization's access to necessary data and information to conduct work at a high level of competence. Interview data provided information on the status of each organizations' current research capacity and the effect this has on overall government policy-making capability in the face of climate change challenges. 75 refs.

  6. Applied research on air pollution using nuclear-related analytical techniques

    International Nuclear Information System (INIS)

    1994-01-01

    A co-ordinated research programme (CRP) on applied research on air pollution using nuclear-related techniques is a global CRP which will run from 1992-1996, and will build upon the experience gained by the Agency from the laboratory support that it has been providing for several years to BAPMoN - the Background Air Pollution Monitoring Network programme organized under the auspices of the World Meterological Organization. The purpose of this CRP is to promote the use of nuclear analytical techniques in air pollution studies, e.g. NAA, XFR, and PIXE for the analysis of toxic and other trace elements in suspended particulate matter (including air filter samples), rainwater and fog-water samples, and in biological indicators of air pollution (e.g. lichens and mosses). The main purposes of the core programme are i) to support the use of nuclear and nuclear-related analytical techniques for practically-oriented research and monitoring studies on air pollution ii) to identify major sources of air pollution affecting each of the participating countries with particular reference to toxic heavy metals, and iii) to obtain comparative data on pollution levels in areas of high pollution (e.g. a city centre or a populated area downwind of a large pollution source) and low pollution (e.g. rural areas). This document reports the discussions held during the first Research Co-ordination Meeting (RCM) for the CRP which took place at the IAEA Headquarters in Vienna. Refs, figs and tabs

  7. Born analytical or adopted over time? a study investigating if new analytical tools can ensure the survival of market oriented startups.

    OpenAIRE

    Skogen, Hege Janson; De la Cruz, Kai

    2017-01-01

    Masteroppgave(MSc) in Master of Science in Strategic Marketing Management - Handelshøyskolen BI, 2017 This study investigates whether the prevalence of technological advances within quantitative analytics moderates the effect market orientation has on firm performance, and if startups can take advantage of the potential opportunities to ensure their own survival. For this purpose, the authors review previous literature in marketing orientation, startups, marketing analytics, an...

  8. Methodological framework, analytical tool and database for the assessment of climate change impacts, adaptation and vulnerability in Denmark

    Energy Technology Data Exchange (ETDEWEB)

    Skougaard Kaspersen, P.; Halsnaes, K.; Gregg, J.; Drews, M.

    2012-12-15

    In this report we provide recommendations about how more consistent studies and data can be provided based on available modelling tools and data for integrated assessment of climate change risks and adaptation options. It is concluded that integrated assessments within this area requires the use of a wide range of data and models in order to cover the full chain of elements including climate modelling, impact, risks, costs, social issues, and decision making. As an outcome of this activity a comprehensive data and modelling tool named Danish Integrated Assessment System (DIAS) has been developed, this may be used by researchers within the field. DIAS has been implemented and tested in a case study on urban flooding caused by extreme precipitation in Aarhus, and this study highlights the usefulness of integrating data, models, and methods from several disciplines into a common framework. DIAS is an attempt to describe such a framework with regards to integrated analysis of climate impacts and adaptation. The final product of the DTU KFT project ''Tool for Vulnerability analysis'' is NOT a user friendly Climate Adaptation tool ready for various types of analysis that may directly be used by decision makers and consultant on their own. Rather developed methodology and collected/available data can serve as a starting point for case specific analyses. For this reason alone this work should very much be viewed as an attempt to coordinate research, data and models outputs between different research institutes from various disciplines. It is unquestionable that there is a future need to integrate information for areas not yet included, and it is very likely that such efforts will depend on research projects conducted in different climate change adaptation areas and sectors in Denmark. (Author)

  9. SmartR: an open-source platform for interactive visual analytics for translational research data.

    Science.gov (United States)

    Herzinger, Sascha; Gu, Wei; Satagopam, Venkata; Eifes, Serge; Rege, Kavita; Barbosa-Silva, Adriano; Schneider, Reinhard

    2017-07-15

    In translational research, efficient knowledge exchange between the different fields of expertise is crucial. An open platform that is capable of storing a multitude of data types such as clinical, pre-clinical or OMICS data combined with strong visual analytical capabilities will significantly accelerate the scientific progress by making data more accessible and hypothesis generation easier. The open data warehouse tranSMART is capable of storing a variety of data types and has a growing user community including both academic institutions and pharmaceutical companies. tranSMART, however, currently lacks interactive and dynamic visual analytics and does not permit any post-processing interaction or exploration. For this reason, we developed SmartR , a plugin for tranSMART, that equips the platform not only with several dynamic visual analytical workflows, but also provides its own framework for the addition of new custom workflows. Modern web technologies such as D3.js or AngularJS were used to build a set of standard visualizations that were heavily improved with dynamic elements. The source code is licensed under the Apache 2.0 License and is freely available on GitHub: https://github.com/transmart/SmartR . reinhard.schneider@uni.lu. Supplementary data are available at Bioinformatics online. © The Author(s) 2017. Published by Oxford University Press.

  10. The GATO gene annotation tool for research laboratories

    Directory of Open Access Journals (Sweden)

    A. Fujita

    2005-11-01

    Full Text Available Large-scale genome projects have generated a rapidly increasing number of DNA sequences. Therefore, development of computational methods to rapidly analyze these sequences is essential for progress in genomic research. Here we present an automatic annotation system for preliminary analysis of DNA sequences. The gene annotation tool (GATO is a Bioinformatics pipeline designed to facilitate routine functional annotation and easy access to annotated genes. It was designed in view of the frequent need of genomic researchers to access data pertaining to a common set of genes. In the GATO system, annotation is generated by querying some of the Web-accessible resources and the information is stored in a local database, which keeps a record of all previous annotation results. GATO may be accessed from everywhere through the internet or may be run locally if a large number of sequences are going to be annotated. It is implemented in PHP and Perl and may be run on any suitable Web server. Usually, installation and application of annotation systems require experience and are time consuming, but GATO is simple and practical, allowing anyone with basic skills in informatics to access it without any special training. GATO can be downloaded at [http://mariwork.iq.usp.br/gato/]. Minimum computer free space required is 2 MB.

  11. Applied Analytical Combustion/emissions Research at the NASA Lewis Research Center - a Progress Report

    Science.gov (United States)

    Deur, J. M.; Kundu, K. P.; Nguyen, H. L.

    1992-01-01

    Emissions of pollutants from future commercial transports are a significant concern. As a result, the Lewis Research Center (LeRC) is investigating various low emission combustor technologies. As part of this effort, a combustor analysis code development program was pursued to guide the combustor design process, to identify concepts having the greatest promise, and to optimize them at the lowest cost in the minimum time.

  12. Applied analytical combustion/emissions research at the NASA Lewis Research Center

    Science.gov (United States)

    Deur, J. M.; Kundu, K. P.; Nguyen, H. L.

    1992-01-01

    Emissions of pollutants from future commercial transports are a significant concern. As a result, the Lewis Research Center (LeRC) is investigating various low emission combustor technologies. As part of this effort, a combustor analysis code development program was pursued to guide the combustor design process, to identify concepts having the greatest promise, and to optimize them at the lowest cost in the minimum time.

  13. Applications of nuclear analytical techniques in human nutrition research as exemplified by research programmes of the International Atomic Energy Agency

    International Nuclear Information System (INIS)

    Parr, R.M.

    1987-01-01

    In human nutrition research, nuclear analytical techniques, particularly neutron activation analysis (NAA), are used mainly for the in vitro study of trace elements. According to work sponsored by the IAEA, up to 15 trace elements and 5 minor elements of nutritional interest may be determined in biological materials by NAA with good accuracy and precision. A programme is described in which NAA was used for the determination of 14 trace elements and one minor element in human milk. NAA also plays an important role in the certification of reference materials for nutritional studies. (author) 17 refs.; 6 tables

  14. Employment of Questionnaire as Tool for Effective Business Research Outcome: Problems and Challenges

    Directory of Open Access Journals (Sweden)

    ADENIYI AKINGBADE WAIDI

    2016-06-01

    Full Text Available Questionnaire has to do with questions designed to gather information or data for analysis. Questionnaire has to be adequate, simple, focused and related to the subject which the research is set to achieve and to test the hypotheses and questions that are formulated for the study. But many questionnaires are constructed and administered without following proper guideline which hinders there end result. This paper assesses some of the guides for constructing questionnaire as well as it uses and the extent to which it enhanced manager’s access to reliable data and information. Descriptive method is employed for the study. Findings revealed that poor or badly prepared questionnaire produce questionnaire that does not provide effective results. Managers and researchers that use such questionnaire hardly achieve their organisational and research objectives. The need for good, well prepared and adequate questionnaire is exemplified by its being the primary tool for analytical research. The study recommends that questionnaire be properly prepared for effective research outcome.

  15. Role of nuclear analytical probe techniques in biological trace element research

    International Nuclear Information System (INIS)

    Jones, K.W.; Pounds, J.G.

    1985-01-01

    Many biomedical experiments require the qualitative and quantitative localization of trace elements with high sensitivity and good spatial resolution. The feasibility of measuring the chemical form of the elements, the time course of trace elements metabolism, and of conducting experiments in living biological systems are also important requirements for biological trace element research. Nuclear analytical techniques that employ ion or photon beams have grown in importance in the past decade and have led to several new experimental approaches. Some of the important features of these methods are reviewed here along with their role in trace element research, and examples of their use are given to illustrate potential for new research directions. It is emphasized that the effective application of these methods necessitates a closely integrated multidisciplinary scientific team. 21 refs., 4 figs., 1 tab

  16. Application of the partitive analytical forecasting (PAF) technique to the United States controlled thermonuclear research effort

    International Nuclear Information System (INIS)

    Nichols, S.P.

    1975-01-01

    The Partitive Analytical Forecasting (PAF) technique is applied to the overall long-term program plans for the Division of Controlled Thermonuclear Research (DCTR) of the United States Energy Research and Development Administration (ERDA). As part of the PAF technique, the Graphical Evaluation and Review Technique (GERTS) IIIZ computer code is used to perform simulations on a logic network describing the DCTR long-term program plan. Logic networks describing the tokamak, mirror, and theta-pinch developments are simulated individually and then together to form an overall DCTR program network. The results of the simulation of the overall network using various funding schemes and strategies are presented. An economic sensitivity analysis is provided for the tokamak logic networks. An analysis is also performed of the fusion-fission hybrid concept in the context of the present DCTR goals. The results mentioned above as well as the PAF technique itself are evaluated, and recommendations for further research are discussed

  17. Analytical Methods for Quantification of Vitamin D and Implications for Research and Clinical Practice.

    Science.gov (United States)

    Stokes, Caroline S; Lammert, Frank; Volmer, Dietrich A

    2018-02-01

    A plethora of contradictory research surrounds vitamin D and its influence on health and disease. This may, in part, result from analytical difficulties with regard to measuring vitamin D metabolites in serum. Indeed, variation exists between analytical techniques and assays used for the determination of serum 25-hydroxyvitamin D. Research studies into the effects of vitamin D on clinical endpoints rely heavily on the accurate assessment of vitamin D status. This has important implications, as findings from vitamin D-related studies to date may potentially have been hampered by the quantification techniques used. Likewise, healthcare professionals are increasingly incorporating vitamin D testing and supplementation regimens into their practice, and measurement errors may be also confounding the clinical decisions. Importantly, the Vitamin D Standardisation Programme is an initiative that aims to standardise the measurement of vitamin D metabolites. Such a programme is anticipated to eliminate the inaccuracies surrounding vitamin D quantification. Copyright© 2018, International Institute of Anticancer Research (Dr. George J. Delinasios), All rights reserved.

  18. Research on Marketing Channel of Mobile Manufacturer Based on Analytic Hierarchy Process

    Institute of Scientific and Technical Information of China (English)

    XIONG Hui; LI Shi-ming; LAN Yong

    2006-01-01

    Research on "marketing channel" of mobile attracts much attention in these years,but there're only few articles referring to how to optimize the disposition of channel resources for mobile manufacturers. Based on a typically multiplex marketing channel system of mobile manufacturer, the analytic hierarchy process (AHP) structure model is established. Through the judgment matrix, simple and total hierarchy arrangement, consistent test, this paper gets the weight of each kind of marketing channel of mobile manufacturer. It provides the practical reference value for mobile manufacturers to distribute resources of marketing channels.

  19. Enabling laboratory EUV research with a compact exposure tool

    Science.gov (United States)

    Brose, Sascha; Danylyuk, Serhiy; Tempeler, Jenny; Kim, Hyun-su; Loosen, Peter; Juschkin, Larissa

    2016-03-01

    In this work we present the capabilities of the designed and realized extreme ultraviolet laboratory exposure tool (EUVLET) which has been developed at the RWTH-Aachen, Chair for the Technology of Optical Systems (TOS), in cooperation with the Fraunhofer Institute for Laser Technology (ILT) and Bruker ASC GmbH. Main purpose of this laboratory setup is the direct application in research facilities and companies with small batch production, where the fabrication of high resolution periodic arrays over large areas is required. The setup can also be utilized for resist characterization and evaluation of its pre- and post-exposure processing. The tool utilizes a partially coherent discharge produced plasma (DPP) source and minimizes the number of other critical components to a transmission grating, the photoresist coated wafer and the positioning system for wafer and grating and utilizes the Talbot lithography approach. To identify the limits of this approach first each component is analyzed and optimized separately and relations between these components are identified. The EUV source has been optimized to achieve the best values for spatial and temporal coherence. Phase-shifting and amplitude transmission gratings have been fabricated and exposed. Several commercially available electron beam resists and one EUV resist have been characterized by open frame exposures to determine their contrast under EUV radiation. Cold development procedure has been performed to further increase the resist contrast. By analyzing the exposure results it can be demonstrated that only a 1:1 copy of the mask structure can be fully resolved by the utilization of amplitude masks. The utilized phase-shift masks offer higher 1st order diffraction efficiency and allow a demagnification of the mask structure in the achromatic Talbot plane.

  20. Haystack, a web-based tool for metabolomics research.

    Science.gov (United States)

    Grace, Stephen C; Embry, Stephen; Luo, Heng

    2014-01-01

    Liquid chromatography coupled to mass spectrometry (LCMS) has become a widely used technique in metabolomics research for differential profiling, the broad screening of biomolecular constituents across multiple samples to diagnose phenotypic differences and elucidate relevant features. However, a significant limitation in LCMS-based metabolomics is the high-throughput data processing required for robust statistical analysis and data modeling for large numbers of samples with hundreds of unique chemical species. To address this problem, we developed Haystack, a web-based tool designed to visualize, parse, filter, and extract significant features from LCMS datasets rapidly and efficiently. Haystack runs in a browser environment with an intuitive graphical user interface that provides both display and data processing options. Total ion chromatograms (TICs) and base peak chromatograms (BPCs) are automatically displayed, along with time-resolved mass spectra and extracted ion chromatograms (EICs) over any mass range. Output files in the common .csv format can be saved for further statistical analysis or customized graphing. Haystack's core function is a flexible binning procedure that converts the mass dimension of the chromatogram into a set of interval variables that can uniquely identify a sample. Binned mass data can be analyzed by exploratory methods such as principal component analysis (PCA) to model class assignment and identify discriminatory features. The validity of this approach is demonstrated by comparison of a dataset from plants grown at two light conditions with manual and automated peak detection methods. Haystack successfully predicted class assignment based on PCA and cluster analysis, and identified discriminatory features based on analysis of EICs of significant bins. Haystack, a new online tool for rapid processing and analysis of LCMS-based metabolomics data is described. It offers users a range of data visualization options and supports non

  1. The MOOC and Learning Analytics Innovation Cycle (MOLAC): A Reflective Summary of Ongoing Research and Its Challenges

    Science.gov (United States)

    Drachsler, H.; Kalz, M.

    2016-01-01

    The article deals with the interplay between learning analytics and massive open online courses (MOOCs) and provides a conceptual framework to situate ongoing research in the MOOC and learning analytics innovation cycle (MOLAC framework). The MOLAC framework is organized on three levels: On the micro-level, the data collection and analytics…

  2. Measurement of very low amounts of arsenic in soils and waters: is ICP-MS the indispensable analytical tool?

    Science.gov (United States)

    López-García, Ignacio; Marín-Hernández, Juan Jose; Perez-Sirvent, Carmen; Hernandez-Cordoba, Manuel

    2017-04-01

    The toxicity of arsenic and its wide distribution in the nature needs nowadays not to be emphasized, and the convenience of reliable analytical tools for arsenic determination at very low levels is clear. Leaving aside atomic fluorescence spectrometers specifically designed for this purpose, the task is currently carried out by using inductively coupled plasma mass spectrometry (ICP-MS), a powerful but expensive technique that is not available in all laboratories. However, as the recent literature clearly shows, a similar or even better analytical performance for the determination of several elements can be achieved by replacing the ICP-MS instrument by an AAS spectrometer (which is commonly present in any laboratory and involves low acquisition and maintenance costs) provided that a simple microextraction step is used to preconcentrate the sample. This communication reports the optimization and results obtained with a new analytical procedure based on this idea and focused to the determination of very low concentrations of arsenic in waters and extracts from soils and sediments. The procedure is based on a micro-solid phase extraction process for the separation and preconcentration of arsenic that uses magnetic particles covered with silver nanoparticles functionalized with the sodium salt of 2-mercaptoethane-sulphonate (MESNa). This composite is obtained in an easy way in the laboratory. After the sample is treated with a low amount (only a few milligrams) of the magnetic material, the solid phase is separated by means of a magnetic field, and then introduced into an electrothermal atomizer (ETAAS) for arsenic determination. The preconcentration factor is close to 200 with a detection limit below 0.1 µg L-1 arsenic. Speciation of As(III) and As(V) can be achieved by means of two extractions carried out at different acidity. The results for total arsenic are verified using certified reference materials. The authors are grateful to the Comunidad Autonóma de la

  3. Interaction Matrices as a Tool for Prioritizing Radioecology Research

    Energy Technology Data Exchange (ETDEWEB)

    Mora, J.C.; Robles, Beatriz [Centro de Investigaciones Energeticas, Medioambientales y Tecnologicas - CIEMAT (Spain); Bradshaw, Clare; Stark, Karolina [Stockholm University (Sweden); Sweeck, Liev; Vives i Batlle, Jordi [Belgian Nuclear Research Centre SCK-CEN (Belgium); Beresford, Nick [Centre for Ecology and Hydrology - CEH (United Kingdom); Thoerring, Havard; Dowdall, Mark [Norwegian Radiation Protection Authority - NRPA (Norway); Outola, Iisa; Turtiainen, Tuukka; Vetikko, Virve [STUK - Radiation and Nuclear Safety Authority (Finland); Steiner, Martin [Federal Office for Radiation Protection - BfS (Germany); Beaugelin-Seiller, Karine; Fevrier, Laureline; Hurtevent, Pierre; Boyer, Patrick [Institut de Radioprotection et de Surete Nucleaire - IRSN (France)

    2014-07-01

    Interaction Matrices as a Tool for Prioritizing Radioecology Research J.C. Mora CIEMAT In 2010 the Strategy for Allied Radioecology (STAR) was launched with several objectives aimed towards integrating the radioecology research efforts of nine institutions in Europe. One of these objectives was the creation of European Radioecology Observatories. The Chernobyl Exclusion Zone (CEZ) and the Upper Silesian Coal Basin (USCB), a coal mining area in Poland, have been chosen after a selection process. A second objective was to develop a system for improving and validating the capabilities of predicting the behaviour of the main radionuclides existing at these observatories. Interaction Matrices (IM) have been used since the 1990's as a tool for developing ecological conceptual models and have also been used within radioecology. The Interaction Matrix system relies on expert judgement for structuring knowledge of a given ecosystem at the conceptual level and was selected for use in the STAR project. A group of experts, selected from each institution of STAR, designed two matrices with the main compartments for each ecosystem (a forest in CEZ and a lake in USCB). All the features, events and processes (FEPs) which could affect the behaviour of the considered radionuclides, focusing on radiocaesium in the Chernobyl forest and radium in the Rontok-Wielki lake, were also included in each IM. Two new sets of experts were appointed to review, improve and prioritize the processes included in each IM. A first processing of the various candidate interaction matrices produced a single interaction matrix for each ecosystem which incorporated all experts combined knowledge. During the prioritization of processes in the IMs, directed towards developing a whole predictive model of radionuclides behaviour in those ecosystems, raised interesting issues related to the processes and parameters involved, regarding the existing knowledge in them. This exercise revealed several processes

  4. IT Tools for Teachers and Scientists, Created by Undergraduate Researchers

    Science.gov (United States)

    Millar, A. Z.; Perry, S.

    2007-12-01

    Interns in the Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT) program conduct computer science research for the benefit of earthquake scientists and have created products in growing use within the SCEC education and research communities. SCEC/UseIT comprises some twenty undergraduates who combine their varied talents and academic backgrounds to achieve a Grand Challenge that is formulated around needs of SCEC scientists and educators and that reflects the value SCEC places on the integration of computer science and the geosciences. In meeting the challenge, students learn to work on multidisciplinary teams and to tackle complex problems with no guaranteed solutions. Meantime, their efforts bring fresh perspectives and insight to the professionals with whom they collaborate, and consistently produces innovative, useful tools for research and education. The 2007 Grand Challenge was to design and prototype serious games to communicate important earthquake science concepts. Interns broke themselves into four game teams, the Educational Game, the Training Game, the Mitigation Game and the Decision-Making Game, and created four diverse games with topics from elementary plate tectonics to earthquake risk mitigation, with intended players ranging from elementary students to city planners. The games were designed to be versatile, to accommodate variation in the knowledge base of the player; and extensible, to accommodate future additions. The games are played on a web browser or from within SCEC-VDO (Virtual Display of Objects). SCEC-VDO, also engineered by UseIT interns, is a 4D, interactive, visualization software that enables integration and exploration of datasets and models such as faults, earthquake hypocenters and ruptures, digital elevation models, satellite imagery, global isochrons, and earthquake prediction schemes. SCEC-VDO enables the user to create animated movies during a session, and is now part

  5. Applications of instrumental neutron activation analysis in the Analytical Division of the Research Center Juelich (KFA)

    International Nuclear Information System (INIS)

    Erdtmann, G.

    1991-12-01

    The Radioanalytical Chemistry Section, as a part of the Central Division of Chemical Analysis of the Research Center KFA Juelich, has the task to provide and to apply nuclear methods in the analytical service for the institutes and projects of the KFA and to customers outside. A great part of this service is trace element determinations by neutron activation analysis using the research reactor FRJ-2. The procedure for the instrumental technique is described and mainly practical aspects are reported in detail. It is based on the k 0 -method developed by Simonits and DeCorte. The results are calculated from the peak areas of the γ-lines and the corresponding k 0 -factors. A new variant of this procedure is required, if the program used for the deconvolution of the γ-spectra provides absolute decay rates of the radionuclides instead of the γ-emission rates. This variant is also described. Some examples of analyses carried out in the analytical service are presented and discussed mainly with respect to accuracy of the results and detection limits. (orig.) [de

  6. Research tools application for female fashion underwear comfort assesment

    Directory of Open Access Journals (Sweden)

    Andreia Salvan Pagnan

    2016-06-01

    Full Text Available Within the women's clothing of the universe's underwear were long an insignificant plan with regard to the development of new textile materials, shapes and colors. The panties that had been known as breeches or long underwear only became a necessity around the twentieth century with the vaporous dresses Christian Dior in the 50 Technological advances in the textile industry brought spandex created by the American laboratory DuPont's better known as the lycra. The elasticity of the fabric gave comfort to women's lingerie, passing this attribute to be considered as a quality factor in lingeries. To understand the desires of the users a qualitative research was conducted with women 18-45 years collecting opinions on the perceived comfort of already existing models compared to a new one be launched. Through the Quality Function Deployment Tool (QFD, or Quality Function Deployment, the data obtained from users of the answers given an interpretation which is to prioritize targets for the development of a based product on analyzes of desired characteristics which are converted into attributes technicians.

  7. Facebook: an effective tool for participant retention in longitudinal research.

    Science.gov (United States)

    Mychasiuk, R; Benzies, K

    2012-09-01

    Facebook is currently one of the world's most visited websites, and home to millions of users who access their accounts on a regular basis. Owing to the website's ease of accessibility and free service, demographic characteristics of users span all domains. As such, Facebook may be a valuable tool for locating and communicating with participants in longitudinal research studies. This article outlines the benefit gained in a longitudinal follow-up study, of an intervention programme for at-risk families, through the use of Facebook as a search engine. Using Facebook as a resource, we were able to locate 19 participants that were otherwise 'lost' to follow-up, decreasing attrition in our study by 16%. Additionally, analysis indicated that hard-to-reach participants located with Facebook differed significantly on measures of receptive language and self-esteem when compared to their easier-to-locate counterparts. These results suggest that Facebook is an effective means of improving participant retention in a longitudinal intervention study and may help improve study validity by reaching participants that contribute differing results. © 2011 Blackwell Publishing Ltd.

  8. Big data, advanced analytics and the future of comparative effectiveness research.

    Science.gov (United States)

    Berger, Marc L; Doban, Vitalii

    2014-03-01

    The intense competition that accompanied the growth of internet-based companies ushered in the era of 'big data' characterized by major innovations in processing of very large amounts of data and the application of advanced analytics including data mining and machine learning. Healthcare is on the cusp of its own era of big data, catalyzed by the changing regulatory and competitive environments, fueled by growing adoption of electronic health records, as well as efforts to integrate medical claims, electronic health records and other novel data sources. Applying the lessons from big data pioneers will require healthcare and life science organizations to make investments in new hardware and software, as well as in individuals with different skills. For life science companies, this will impact the entire pharmaceutical value chain from early research to postcommercialization support. More generally, this will revolutionize comparative effectiveness research.

  9. Justice at the millennium: a meta-analytic review of 25 years of organizational justice research.

    Science.gov (United States)

    Colquitt, J A; Conlon, D E; Wesson, M J; Porter, C O; Ng, K Y

    2001-06-01

    The field of organizational justice continues to be marked by several important research questions, including the size of relationships among justice dimensions, the relative importance of different justice criteria, and the unique effects of justice dimensions on key outcomes. To address such questions, the authors conducted a meta-analytic review of 183 justice studies. The results suggest that although different justice dimensions are moderately to highly related, they contribute incremental variance explained in fairness perceptions. The results also illustrate the overall and unique relationships among distributive, procedural, interpersonal, and informational justice and several organizational outcomes (e.g., job satisfaction, organizational commitment, evaluation of authority, organizational citizenship behavior, withdrawal, performance). These findings are reviewed in terms of their implications for future research on organizational justice.

  10. The diffusion of renewable energy technology: an analytical framework and key issues for research

    International Nuclear Information System (INIS)

    Jacobsson, S.; Johnson, A.

    2000-01-01

    During the last two decades there has been a great deal of research on renewable energy technologies. It is commonly thought that very little has come out of this research in terms of commercially interesting technologies. The first objective of this paper is to demonstrate that this perception is no longer correct; in the 1990s there has been a double-digit growth rate in the market for some renewable energy technologies. The consequent alteration in the energy system, is, however, a slow, painful and highly uncertain process. This process, we argue, needs to be studied using an innovation system perspective where the focus is on networks, institutions and firms' perceptions, competencies and strategies. The second objective of the paper is therefore to present the bare bones of such an analytical framework. A third objective is to identify a set of key issues related to the speed and direction of that transformation process which needs to be studied further. (author)

  11. Giving raw data a chance to talk: a demonstration of exploratory visual analytics with a pediatric research database using Microsoft Live Labs Pivot to promote cohort discovery, research, and quality assessment.

    Science.gov (United States)

    Viangteeravat, Teeradache; Nagisetty, Naga Satya V Rao

    2014-01-01

    Secondary use of large and open data sets provides researchers with an opportunity to address high-impact questions that would otherwise be prohibitively expensive and time consuming to study. Despite the availability of data, generating hypotheses from huge data sets is often challenging, and the lack of complex analysis of data might lead to weak hypotheses. To overcome these issues and to assist researchers in building hypotheses from raw data, we are working on a visual and analytical platform called PRD Pivot. PRD Pivot is a de-identified pediatric research database designed to make secondary use of rich data sources, such as the electronic health record (EHR). The development of visual analytics using Microsoft Live Labs Pivot makes the process of data elaboration, information gathering, knowledge generation, and complex information exploration transparent to tool users and provides researchers with the ability to sort and filter by various criteria, which can lead to strong, novel hypotheses.

  12. The Application of State-of-the-Art Analytic Tools (Biosensors and Spectroscopy in Beverage and Food Fermentation Process Monitoring

    Directory of Open Access Journals (Sweden)

    Shaneel Chandra

    2017-09-01

    Full Text Available The production of several agricultural products and foods are linked with fermentation. Traditional methods used to control and monitor the quality of the products and processes are based on the use of simple chemical analysis. However, these methods are time-consuming and do not provide sufficient relevant information to guarantee the chemical changes during the process. Commonly used methods applied in the agriculture and food industries to monitor fermentation are those based on simple or single-point sensors, where only one parameter is measured (e.g., temperature or density. These sensors are used several times per day and are often the only source of data available from which the conditions and rate of fermentation are monitored. In the modern food industry, an ideal method to control and monitor the fermentation process should enable a direct, rapid, precise, and accurate determination of several target compounds, with minimal to no sample preparation or reagent consumption. Here, state-of-the-art advancements in both the application of sensors and analytical tools to monitor beverage and food fermentation processes will be discussed.

  13. Conducting qualitative research in the British Armed Forces: theoretical, analytical and ethical implications.

    Science.gov (United States)

    Finnegan, Alan

    2014-06-01

    The aim of qualitative research is to produce empirical evidence with data collected through means such as interviews and observation. Qualitative research encourages diversity in the way of thinking and the methods used. Good studies produce a richness of data to provide new knowledge or address extant problems. However, qualitative research resulting in peer review publications within the Defence Medical Services (DMS) is a rarity. This article aims to help redress this balance by offering direction regarding qualitative research in the DMS with a focus on choosing a theoretical framework, analysing the data and ethical approval. Qualitative researchers need an understanding of the paradigms and theories that underpin methodological frameworks, and this article includes an overview of common theories in phenomenology, ethnography and grounded theory, and their application within the military. It explains qualitative coding: the process used to analyse data and shape the analytical framework. A popular four phase approach with examples from an operational nursing research study is presented. Finally, it tackles the issue of ethical approval for qualitative studies and offers direction regarding the research proposal and participant consent. The few qualitative research studies undertaken in the DMS have offered innovative insights into defence healthcare providing information to inform and change educational programmes and clinical practice. This article provides an extra resource for clinicians to encourage studies that will improve the operational capability of the British Armed Forces. It is anticipated that these guidelines are transferable to research in other Armed Forces and the military Veterans population. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  14. Research and learning opportunities in a reactor-based nuclear analytical laboratory

    International Nuclear Information System (INIS)

    Robinson, L.

    1994-01-01

    Although considered by many to be a mature science, neutron activation analysis (NAA) continues to be a valuable tool in trace-element research applications. Examples of the applicability of NAA can be found in a variety of areas including archaeology, environmental science, epidemiology, forensic science, and material science to name a few. Each stage of NAA provides opportunities to share numerous practical and fundamental scientific principles with high school teachers and students. This paper will present an overview of these opportunities and give a specific example from collaboration with a high school teacher whose research involved the automation of a gamma-ray spectroscopy counting system using a laboratory robot

  15. Vehicle Technology Simulation and Analysis Tools | Transportation Research

    Science.gov (United States)

    Analysis Tools NREL developed the following modeling, simulation, and analysis tools to investigate novel design goals (e.g., fuel economy versus performance) to find cost-competitive solutions. ADOPT Vehicle Simulator to analyze the performance and fuel economy of conventional and advanced light- and

  16. Commentary on "Theory-Led Design of Instruments and Representations in Learning Analytics: Developing a Novel Tool for Orchestration of Online Collaborative Learning"

    Science.gov (United States)

    Teplovs, Chris

    2015-01-01

    This commentary reflects on the contributions to learning analytics and theory by a paper that describes how multiple theoretical frameworks were woven together to inform the creation of a new, automated discourse analysis tool. The commentary highlights the contributions of the original paper, provides some alternative approaches, and touches on…

  17. Environmental Contaminants, Metabolites, Cells, Organ Tissues, and Water: All in a Day’s Work at the EPA Analytical Chemistry Research Core

    Science.gov (United States)

    The talk will highlight key aspects and results of analytical methods the EPA National Health and Environmental Effects Research Laboratory (NHEERL) Analytical Chemistry Research Core (ACRC) develops and uses to provide data on disposition, metabolism, and effects of environmenta...

  18. PARAMO: a PARAllel predictive MOdeling platform for healthcare analytic research using electronic health records.

    Science.gov (United States)

    Ng, Kenney; Ghoting, Amol; Steinhubl, Steven R; Stewart, Walter F; Malin, Bradley; Sun, Jimeng

    2014-04-01

    Healthcare analytics research increasingly involves the construction of predictive models for disease targets across varying patient cohorts using electronic health records (EHRs). To facilitate this process, it is critical to support a pipeline of tasks: (1) cohort construction, (2) feature construction, (3) cross-validation, (4) feature selection, and (5) classification. To develop an appropriate model, it is necessary to compare and refine models derived from a diversity of cohorts, patient-specific features, and statistical frameworks. The goal of this work is to develop and evaluate a predictive modeling platform that can be used to simplify and expedite this process for health data. To support this goal, we developed a PARAllel predictive MOdeling (PARAMO) platform which (1) constructs a dependency graph of tasks from specifications of predictive modeling pipelines, (2) schedules the tasks in a topological ordering of the graph, and (3) executes those tasks in parallel. We implemented this platform using Map-Reduce to enable independent tasks to run in parallel in a cluster computing environment. Different task scheduling preferences are also supported. We assess the performance of PARAMO on various workloads using three datasets derived from the EHR systems in place at Geisinger Health System and Vanderbilt University Medical Center and an anonymous longitudinal claims database. We demonstrate significant gains in computational efficiency against a standard approach. In particular, PARAMO can build 800 different models on a 300,000 patient data set in 3h in parallel compared to 9days if running sequentially. This work demonstrates that an efficient parallel predictive modeling platform can be developed for EHR data. This platform can facilitate large-scale modeling endeavors and speed-up the research workflow and reuse of health information. This platform is only a first step and provides the foundation for our ultimate goal of building analytic pipelines

  19. Multidisciplinary design and analytic approaches to advance prospective research on the multilevel determinants of child health.

    Science.gov (United States)

    Johnson, Sara B; Little, Todd D; Masyn, Katherine; Mehta, Paras D; Ghazarian, Sharon R

    2017-06-01

    Characterizing the determinants of child health and development over time, and identifying the mechanisms by which these determinants operate, is a research priority. The growth of precision medicine has increased awareness and refinement of conceptual frameworks, data management systems, and analytic methods for multilevel data. This article reviews key methodological challenges in cohort studies designed to investigate multilevel influences on child health and strategies to address them. We review and summarize methodological challenges that could undermine prospective studies of the multilevel determinants of child health and ways to address them, borrowing approaches from the social and behavioral sciences. Nested data, variation in intervals of data collection and assessment, missing data, construct measurement across development and reporters, and unobserved population heterogeneity pose challenges in prospective multilevel cohort studies with children. We discuss innovations in missing data, innovations in person-oriented analyses, and innovations in multilevel modeling to address these challenges. Study design and analytic approaches that facilitate the integration across multiple levels, and that account for changes in people and the multiple, dynamic, nested systems in which they participate over time, are crucial to fully realize the promise of precision medicine for children and adolescents. Copyright © 2017 Elsevier Inc. All rights reserved.

  20. Analytical researches on the accelerating structures, wakefields, and beam dynamics for future linear colliders

    International Nuclear Information System (INIS)

    Gao, J.

    1996-01-01

    The research works presented in this memoir are oriented not only to the R and D programs towards future linear colliders, but also to the pedagogic purposes. The first part of this memoir (from Chapter 2 to Chapter 9) establishes an analytical framework of the disk-loaded slow wave accelerating structures with can be served as the advanced courses for the students who have got some basic trainings in the linear accelerator theories. The analytical formulae derived in this part describe clearly the properties of the disk-loaded accelerating structures, such as group velocity, shunt impedance, coupling coefficients κ and β, loss factors, and wake fields. The second part (from Chapter 11 to Chapter 13) gives the beam dynamics simulations and the final proposal of an S-Band Superconducting Linear Collider (SSLC) which is aimed to avoid the dark current problem in TESLA project. This memoir has not included all the works conducted since April 1992, such as beam dynamics simulations for CLIC Test Facility (CFT-2) and the design of High Charge Structures (HCS) (11π/12 mode) for CFT-2, in order to make this memoir more harmonious, coherent and continuous. (author)

  1. Tool Efficiency Analysis model research in SEMI industry

    Directory of Open Access Journals (Sweden)

    Lei Ma

    2018-01-01

    Full Text Available One of the key goals in SEMI industry is to improve equipment through put and ensure equipment production efficiency maximization. This paper is based on SEMI standards in semiconductor equipment control, defines the transaction rules between different tool states,and presents a TEA system model which is to analysis tool performance automatically based on finite state machine. The system was applied to fab tools and verified its effectiveness successfully, and obtained the parameter values used to measure the equipment performance, also including the advices of improvement.

  2. A literature review of empirical research on learning analytics in medical education.

    Science.gov (United States)

    Saqr, Mohammed

    2018-01-01

    The number of publications in the field of medical education is still markedly low, despite recognition of the value of the discipline in the medical education literature, and exponential growth of publications in other fields. This necessitates raising awareness of the research methods and potential benefits of learning analytics (LA). The aim of this paper was to offer a methodological systemic review of empirical LA research in the field of medical education and a general overview of the common methods used in the field in general. Search was done in Medline database using the term "LA." Inclusion criteria included empirical original research articles investigating LA using qualitative, quantitative, or mixed methodologies. Articles were also required to be written in English, published in a scholarly peer-reviewed journal and have a dedicated section for methods and results. A Medline search resulted in only six articles fulfilling the inclusion criteria for this review. Most of the studies collected data about learners from learning management systems or online learning resources. Analysis used mostly quantitative methods including descriptive statistics, correlation tests, and regression models in two studies. Patterns of online behavior and usage of the digital resources as well as predicting achievement was the outcome most studies investigated. Research about LA in the field of medical education is still in infancy, with more questions than answers. The early studies are encouraging and showed that patterns of online learning can be easily revealed as well as predicting students' performance.

  3. A Clinical Reasoning Tool for Virtual Patients: Design-Based Research Study.

    Science.gov (United States)

    Hege, Inga; Kononowicz, Andrzej A; Adler, Martin

    2017-11-02

    Clinical reasoning is a fundamental process medical students have to learn during and after medical school. Virtual patients (VP) are a technology-enhanced learning method to teach clinical reasoning. However, VP systems do not exploit their full potential concerning the clinical reasoning process; for example, most systems focus on the outcome and less on the process of clinical reasoning. Keeping our concept grounded in a former qualitative study, we aimed to design and implement a tool to enhance VPs with activities and feedback, which specifically foster the acquisition of clinical reasoning skills. We designed the tool by translating elements of a conceptual clinical reasoning learning framework into software requirements. The resulting clinical reasoning tool enables learners to build their patient's illness script as a concept map when they are working on a VP scenario. The student's map is compared with the experts' reasoning at each stage of the VP, which is technically enabled by using Medical Subject Headings, which is a comprehensive controlled vocabulary published by the US National Library of Medicine. The tool is implemented using Web technologies, has an open architecture that enables its integration into various systems through an open application program interface, and is available under a Massachusetts Institute of Technology license. We conducted usability tests following a think-aloud protocol and a pilot field study with maps created by 64 medical students. The results show that learners interact with the tool but create less nodes and connections in the concept map than an expert. Further research and usability tests are required to analyze the reasons. The presented tool is a versatile, systematically developed software component that specifically supports the clinical reasoning skills acquisition. It can be plugged into VP systems or used as stand-alone software in other teaching scenarios. The modular design allows an extension with new

  4. GEAS Spectroscopy Tools for Authentic Research Investigations in the Classroom

    Science.gov (United States)

    Rector, Travis A.; Vogt, Nicole P.

    2018-06-01

    Spectroscopy is one of the most powerful tools that astronomers use to study the universe. However relatively few resources are available that enable undergraduates to explore astronomical spectra interactively. We present web-based applications which guide students through the analysis of real spectra of stars, galaxies, and quasars. The tools are written in HTML5 and function in all modern web browsers on computers and tablets. No software needs to be installed nor do any datasets need to be downloaded, enabling students to use the tools in or outside of class (e.g., for online classes).Approachable GUIs allow students to analyze spectra in the same manner as professional astronomers. The stellar spectroscopy tool can fit a continuum with a blackbody and identify spectral features, as well as fit line profiles and determine equivalent widths. The galaxy and AGN tools can also measure redshifts and calcium break strengths. The tools provide access to an archive of hundreds of spectra obtained with the optical telescopes at Kitt Peak National Observatory. It is also possible to load your own spectra or to query the Sloan Digital Sky Survey (SDSS) database.We have also developed curricula to investigate these topics: spectral classification, variable stars, redshift, and AGN classification. We will present the functionality of the tools and describe the associated curriculum. The tools are part of the General Education Astronomy Source (GEAS) project based at New Mexico State University, with support from the National Science Foundation (NSF, AST-0349155) and the National Aeronautics and Space Administration (NASA, NNX09AV36G). Curriculum development was supported by the NSF (DUE-0618849 and DUE-0920293).

  5. IAEA coordinated research project on 'analytical and experimental benchmark analyses of accelerator driven systems'

    International Nuclear Information System (INIS)

    Ait-Abderrahim, H.; Stanculescu, A.

    2006-01-01

    This paper provides the general background and the main specifications of the benchmark exercises performed within the framework of the IAEA Coordinated Research Project (CRP) on Analytical and Experimental Benchmark Analyses of Accelerator Driven Systems. The overall objective of the CRP, performed within the framework of the Technical Working Group on Fast Reactors (TWG-FR) of IAEA's Nuclear Energy Dept., is to contribute to the generic R and D efforts in various fields common to innovative fast neutron system development, i.e. heavy liquid metal thermal hydraulics, dedicated transmutation fuels and associated core designs, theoretical nuclear reaction models, measurement and evaluation of nuclear data for transmutation, and development and validation of calculational methods and codes. (authors)

  6. Automating the Analytical Laboratories Section, Lewis Research Center, National Aeronautics and Space Administration: a feasibility study

    International Nuclear Information System (INIS)

    Boyle, W.G.; Barton, G.W.

    1979-01-01

    We studied the feasibility of computerized automation of the Analytical Laboratories Section at NASA's Lewis Research Center. Since that laboratory's duties are not routine, we set our automation goals with that in mind. We selected four instruments as the most likely automation candidates: an atomic absorption spectrophotometer, an emission spectrometer, an x-ray fluorescence spectrometer, and an x-ray diffraction unit. Our study describes two options for computer automation: a time-shared central computer and a system with microcomputers for each instrument connected to a central computer. A third option, presented for future planning, expands the microcomputer version. We determine costs and benefits for each option. We conclude that the microcomputer version best fits the goals and duties of the laboratory and that such an automated system is needed to meet the laboratory's future requirements

  7. Applying the Analytic Hierarchy Process in healthcare research: A systematic literature review and evaluation of reporting.

    Science.gov (United States)

    Schmidt, Katharina; Aumann, Ines; Hollander, Ines; Damm, Kathrin; von der Schulenburg, J-Matthias Graf

    2015-12-24

    The Analytic Hierarchy Process (AHP), developed by Saaty in the late 1970s, is one of the methods for multi-criteria decision making. The AHP disaggregates a complex decision problem into different hierarchical levels. The weight for each criterion and alternative are judged in pairwise comparisons and priorities are calculated by the Eigenvector method. The slowly increasing application of the AHP was the motivation for this study to explore the current state of its methodology in the healthcare context. A systematic literature review was conducted by searching the Pubmed and Web of Science databases for articles with the following keywords in their titles or abstracts: "Analytic Hierarchy Process," "Analytical Hierarchy Process," "multi-criteria decision analysis," "multiple criteria decision," "stated preference," and "pairwise comparison." In addition, we developed reporting criteria to indicate whether the authors reported important aspects and evaluated the resulting studies' reporting. The systematic review resulted in 121 articles. The number of studies applying AHP has increased since 2005. Most studies were from Asia (almost 30%), followed by the US (25.6%). On average, the studies used 19.64 criteria throughout their hierarchical levels. Furthermore, we restricted a detailed analysis to those articles published within the last 5 years (n = 69). The mean of participants in these studies were 109, whereas we identified major differences in how the surveys were conducted. The evaluation of reporting showed that the mean of reported elements was about 6.75 out of 10. Thus, 12 out of 69 studies reported less than half of the criteria. The AHP has been applied inconsistently in healthcare research. A minority of studies described all the relevant aspects. Thus, the statements in this review may be biased, as they are restricted to the information available in the papers. Hence, further research is required to discover who should be interviewed and how, how

  8. Polar Bears or People?: How Framing Can Provide a Useful Analytic Tool to Understand & Improve Climate Change Communication in Classrooms

    Science.gov (United States)

    Busch, K. C.

    2014-12-01

    Not only will young adults bear the brunt of climate change's effects, they are also the ones who will be required to take action - to mitigate and to adapt. The Next Generation Science Standards include climate change, ensuring the topic will be covered in U.S. science classrooms in the near future. Additionally, school is a primary source of information about climate change for young adults. The larger question, though, is how can the teaching of climate change be done in such a way as to ascribe agency - a willingness to act - to students? Framing - as both a theory and an analytic method - has been used to understand how language in the media can affect the audience's intention to act. Frames function as a two-way filter, affecting both the message sent and the message received. This study adapted both the theory and the analytic methods of framing, applying them to teachers in the classroom to answer the research question: How do teachers frame climate change in the classroom? To answer this question, twenty-five lessons from seven teachers were analyzed using semiotic discourse analysis methods. It was found that the teachers' frames overlapped to form two distinct discourses: a Science Discourse and a Social Discourse. The Science Discourse, which was dominant, can be summarized as: Climate change is a current scientific problem that will have profound global effects on the Earth's physical systems. The Social Discourse, used much less often, can be summarized as: Climate change is a future social issue because it will have negative impacts at the local level on people. While it may not be surprising that the Science Discourse was most often heard in these science classrooms, it is possibly problematic if it were the only discourse used. The research literature on framing indicates that the frames found in the Science Discourse - global scale, scientific statistics and facts, and impact on the Earth's systems - are not likely to inspire action-taking. This

  9. Land-use regime shifts: an analytical framework and agenda for future land-use research

    Directory of Open Access Journals (Sweden)

    Navin Ramankutty

    2016-06-01

    Full Text Available A key research frontier in global change research lies in understanding processes of land change to inform predictive models of future land states. We believe that significant advances in the field are hampered by limited attention being paid to critical points of change termed land-use regime shifts. We present an analytical framework for understanding land-use regime shifts. We survey historical events of land change and perform in-depth case studies of soy and shrimp development in Latin America to demonstrate the role of preconditions, triggers, and self-reinforcing processes in driving land-use regime shifts. Whereas the land-use literature demonstrates a good understanding of within-regime dynamics, our understanding of the drivers of land-use regime shifts is limited to ex post facto explications. Theoretical and empirical advances are needed to better understand the dynamics and implications of land-use regime shifts. We draw insights from the regime-shifts literature to propose a research agenda for studying land change.

  10. iPat: intelligent prediction and association tool for genomic research.

    Science.gov (United States)

    Chen, Chunpeng James; Zhang, Zhiwu

    2018-06-01

    The ultimate goal of genomic research is to effectively predict phenotypes from genotypes so that medical management can improve human health and molecular breeding can increase agricultural production. Genomic prediction or selection (GS) plays a complementary role to genome-wide association studies (GWAS), which is the primary method to identify genes underlying phenotypes. Unfortunately, most computing tools cannot perform data analyses for both GWAS and GS. Furthermore, the majority of these tools are executed through a command-line interface (CLI), which requires programming skills. Non-programmers struggle to use them efficiently because of the steep learning curves and zero tolerance for data formats and mistakes when inputting keywords and parameters. To address these problems, this study developed a software package, named the Intelligent Prediction and Association Tool (iPat), with a user-friendly graphical user interface. With iPat, GWAS or GS can be performed using a pointing device to simply drag and/or click on graphical elements to specify input data files, choose input parameters and select analytical models. Models available to users include those implemented in third party CLI packages such as GAPIT, PLINK, FarmCPU, BLINK, rrBLUP and BGLR. Users can choose any data format and conduct analyses with any of these packages. File conversions are automatically conducted for specified input data and selected packages. A GWAS-assisted genomic prediction method was implemented to perform genomic prediction using any GWAS method such as FarmCPU. iPat was written in Java for adaptation to multiple operating systems including Windows, Mac and Linux. The iPat executable file, user manual, tutorials and example datasets are freely available at http://zzlab.net/iPat. zhiwu.zhang@wsu.edu.

  11. Research on Test-bench for Sonic Logging Tool

    Directory of Open Access Journals (Sweden)

    Xianping Liu

    2016-01-01

    Full Text Available In this paper, the test-bench for sonic logging tool is proposed and designed to realize automatic calibration and testing of the sonic logging tool. The test-bench System consists of Host Computer, Embedded Controlling Board, and functional boards. The Host Computer serves as the Human Machine Interface (HMI and processes uploaded data. The software running on Host Computer is designed on VC++, which is developed based on multithreading, Dynamic Linkable Library (DLL and Multiple Document Interface (MDI techniques. The Embedded Controlling Board uses ARM7 as the microcontroller and communicates with Host Computer via Ethernet. The Embedded Controlling Board software is realized based on embedded uclinux operating system with a layered architecture. The functional boards are designed based on Field Programmable Gate Array (FPGA and provide test interfaces for the logging tool. The functional board software is divided into independent sub-modules that can repeatedly be used by various functional boards and then integrated those sub-modules in the top layer. With the layered architecture and modularized design, the software system is highly reliable and extensible. With the help of designed system, a test has been conducted quickly and successfully on the electronic receiving cabin of the sonic logging tool. It demonstrated that the system could greatly improve the production efficiency of the sonic logging tool.

  12. e-research: Changes and challenges in the use of digital tools in primary care research

    DEFF Research Database (Denmark)

    Bruun Larsen, Lars; Skonnord, Trygve; Gjelstad, Svein

    in primary care research. Examples of this are online randomisation, electronic questionnaires, automatic email scheduling, mobile phone applications and data extraction tools. The amount of data can be increased to a low cost, and this can help to reach adequate sample sizes. However, there are still...... challenges within the field. To secure a high response rate, you need to follow up manually or use another application. There are also practical and ethical problems, and the data security for sensitive data have to be followed carefully. Session content Oral presentations about some technological...

  13. Benchtop-NMR and MRI--a new analytical tool in drug delivery research.

    Science.gov (United States)

    Metz, Hendrik; Mäder, Karsten

    2008-12-08

    During the last years, NMR spectroscopy and NMR imaging (magnetic resonance imaging, MRI) have been increasingly used to monitor drug delivery systems in vitro and in vivo. However, high installation and running costs of the commonly used superconducting magnet technology limits the application range and prevents the further spread of this non-invasive technology. Benchtop-NMR (BT-NMR) relaxometry uses permanent magnets and is much less cost intensive. BT-NMR relaxometry is commonly used in the food and chemical industry, but so far scarcely used in the pharmaceutical field. The paper shows on several examples that the application field of BT-NMR relaxometry can be extended into the field of drug delivery, including the characterisation of emulsions and lipid ingredients (e.g. the amount and physicochemical state of the lipid) and the monitoring of adsorption characteristics (e.g. oil binding of porous ingredients). The most exciting possibilities of BT-NMR technology are linked with the new development of BT-instruments with imaging capability. BT-MRI examples on the monitoring of hydration and swelling of HPMC-based monolayer and double-layer tablets are shown. BT-MRI opens new MRI opportunities for the non-invasive monitoring of drug delivery processes.

  14. Medicaid Analytic eXtract (MAX) Chartbooks

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicaid Analytic eXtract Chartbooks are research tools and reference guides on Medicaid enrollees and their Medicaid experience in 2002 and 2004. Developed for...

  15. The Value of Decision Analytical Modeling in Surgical Research: An Example of Laparoscopic Versus Open Distal Pancreatectomy.

    Science.gov (United States)

    Tax, Casper; Govaert, Paulien H M; Stommel, Martijn W J; Besselink, Marc G H; Gooszen, Hein G; Rovers, Maroeska M

    2017-11-02

    To illustrate how decision modeling may identify relevant uncertainty and can preclude or identify areas of future research in surgery. To optimize use of research resources, a tool is needed that assists in identifying relevant uncertainties and the added value of reducing these uncertainties. The clinical pathway for laparoscopic distal pancreatectomy (LDP) versus open (ODP) for nonmalignant lesions was modeled in a decision tree. Cost-effectiveness based on complications, hospital stay, costs, quality of life, and survival was analyzed. The effect of existing uncertainty on the cost-effectiveness was addressed, as well as the expected value of eliminating uncertainties. Based on 29 nonrandomized studies (3.701 patients) the model shows that LDP is more cost-effective compared with ODP. Scenarios in which LDP does not outperform ODP for cost-effectiveness seem unrealistic, e.g., a 30-day mortality rate of 1.79 times higher after LDP as compared with ODP, conversion in 62.2%, surgically repair of incisional hernias in 21% after LDP, or an average 2.3 days longer hospital stay after LDP than after ODP. Taking all uncertainty into account, LDP remained more cost-effective. Minimizing these uncertainties did not change the outcome. The results show how decision analytical modeling can help to identify relevant uncertainty and guide decisions for future research in surgery. Based on the current available evidence, a randomized clinical trial on complications, hospital stay, costs, quality of life, and survival is highly unlikely to change the conclusion that LDP is more cost-effective than ODP.

  16. The use of generalised audit software by internal audit functions in a developing country: The purpose of the use of generalised audit software as a data analytics tool

    Directory of Open Access Journals (Sweden)

    D.P. van der Nest

    2017-11-01

    Full Text Available This article explores the purpose of the use of generalised audit software as a data analytics tool by internal audit functions in the locally controlled banking industry of South Africa. The evolution of the traditional internal audit methodology of collecting audit evidence through the conduct of interviews, the completion of questionnaires, and by testing controls on a sample basis, is long overdue, and such practice in the present technological, data-driven era will soon render such an internal audit function obsolete. The research results indicate that respondents are utilising GAS for a variety of purposes but that its frequency of use is not yet optimal and that there is still much room for improvement for tests of controls purposes. The top five purposes for which the respondents make use of GAS often to always during separate internal audit engagements are: (1 to identify transactions with specific characteristics or control criteria for tests of control purposes; (2 for conducting full population analysis; (3 to identify account balances over a certain amount; (4 to identify and report on the frequency of occurrence of risks or frequency of occurrence of specific events; and (5 to obtain audit evidence about control effectiveness

  17. Bayes' theorem: A paradigm research tool in biomedical sciences

    African Journals Online (AJOL)

    STORAGESEVER

    2008-12-29

    Dec 29, 2008 ... It is on this premise that this article presents Bayes' theorem as a vital tool. A brief intuitive ... diseased individual will be selected or that a disease-free individual will be selected? ...... Ultrasound physics and. Instruction 3rd ed ...

  18. Endothelial cell cultures as a tool in biomaterial research

    NARCIS (Netherlands)

    Kirkpatrick, CJ; Otto, M; van Kooten, T; Krump, [No Value; Kriegsmann, J; Bittinger, F

    1999-01-01

    Progress in biocompatibility and tissue engineering would today be inconceivable without the aid of in vitro techniques. Endothelial cell cultures represent a valuable tool not just in haemocompatibility testing, but also in the concept of designing hybrid organs. In the past endothelial cells (EC)

  19. THE STRUCTURE ANALYTICAL RESEARCH OF POROUS PERMEABLE WIRE MATERIAL (in Russian

    Directory of Open Access Journals (Sweden)

    Andrzej JAKUBOWSKI

    2016-04-01

    Full Text Available The details of making technology of porous permeable material with use of wire are allowed to carry out the analytical research of structure and structural characteristics of wire winding body. Its permit for prognostication the final proper-ties of material, that is produced by the following deformation treatment (diameter reduction. Due to the regular orga-nized arrangement of wire, the coil of winding body is considered as a multispan continuous beam, but a contact of coils – as interaction of two cylinders. Possibility of exactly calculation of the contacts between coils is allowed to go over the single fragment displacements into deformation of whole winding body. During research of deformation processes in regards of winding body geometry and used wire mechanical properties, the structural characteristics of porous permea-ble wire material are expected. The optimal number of winding layers, eliminating the distortion of organized final struc-ture, is established. The material pressure–compactness relation is obtained in order to control the technological condi-tions of winding and drafting for guarantee the product required properties.

  20. Facility and application of nuclear and supplementary analytical techniques at Dalat Nuclear Research Institute

    International Nuclear Information System (INIS)

    Nguyen Mong Sinh; Ho Manh Dung; Nguyen Thanh Binh

    2006-01-01

    The main applications of the nuclear and supplementary analytical techniques (N and SATs) in the Dalat Nuclear Research Institute (DNRI) and the facilities for the techniques are presented. The NATs in DNRI include the neutron activation analysis (NAA) with instrumental, radiochemical and prompt gamma methods (INAA, RNAA, PGNAA), the X-ray fluorescence analysis (XRFA) and the low-level counting and spectrometry. The sample irradiation sites for NAA, the automatic and manual pneumatic transfer systems, were installed at channels 7-1 and 13-2 and rotary rack on the Dalat research reactor. An ORTEC automatic sample changer (model ASC2) for γ-ray counting was equipped. A computer software for NAA based on the k 0 -standardization method for calculation of elemental concentration was developed. The low-level counting and spectrometry techniques have been setup. The devices required for sampling, sample preparation and data processing have also been equipped. The applications of N and SATs for determination of elemental composition, particularly important in providing data so-called trace elements, radionuclides and multi-element have been enlarged for objects of geology, archaeology, bio-agriculture, health-nutrition and environment. The implementation a quality system for N and SATs has been planned and initiated. (author)

  1. [Eating, nourishment and nutrition: instrumental analytic categories in the scientific research field].

    Science.gov (United States)

    da Veiga Soares Carvalho, Maria Cláudia; Luz, Madel Therezinha; Prado, Shirley Donizete

    2011-01-01

    Eating, nourishment or nutrition circulate in our culture as synonyms and thus do not account for the changes that occur in nourishment, which intended or unintended, have a hybridization pattern that represents a change of rules and food preferences. This paper aims to take these common sense conceptions as analytic categories for analyzing and interpreting research for the Humanities and Health Sciences in a theoretical perspective, through conceptualization. The food is associated with a natural function (biological), a concept in which nature is opposed to culture, and nourishment takes cultural meanings (symbolic), expressing the division of labor, wealth, and a historical and cultural creation through which one can study a society. One attributes to Nutrition a sense of rational action, derived from the constitution of this science in modernity, inserted in a historical process of scientific rationalization of eating and nourishing. We believe that through the practice of conceptualization in interdisciplinary research, which involves a shared space of knowledge, we can be less constrained by a unified theoretical model of learning and be freer to think about life issues.

  2. Design research in statistics education : on symbolizing and computer tools

    NARCIS (Netherlands)

    Bakker, A.

    2004-01-01

    The present knowledge society requires statistical literacy-the ability to interpret, critically evaluate, and communicate about statistical information and messages (Gal, 2002). However, research shows that students generally do not gain satisfactory statistical understanding. The research

  3. Research reactors: a tool for science and medicine

    International Nuclear Information System (INIS)

    Ordonez, Juan

    2001-01-01

    The types and uses of research reactors are reviewed. After an analysis of the world situation, the demand of new research reactors of about 20 MW is foreseen. The experience and competitiveness of INVAP S.E. as designer and constructor of research reactors is outlined and the general specifications of the reactors designed by INVAP for Egypt and Australia are given

  4. Tools for Monitoring Social Media: A Marketing Research Project

    Science.gov (United States)

    Veeck, Ann; Hoger, Beth

    2014-01-01

    Knowledge of how to effectively monitor social media is an increasingly valued marketing research skill. This study tests an approach for adding social media content to an undergraduate marketing research class team project. The revised project maintains the expected objectives and parameters of a traditional research project, while integrating…

  5. Development of reliable analytical tools for evaluating the influence of reductive winemaking on the quality of Lugana wines.

    Science.gov (United States)

    Mattivi, Fulvio; Fedrizzi, Bruno; Zenato, Alberto; Tiefenthaler, Paolo; Tempesta, Silvano; Perenzoni, Daniele; Cantarella, Paolo; Simeoni, Federico; Vrhovsek, Urska

    2012-06-30

    This paper presents methods for the definition of important analytical tools, such as the development of sensitive and rapid methods for analysing reduced and oxidised glutathione (GSH and GSSG), hydroxycinnamic acids (HCA), bound thiols (GSH-3MH and Cys-3MH) and free thiols (3MH and 3MHA), and their first application to evaluate the effect of reductive winemaking on the composition of Lugana juices and wines. Lugana is a traditional white wine from the Lake Garda region (Italy), produced using a local grape variety, Trebbiano di Lugana. An innovative winemaking procedure based on preliminary cooling of grape berries followed by crushing in an inert environment was implemented and explored on a winery scale. The effects of these procedures on hydroxycinnamic acids, GSH, GSSG, free and bound thiols and flavanols content were investigated. The juices and wines produced using different protocols were examined. Moreover, wines aged in tanks for 1, 2 and 3 months were analysed. The high level of GSH found in Lugana grapes, which can act as a natural antioxidant and be preserved in must and young wines, thus reducing the need of exogenous antioxidants, was particularly interesting. Moreover, it was clear that polyphenol concentrations (hydroxycinnamic acids and catechins) were strongly influenced by winemaking and pressing conditions, which required fine tuning of pressing. Above-threshold levels of 3-mercaptohexan-1-ol (3MH) and 3-mercaptohexyl acetate (3MHA) were found in the wines and changed according to the winemaking procedure applied. Interestingly, the evolution during the first three months also varied depending on the procedure adopted. Organic synthesis of cysteine and glutathione conjugates was carried out and juices and wines were subjected to LC-MS/MS analysis. These two molecules appeared to be strongly affected by the winemaking procedure, but did not show any significant change during the first 3 months of post-bottling ageing. This supports the theory

  6. On-line near infrared spectroscopy as a Process Analytical Technology (PAT) tool to control an industrial seeded API crystallization.

    Science.gov (United States)

    Schaefer, C; Lecomte, C; Clicq, D; Merschaert, A; Norrant, E; Fotiadu, F

    2013-09-01

    The final step of an active pharmaceutical ingredient (API) manufacturing synthesis process consists of a crystallization during which the API and residual solvent contents have to be quantified precisely in order to reach a predefined seeding point. A feasibility study was conducted to demonstrate the suitability of on-line NIR spectroscopy to control this step in line with new version of the European Medicines Agency (EMA) guideline [1]. A quantitative method was developed at laboratory scale using statistical design of experiments (DOE) and multivariate data analysis such as principal component analysis (PCA) and partial least squares (PLS) regression. NIR models were built to quantify the API in the range of 9-12% (w/w) and to quantify the residual methanol in the range of 0-3% (w/w). To improve the predictive ability of the models, the development procedure encompassed: outliers elimination, optimum model rank definition, spectral range and spectral pre-treatment selection. Conventional criteria such as, number of PLS factors, R(2), root mean square errors of calibration, cross-validation and prediction (RMSEC, RMSECV, RMSEP) enabled the selection of three model candidates. These models were tested in the industrial pilot plant during three technical campaigns. Results of the most suitable models were evaluated against to the chromatographic reference methods. Maximum relative bias of 2.88% was obtained about API target content. Absolute bias of 0.01 and 0.02% (w/w) respectively were achieved at methanol content levels of 0.10 and 0.13% (w/w). The repeatability was assessed as sufficient for the on-line monitoring of the 2 analytes. The present feasibility study confirmed the possibility to use on-line NIR spectroscopy as a PAT tool to monitor in real-time both the API and the residual methanol contents, in order to control the seeding of an API crystallization at industrial scale. Furthermore, the successful scale-up of the method proved its capability to be

  7. Big Data Analytics and Machine Intelligence Capability Development at NASA Langley Research Center: Strategy, Roadmap, and Progress

    Science.gov (United States)

    Ambur, Manjula Y.; Yagle, Jeremy J.; Reith, William; McLarney, Edward

    2016-01-01

    In 2014, a team of researchers, engineers and information technology specialists at NASA Langley Research Center developed a Big Data Analytics and Machine Intelligence Strategy and Roadmap as part of Langley's Comprehensive Digital Transformation Initiative, with the goal of identifying the goals, objectives, initiatives, and recommendations need to develop near-, mid- and long-term capabilities for data analytics and machine intelligence in aerospace domains. Since that time, significant progress has been made in developing pilots and projects in several research, engineering, and scientific domains by following the original strategy of collaboration between mission support organizations, mission organizations, and external partners from universities and industry. This report summarizes the work to date in Data Intensive Scientific Discovery, Deep Content Analytics, and Deep Q&A projects, as well as the progress made in collaboration, outreach, and education. Recommendations for continuing this success into future phases of the initiative are also made.

  8. Big data analytics from strategic planning to enterprise integration with tools, techniques, NoSQL, and graph

    CERN Document Server

    Loshin, David

    2013-01-01

    Big Data Analytics will assist managers in providing an overview of the drivers for introducing big data technology into the organization and for understanding the types of business problems best suited to big data analytics solutions, understanding the value drivers and benefits, strategic planning, developing a pilot, and eventually planning to integrate back into production within the enterprise. Guides the reader in assessing the opportunities and value propositionOverview of big data hardware and software architecturesPresents a variety of te

  9. Research Game: an innovative educational tool for teachers and students

    Directory of Open Access Journals (Sweden)

    Franca Sangiorgio

    2014-12-01

    Full Text Available This contribution describes ‘Research Game’, a game produced in a Lifelong Learning Programme-Comenius Project (The European Scientific Research Game which aims at motivating secondary school students through the experience of the excitement of scientific research. The project proposes practical and didactic works which combine theoretical activities with ICT in order to introduce students to the scientific research. Students collaborated internationally across Europe, to build hypotheses, carry out research, test the validity of their hypothesis and finalize a theory based on their findings. On the project platform (www.researchgame.eu/platform teachers and students registered, created a team, interacted on a forum space, played and learned science in a new innovative way. Here, the students shared their research findings with other groups of all Europe; finally competed online playing a serious game and showing to be able to apply the scientific method.

  10. Real-time particle size analysis using focused beam reflectance measurement as a process analytical technology tool for a continuous granulation-drying-milling process.

    Science.gov (United States)

    Kumar, Vijay; Taylor, Michael K; Mehrotra, Amit; Stagner, William C

    2013-06-01

    Focused beam reflectance measurement (FBRM) was used as a process analytical technology tool to perform inline real-time particle size analysis of a proprietary granulation manufactured using a continuous twin-screw granulation-drying-milling process. A significant relationship between D20, D50, and D80 length-weighted chord length and sieve particle size was observed with a p value of 0.05).

  11. Review of Research on Student-Facing Learning Analytics Dashboards and Educational Recommender Systems

    Science.gov (United States)

    Bodily, Robert; Verbert, Katrien

    2017-01-01

    This article is a comprehensive literature review of student-facing learning analytics reporting systems that track learning analytics data and report it directly to students. This literature review builds on four previously conducted literature reviews in similar domains. Out of the 945 articles retrieved from databases and journals, 93 articles…

  12. Hydropedology as a powerful tool for environmental policy research

    NARCIS (Netherlands)

    Bouma, J.

    2006-01-01

    Rather than produce clear-cut answers to well-defined problems, research on future environmental policy issues requires a different approach whereby researchers are partners in joint learning processes among stakeholders, policy makers, NGOs (Non-Governmental Organisations) and industry. This

  13. Systems Engineering-Based Tool for Identifying Critical Research Systems

    Science.gov (United States)

    Abbott, Rodman P.; Stracener, Jerrell

    2016-01-01

    This study investigates the relationship between the designated research project system independent variables of Labor, Travel, Equipment, and Contract total annual costs and the dependent variables of both the associated matching research project total annual academic publication output and thesis/dissertation number output. The Mahalanobis…

  14. Bibliometric mapping as a science policy and research management tool

    NARCIS (Netherlands)

    Noyons, Everard Christiaan Marie

    1999-01-01

    Bibliometric maps of science are landscapes of scientific research fields created by quantitative analysis of bibliographic data. In such maps the 'cities' are, for instance, research topics. Topics with a strong cognitive relation are in each other's vicinity and topics with a weak relation are

  15. Process models as tools in forestry research and management

    Science.gov (United States)

    Kurt Johnsen; Lisa Samuelson; Robert Teskey; Steve McNulty; Tom Fox

    2001-01-01

    Forest process models are mathematical representations of biological systems that incorporate our understanding of physiological and ecological mechanisms into predictive algorithms. These models were originally designed and used for research purposes, but are being developed for use in practical forest management. Process models designed for research...

  16. Somatic Sensitivity and Reflexivity as Validity Tools in Qualitative Research

    Science.gov (United States)

    Green, Jill

    2015-01-01

    Validity is a key concept in qualitative educational research. Yet, it is often not addressed in methodological writing about dance. This essay explores validity in a postmodern world of diverse approaches to scholarship, by looking at the changing face of validity in educational qualitative research and at how new understandings of the concept…

  17. Flood Risk Research and Warning Tools at the European Scale

    NARCIS (Netherlands)

    Roo, A.P.J. de; Thielen, J.; Feyen, L.; Burek, P.; Salamon, P.

    2012-01-01

    The floods in the rivers Meuse and Rhine in 1993 and 1995 made the European Commission realize that also at Commission level further research on floods – especially in transboundary river catchments - was necessary. This led to the start of a dedicated research project on floods at the European

  18. Fuzzy Analytic Hierarchy Process-based Chinese Resident Best Fitness Behavior Method Research.

    Science.gov (United States)

    Wang, Dapeng; Zhang, Lan

    2015-01-01

    With explosive development in Chinese economy and science and technology, people's pursuit of health becomes more and more intense, therefore Chinese resident sports fitness activities have been rapidly developed. However, different fitness events popularity degrees and effects on body energy consumption are different, so bases on this, the paper researches on fitness behaviors and gets Chinese residents sports fitness behaviors exercise guide, which provides guidance for propelling to national fitness plan's implementation and improving Chinese resident fitness scientization. The paper starts from the perspective of energy consumption, it mainly adopts experience method, determines Chinese resident favorite sports fitness event energy consumption through observing all kinds of fitness behaviors energy consumption, and applies fuzzy analytic hierarchy process to make evaluation on bicycle riding, shadowboxing practicing, swimming, rope skipping, jogging, running, aerobics these seven fitness events. By calculating fuzzy rate model's membership and comparing their sizes, it gets fitness behaviors that are more helpful for resident health, more effective and popular. Finally, it gets conclusions that swimming is a best exercise mode and its membership is the highest. Besides, the memberships of running, rope skipping and shadowboxing practicing are also relative higher. It should go in for bodybuilding by synthesizing above several kinds of fitness events according to different physical conditions; different living conditions so that can better achieve the purpose of fitness exercises.

  19. IAEA coordinated research project (CRP) on 'Analytical and experimental benchmark analyses of accelerator driven systems'

    International Nuclear Information System (INIS)

    Abanades, Alberto; Aliberti, Gerardo; Gohar, Yousry; Talamo, Alberto; Bornos, Victor; Kiyavitskaya, Anna; Carta, Mario; Janczyszyn, Jerzy; Maiorino, Jose; Pyeon, Cheolho; Stanculescu, Alexander; Titarenko, Yury; Westmeier, Wolfram

    2008-01-01

    In December 2005, the International Atomic Energy Agency (IAEA) has started a Coordinated Research Project (CRP) on 'Analytical and Experimental Benchmark Analyses of Accelerator Driven Systems'. The overall objective of the CRP, performed within the framework of the Technical Working Group on Fast Reactors (TWGFR) of IAEA's Nuclear Energy Department, is to increase the capability of interested Member States in developing and applying advanced reactor technologies in the area of long-lived radioactive waste utilization and transmutation. The specific objective of the CRP is to improve the present understanding of the coupling of an external neutron source (e.g. spallation source) with a multiplicative sub-critical core. The participants are performing computational and experimental benchmark analyses using integrated calculation schemes and simulation methods. The CRP aims at integrating some of the planned experimental demonstration projects of the coupling between a sub-critical core and an external neutron source (e.g. YALINA Booster in Belarus, and Kyoto University's Critical Assembly (KUCA)). The objective of these experimental programs is to validate computational methods, obtain high energy nuclear data, characterize the performance of sub-critical assemblies driven by external sources, and to develop and improve techniques for sub-criticality monitoring. The paper summarizes preliminary results obtained to-date for some of the CRP benchmarks. (authors)

  20. Experimental and Analytical Research on Resonance Phenomena of Vibrating Head with MRE Regulating Element

    Science.gov (United States)

    Miedzińska, D.; Gieleta, R.; Osiński, J.

    2015-02-01

    A vibratory pile hammer (VPH) is a mechanical device used to drive steel piles as well as tube piles into soil to provide foundation support for buildings or other structures. In order to increase the stability and the efficiency of the VPH work in the over-resonance frequency, a new VPH construction was developed at the Military University of Technology. The new VPH contains a system of counter-rotating eccentric weights, powered by hydraulic motors, and designed in such a way that horizontal vibrations cancel out, while vertical vibrations are transmitted into the pile. This system is suspended in the static parts by the adaptive variable stiffness pillows based on a smart material, magnetorheological elastomer (MRE), whose rheological and mechanical properties can be reversibly and rapidly controlled by an external magnetic field. The work presented in the paper is a part of the modified VPH construction design process. It concerns the experimental research on the vibrations during the piling process and the analytical analyses of the gained signal. The results will be applied in the VPH control system.

  1. Social media and the social sciences: How researchers employ Big Data analytics

    Directory of Open Access Journals (Sweden)

    Mylynn Felt

    2016-04-01

    Full Text Available Social media posts are full of potential for data mining and analysis. Recognizing this potential, platform providers increasingly restrict free access to such data. This shift provides new challenges for social scientists and other non-profit researchers who seek to analyze public posts with a purpose of better understanding human interaction and improving the human condition. This paper seeks to outline some of the recent changes in social media data analysis, with a focus on Twitter, specifically. Using Twitter data from a 24-hour period following The Sisters in Spirit Candlelight Vigil, sponsored by the Native Women’s Association of Canada, this article compares three free-use Twitter application programming interfaces for capturing tweets and enabling analysis. Although recent Twitter data restrictions limit free access to tweets, there are many dynamic options for social scientists to choose from in the capture and analysis of Twitter and other social media platform data. This paper calls for critical social media data analytics combined with traditional, qualitative methods to address the developing ‘data gold rush.’

  2. SMART II : the spot market agent research tool version 2.0.

    Energy Technology Data Exchange (ETDEWEB)

    North, M. J. N.

    2000-12-14

    Argonne National Laboratory (ANL) has worked closely with Western Area Power Administration (Western) over many years to develop a variety of electric power marketing and transmission system models that are being used for ongoing system planning and operation as well as analytic studies. Western markets and delivers reliable, cost-based electric power from 56 power plants to millions of consumers in 15 states. The Spot Market Agent Research Tool Version 2.0 (SMART II) is an investigative system that partially implements some important components of several existing ANL linear programming models, including some used by Western. SMART II does not implement a complete model of the Western utility system but it does include several salient features of this network for exploratory purposes. SMART II uses a Swarm agent-based framework. SMART II agents model bulk electric power transaction dynamics with recognition for marginal costs as well as transmission and generation constraints. SMART II uses a sparse graph of nodes and links to model the electric power spot market. The nodes represent power generators and consumers with distinct marginal decision curves and varying investment capital as well individual learning parameters. The links represent transmission lines with individual capacities taken from a range of central distribution, outlying distribution and feeder line types. The application of SMART II to electric power systems studies has produced useful results different from those often found using more traditional techniques. Use of the advanced features offered by the Swarm modeling environment simplified the creation of the SMART II model.

  3. TRANSPORTATION RESEARCH IMPLEMENTATION MANAGEMENT : DEVELOPMENT OF PERFORMANCE BASED PROCESSES, METRICS, AND TOOLS

    Science.gov (United States)

    2018-02-02

    The objective of this study is to develop an evidencebased research implementation database and tool to support research implementation at the Georgia Department of Transportation (GDOT).A review was conducted drawing from the (1) implementati...

  4. A Primer On Consumer Marketing Research, Procedures, Methods, And Tools

    Science.gov (United States)

    1994-03-01

    THE FOLLOWING IS ONE OF A SERIES OF PAPERS DEVELOPED OR PRODUCED BY THE ECONOMIC ANALYSIS DIVISION OF THE JOHN A. VOLPE NATIONAL TRANSPORTATION SYSTEMS CENTER AS PART OF ITS RESEARCH PROJECT LOOKING INTO ISSUES SURROUNDING : USER RESPONSE AND MARKET ...

  5. Applying Web-Based Tools for Research, Engineering, and Operations

    Science.gov (United States)

    Ivancic, William D.

    2011-01-01

    Personnel in the NASA Glenn Research Center Network and Architectures branch have performed a variety of research related to space-based sensor webs, network centric operations, security and delay tolerant networking (DTN). Quality documentation and communications, real-time monitoring and information dissemination are critical in order to perform quality research while maintaining low cost and utilizing multiple remote systems. This has been accomplished using a variety of Internet technologies often operating simultaneously. This paper describes important features of various technologies and provides a number of real-world examples of how combining Internet technologies can enable a virtual team to act efficiently as one unit to perform advanced research in operational systems. Finally, real and potential abuses of power and manipulation of information and information access is addressed.

  6. Stable isotopes: essential tools in biological and medical research

    Energy Technology Data Exchange (ETDEWEB)

    Klein, P. D.; Hachey, D. L.; Kreek, M. J.; Schoeller, D. A.

    1977-01-01

    Recent developments in the use of the stable isotopes, /sup 13/C, /sup 15/N, /sup 17/O, and /sup 18/O, as tracers in research studies in the fields of biology, medicine, pharmacology, and agriculture are briefly reviewed. (CH)

  7. Research articles as a didatic tool in undergraduate chemistry teaching

    OpenAIRE

    Massi, Luciana; Santos, Gelson Ribeiro dos; Ferreira, Jerino Queiroz; Queiroz, Salete Linhares

    2009-01-01

    Chemistry teachers increasingly use research articles in their undergraduate courses. This trend arises from current pedagogical emphasis on active learning and scientific process. In this paper, we describe some educational experiences on the use of research articles in chemistry higher education. Additionally, we present our own conclusions on the use of such methodology applied to a scientific communication course offered to undergraduate chemistry students at the University of São Paulo, ...

  8. Neurophysiological analytics for all! Free open-source software tools for documenting, analyzing, visualizing, and sharing using electronic notebooks.

    Science.gov (United States)

    Rosenberg, David M; Horn, Charles C

    2016-08-01

    Neurophysiology requires an extensive workflow of information analysis routines, which often includes incompatible proprietary software, introducing limitations based on financial costs, transfer of data between platforms, and the ability to share. An ecosystem of free open-source software exists to fill these gaps, including thousands of analysis and plotting packages written in Python and R, which can be implemented in a sharable and reproducible format, such as the Jupyter electronic notebook. This tool chain can largely replace current routines by importing data, producing analyses, and generating publication-quality graphics. An electronic notebook like Jupyter allows these analyses, along with documentation of procedures, to display locally or remotely in an internet browser, which can be saved as an HTML, PDF, or other file format for sharing with team members and the scientific community. The present report illustrates these methods using data from electrophysiological recordings of the musk shrew vagus-a model system to investigate gut-brain communication, for example, in cancer chemotherapy-induced emesis. We show methods for spike sorting (including statistical validation), spike train analysis, and analysis of compound action potentials in notebooks. Raw data and code are available from notebooks in data supplements or from an executable online version, which replicates all analyses without installing software-an implementation of reproducible research. This demonstrates the promise of combining disparate analyses into one platform, along with the ease of sharing this work. In an age of diverse, high-throughput computational workflows, this methodology can increase efficiency, transparency, and the collaborative potential of neurophysiological research. Copyright © 2016 the American Physiological Society.

  9. Undergraduate Research in Physics as an Educational Tool

    Science.gov (United States)

    Hakim, Toufic M.; Garg, Shila

    2001-03-01

    The National Science Foundation's 1996 report "Shaping the Future: New Expectations for Undergraduate Education in Science, Mathematics, Engineering and Technology" urged that in order to improve SME&T education, decisive action must be taken so that "all students have access to excellent undergraduate education in science .... and all students learn these subjects by direct experience with the methods and processes of inquiry." Research-related educational activities that integrate education and research have been shown to be valuable in improving the quality of education and enhancing the number of majors in physics departments. Student researchers develop a motivation to continue in science and engineering through an appreciation of how science is done and the excitement of doing frontier research. We will address some of the challenges of integrating research into the physics undergraduate curriculum effectively. The departmental and institutional policies and infrastructure required to help prepare students for this endeavor will be discussed as well as sources of support and the establishment of appropriate evaluation procedures.

  10. Recorded peer video chat as a research and development tool

    DEFF Research Database (Denmark)

    Otrel-Cass, Kathrin; Cowie, Bronwen

    2016-01-01

    When practising teachers take time to exchange their experiences and reflect on their teaching realities as critical friends, they add meaning and depth to educational research. When peer talk is facilitated through video chat platforms, teachers can meet (virtually) face to face even when...... recordings were transcribed and used to prompt further discussion. The recording of the video chat meetings provided an opportunity for researchers to listen in and follow up on points they felt needed further unpacking or clarification. The recorded peer video chat conversations provided an additional...... opportunity to stimulate and support teacher participants in a process of critical analysis and reflection on practice. The discussions themselves were empowering because in the absence of the researcher, the teachers, in negotiation with peers, choose what is important enough to them to take time to discuss....

  11. Electrospray ionization and matrix assisted laser desorption/ionization mass spectrometry: powerful analytical tools in recombinant protein chemistry

    DEFF Research Database (Denmark)

    Andersen, Jens S.; Svensson, B; Roepstorff, P

    1996-01-01

    Electrospray ionization and matrix assisted laser desorption/ionization are effective ionization methods for mass spectrometry of biomolecules. Here we describe the capabilities of these methods for peptide and protein characterization in biotechnology. An integrated analytical strategy is presen......Electrospray ionization and matrix assisted laser desorption/ionization are effective ionization methods for mass spectrometry of biomolecules. Here we describe the capabilities of these methods for peptide and protein characterization in biotechnology. An integrated analytical strategy...... is presented encompassing protein characterization prior to and after cloning of the corresponding gene....

  12. Integrating agricultural research and policy analysis: analytical framework and policy applications for bio-economic modelling.

    NARCIS (Netherlands)

    Ruben, R.; Moll, H.; Kuyvenhoven, A.

    1998-01-01

    Interdisciplinary approaches to identify suitable incentives for enhancing sustainable natural resource use require an analytical framework that satisfies both practical purposes of policy support and disciplinary requirements regarding the specification of underlying technical and behavioural

  13. Research Tools for the Measurement of Pain and Nociception

    Directory of Open Access Journals (Sweden)

    Craig Johnson

    2016-11-01

    Full Text Available There are many ways in which pain in animals can be measured and these are based on a variety of phenomena that are related to either the perception of pain or alterations in physical or behavioural features of the animal that are caused by that pain. The features of pain that are most useful for assessment in clinical environments are not always the best to use in a research environment. This is because the aims and objectives of the two settings are different and so whilst particular techniques will have the same advantages and disadvantages in clinical and research environments, these considerations may become more or less of a drawback when moving from one environment to the other. For example, a simple descriptive pain scale has a number of advantages and disadvantages. In a clinical setting the advantages are very useful and the disadvantages are less relevant, but in a research environment the advantages are less important and the disadvantages can become more problematic. This paper will focus on pain in the research environment and after a brief revision of the pathophysiological systems involved will attempt to outline the major advantages and disadvantages of the more commonly used measurement techniques that have been used for studies in the area of pain perception and analgesia. This paper is expanded from a conference proceedings paper presented at the International Veterinary Emergency and Critical Care Conference in San Diego, USA.

  14. The portuguese research reactor: A tool for the next century

    International Nuclear Information System (INIS)

    Ramalho, A.J.G.; Marques, J.G.; Cardeira, F.M.

    2000-01-01

    A short presentation is made of the Portuguese Research Reactor utilisation, its problems and the solutions found. Starting with the initial calibration and experiments the routine operation at full power follows. The problems then encountered which drove to the refurbishment are referred. The present status of the system is then presented and from that conclusions for the future are derived. (author)

  15. Using smartphones in survey research: a multifunctional tool

    NARCIS (Netherlands)

    Nathalie Sonck; Henk Fernee

    2013-01-01

    Smartphones and apps offer an innovative means of collecting data from the public. The Netherlands Institute for Social Research | SCP has been engaged in one of the first experiments involving the use of a smartphone app to collect time use data recorded by means of an electronic diary. Is it

  16. Computers, Laptops and Tools. ACER Research Monograph No. 56.

    Science.gov (United States)

    Ainley, Mary; Bourke, Valerie; Chatfield, Robert; Hillman, Kylie; Watkins, Ian

    In 1997, Balwyn High School (Australia) instituted a class of 28 Year 7 students to use laptop computers across the curriculum. This report details findings from an action research project that monitored important aspects of what happened when this program was introduced. A range of measures was developed to assess the influence of the use of…

  17. Administrative Data Linkage as a Tool for Child Maltreatment Research

    Science.gov (United States)

    Brownell, Marni D.; Jutte, Douglas P.

    2013-01-01

    Linking administrative data records for the same individuals across services and over time offers a powerful, population-wide resource for child maltreatment research that can be used to identify risk and protective factors and to examine outcomes. Multistage de-identification processes have been developed to protect privacy and maintain…

  18. TPACK: An Emerging Research and Development Tool for Teacher Educators

    Science.gov (United States)

    Baran, Evrim; Chuang, Hsueh-Hua; Thompson, Ann

    2011-01-01

    TPACK (technological pedagogical content knowledge) has emerged as a clear and useful construct for researchers working to understand technology integration in learning and teaching. Whereas first generation TPACK work focused upon explaining and interpreting the construct, TPACK has now entered a second generation where the focus is upon using…

  19. "Intelligent" tools for workflow process redesign : a research agenda

    NARCIS (Netherlands)

    Netjes, M.; Vanderfeesten, I.T.P.; Reijers, H.A.; Bussler, C.; Haller, A.

    2006-01-01

    Although much attention is being paid to business processes during the past decades, the design of business processes and particularly workflow processes is still more art than science. In this workshop paper, we present our view on modeling methods for workflow processes and introduce our research

  20. Microelectrodes as novel research tools for environmental biofilm studies

    International Nuclear Information System (INIS)

    Yu, T.; Lu, R.; Bishop, L.

    2002-01-01

    Biofilm processes are widely utilized in environmental engineering for biodegradation of contaminated waters, gases and soils. It is important to understand the structure and functions of biofilms. Microelectrodes are novel experimental tools for environmental biofilm studies. The authors reviewed the techniques of oxygen, sulfide, redox potential and pH microelectrode. These microelectrodes have tip diameters of 3 to 20 μm, resulting a high spatial resolution. They enable us directly measure the chemical conditions as results of microbial activities in biofilms. The authors also reported the laboratory and field studies of wastewater biofilms using microelectrode techniques. The results of these studies provided experimental evidence on the stratification of microbial processes and the associated redox potential change in wastewater biofilms: (1) The oxygen penetration depth was only a fraction of the biofilm thickness. This observation, first made under laboratory conditions, has been confirmed under field conditions. (2) The biofilms with both aerobic oxidation and sulfate reduction had a clearly stratified structure. This was evidenced by a sharp decrease of redox potential near the interface between the aerobic zone and the sulfate reduction zone within the biofilm. In this type of biofilms, aerobic oxidation took place only in a shallow layer near the biofilm surface and sulfate reduction occurred in the deeper anoxic zone. (3) The redox potential changed with the shift of primary microbial process in biofilms, indicating that it is possible to use redox potential to help illustrate the structure and functions of biofilms. (author)

  1. Optimization of IC/HPLC as a rapid analytical tool for characterization of total impurities in UO2

    International Nuclear Information System (INIS)

    Kelkar, A.G.; Kapoor, Y.S.; Mahanty, B.N.; Fulzele, A.K.; Mallik, G.K.

    2007-01-01

    Use of ion chromatography in the determination of metallic and non metallic impurities has been studied and observed to be very satisfactory. In the present paper the total time was monitored in all these experiments and compared with the conventional analytical techniques. (author)

  2. Methodological framework, analytical tool and database for the assessment of climate change impacts, adaptation and vulnerability in Denmark

    DEFF Research Database (Denmark)

    Kaspersen, Per Skougaard; Halsnæs, Kirsten; Gregg, Jay Sterling

    . The project is one of seven initiatives proposed by KFT for 2012. The methodology report includes definitions of major concepts, an outline of an analytical structure, a presentation of models and their applicability, and the results of case studies. The work presented in this report draws on intensive...

  3. Tools for Reproducibility and Extensibility in Scientific Research

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    Open inquiry through reproducing results is fundamental to the scientific process. Contemporary research relies on software engineering pipelines to collect, process, and analyze data. The open source projects within Project Jupyter facilitate these objectives by bringing software engineering within the context of scientific communication. We will highlight specific projects that are computational building blocks for scientific communication, starting with the Jupyter Notebook. We will also explore applications of projects that build off of the Notebook such as Binder, JupyterHub, and repo2docker. We will discuss how these projects can individually and jointly improve reproducibility in scientific communication. Finally, we will demonstrate applications of Jupyter software that allow researchers to build upon the code of other scientists, both to extend their work and the work of others.    There will be a follow-up demo session in the afternoon, hosted by iML. Details can be foun...

  4. The Systems Biology Research Tool: evolvable open-source software

    OpenAIRE

    Wright, J; Wagner, A

    2008-01-01

    Abstract Background Research in the field of systems biology requires software for a variety of purposes. Software must be used to store, retrieve, analyze, and sometimes even to collect the data obtained from system-level (often high-throughput) experiments. Software must also be used to implement mathematical models and algorithms required for simulation and theoretical predictions on the system-level. Results We introduce a free, easy-to-use, open-source, integrated software platform calle...

  5. A vibrating quartz fork - a tool for cryogenic helium research

    Czech Academy of Sciences Publication Activity Database

    Blažková, Michaela; Člověčko, M.; Eltsov, V. B.; Gažo, E.; de Graaf, R.; Hosio, J.J.; Krusius, M.; Schmoranzer, D.; Schoepe, W.; Skrbek, Ladislav; Skyba, P.; Solntsev, R.E.; Vinen, W. F.

    2008-01-01

    Roč. 150, - (2008), s. 525-535 ISSN 0022-2291 R&D Projects: GA ČR GA202/05/0218 Grant - others:GAUK(CZ) 7953/2007; Transnational Access Programme(XE) RITA -CT-2003-505313 Institutional research plan: CEZ:AV0Z10100520 Keywords : normal 3He * superfluid 3He * superfluid 4He * turbulence, * cavitation * quartz tuning fork Subject RIV: BK - Fluid Dynamics Impact factor: 1.034, year: 2008

  6. Modelling as an indispensible research tool in the information society.

    Science.gov (United States)

    Bouma, Johan

    2016-04-01

    Science and society would be well advised to develop a different relationship as the information revolution penetrates all aspects of modern life. Rather than produce clear answers to clear questions in a top-down manner, land-use issues related to the UN Sustainable Development Goals (SDGs) present "wicked"problems involving different, strongly opiniated, stakeholders with conflicting ideas and interests and risk-averse politicians. The Dutch government has invited its citizens to develop a "science agenda", defining future research needs, implicitly suggesting that the research community is unable to do so. Time, therefore, for a pro-active approach to more convincingly define our:"societal license to research". For soil science this could imply a focus on the SDGs , considering soils as living, characteristically different, dynamic bodies in a landscape, to be mapped in ways that allow generation of suitable modelling data. Models allow a dynamic characterization of water- and nutrient regimes and plant growth in soils both for actual and future conditions, reflecting e.g. effects of climate or land-use change or alternative management practices. Engaging modern stakeholders in a bottom-up manner implies continuous involvement and "joint learning" from project initiation to completion, where modelling results act as building blocks to explore alternative scenarios. Modern techniques allow very rapid calculations and innovative visualization. Everything is possible but only modelling can articulate the economic, social and environmental consequences of each scenario, demonstrating in a pro-active manner the crucial and indispensible role of research. But choices are to be made by stakeholders and reluctant policy makers and certainly not by scientists who should carefully guard their independance. Only clear results in the end are convincing proof for the impact of science, requiring therefore continued involvement of scientists up to the very end of projects. To

  7. Use of nuclear and related analytical techniques in environmental research as exemplified by selected air pollution studies

    International Nuclear Information System (INIS)

    Smodis, B.; Jacimovic, R.; Jeran, Z.; Stropnik, B.; Svetina, M.

    2000-01-01

    Among nuclear and nuclear related analytical techniques, neutron activation analysis and X-ray fluorescence spectrometry proved to be particularly useful for environmental studies owing to their nondestructive character and multi element capability. This paper emphasizes their importance among other multielement analytical methods by discussing their specific role due to specific physics basis, quite different to other destructive non-nuclear methods, and by summarizing results obtained in several studies related to air pollution research, including analyses of airborne particulate matter, water samples, lichens and mosses. (author)

  8. NASA Global Hawk: A New Tool for Earth Science Research

    Science.gov (United States)

    Hall, Phill

    2009-01-01

    This slide presentation reviews the Global Hawk, a unmanned aerial vehicle (UAV) that NASA plans to use for Earth Sciences research. The Global Hawk is the world's first fully autonomous high-altitude, long-endurance aircraft, and is capable of conducting long duration missions. Plans are being made for the use of the aircraft on missions in the Arctic, Pacific and Western Atlantic Oceans. There are slides showing the Global Hawk Operations Center (GHOC), Flight Control and Air Traffic Control Communications Architecture, and Payload Integration and Accommodations on the Global Hawk. The first science campaign, planned for a study of the Pacific Ocean, is reviewed.

  9. Digital storytelling: an innovative tool for practice, education, and research.

    Science.gov (United States)

    Lal, Shalini; Donnelly, Catherine; Shin, Jennifer

    2015-01-01

    Digital storytelling is a method of using storytelling, group work, and modern technology to facilitate the creation of 2-3 minute multi-media video clips to convey personal or community stories. Digital storytelling is being used within the health care field; however, there has been limited documentation of its application within occupational therapy. This paper introduces digital storytelling and proposes how it can be applied in occupational therapy clinical practice, education, and research. The ethical and methodological challenges in relation to using the method are also discussed.

  10. The NASA Human Research Wiki - An Online Collaboration Tool

    Science.gov (United States)

    Barr, Yael; Rasbury, Jack; Johnson, Jordan; Barstend, Kristina; Saile, Lynn; Watkins, Sharmi

    2012-01-01

    The Exploration Medical Capability (ExMC) element is one of six elements of the Human Research Program (HRP). ExMC is charged with decreasing the risk of: "Inability to adequately recognize or treat an ill or injured crew member" for exploration-class missions In preparation for exploration-class missions, ExMC has compiled a large evidence base, previously available only to persons within the NASA community. ExMC has developed the "NASA Human Research Wiki" in an effort to make the ExMC information available to the general public and increase collaboration within and outside of NASA. The ExMC evidence base is comprised of several types of data, including: (1)Information on more than 80 medical conditions which could occur during space flight (a)Derived from several sources (b)Including data on incidence and potential outcomes, as captured in the Integrated Medical Model s (IMM) Clinical Finding Forms (CliFFs). (2)Approximately 25 gap reports (a)Identify any "gaps" in knowledge and/or technology that would need to be addressed in order to provide adequate medical support for these novel missions.

  11. A tool for assessing the feasibility of comparative effectiveness research

    Directory of Open Access Journals (Sweden)

    Walker AM

    2013-01-01

    Full Text Available Alexander M Walker,1 Amanda R Patrick,2 Michael S Lauer,3 Mark C Hornbrook,4 Matthew G Marin,5 Richard Platt,6 Véronique L Roger,7 Paul Stang,8 Sebastian Schneeweiss21World Health Information Science Consultants, Newton, MA; 2Division of Pharmacoepidemiology and Pharmacoeconomics, Brigham and Women's Hospital, Boston, MA; 3National Heart, Lung, and Blood Institute, National Institutes of Health, Bethesda, MD; 4The Center for Health Research, Kaiser Permanente Northwest, Portland, OR; 5Department of Medicine, New Jersey Medical School, Newark, NJ; 6Department of Population Medicine, Harvard Pilgrim Health Care Institute and Harvard Medical School, Boston, MA; 7Department of Health Sciences Research, Mayo Clinic, Rochester, MN; 8Johnson and Johnson Pharmaceutical Research and Development, Titusville, NJ, USABackground: Comparative effectiveness research (CER provides actionable information for health care decision-making. Randomized clinical trials cannot provide the patients, time horizons, or practice settings needed for all required CER. The need for comparative assessments and the infeasibility of conducting randomized clinical trials in all relevant areas is leading researchers and policy makers to non-randomized, retrospective CER. Such studies are possible when rich data exist on large populations receiving alternative therapies that are used as-if interchangeably in clinical practice. This setting we call “empirical equipoise.”Objectives: This study sought to provide a method for the systematic identification of settings it in which it is empirical equipoise that offers promised non-randomized CER.Methods: We used a standardizing transformation of the propensity score called “preference” to assess pairs of common treatments for uncomplicated community-acquired pneumonia and new-onset heart failure in a population of low-income elderly people in Pennsylvania, for whom we had access to de-identified insurance records. Treatment

  12. Performance calculations for battery power supplies as laboratory research tools

    International Nuclear Information System (INIS)

    Scanlon, J.J.; Rolader, G.E.; Jamison, K.A.; Petresky, H.

    1991-01-01

    Electromagnetic Launcher (EML) research at the Air Force Armament Laboratory, Hypervelocity Launcher Branch (AFATL/SAH), Eglin AFB, has focused on developing the technologies required for repetitively launching several kilogram payloads to high velocities. Previous AFATL/SAH experiments have been limited by the available power supply resulting in small muzzle energies on the order of 100's of kJ. In an effort to advance the development of EML's, AFATL/SAH has designed and constructed a battery power supply (BPS) capable of providing several mega-Amperes of current for several seconds. This system consists of six modules each containing 2288 automotive batteries which may be connected in two different series - parallel arrangements. In this paper the authors define the electrical characteristics of the AFATL Battery Power supply at the component level

  13. Electrostatic Levitation: A Tool to Support Materials Research in Microgravity

    Science.gov (United States)

    Rogers, Jan; SanSoucie, Mike

    2012-01-01

    Containerless processing represents an important topic for materials research in microgravity. Levitated specimens are free from contact with a container, which permits studies of deeply undercooled melts, and high-temperature, highly reactive materials. Containerless processing provides data for studies of thermophysical properties, phase equilibria, metastable state formation, microstructure formation, undercooling, and nucleation. The European Space Agency (ESA) and the German Aerospace Center (DLR) jointly developed an electromagnetic levitator facility (MSL-EML) for containerless materials processing in space. The electrostatic levitator (ESL) facility at the Marshall Space Flight Center provides support for the development of containerless processing studies for the ISS. Apparatus and techniques have been developed to use the ESL to provide data for phase diagram determination, creep resistance, emissivity, specific heat, density/thermal expansion, viscosity, surface tension and triggered nucleation of melts. The capabilities and results from selected ESL-based characterization studies performed at NASA's Marshall Space Flight Center will be presented.

  14. FACTORIAL CORRESPONDENCES ANALYSIS – A TOOL IN TOURISM MOTIVATION RESEARCH

    Directory of Open Access Journals (Sweden)

    Ion Danut I. JUGANARU

    2016-05-01

    Full Text Available This study aims at analyzing the distribution of tourist flows in 2014, from 25 European countries, on three main categories of trip purposes, and assumes that there are differences or similarities between the tourists’ countries of residence and their trip purposes. "Purpose'' is a multidimensional concept used in marketing research, most often for understanding consumer behavior, and for identifying market segments or customer target groups, reunited in terms of similar characteristics. Being aware that the decision of choice/ purchase is based on purposes, their knowledge proves useful in designing strategies to increase the satisfaction level provided to the customer. The statistical method used in this paper is the factorial correspondences analysis. In our opinion, the identification, by this method, of the existence of differences or similarities between the tourists’ countries of residence and their trip purposes can represent a useful step in studying the tourism market and the choice/ reformulation of strategies.

  15. Consensus Coding as a Tool in Visual Appearance Research

    Directory of Open Access Journals (Sweden)

    D R Simmons

    2011-04-01

    Full Text Available A common problem in visual appearance research is how to quantitatively characterise the visual appearance of a region of an image which is categorised by human observers in the same way. An example of this is scarring in medical images (Ayoub et al, 2010, The Cleft-Palate Craniofacial Journal, in press. We have argued that “scarriness” is itself a visual appearance descriptor which summarises the distinctive combination of colour, texture and shape information which allows us to distinguish scarred from non-scarred tissue (Simmons et al, ECVP 2009. Other potential descriptors for other image classes would be “metallic”, “natural”, or “liquid”. Having developed an automatic algorithm to locate scars in medical images, we then tested “ground truth” by asking untrained observers to draw around the region of scarring. The shape and size of the scar on the image was defined by building a contour plot of the agreement between observers' outlines and thresholding at the point above which 50% of the observers agreed: a consensus coding scheme. Based on the variability in the amount of overlap between the scar as defined by the algorithm, and the consensus scar of the observers, we have concluded that the algorithm does not completely capture the putative appearance descriptor “scarriness”. A simultaneous analysis of qualitative descriptions of the scarring by the observers revealed that other image features than those encoded by the algorithm (colour and texture might be important, such as scar boundary shape. This approach to visual appearance research in medical imaging has potential applications in other application areas, such as botany, geology and archaeology.

  16. A validated set of tool pictures with matched objects and non-objects for laterality research.

    Science.gov (United States)

    Verma, Ark; Brysbaert, Marc

    2015-01-01

    Neuropsychological and neuroimaging research has established that knowledge related to tool use and tool recognition is lateralized to the left cerebral hemisphere. Recently, behavioural studies with the visual half-field technique have confirmed the lateralization. A limitation of this research was that different sets of stimuli had to be used for the comparison of tools to other objects and objects to non-objects. Therefore, we developed a new set of stimuli containing matched triplets of tools, other objects and non-objects. With the new stimulus set, we successfully replicated the findings of no visual field advantage for objects in an object recognition task combined with a significant right visual field advantage for tools in a tool recognition task. The set of stimuli is available as supplemental data to this article.

  17. Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Tengfang [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Flapper, Joris [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ke, Jing [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Kramer, Klaas [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sathaye, Jayant [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-02-01

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry – including four dairy processes – cheese, fluid milk, butter, and milk powder.

  18. Investigation of material systems in industry and research by organic analytical mass spectrometer

    International Nuclear Information System (INIS)

    Decsy, Z.

    1980-01-01

    The modern, many-sided and efficient organic analytical mass spectrometer possesses all the structure-and composition-examination possibilities of complex organic analytical laboratories. The article presents the advantages and possibilities of the application of mass spectrometer in different operation modes in connection with the examination of a petrochemical synthesis product: ortho-phenylene-diamine, an experimental gas odorizing material, a petroleum production auxiliary material: petroleum sulfonate, a gasoline sample and a sulfur-containing standard substance. The useful operation modes include spectrum records of low and high resolution, the application of space ionization and space desorption ion sources as well as the ''mass fragmentographic'' measuring method. (author)

  19. Raising Reliability of Web Search Tool Research through Replication and Chaos Theory

    OpenAIRE

    Nicholson, Scott

    1999-01-01

    Because the World Wide Web is a dynamic collection of information, the Web search tools (or "search engines") that index the Web are dynamic. Traditional information retrieval evaluation techniques may not provide reliable results when applied to the Web search tools. This study is the result of ten replications of the classic 1996 Ding and Marchionini Web search tool research. It explores the effects that replication can have on transforming unreliable results from one iteration into replica...

  20. Research on Key Technologies of Unit-Based CNC Machine Tool Assembly Design

    OpenAIRE

    Zhongqi Sheng; Lei Zhang; Hualong Xie; Changchun Liu

    2014-01-01

    Assembly is the part that produces the maximum workload and consumed time during product design and manufacturing process. CNC machine tool is the key basic equipment in manufacturing industry and research on assembly design technologies of CNC machine tool has theoretical significance and practical value. This study established a simplified ASRG for CNC machine tool. The connection between parts, semantic information of transmission, and geometric constraint information were quantified to as...

  1. Patient registries: useful tools for clinical research in myasthenia gravis.

    Science.gov (United States)

    Baggi, Fulvio; Mantegazza, Renato; Antozzi, Carlo; Sanders, Donald

    2012-12-01

    Clinical registries may facilitate research on myasthenia gravis (MG) in several ways: as a source of demographic, clinical, biological, and immunological data on large numbers of patients with this rare disease; as a source of referrals for clinical trials; and by allowing rapid identification of MG patients with specific features. Physician-derived registries have the added advantage of incorporating diagnostic and treatment data that may allow comparison of outcomes from different therapeutic approaches, which can be supplemented with patient self-reported data. We report the demographic analysis of MG patients in two large physician-derived registries, the Duke MG Patient Registry, at the Duke University Medical Center, and the INNCB MG Registry, at the Istituto Neurologico Carlo Besta, as a preliminary study to assess the consistency of the two data sets. These registries share a common structure, with an inner core of common data elements (CDE) that facilitate data analysis. The CDEs are concordant with the MG-specific CDEs developed under the National Institute of Neurological Disorders and Stroke Common Data Elements Project. © 2012 New York Academy of Sciences.

  2. Comparison of nuclear safety research reactor (TRIGA-ACPR) performance with analytical prediction

    International Nuclear Information System (INIS)

    West, G.B.; Whittemore, W.L.

    1976-01-01

    The NSRR was taken critical on June 30, 1975 at the Japan Atomic Energy Research Institute - Tokai-mura, Japan. Following initial core loading and control rod calibration, a series of pulsing tests was performed to characterize the performance of the reactor. A comparison has been made of performance parameters actually measured in the 157 element core versus predicted values based upon design analyses. The nuclear parameters measured were quite close to prediction. A $4.70 pulse produced a minimum period of 1.12 msec, a peak power of 20,500 MW and yielded a prompt energy release of 103 MW-sec. Pulse tests with experimental UO 2 fuel pins in the central irradiation cavity have produced 320 cal/gm, averaged at the axial center of 10% enriched UO 2 , for a 100 MW-sec pulse. The pulse rods for the NSRR contain B 4 C enriched to about 93 percent in Boron-10 in order to achieve maximum design performance with only three pulse rods. The total worth for the three transient rods was measured to be about $5.05 (vs $5.07 calculated for the 165 element core), thus verifying the effectiveness of the Boron-10 enrichment to achieve the desired result. Analysis of fuel temperature measurements made in the NSRR show that, for fuel temperatures produced during pulsing greater than 900 deg. C, heat transfer in the 0.010-inch gap between fuel and clad is enhanced by the minor outgassing of hydrogen which is characteristic of that temperature region. The hydrogen is normally all reabsorbed within about 100 sec of maximum temperature, at which time the heat transfer is characteristic of air (or argon) in the gap. In some of the temperature-instrumented elements, however, all of the hydrogen was not reabsorbed and as a result these elements gave significantly lower temperatures for high power steady state operation than were recorded prior to pulsing. In general, the NSRR parameters measured during startup were quite close to analytical prediction and the overall performance of the

  3. Emerging imaging tools for use with traumatic brain injury research.

    Science.gov (United States)

    Hunter, Jill V; Wilde, Elisabeth A; Tong, Karen A; Holshouser, Barbara A

    2012-03-01

    This article identifies emerging neuroimaging measures considered by the inter-agency Pediatric Traumatic Brain Injury (TBI) Neuroimaging Workgroup. This article attempts to address some of the potential uses of more advanced forms of imaging in TBI as well as highlight some of the current considerations and unresolved challenges of using them. We summarize emerging elements likely to gain more widespread use in the coming years, because of 1) their utility in diagnosis, prognosis, and understanding the natural course of degeneration or recovery following TBI, and potential for evaluating treatment strategies; 2) the ability of many centers to acquire these data with scanners and equipment that are readily available in existing clinical and research settings; and 3) advances in software that provide more automated, readily available, and cost-effective analysis methods for large scale data image analysis. These include multi-slice CT, volumetric MRI analysis, susceptibility-weighted imaging (SWI), diffusion tensor imaging (DTI), magnetization transfer imaging (MTI), arterial spin tag labeling (ASL), functional MRI (fMRI), including resting state and connectivity MRI, MR spectroscopy (MRS), and hyperpolarization scanning. However, we also include brief introductions to other specialized forms of advanced imaging that currently do require specialized equipment, for example, single photon emission computed tomography (SPECT), positron emission tomography (PET), encephalography (EEG), and magnetoencephalography (MEG)/magnetic source imaging (MSI). Finally, we identify some of the challenges that users of the emerging imaging CDEs may wish to consider, including quality control, performing multi-site and longitudinal imaging studies, and MR scanning in infants and children.

  4. Data Linkage: A powerful research tool with potential problems

    Directory of Open Access Journals (Sweden)

    Scott Ian

    2010-12-01

    Full Text Available Abstract Background Policy makers, clinicians and researchers are demonstrating increasing interest in using data linked from multiple sources to support measurement of clinical performance and patient health outcomes. However, the utility of data linkage may be compromised by sub-optimal or incomplete linkage, leading to systematic bias. In this study, we synthesize the evidence identifying participant or population characteristics that can influence the validity and completeness of data linkage and may be associated with systematic bias in reported outcomes. Methods A narrative review, using structured search methods was undertaken. Key words "data linkage" and Mesh term "medical record linkage" were applied to Medline, EMBASE and CINAHL databases between 1991 and 2007. Abstract inclusion criteria were; the article attempted an empirical evaluation of methodological issues relating to data linkage and reported on patient characteristics, the study design included analysis of matched versus unmatched records, and the report was in English. Included articles were grouped thematically according to patient characteristics that were compared between matched and unmatched records. Results The search identified 1810 articles of which 33 (1.8% met inclusion criteria. There was marked heterogeneity in study methods and factors investigated. Characteristics that were unevenly distributed among matched and unmatched records were; age (72% of studies, sex (50% of studies, race (64% of studies, geographical/hospital site (93% of studies, socio-economic status (82% of studies and health status (72% of studies. Conclusion A number of relevant patient or population factors may be associated with incomplete data linkage resulting in systematic bias in reported clinical outcomes. Readers should consider these factors in interpreting the reported results of data linkage studies.

  5. Evaluation of Four Different Analytical Tools to Determine the Regional Origin of Gastrodia elata and Rehmannia glutinosa on the Basis of Metabolomics Study

    Directory of Open Access Journals (Sweden)

    Dong-Kyu Lee

    2014-05-01

    Full Text Available Chemical profiles of medicinal plants could be dissimilar depending on the cultivation environments, which may influence their therapeutic efficacy. Accordingly, the regional origin of the medicinal plants should be authenticated for correct evaluation of their medicinal and market values. Metabolomics has been found very useful for discriminating the origin of many plants. Choosing the adequate analytical tool can be an essential procedure because different chemical profiles with different detection ranges will be produced according to the choice. In this study, four analytical tools, Fourier transform near‑infrared spectroscopy (FT-NIR, 1H-nuclear magnetic resonance spectroscopy (1H‑NMR, liquid chromatography-mass spectrometry (LC-MS, and gas chromatography-mass spectroscopy (GC-MS were applied in parallel to the same samples of two popular medicinal plants (Gastrodia elata and Rehmannia glutinosa cultivated either in Korea or China. The classification abilities of four discriminant models for each plant were evaluated based on the misclassification rate and Q2 obtained from principal component analysis (PCA and orthogonal projection to latent structures-discriminant analysis (OPLS‑DA, respectively. 1H-NMR and LC-MS, which were the best techniques for G. elata and R. glutinosa, respectively, were generally preferable for origin discrimination over the others. Reasoned by integrating all the results, 1H-NMR is the most prominent technique for discriminating the origins of two plants. Nonetheless, this study suggests that preliminary screening is essential to determine the most suitable analytical tool and statistical method, which will ensure the dependability of metabolomics-based discrimination.

  6. Positioning Mentoring as a Coach Development Tool: Recommendations for Future Practice and Research

    Science.gov (United States)

    McQuade, Sarah; Davis, Louise; Nash, Christine

    2015-01-01

    Current thinking in coach education advocates mentoring as a development tool to connect theory and practice. However, little empirical evidence exists to evaluate the effectiveness of mentoring as a coach development tool. Business, education, and nursing precede the coaching industry in their mentoring practice, and research findings offered in…

  7. The "Metaphorical Collage" as a Research Tool in the Field of Education

    Science.gov (United States)

    Russo-Zimet, Gila

    2016-01-01

    The aim of this paper is to propose a research tool in the field of education--the "metaphorical collage." This tool facilitates the understanding of concepts and processes in education through the analysis of metaphors in collage works that include pictorial images and verbal images. We believe the "metaphorical collage" to be…

  8. ENVIRONMENTAL RESEARCH BRIEF : ANALYTIC ELEMENT MODELING OF GROUND-WATER FLOW AND HIGH PERFORMANCE COMPUTING

    Science.gov (United States)

    Several advances in the analytic element method have been made to enhance its performance and facilitate three-dimensional ground-water flow modeling in a regional aquifer setting. First, a new public domain modular code (ModAEM) has been developed for modeling ground-water flow ...

  9. Satellite telemetry: A new tool for wildlife research and management

    Science.gov (United States)

    Fancy, Steven G.; Pank, Larry F.; Douglas, David C.; Curby, Catherine H.; Garner, Gerald W.; Amstrup, Steven C.; Regelin, Wayne L.

    1998-01-01

    operation, the UHF (ultra-high frequency) signal failed on three of 32 caribou transmitters and 10 of 36 polar bear transmitters.A geographic information system (GIS) incorporating other databases (e.g., land cover, elevation, slope, aspect, hydrology, ice distribution) was used to analyze and display detailed locational and behavioral data collected via satellite. Examples of GIS applications to research projects using satellite telemetry and examples of detailed movement patterns of caribou and polar bears are presented. This report includes documentation for computer software packages for processing Argos data and presents developments, as of March 1987, in transmitter design, data retrieval using a local user terminal, computer software, and sensor development and calibration.

  10. Advancing Research in Second Language Writing through Computational Tools and Machine Learning Techniques: A Research Agenda

    Science.gov (United States)

    Crossley, Scott A.

    2013-01-01

    This paper provides an agenda for replication studies focusing on second language (L2) writing and the use of natural language processing (NLP) tools and machine learning algorithms. Specifically, it introduces a range of the available NLP tools and machine learning algorithms and demonstrates how these could be used to replicate seminal studies…

  11. Coding for Language Complexity: The Interplay among Methodological Commitments, Tools, and Workflow in Writing Research

    Science.gov (United States)

    Geisler, Cheryl

    2018-01-01

    Coding, the analytic task of assigning codes to nonnumeric data, is foundational to writing research. A rich discussion of methodological pluralism has established the foundational importance of systematicity in the task of coding, but less attention has been paid to the equally important commitment to language complexity. Addressing the interplay…

  12. Welfare assessment in porcine biomedical research – Suggestion for an operational tool

    DEFF Research Database (Denmark)

    Søndergaard, Lene Vammen; Dagnæs-Hansen, Frederik; Herskin, Mette S

    2011-01-01

    of the extent of welfare assessment in pigs used in biomedical research and to suggest a welfare assessment standard for research facilities based on an exposition of ethological considerations relevant for the welfare of pigs in biomedical research. The tools for porcine welfare assessment presented suggest...

  13. Conceptualising the Use of Facebook in Ethnographic Research: As Tool, as Data and as Context

    Science.gov (United States)

    Baker, Sally

    2013-01-01

    This article proposes a three-part conceptualisation of the use of Facebook in ethnographic research: as a tool, as data and as context. Longitudinal research with young adults at a time of significant change provides many challenges for the ethnographic researcher, such as maintaining channels of communication and high rates of participant…

  14. The Value of Decision Analytical Modeling in Surgical Research: An Example of Laparoscopic Versus Open Distal Pancreatectomy

    NARCIS (Netherlands)

    Tax, Casper; Govaert, Paulien H. M.; Stommel, Martijn W. J.; Besselink, Marc G. H.; Gooszen, Hein G.; Rovers, Maroeska M.

    2017-01-01

    To illustrate how decision modeling may identify relevant uncertainty and can preclude or identify areas of future research in surgery. To optimize use of research resources, a tool is needed that assists in identifying relevant uncertainties and the added value of reducing these uncertainties. The

  15. MoDOT pavement preservation research program volume IV, pavement evaluation tools-data collection methods.

    Science.gov (United States)

    2015-10-01

    The overarching goal of the MoDOT Pavement Preservation Research Program, Task 3: Pavement Evaluation Tools Data : Collection Methods was to identify and evaluate methods to rapidly obtain network-level and project-level information relevant to :...

  16. International Conference on Analytic and Algebraic Geometry held at the Tata Institute of Fundamental Research and the University of Hyderabad

    CERN Document Server

    Biswas, Indranil; Morye, Archana; Parameswaran, A

    2017-01-01

    This volume is an outcome of the International conference held in Tata Institute of Fundamental Research and the University of Hyderabad. There are fifteen articles in this volume. The main purpose of the articles is to introduce recent and advanced techniques in the area of analytic and algebraic geometry. This volume attempts to give recent developments in the area to target mainly young researchers who are new to this area. Also, some research articles have been added to give examples of how to use these techniques to prove new results.

  17. Analytical tools for solitons and periodic waves corresponding to phonons on Lennard-Jones lattices in helical proteins

    DEFF Research Database (Denmark)

    D'ovidio, Francesco; Bohr, Henrik; Lindgård, Per-Anker

    2005-01-01

    We study the propagation of solitons along the hydrogen bonds of an alpha helix. Modeling the hydrogen and peptide bonds with Lennard-Jones potentials, we show that the solitons can appear spontaneously and have long lifetimes. Remarkably, even if no explicit solution is known for the Lennard-Jones...... potential, the solitons can be characterized analytically with a good quantitative agreement using formulas for a Toda potential with parameters fitted to the Lennard-Jones potential. We also discuss and show the robustness of the family of periodic solutions called cnoidal waves, corresponding to phonons...

  18. CORE SIM: A multi-purpose neutronic tool for research and education

    International Nuclear Information System (INIS)

    Demaziere, Christophe

    2011-01-01

    Highlights: → A highly flexible neutronic core simulator was developed. → The tool estimates the static neutron flux, the eigenmodes, and the neutron noise. → The tool was successfully validated via many benchmark cases. → The tool can be used for research and education. → The tool is freely available. - Abstract: This paper deals with the development, validation, and demonstration of an innovative neutronic tool. The novelty of the tool resides in its versatility, since many different systems can be investigated and different kinds of calculations can be performed. More precisely, both critical systems and subcritical systems with an external neutron source can be studied, and static and dynamic cases in the frequency domain (i.e. for stationary fluctuations) can be considered. In addition, the tool has the ability to determine the different eigenfunctions of any nuclear core. For each situation, the static neutron flux, the different eigenmodes and eigenvalues, the first-order neutron noise, and their adjoint functions are estimated, as well as the effective multiplication factor of the system. The main advantages of the tool, which is entirely MatLab based, lie with the robustness of the implemented numerical algorithms, its high portability between different computer platforms and operative systems, and finally its ease of use since no input deck writing is required. The present version of the tool, which is based on two-group diffusion theory, is mostly suited to investigate thermal systems. The definition of both the static and dynamic core configurations directly from the static macroscopic cross-sections and their fluctuations, respectively, makes the tool particularly well suited for research and education. Some of the many benchmark cases used to validate the tool are briefly reported. The static and dynamic capabilities of the tool are also demonstrated for the following configurations: a vibrating control rod, a perturbation traveling upwards

  19. Impact of design research on industrial practice tools, technology, and training

    CERN Document Server

    Lindemann, Udo

    2016-01-01

    Showcasing exemplars of how various aspects of design research were successfully transitioned into and influenced, design practice, this book features chapters written by eminent international researchers and practitioners from industry on the Impact of Design Research on Industrial Practice. Chapters written by internationally acclaimed researchers of design analyse the findings (guidelines, methods and tools), technologies/products and educational approaches that have been transferred as tools, technologies and people to transform industrial practice of engineering design, whilst the chapters that are written by industrial practitioners describe their experience of how various tools, technologies and training impacted design practice. The main benefit of this book, for educators, researchers and practitioners in (engineering) design, will be access to a comprehensive coverage of case studies of successful transfer of outcomes of design research into practice; as well as guidelines and platforms for successf...

  20. Big Data Analytics in Healthcare.

    Science.gov (United States)

    Belle, Ashwin; Thiagarajan, Raghuram; Soroushmehr, S M Reza; Navidi, Fatemeh; Beard, Daniel A; Najarian, Kayvan

    2015-01-01

    The rapidly expanding field of big data analytics has started to play a pivotal role in the evolution of healthcare practices and research. It has provided tools to accumulate, manage, analyze, and assimilate large volumes of disparate, structured, and unstructured data produced by current healthcare systems. Big data analytics has been recently applied towards aiding the process of care delivery and disease exploration. However, the adoption rate and research development in this space is still hindered by some fundamental problems inherent within the big data paradigm. In this paper, we discuss some of these major challenges with a focus on three upcoming and promising areas of medical research: image, signal, and genomics based analytics. Recent research which targets utilization of large volumes of medical data while combining multimodal data from disparate sources is discussed. Potential areas of research within this field which have the ability to provide meaningful impact on healthcare delivery are also examined.

  1. Extensions of the Johnson-Neyman Technique to Linear Models with Curvilinear Effects: Derivations and Analytical Tools

    Science.gov (United States)

    Miller, Jason W.; Stromeyer, William R.; Schwieterman, Matthew A.

    2013-01-01

    The past decade has witnessed renewed interest in the use of the Johnson-Neyman (J-N) technique for calculating the regions of significance for the simple slope of a focal predictor on an outcome variable across the range of a second, continuous independent variable. Although tools have been developed to apply this technique to probe 2- and 3-way…

  2. Analytical tools and methodologies for evaluation of residual life of contacting pressure tubes in the early generation of Indian PHWRs

    International Nuclear Information System (INIS)

    Sinha, S.K.; Madhusoodanan, K.; Rupani, B.B.; Sinha, R.K.

    2002-01-01

    In-service life of a contacting Zircaloy-2 pressure tube (PT) in the earlier generation of Indian PHWRs, is limited mainly due to the accelerated hydrogen pick-up and nucleation and growth of hydride blister(s) at the cold spot(s) formed on outside surface of pressure tube as a result of its contact with the calandria tube (CT). The activities involving development of the analytical models for simulating the degradation mechanisms leading to PT-CT contact and the methodologies for the revaluation of their safe life under such condition form the important part of our extensive programme for the life management of contacting pressure tubes. Since after the PT-CT contact, rate of hydrogen pick-up and nucleation and growth of hydride blisters govern the safe residual life of the pressure tube, two analytical models (a) hydrogen pick-up model ('HYCON') and (b) model for the nucleation and growth of hydride blister at the contact spot ('BLIST -2D') have been developed in-house to estimate the extent of degradation caused by them. Along with them, a methodology for evaluation of safe residual life has also been formulated for evaluating the safe residual life of the contacting channels. This paper gives the brief description of the models and the methodologies relevant for the contacting Zircaloy-2 pressure tubes. (author)

  3. A pilot analytic study of a research-level, lower-cost human papillomavirus 16, 18, and 45 test.

    Science.gov (United States)

    Yang, Hannah P; Walmer, David K; Merisier, Delson; Gage, Julia C; Bell, Laura; Rangwala, Sameera; Shrestha, Niwashin; Kobayashi, Lori; Eder, Paul S; Castle, Philip E

    2011-09-01

    The analytic performance of a low-cost, research-stage DNA test for the most carcinogenic human papillomavirus (HPV) genotypes (HPV16, HPV18, and HPV45) in aggregate was evaluated among carcinogenic HPV-positive women, which might be used to decide who needs immediate colposcopy in low-resource settings ("triage test"). We found that HPV16/18/45 test agreed well with two DNA tests, a GP5+/6+ genotyping assay (Kappa = 0.77) and a quantitative PCR assay (at a cutpoint of 5000 viral copies) (Kappa = 0.87). DNA sequencing on a subset of 16 HPV16/18/45 positive and 16 HPV16/18/45 negative verified the analytic specificity of the research test. It is concluded that the HPV16/18/45 assay is a promising triage test with a minimum detection of approximately 5000 viral copies, the clinically relevant threshold. Published by Elsevier B.V.

  4. miRQuest: integration of tools on a Web server for microRNA research.

    Science.gov (United States)

    Aguiar, R R; Ambrosio, L A; Sepúlveda-Hermosilla, G; Maracaja-Coutinho, V; Paschoal, A R

    2016-03-28

    This report describes the miRQuest - a novel middleware available in a Web server that allows the end user to do the miRNA research in a user-friendly way. It is known that there are many prediction tools for microRNA (miRNA) identification that use different programming languages and methods to realize this task. It is difficult to understand each tool and apply it to diverse datasets and organisms available for miRNA analysis. miRQuest can easily be used by biologists and researchers with limited experience with bioinformatics. We built it using the middleware architecture on a Web platform for miRNA research that performs two main functions: i) integration of different miRNA prediction tools for miRNA identification in a user-friendly environment; and ii) comparison of these prediction tools. In both cases, the user provides sequences (in FASTA format) as an input set for the analysis and comparisons. All the tools were selected on the basis of a survey of the literature on the available tools for miRNA prediction. As results, three different cases of use of the tools are also described, where one is the miRNA identification analysis in 30 different species. Finally, miRQuest seems to be a novel and useful tool; and it is freely available for both benchmarking and miRNA identification at http://mirquest.integrativebioinformatics.me/.

  5. New research at Paisley Caves:applying new integrated analytical approaches to understanding stratigraphy, taphonomy, and site formation processes

    OpenAIRE

    Shillito, Lisa-Marie; Blong, John C; Jenkins, Dennis L; Stafford Jr, Thomas W; Whelton, Helen; McDonough, Katelyn; Bull, Ian

    2018-01-01

    Paisley Caves in Oregon has become well known due to early dates, and human presence in the form of coprolites, found to contain ancient human DNA. Questions remain over whether the coprolites themselves are human, or whether the DNA is mobile in the sediments. This brief introduces new research applying an integrated analytical approach combining sediment micromorphology and lipid biomarker analysis, which aims to resolve these problems.

  6. A Visualization Tool for Integrating Research Results at an Underground Mine

    Science.gov (United States)

    Boltz, S.; Macdonald, B. D.; Orr, T.; Johnson, W.; Benton, D. J.

    2016-12-01

    Researchers with the National Institute for Occupational Safety and Health are conducting research at a deep, underground metal mine in Idaho to develop improvements in ground control technologies that reduce the effects of dynamic loading on mine workings, thereby decreasing the risk to miners. This research is multifaceted and includes: photogrammetry, microseismic monitoring, geotechnical instrumentation, and numerical modeling. When managing research involving such a wide range of data, understanding how the data relate to each other and to the mining activity quickly becomes a daunting task. In an effort to combine this diverse research data into a single, easy-to-use system, a three-dimensional visualization tool was developed. The tool was created using the Unity3d video gaming engine and includes the mine development entries, production stopes, important geologic structures, and user-input research data. The tool provides the user with a first-person, interactive experience where they are able to walk through the mine as well as navigate the rock mass surrounding the mine to view and interpret the imported data in the context of the mine and as a function of time. The tool was developed using data from a single mine; however, it is intended to be a generic tool that can be easily extended to other mines. For example, a similar visualization tool is being developed for an underground coal mine in Colorado. The ultimate goal is for NIOSH researchers and mine personnel to be able to use the visualization tool to identify trends that may not otherwise be apparent when viewing the data separately. This presentation highlights the features and capabilities of the mine visualization tool and explains how it may be used to more effectively interpret data and reduce the risk of ground fall hazards to underground miners.

  7. The National Shipbuilding Research Program. Development of a Quick TBT Analytical Method

    Science.gov (United States)

    2000-08-16

    Development of a Quick TBT Analytical Method 09/25/2000 Page 3 of 38 Executive Summary Concern about the toxic effect of tributyltin have caused the...Antifouling Paints on the Environment Tributyl tin ( TBT ) has been shown to be highly toxic to certain aquatic organisms at concentrations measured in the...paints, developed in the 1960s, contains the organotin tributyltin ( TBT ), which has been proven to cause deformations in oysters and sex changes in

  8. Research on bathymetry estimation by Worldview-2 based with the semi-analytical model

    Science.gov (United States)

    Sheng, L.; Bai, J.; Zhou, G.-W.; Zhao, Y.; Li, Y.-C.

    2015-04-01

    South Sea Islands of China are far away from the mainland, the reefs takes more than 95% of south sea, and most reefs scatter over interested dispute sensitive area. Thus, the methods of obtaining the reefs bathymetry accurately are urgent to be developed. Common used method, including sonar, airborne laser and remote sensing estimation, are limited by the long distance, large area and sensitive location. Remote sensing data provides an effective way for bathymetry estimation without touching over large area, by the relationship between spectrum information and bathymetry. Aimed at the water quality of the south sea of China, our paper develops a bathymetry estimation method without measured water depth. Firstly the semi-analytical optimization model of the theoretical interpretation models has been studied based on the genetic algorithm to optimize the model. Meanwhile, OpenMP parallel computing algorithm has been introduced to greatly increase the speed of the semi-analytical optimization model. One island of south sea in China is selected as our study area, the measured water depth are used to evaluate the accuracy of bathymetry estimation from Worldview-2 multispectral images. The results show that: the semi-analytical optimization model based on genetic algorithm has good results in our study area;the accuracy of estimated bathymetry in the 0-20 meters shallow water area is accepted.Semi-analytical optimization model based on genetic algorithm solves the problem of the bathymetry estimation without water depth measurement. Generally, our paper provides a new bathymetry estimation method for the sensitive reefs far away from mainland.

  9. The use of selective adsorbents in capillary electrophoresis-mass spectrometry for analyte preconcentration and microreactions: a powerful three-dimensional tool for multiple chemical and biological applications.

    Science.gov (United States)

    Guzman, N A; Stubbs, R J

    2001-10-01

    Much attention has recently been directed to the development and application of online sample preconcentration and microreactions in capillary electrophoresis using selective adsorbents based on chemical or biological specificity. The basic principle involves two interacting chemical or biological systems with high selectivity and affinity for each other. These molecular interactions in nature usually involve noncovalent and reversible chemical processes. Properly bound to a solid support, an "affinity ligand" can selectively adsorb a "target analyte" found in a simple or complex mixture at a wide range of concentrations. As a result, the isolated analyte is enriched and highly purified. When this affinity technique, allowing noncovalent chemical interactions and biochemical reactions to occur, is coupled on-line to high-resolution capillary electrophoresis and mass spectrometry, a powerful tool of chemical and biological information is created. This paper describes the concept of biological recognition and affinity interaction on-line with high-resolution separation, the fabrication of an "analyte concentrator-microreactor", optimization conditions of adsorption and desorption, the coupling to mass spectrometry, and various applications of clinical and pharmaceutical interest.

  10. Chloride present in biological samples as a tool for enhancement of sensitivity in capillary zone electrophoretic analysis of anionic trace analytes

    Czech Academy of Sciences Publication Activity Database

    Křivánková, Ludmila; Pantůčková, Pavla; Gebauer, Petr; Boček, Petr; Caslavska, J.; Thormann, W.

    2003-01-01

    Roč. 24, č. 3 (2003), s. 505-517 ISSN 0173-0835 R&D Projects: GA ČR GA203/02/0023; GA ČR GA203/01/0401; GA AV ČR IAA4031103 Institutional research plan: CEZ:AV0Z4031919 Keywords : acetoacetate * capillary zone electrophoresis * chloride stacking effects Subject RIV: CB - Analytical Chemistry, Separation Impact factor: 4.040, year: 2003

  11. Developing an analytical tool for evaluating EMS system design changes and their impact on cardiac arrest outcomes: combining geographic information systems with register data on survival rates

    Directory of Open Access Journals (Sweden)

    Sund Björn

    2013-02-01

    Full Text Available Abstract Background Out-of-hospital cardiac arrest (OHCA is a frequent and acute medical condition that requires immediate care. We estimate survival rates from OHCA in the area of Stockholm, through developing an analytical tool for evaluating Emergency Medical Services (EMS system design changes. The study also is an attempt to validate the proposed model used to generate the outcome measures for the study. Methods and results This was done by combining a geographic information systems (GIS simulation of driving times with register data on survival rates. The emergency resources comprised ambulance alone and ambulance plus fire services. The simulation model predicted a baseline survival rate of 3.9 per cent, and reducing the ambulance response time by one minute increased survival to 4.6 per cent. Adding the fire services as first responders (dual dispatch increased survival to 6.2 per cent from the baseline level. The model predictions were validated using empirical data. Conclusion We have presented an analytical tool that easily can be generalized to other regions or countries. The model can be used to predict outcomes of cardiac arrest prior to investment in EMS design changes that affect the alarm process, e.g. (1 static changes such as trimming the emergency call handling time or (2 dynamic changes such as location of emergency resources or which resources should carry a defibrillator.

  12. Are EM's communication tools effective? Evaluation research of two EM publications

    International Nuclear Information System (INIS)

    Wight, Evelyn; Gardner, Gene; Harvey, Tony

    1992-01-01

    As a reflection of its growing culture of openness, and in response to the public's need for accurate information about its activities, the U.S. Department of Energy (DOE) Office of the Assistant Secretary for Environmental Restoration and Waste Management (EM) has increased the amount of information available to the public through communication tools such as brochures, fact sheets, and a travelling exhibit with an interactive computer display. Our involvement with this effort has been to design, develop, and critique booklets, brochures, fact sheets and other communication tools for EM. This paper presents an evaluation of the effectiveness of two communication tools we developed: the EM Booklet and the EM Fact Sheets. We measured effectiveness using non-parametric testing. This paper describes DOE's culture change, EM's communication tools and their context within DOE'S new open culture, our research, test methods and results, the significance of our research, and our plans for future research. (author)

  13. Leningrad NPP full scope and analytical simulators as tools for MMI improvement and operator support systems development and testing

    International Nuclear Information System (INIS)

    Rakitin, I.D.; Malkin, S.D.; Shalia, V.V.; Fedorov, E.M.; Lebedev, N.N.; Khoudiakov, M.M.

    1999-01-01

    Training Support Center (TSC) created at the Leningrad NPP (LNPP), Sosnovy Bor, Russia, incorporates full-scope and analytical simulators working in parallel with the prototypes of the expert and interactive systems to provide a new scope of R and D MMI improvement work as for the developer as well as for the user. Possibilities of development, adjusting and testing of any new or up-graded Operators' Support System before its installation at the reference unit's Control Room are described in the paper. These Simulators ensure the modeling of a wide range of accidents and transients and provide with special software and ETHERNET data process communications with the Operators' Support systems' prototypes. The development and adjustment of two state-of-the-art Operators' Support Systems of interest with using of Simulators are described in the paper as an example. These systems have been developed jointly by RRC KI and LNPP team. (author)

  14. Single-cell MALDI-MS as an analytical tool for studying intrapopulation metabolic heterogeneity of unicellular organisms.

    Science.gov (United States)

    Amantonico, Andrea; Urban, Pawel L; Fagerer, Stephan R; Balabin, Roman M; Zenobi, Renato

    2010-09-01

    Heterogeneity is a characteristic feature of all populations of living organisms. Here we make an attempt to validate a single-cell mass spectrometric method for detection of changes in metabolite levels occurring in populations of unicellular organisms. Selected metabolites involved in central metabolism (ADP, ATP, GTP, and UDP-Glucose) could readily be detected in single cells of Closterium acerosum by means of negative-mode matrix-assisted laser desorption/ionization (MALDI) mass spectrometry (MS). The analytical capabilities of this approach were characterized using standard compounds. The method was then used to study populations of individual cells with different levels of the chosen metabolites. With principal component analysis and support vector machine algorithms, it was possible to achieve a clear separation of individual C. acerosum cells in different metabolic states. This study demonstrates the suitability of mass spectrometric analysis of metabolites in single cells to measure cell-population heterogeneity.

  15. Schedule Analytics

    Science.gov (United States)

    2016-04-30

    Warfare, Naval Sea Systems Command Acquisition Cycle Time : Defining the Problem David Tate, Institute for Defense Analyses Schedule Analytics Jennifer...research was comprised of the following high- level steps :  Identify and review primary data sources 1...research. However, detailed reviews of the OMB IT Dashboard data revealed that schedule data is highly aggregated. Program start date and program end date

  16. Embodying resistance : a discourse analytical study of the selfie as political tool within the fourth wave of feminism

    OpenAIRE

    Barbala, Astri Moksnes

    2017-01-01

    This Master’s thesis is exploring whether the selfie can be utilised as a political tool in order to challenge the stereotypical ideas of femininity and female beauty that currently dominate the visual social media landscape. Focusing on the photo-sharing application Instagram, the emphasis is here on how the selfie can position the portrayed subject’s body as a site of resistance. By publishing images depicting their non-normative physical appearances, social media-participating feminists ar...

  17. [From oral history to the research film: the audiovisual as a tool of the historian].

    Science.gov (United States)

    Mattos, Hebe; Abreu, Martha; Castro, Isabel

    2017-01-01

    An analytical essay of the process of image production, audiovisual archive formation, analysis of sources, and creation of the filmic narrative of the four historiographic films that form the DVD set Passados presentes (Present pasts) from the Oral History and Image Laboratory of Universidade Federal Fluminense (Labhoi/UFF). From excerpts from the audiovisual archive of Labhoi and the films made, the article analyzes: how the problem of research (the memory of slavery, and the legacy of the slave song in the agrofluminense region) led us to the production of images in a research situation; the analytical shift in relation to the cinematographic documentary and the ethnographic film; the specificities of revisiting the audiovisual collection constituted by the formulation of new research problems.

  18. An Analytical Autoethnographical Account of Using Inquiry-Based Learning in a Graduate Research Methods Course

    Science.gov (United States)

    Woolf, Jules

    2017-01-01

    Increased emphasis is being placed on integrating research and teaching in higher education because of the numerous benefits accrued by students. In accordance, research methods courses are ubiquitously contained in curricula, ostensibly to promote research training and the research-teaching nexus. Students may not appreciate the inclusion,…

  19. Visual Tools for Eliciting Connections and Cohesiveness in Mixed Methods Research

    Science.gov (United States)

    Murawska, Jaclyn M.; Walker, David A.

    2017-01-01

    In this commentary, we offer a set of visual tools that can assist education researchers, especially those in the field of mathematics, in developing cohesiveness from a mixed methods perspective, commencing at a study's research questions and literature review, through its data collection and analysis, and finally to its results. This expounds…

  20. New Tools for New Literacies Research: An Exploration of Usability Testing Software

    Science.gov (United States)

    Asselin, Marlene; Moayeri, Maryam

    2010-01-01

    Competency in the new literacies of the Internet is essential for participating in contemporary society. Researchers studying these new literacies are recognizing the limitations of traditional methodological tools and adapting new technologies and new media for use in research. This paper reports our exploration of usability testing software to…

  1. Effect of Virtual Analytical Chemistry Laboratory on Enhancing Student Research Skills and Practices

    Science.gov (United States)

    Bortnik, Boris; Stozhko, Natalia; Pervukhina, Irina; Tchernysheva, Albina; Belysheva, Galina

    2017-01-01

    This article aims to determine the effect of a virtual chemistry laboratory on university student achievement. The article describes a model of a laboratory course that includes a virtual component. This virtual component is viewed as a tool of student pre-lab autonomous learning. It presents electronic resources designed for a virtual laboratory…

  2. Policy-Making Theory as an Analytical Framework in Policy Analysis: Implications for Research Design and Professional Advocacy.

    Science.gov (United States)

    Sheldon, Michael R

    2016-01-01

    Policy studies are a recent addition to the American Physical Therapy Association's Research Agenda and are critical to our understanding of various federal, state, local, and organizational policies on the provision of physical therapist services across the continuum of care. Policy analyses that help to advance the profession's various policy agendas will require relevant theoretical frameworks to be credible. The purpose of this perspective article is to: (1) demonstrate the use of a policy-making theory as an analytical framework in a policy analysis and (2) discuss how sound policy analysis can assist physical therapists in becoming more effective change agents, policy advocates, and partners with other relevant stakeholder groups. An exploratory study of state agency policy responses to address work-related musculoskeletal disorders is provided as a contemporary example to illustrate key points and to demonstrate the importance of selecting a relevant analytical framework based on the context of the policy issue under investigation. © 2016 American Physical Therapy Association.

  3. NIR spectroscopy as a process analytical technology (PAT) tool for monitoring and understanding of a hydrolysis process.

    Science.gov (United States)

    Wu, Zhisheng; Peng, Yanfang; Chen, Wei; Xu, Bing; Ma, Qun; Shi, Xinyuan; Qiao, Yanjiang

    2013-06-01

    The use of near infrared spectroscopy was investigated as a process analytical technology to monitor the amino acids concentration profile during hydrolysis process of Cornu Bubali. A protocol was followed, including outlier selection using relationship plot of residuals versus the leverage level, calibration models using interval partial least squares and synergy interval partial least squares (SiPLS). A strategy of four robust root mean square error of predictions (RMSEP) values have been developed to assess calibration models by means of the desirability index. Furthermore, multivariate quantification limits (MQL) values of the optimum model were determined using two types of error. The SiPLS(3) models for L-proline, L-tyrosine, L-valine, L-phenylalanine and L-lysine provided excellent accuracies with RMSEP values of 0.0915 mg/mL, 0.1605 mg/mL, 0.0515 mg/mL, 0.0586 mg/mL and 0.0613 mg/mL, respectively. The MQL ranged from 90 ppm to 810 ppm, which confirmed that these models can be suitable for most applications. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. Analytical solution of concentric two-pole Halbach cylinders as a preliminary design tool for magnetic refrigeration systems

    Science.gov (United States)

    Fortkamp, F. P.; Lozano, J. A.; Barbosa, J. R.

    2017-12-01

    This work presents a parametric analysis of the performance of nested permanent magnet Halbach cylinders intended for applications in magnetic refrigeration and heat pumping. An analytical model for the magnetic field generated by the cylinders is used to systematically investigate the influence of their geometric parameters. The proposed configuration generates two poles in the air gap between the cylinders, where active magnetic regenerators are positioned for conversion of magnetic work into cooling capacity or heat power. A sample geometry based on previous designs of magnetic refrigerators is investigated, and the results show that the magnetic field in the air gap oscillates between 0 to approximately 1 T, forming a rectified cosine profile along the circumference of the gap. Calculations of the energy density of the magnets indicate the need to operate at a low energy (particular the inner cylinder) in order to generate a magnetic profile suitable for a magnetic cooler. In practice, these low-energy regions of the magnet can be potentially replaced by soft ferromagnetic material. A parametric analysis of the air gap height has been performed, showing that there are optimal values which maximize the magnet efficiency parameter Λcool . Some combinations of cylinder radii resulted in magnetic field changes that were too small for practical purposes. No demagnetization of the cylinders has been found for the range of parameters considered.

  5. Static analytical and experimental research of shock absorber to safeguard the nuclear fuel assemblies

    Energy Technology Data Exchange (ETDEWEB)

    Dundulis, Gintautas [Laboratory of Nuclear Installation Safety, Lithuanian Energy Institute, Breslaujos Street 3, LT-44403 Kaunas (Lithuania)], E-mail: gintas@mail.lei.lt; Grybenas, Albertas [Laboratory of Materials Research and Testing, Lithuanian Energy Institute, Breslaujos Street 3, LT-44403 Kaunas (Lithuania); Karalevicius, Renatas [Laboratory of Nuclear Installation Safety, Lithuanian Energy Institute, Breslaujos Street 3, LT-44403 Kaunas (Lithuania); Makarevicius, Vidas [Laboratory of Materials Research and Testing, Lithuanian Energy Institute, Breslaujos Street 3, LT-44403 Kaunas (Lithuania); Rimkevicius, Sigitas; Uspuras, Eugenijus [Laboratory of Nuclear Installation Safety, Lithuanian Energy Institute, Breslaujos Street 3, LT-44403 Kaunas (Lithuania)

    2009-01-15

    The Ignalina Nuclear Power Plant (NPP) has two RBMK-1500 graphite-moderated boiling water multi-channel reactors. The Ignalina NPP Unit 1 was shutdown at the end of 2004, while Unit 2 is foreseen to be shutdown at the end of 2009. At the Ignalina NPP Unit 1 remains approximately 1000 spent fuel assemblies with low burn-up depth. A special set of equipment was developed to reuse these assemblies in the reactor of Unit 2. One of most important items of this set is a container, which is used for the transportation of spent fuel assemblies between the reactors of Unit 1 and Unit 2. A special shock absorber was designed to avoid failure of fuel assemblies in case of hypothetical spent fuel assemblies drop accident during uploading/unloading of spent fuel assemblies to/from container. This shock absorber was examined by using scaled experiments. The objective of this article is the estimation whether the proposed design of shock absorber fulfils the function of the absorber and the optimization of its geometrical parameters using the results of the performed investigations. Static analytical and experimental investigations are presented in the article. The finite element code BRIGADE/Plus was used for the analytical analysis. The calculation model was verified by comparing the experimental investigation and simulation results for further employment of this finite element model in the development of an optimum design of shock absorber. Static simulation was used to perform primary optimization of design and dimension of the shock absorber.

  6. Action Research on a WebQuest as an Instructional Tool for Writing Abstracts of Research Articles

    Directory of Open Access Journals (Sweden)

    Krismiyati Latuperissa

    2012-08-01

    Full Text Available The massive growth of and access to information technology (IT has enabled the integration of technology into classrooms. One such integration is the use of WebQuests as an instructional tool in teaching targeted learning activities such as writing abstracts of research articles in English for English as a Foreign Language (EFL learners. In the academic world, writing an abstract of a research paper or final project in English can be challenging for EFL students. This article presents an action research project on the process and outcomes of using a WebQuest designed to help 20 Indonesian university IT students write a research article’s abstract in English. Findings reveal that despite positive feedback, changes need to be made to make the WebQuest a more effective instructional tool for the purpose it was designed.

  7. Soil and Water Assessment Tool: Historical Development, Applications, and Future Research Directions, The

    OpenAIRE

    Philip W. Gassman; Manuel R. Reyes; Colleen H. Green; Jeffrey G. Arnold

    2007-01-01

    The Soil and Water Assessment Tool (SWAT) model is a continuation of nearly 30 years of modeling efforts conducted by the U.S. Department of Agriculture (USDA), Agricultural Research Service. SWAT has gained international acceptance as a robust interdisciplinary watershed modeling tool, as evidenced by international SWAT conferences, hundreds of SWAT-related papers presented at numerous scientific meetings, and dozens of articles published in peer-reviewed journals. The model has also been ad...

  8. Experimental research on the durability cutting tools for cutting-off steel profiles

    Directory of Open Access Journals (Sweden)

    Cristea Alexandru

    2017-01-01

    Full Text Available The production lines used for manufacturing U-shaped profiles are very complex and they must have high productivity. One of the most important stages of the fabrication process is the cutting-off. This paper presents the experimental research and analysis of the durability of the cutting tools used for cutting-off U-shaped metal steel profiles. The results of this work can be used to predict the durability of the cutting tools.

  9. The Research-Teaching Nexus: Using a Construction Teaching Event as a Research Tool

    Science.gov (United States)

    Casanovas-Rubio, Maria del Mar; Ahearn, Alison; Ramos, Gonzalo; Popo-Ola, Sunday

    2016-01-01

    In principle, the research-teaching nexus should be seen as a two-way link, showing not only ways in which research supports teaching but also ways in which teaching supports research. In reality, the discussion has been limited almost entirely to the first of these practices. This paper presents a case study in which some student field-trip…

  10. Interactive Data Visualization for HIV Cohorts: Leveraging Data Exchange Standards to Share and Reuse Research Tools.

    Directory of Open Access Journals (Sweden)

    Meridith Blevins

    Full Text Available To develop and disseminate tools for interactive visualization of HIV cohort data.If a picture is worth a thousand words, then an interactive video, composed of a long string of pictures, can produce an even richer presentation of HIV population dynamics. We developed an HIV cohort data visualization tool using open-source software (R statistical language. The tool requires that the data structure conform to the HIV Cohort Data Exchange Protocol (HICDEP, and our implementation utilized Caribbean, Central and South America network (CCASAnet data.This tool currently presents patient-level data in three classes of plots: (1 Longitudinal plots showing changes in measurements viewed alongside event probability curves allowing for simultaneous inspection of outcomes by relevant patient classes. (2 Bubble plots showing changes in indicators over time allowing for observation of group level dynamics. (3 Heat maps of levels of indicators changing over time allowing for observation of spatial-temporal dynamics. Examples of each class of plot are given using CCASAnet data investigating trends in CD4 count and AIDS at antiretroviral therapy (ART initiation, CD4 trajectories after ART initiation, and mortality.We invite researchers interested in this data visualization effort to use these tools and to suggest new classes of data visualization. We aim to contribute additional shareable tools in the spirit of open scientific collaboration and hope that these tools further the participation in open data standards like HICDEP by the HIV research community.

  11. Direct numerical control of machine tools in a nuclear research center by the CAMAC system

    International Nuclear Information System (INIS)

    Zwoll, K.; Mueller, K.D.; Becks, B.; Erven, W.; Sauer, M.

    1977-01-01

    The production of mechanical parts in research centers can be improved by connecting several numerically controlled machine tools to a central process computer via a data link. The CAMAC Serial Highway with its expandable structure yields an economic and flexible system for this purpose. The CAMAC System also facilitates the development of modular components controlling the machine tools itself. A CAMAC installation controlling three different machine tools connected to a central computer (PDP11) via the CAMAC Serial Highway is described. Besides this application, part of the CAMAC hardware and software can also be used for a great variety of scientific experiments

  12. A Tool for Measuring NASA's Aeronautics Research Progress Toward Planned Strategic Community Outcomes

    Science.gov (United States)

    Tahmasebi, Farhad; Pearce, Robert

    2016-01-01

    Description of a tool for portfolio analysis of NASA's Aeronautics research progress toward planned community strategic Outcomes is presented. For efficiency and speed, the tool takes advantage of a function developed in Excels Visual Basic for Applications. The strategic planning process for determining the community Outcomes is also briefly discussed. Stakeholder buy-in, partnership performance, progress of supporting Technical Challenges, and enablement forecast are used as the criteria for evaluating progress toward Outcomes. A few illustrative examples of using the tool are also presented.

  13. Research and Development of Powder Brazing Filler Metals for Diamond Tools: A Review

    Directory of Open Access Journals (Sweden)

    Fei Long

    2018-05-01

    Full Text Available Powder brazing filler metals (PBFMs feature a number of comparative advantages. Among others, these include a low energy consumption, an accurate dosage, a good brazeability, a short production time, and a high production efficiency. These filler metals have been used in the aerospace, automobile, and electric appliances industries. The PBFMs are especially suitable for diamond tools bonding, which involves complex workpiece shapes and requires accurate dosage. The recent research of PBFMs for diamond tools is reviewed in this paper. The current applications are discussed. The CuSnTi and Ni-Cr-based PBFMs have been the two commonly used monolayer PBFMs. Thus, the bonding mechanism at the interface between both the monolayer PBFMs and a diamond tool are summarized first. The ways to improve the performance of the monolayer PBFMs for diamond tools are analyzed. Next, a research of PBFMs for impregnated diamond tools is reviewed. The technical problems that urgently need solutions are discussed. Finally, the challenges and opportunities involved with the PBFMs for diamond tools research and development are summarized, and corresponding prospects are suggested.

  14. Isotopes in global change science: from isotope analytics to Earth system research

    International Nuclear Information System (INIS)

    Oeschger, H.

    1998-01-01

    The aim of this paper is to emphasize some of the studies of Jean Charles Fontes and his role in our scientific community. Isotopes represent a powerful tool for the understanding of the Earth's past environment and defining the envelope of natural environmental variability within which we can assess anthropogenic impact on the Earth's biosphere, geosphere and atmosphere. The reconstruction impacts of past climatic change on the Earth's system are a basis to validate models of the possible impacts of future climate change. Oceanic sediments, polar ice caps, continental sedimentary sequences and groundwater are archives of past climate. Their quantitative study is developed within the IGBP (International Geosphere-Biosphere Program) - Pages project, which strongly emphasizes an optimum use of isotope tools. (author)

  15. Research on the analytical method about influence of gas leakage and explosion on subway

    Science.gov (United States)

    Ji, Wendong; Yang, Ligong; Chen, Lin

    2018-05-01

    With the construction and development of city subway, the cross impact of underground rail transit and gas pipe network is becoming more and more serious, but there is no analytical method for the impact of gas explosions on the subway. According to this paper, the gas leakage is equivalent to the TNT explosion equivalent, based on which, the calculation of the explosive impact load is carried out. On the basis of the concrete manifestation of gas explosion, it is more convenient to carry out the subsequent calculation by equivalently treating the explosive impact load as a uniform load within a certain range. The overlying soil of the subway station has played a protective role for the subway, making the displacement of the subway structure in the explosion process significantly reduced. The analysis on the actual case shows that this method can be successfully applied to the quantitative analysis of such accidents.

  16. [Analysis of researchers' implication in a research-intervention in the Stork Network: a tool for institutional analysis].

    Science.gov (United States)

    Fortuna, Cinira Magali; Mesquita, Luana Pinho de; Matumoto, Silvia; Monceau, Gilles

    2016-09-19

    This qualitative study is based on institutional analysis as the methodological theoretical reference with the objective of analyzing researchers' implication during a research-intervention and the interferences caused by this analysis. The study involved researchers from courses in medicine, nursing, and dentistry at two universities and workers from a Regional Health Department in follow-up on the implementation of the Stork Network in São Paulo State, Brazil. The researchers worked together in the intervention and in analysis workshops, supported by an external institutional analysis. Two institutions stood out in the analysis: the research, established mainly with characteristics of neutrality, and management, with Taylorist characteristics. Differences between researchers and difficulties in identifying actions proper to network management and research were some of the interferences that were identified. The study concludes that implication analysis is a powerful tool for such studies.

  17. The Scottish Government's Rural and Environmental Science and Analytical Services Strategic Research Progamme

    Science.gov (United States)

    Dawson, Lorna; Bestwick, Charles

    2013-04-01

    The Strategic Research Programme focuses on the delivery of outputs and outcomes within the major policy agenda areas of climate change, land use and food security, and to impact on the 'Wealthier', 'Healthier' and 'Greener' strategic objectives of the Scottish Government. The research is delivered through two programmes: 'Environmental Change' and 'Food, Land and People'; the core strength of which is the collaboration between the Scottish Government's Main Research Providers-The James Hutton Institute, the Moredun Research Institute, Rowett Institute of Nutrition and Health University of Aberdeen, Scotland's Rural College, Biomathematics and Statistics Scotland and The Royal Botanic Gardens Edinburgh. The research actively seeks to inform and be informed by stakeholders from policy, farming, land use, water and energy supply, food production and manufacturing, non-governmental organisations, voluntary organisations, community groups and general public. This presentation will provide an overview of the programme's interdisciplinary research, through examples from across the programme's themes. Examples will exemplify impact within the Strategic Programme's priorities of supporting policy and practice, contributing to economic growth and innovation, enhancing collaborative and multidisciplinary research, growing scientific resilience and delivering scientific excellence. http://www.scotland.gov.uk/Topics/Research/About/EBAR/StrategicResearch/future-research-strategy/Themes/ http://www.knowledgescotland.org/news.php?article_id=295

  18. Analytical research of vibration and far-field acoustic radiation of cylindrical shell immersed at finite depth

    Directory of Open Access Journals (Sweden)

    GUO Wenjie

    2017-08-01

    Full Text Available Aiming at the current lack of analytical research concerning the cylindrical shell-flow field coupling vibration and sound radiation system under the influence of a free surface, this paper proposes an analytical method which solves the vibration response and far-field acoustic radiation of a finite cylindrical shell immersed at a finite depth. Based on the image method and Graf addition theorem, the analytical expression of the fluid velocity potential can be obtained, then combined with the energy functional of the variation method to deduce the shell-liquid coupling vibration equation, which can in turn solve the forced vibration response. The research shows that, compared with an infinite fluid, a free surface can increase at the same order of resonance frequency; but as the depth of immersion gradually increases, the mean square vibration velocity tends to become the same as that in an infinite fluid. Compared with numerical results from Nastran software, this shows that the present method is accurate and reliable, and has such advantages as a simple method and a small amount of calculation. The far-field radiated pressure can be obtained by the vibration response using the Fourier transformation and stationary phase method. The results indicate that the directivity and volatility of the far-field acoustic pressure of a cylindrical shell is similar to that of an acoustical dipole due to the free surface. However, the far-field acoustic pressure is very different from the vibration characteristics, and will not tend to an infinite fluid as the submerging depth increases. Compared with the numerical method, the method in this paper is simpler and has a higher computational efficiency. It enables the far-field acoustic radiation of an underwater cylindrical shell to be predicted quickly under the influence of external incentives and the free surface, providing guiding significance for acoustic research into the half space structure vibration

  19. The Identity Politics of Qualitative Research. A Discourse Analytic Inter-text

    Directory of Open Access Journals (Sweden)

    Pascal Dey

    2006-09-01

    Full Text Available We intend this paper to be read as an inter-text between selected FQS articles, which in one way or another engage in the identity politics of qualitative research, and the broader discussion of quality in the social sciences. Subjecting those texts to a discursive investigation, we highlight how the semantic scope of what is called "qualitative research" is decisively delimited by the positivist associations of "good research". Our overall aim is to take issue with two binaries that are commonly employed by qualitative researchers and thus collide with the evolution of creative/aesthetic science. Simultaneously, however, we seek to enliven qualitative research by providing an (ancient and thus marginal understanding of research and of approaches that are still outside the prevailing canon. To this end, we start by investigating the binary between quantitative and qualitative research that is perpetually reified as our colleagues invoke the positivist quality criteria, subordinating the qualitative pole to an inferior position. Second, we provide examples of the ways that qualitative research is habitually separated from "non-research" such as the arts, journalism and fiction, ostensibly to justify calling it "scientific". Pondering how these binaries endow qualitative research with a limited identity and a supplementary status, we draw on some postmodern works so as to elaborate on alternative understandings of "science" and scientific quality. Finally, we argue for a "politics of difference" which we envision as a point of origin for extending qualitative research by multiplying its genres, styles and tropes. URN: urn:nbn:de:0114-fqs0604286

  20. EasyInterface: A toolkit for rapid development of GUIs for research prototype tools

    OpenAIRE

    Doménech, Jesús; Genaim, Samir; Johnsen, Einar Broch; Schlatte, Rudolf

    2017-01-01

    In this paper we describe EasyInterface, an open-source toolkit for rapid development of web-based graphical user interfaces (GUIs). This toolkit addresses the need of researchers to make their research prototype tools available to the community, and integrating them in a common environment, rapidly and without being familiar with web programming or GUI libraries in general. If a tool can be executed from a command-line and its output goes to the standard output, then in few minutes one can m...

  1. Nexusing Charcoal in South Mozambique: A Proposal To Integrate the Nexus Charcoal-Food-Water Analysis With a Participatory Analytical and Systemic Tool

    Directory of Open Access Journals (Sweden)

    Ricardo Martins

    2018-06-01

    Full Text Available Nexus analysis identifies and explores the synergies and trade-offs between energy, food and water systems, considered as interdependent systems interacting with contextual drivers (e.g., climate change, poverty. The nexus is, thus, a valuable analytical and policy design supporting tool to address the widely discussed links between bioenergy, food and water. In fact, the Nexus provides a more integrative and broad approach in relation to the single isolated system approach that characterizes many bioenergy analysis and policies of the last decades. In particular, for the South of Mozambique, charcoal production, food insecurity and water scarcity have been related in separated studies and, thus, it would be expected that Nexus analysis has the potential to provide the basis for integrated policies and strategies focused on charcoal as a development factor. However, to date there is no Nexus analysis focused on charcoal in Mozambique, neither is there an assessment of the comprehensiveness and relevance of Nexus analysis when applied to charcoal energy systems. To address these gaps, this work applies the Nexus to the charcoal-food-water system in Mozambique, integrating national, regional and international studies analysing the isolated, or pairs of, systems. This integration results in a novel Nexus analysis graphic for charcoal-food-water relationship. Then, to access the comprehensiveness and depth of analysis, this Nexus analysis is critically compared with the 2MBio-A, a systems analytical and design framework based on a design tool specifically developed for Bioenergy (the 2MBio. The results reveal that Nexus analysis is “blind” to specific fundamental social, ecological and socio-historical dynamics of charcoal energy systems. The critical comparison also suggests the need to integrate the high level systems analysis of Nexus with non-deterministic, non-prescriptive participatory analysis tools, like the 2MBio-A, as a means to

  2. The International Conference on Intelligent Biology and Medicine (ICIBM) 2016: from big data to big analytical tools.

    Science.gov (United States)

    Liu, Zhandong; Zheng, W Jim; Allen, Genevera I; Liu, Yin; Ruan, Jianhua; Zhao, Zhongming

    2017-10-03

    The 2016 International Conference on Intelligent Biology and Medicine (ICIBM 2016) was held on December 8-10, 2016 in Houston, Texas, USA. ICIBM included eight scientific sessions, four tutorials, one poster session, four highlighted talks and four keynotes that covered topics on 3D genomics structural analysis, next generation sequencing (NGS) analysis, computational drug discovery, medical informatics, cancer genomics, and systems biology. Here, we present a summary of the nine research articles selected from ICIBM 2016 program for publishing in BMC Bioinformatics.

  3. Supporting Imagers' VOICE: A National Training Program in Comparative Effectiveness Research and Big Data Analytics.

    Science.gov (United States)

    Kang, Stella K; Rawson, James V; Recht, Michael P

    2017-12-05

    Provided methodologic training, more imagers can contribute to the evidence basis on improved health outcomes and value in diagnostic imaging. The Value of Imaging Through Comparative Effectiveness Research Program was developed to provide hands-on, practical training in five core areas for comparative effectiveness and big biomedical data research: decision analysis, cost-effectiveness analysis, evidence synthesis, big data principles, and applications of big data analytics. The program's mixed format consists of web-based modules for asynchronous learning as well as in-person sessions for practical skills and group discussion. Seven diagnostic radiology subspecialties and cardiology are represented in the first group of program participants, showing the collective potential for greater depth of comparative effectiveness research in the imaging community. Copyright © 2017 American College of Radiology. Published by Elsevier Inc. All rights reserved.

  4. The ABCs of incentive-based treatment in health care: a behavior analytic framework to inform research and practice

    Directory of Open Access Journals (Sweden)

    Meredith SE

    2014-03-01

    Full Text Available Steven E Meredith,1 Brantley P Jarvis,2 Bethany R Raiff,3 Alana M Rojewski,2 Allison Kurti,2 Rachel N Cassidy,2 Philip Erb,2 Jolene R Sy,4 Jesse Dallery2 1Department of Psychiatry and Behavioral Sciences, The Johns Hopkins University School of Medicine, Baltimore, MD, USA; 2Department of Psychology, University of Florida, Gainesville, FL, USA; 3Department of Psychology, Rowan University, Glassboro, NJ, USA; 4Saint Louis University School of Social Work, St Louis, MO, USA Abstract: Behavior plays an important role in health promotion. Exercise, smoking cessation, medication adherence, and other healthy behavior can help prevent, or even treat, some diseases. Consequently, interventions that promote healthy behavior have become increasingly common in health care settings. Many of these interventions award incentives contingent upon preventive health-related behavior. Incentive-based interventions vary considerably along several dimensions, including who is targeted in the intervention, which behavior is targeted, and what type of incentive is used. More research on the quantitative and qualitative features of many of these variables is still needed to inform treatment. However, extensive literature on basic and applied behavior analytic research is currently available to help guide the study and practice of incentive-based treatment in health care. In this integrated review, we discuss how behavior analytic research and theory can help treatment providers design and implement incentive-based interventions that promote healthy behavior. Keywords: incentives, contingency management, conditional cash transfer, pay-for-performance, wellness

  5. A pilot analytic study of a research-level, lower-cost human papillomavirus 16, 18, and 45 test

    OpenAIRE

    Yang, Hannah P.; Walmer, David K.; Merisier, Delson; Gage, Julia C.; Bell, Laura; Rangwala, Sameera; Shrestha, Niwashin; Kobayashi, Lori; Eder, Paul S.; Castle, Philip E.

    2011-01-01

    The analytic performance of a low-cost, research-stage DNA test for the most carcinogenic human papillomavirus (HPV) genotypes (HPV16, HPV18, and HPV45) in aggregate was evaluated among carcinogenic HPV-positive women, which might be used to decide who needs immediate colposcopy in low-resource settings (“triage test”). We found that HPV16/18/45 test agreed well with two DNA tests, a GP5+/6+ genotyping assay (Kappa = 0.77) and a quantitative PCR assay (at a cutpoint of 5000 viral copies) (Kap...

  6. A QUALITATIVE RESEARCH REGARDING THE MARKETING COMMUNICATION TOOLS USED IN THE ONLINE ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    POP Nicolae Al.

    2011-07-01

    Full Text Available Starting from the meaning of the communication process in marketing, the authors try to identify its role in assuring the continuity of the management process in what concerns the relationships between all the partners of the company, on the long term. An emphasis is made on the role of online communication and its tools in relationship marketing. In order to validate some of the mentioned ideas the authors have chosen to undertake a qualitative marketing research among the managers of some Romanian tourism companies. The qualitative part of the study had as purpose the identification of the main tools which form the basis of the communication with the beneficiaries of the touristic services, of the way in which the companies use the online communication tools for attracting, keeping and developing the long term relationships with their customers in the virtual environment. The following tools have been analyzed: websites, email marketing campaigns, e-newsletters, online advertising, search engines, sponsored links, blogs, RSS feed, social networks, forums, online discussion groups, portals, infomediaries and instant messaging. The chosen investigation method was the selective survey, the research technique - explorative interrogation and the research instrument - semi structured detailed interview, based on a conversation guide. A very important fact is the classification resulted after the respondents were requested to mention the most efficient tools for attracting customers and for maintaining the relationships with them. Although the notoriety of the online marketing tools is high, there are some tools that are known by definition, but are not used at all or are not used correctly; or are not known by definition, but are used in practice. The authors contributed by validating a performing methodology of qualitative research, a study which will open new ways and means for making the online communication tools used for touristic services in

  7. Multimedia Informed Consent Tool for a Low Literacy African Research Population: Development and Pilot-Testing

    Science.gov (United States)

    Afolabi, Muhammed Olanrewaju; Bojang, Kalifa; D’Alessandro, Umberto; Imoukhuede, Egeruan Babatunde; Ravinetto, Raffaella M; Larson, Heidi Jane; McGrath, Nuala; Chandramohan, Daniel

    2014-01-01

    Background International guidelines recommend the use of appropriate informed consent procedures in low literacy research settings because written information is not known to guarantee comprehension of study information. Objectives This study developed and evaluated a multimedia informed consent tool for people with low literacy in an area where a malaria treatment trial was being planned in The Gambia. Methods We developed the informed consent document of the malaria treatment trial into a multimedia tool integrating video, animations and audio narrations in three major Gambian languages. Acceptability and ease of use of the multimedia tool were assessed using quantitative and qualitative methods. In two separate visits, the participants’ comprehension of the study information was measured by using a validated digitised audio questionnaire. Results The majority of participants (70%) reported that the multimedia tool was clear and easy to understand. Participants had high scores on the domains of adverse events/risk, voluntary participation, study procedures while lowest scores were recorded on the question items on randomisation. The differences in mean scores for participants’ ‘recall’ and ‘understanding’ between first and second visits were statistically significant (F (1,41)=25.38, pmultimedia tool was acceptable and easy to administer among low literacy participants in The Gambia. It also proved to be effective in delivering and sustaining comprehension of study information across a diverse group of participants. Additional research is needed to compare the tool to the traditional consent interview, both in The Gambia and in other sub-Saharan settings. PMID:25133065

  8. Analytics for Education

    Science.gov (United States)

    MacNeill, Sheila; Campbell, Lorna M.; Hawksey, Martin

    2014-01-01

    This article presents an overview of the development and use of analytics in the context of education. Using Buckingham Shum's three levels of analytics, the authors present a critical analysis of current developments in the domain of learning analytics, and contrast the potential value of analytics research and development with real world…

  9. State Health Mapper: An Interactive, Web-Based Tool for Physician Workforce Planning, Recruitment, and Health Services Research.

    Science.gov (United States)

    Krause, Denise D

    2015-11-01

    Health rankings in Mississippi are abysmal. Mississippi also has fewer physicians to serve its population compared with all other states. Many residents of this predominately rural state do not have access to healthcare providers. To better understand the demographics and distribution of the current health workforce in Mississippi, the main objective of the study was to design a Web-based, spatial, interactive application to visualize and explore the physician workforce. A Web application was designed to assist in health workforce planning. Secondary datasets of licensure and population information were obtained, and live feeds from licensure systems are being established. Several technologies were used to develop an intuitive, user-friendly application. Custom programming was completed in JavaScript so the application could run on most platforms, including mobile devices. The application allows users to identify and query geographic locations of individual or aggregated physicians based on attributes included in the licensure data, to perform drive time or buffer analyses, and to explore sociodemographic population data by geographic area of choice. This Web-based application with analytical tools visually represents the physician workforce licensed in Mississippi and its attributes, and provides access to much-needed information for statewide health workforce planning and research. The success of the application is not only based on the practicality of the tool but also on its ease of use. Feedback has been positive and has come from a wide variety of organizations across the state.

  10. Knowledge Translation Tools are Emerging to Move Neck Pain Research into Practice.

    Science.gov (United States)

    Macdermid, Joy C; Miller, Jordan; Gross, Anita R

    2013-01-01

    Development or synthesis of the best clinical research is in itself insufficient to change practice. Knowledge translation (KT) is an emerging field focused on moving knowledge into practice, which is a non-linear, dynamic process that involves knowledge synthesis, transfer, adoption, implementation, and sustained use. Successful implementation requires using KT strategies based on theory, evidence, and best practice, including tools and processes that engage knowledge developers and knowledge users. Tools can provide instrumental help in implementing evidence. A variety of theoretical frameworks underlie KT and provide guidance on how tools should be developed or implemented. A taxonomy that outlines different purposes for engaging in KT and target audiences can also be useful in developing or implementing tools. Theoretical frameworks that underlie KT typically take different perspectives on KT with differential focus on the characteristics of the knowledge, knowledge users, context/environment, or the cognitive and social processes that are involved in change. Knowledge users include consumers, clinicians, and policymakers. A variety of KT tools have supporting evidence, including: clinical practice guidelines, patient decision aids, and evidence summaries or toolkits. Exemplars are provided of two KT tools to implement best practice in management of neck pain-a clinician implementation guide (toolkit) and a patient decision aid. KT frameworks, taxonomies, clinical expertise, and evidence must be integrated to develop clinical tools that implement best evidence in the management of neck pain.

  11. Incorporating ethical principles into clinical research protocols: a tool for protocol writers and ethics committees.

    Science.gov (United States)

    Li, Rebecca H; Wacholtz, Mary C; Barnes, Mark; Boggs, Liam; Callery-D'Amico, Susan; Davis, Amy; Digilova, Alla; Forster, David; Heffernan, Kate; Luthin, Maeve; Lynch, Holly Fernandez; McNair, Lindsay; Miller, Jennifer E; Murphy, Jacquelyn; Van Campen, Luann; Wilenzick, Mark; Wolf, Delia; Woolston, Cris; Aldinger, Carmen; Bierer, Barbara E

    2016-04-01

    A novel Protocol Ethics Tool Kit ('Ethics Tool Kit') has been developed by a multi-stakeholder group of the Multi-Regional Clinical Trials Center of Brigham and Women's Hospital and Harvard. The purpose of the Ethics Tool Kit is to facilitate effective recognition, consideration and deliberation of critical ethical issues in clinical trial protocols. The Ethics Tool Kit may be used by investigators and sponsors to develop a dedicated Ethics Section within a protocol to improve the consistency and transparency between clinical trial protocols and research ethics committee reviews. It may also streamline ethics review and may facilitate and expedite the review process by anticipating the concerns of ethics committee reviewers. Specific attention was given to issues arising in multinational settings. With the use of this Tool Kit, researchers have the opportunity to address critical research ethics issues proactively, potentially speeding the time and easing the process to final protocol approval. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/

  12. Quantitative Risk reduction estimation Tool For Control Systems, Suggested Approach and Research Needs

    Energy Technology Data Exchange (ETDEWEB)

    Miles McQueen; Wayne Boyer; Mark Flynn; Sam Alessi

    2006-03-01

    For the past year we have applied a variety of risk assessment technologies to evaluate the risk to critical infrastructure from cyber attacks on control systems. More recently, we identified the need for a stand alone control system risk reduction estimation tool to provide owners and operators of control systems with a more useable, reliable, and credible method for managing the risks from cyber attack. Risk is defined as the probability of a successful attack times the value of the resulting loss, typically measured in lives and dollars. Qualitative and ad hoc techniques for measuring risk do not provide sufficient support for cost benefit analyses associated with cyber security mitigation actions. To address the need for better quantitative risk reduction models we surveyed previous quantitative risk assessment research; evaluated currently available tools; developed new quantitative techniques [17] [18]; implemented a prototype analysis tool to demonstrate how such a tool might be used; used the prototype to test a variety of underlying risk calculational engines (e.g. attack tree, attack graph); and identified technical and research needs. We concluded that significant gaps still exist and difficult research problems remain for quantitatively assessing the risk to control system components and networks, but that a useable quantitative risk reduction estimation tool is not beyond reach.

  13. Competitive Benchmarking: An IS Research Approach to Address Wicked Problems with Big Data and Analytics

    NARCIS (Netherlands)

    W. Ketter (Wolfgang); M. Peters (Markus); J. Collins (John); A. Gupta (Alok)

    2015-01-01

    textabstractWicked problems like sustainable energy and financial market stability are societal challenges that arise from complex socio-technical systems in which numerous social, economic, political, and technical factors interact. Understanding and mitigating them requires research methods that

  14. Medical informatics: an essential tool for health sciences research in acute care.

    Science.gov (United States)

    Li, Man; Pickering, Brian W; Smith, Vernon D; Hadzikadic, Mirsad; Gajic, Ognjen; Herasevich, Vitaly

    2009-10-01

    Medical Informatics has become an important tool in modern health care practice and research. In the present article we outline the challenges and opportunities associated with the implementation of electronic medical records (EMR) in complex environments such as intensive care units (ICU). We share our initial experience in the design, maintenance and application of a customized critical care, Microsoft SQL based, research warehouse, ICU DataMart. ICU DataMart integrates clinical and administrative data from heterogeneous sources within the EMR to support research and practice improvement in the ICUs. Examples of intelligent alarms -- "sniffers", administrative reports, decision support and clinical research applications are presented.

  15. Medical Informatics: An Essential Tool for Health Sciences Research in Acute Care

    Directory of Open Access Journals (Sweden)

    Man Li

    2009-10-01

    Full Text Available Medical Informatics has become an important tool in modern health care practice and research. In the present article we outline the challenges and opportunities associated with the implementation of electronic medical records (EMR in complex environments such as intensive care units (ICU. We share our initial experience in the design, maintenance and application of a customized critical care, Microsoft SQL based, research warehouse, ICU DataMart. ICU DataMart integrates clinical and administrative data from heterogeneous sources within the EMR to support research and practice improvement in the ICUs. Examples of intelligent alarms – “sniffers”, administrative reports, decision support and clinical research applications are presented.

  16. Dynamic analytical and experimental research of shock absorber to safeguard the nuclear fuel assemblies

    Energy Technology Data Exchange (ETDEWEB)

    Dundulis, Gintautas, E-mail: gintas@mail.lei.lt [Laboratory of Nuclear Installation Safety, Lithuanian Energy Institute, Breslaujos str. 3, LT-44403 Kaunas (Lithuania); Grybenas, Albertas [Laboratory of Materials Research and Testing, Lithuanian Energy Institute, Breslaujos str. 3, LT-44403 Kaunas (Lithuania); Karalevicius, Renatas [Laboratory of Nuclear Installation Safety, Lithuanian Energy Institute, Breslaujos str. 3, LT-44403 Kaunas (Lithuania); Makarevicius, Vidas [Laboratory of Materials Research and Testing, Lithuanian Energy Institute, Breslaujos str. 3, LT-44403 Kaunas (Lithuania); Rimkevicius, Sigitas; Uspuras, Eugenijus [Laboratory of Nuclear Installation Safety, Lithuanian Energy Institute, Breslaujos str. 3, LT-44403 Kaunas (Lithuania)

    2013-07-15

    Highlights: • Plastical deformation of the shock absorber. • Dynamic testing of the scaled shock absorber. • Dynamic simulation of the shock absorber using finite element method. • Strain-rate evaluation in dynamic analysis. • Variation of displacement, acceleration and velocity during dynamic impact. -- Abstract: The Ignalina Nuclear Power Plant (NPP) has two RBMK-1500 graphite moderated boiling water multi-channel reactors. The Ignalina NPP Unit 1 was shut down at the end of 2004 while Unit 2 has been in operation for over 5 years. After shutdown at the Unit 1 remained spent fuel assemblies with low burn-up depth. In order to reuse these assemblies in the reactor of Unit 2 a special set of equipment was developed. One of the most important items of this set is a container, which is used for the transportation of spent fuel assemblies between the reactors of Unit 1 and Unit 2. A special shock absorber was designed to avoid failure of fuel assemblies in case of hypothetical spent fuel assemblies drop accident during uploading/unloading of spent fuel assemblies to/from container. This shock absorber was examined using scaled experiments and finite element analysis. Static and dynamic investigations of the shock absorber were performed for the estimation and optimization of its geometrical parameters. The objective of this work is the estimation whether the proposed design of shock absorber can fulfil the stopping function of the spent fuel assemblies and is capable to withstand the dynamics load. Experimental testing of scaled shock absorber models and dynamic analytical investigations using the finite element code ABAQUS/Explicit were performed. The simulation model was verified by comparing the experimental and simulation results and it was concluded that the shock absorber is capable to withstand the dynamic load, i.e. successful force suppression function in case of accident.

  17. Direct analysis in real time mass spectrometry, a process analytical technology tool for real-time process monitoring in botanical drug manufacturing.

    Science.gov (United States)

    Wang, Lu; Zeng, Shanshan; Chen, Teng; Qu, Haibin

    2014-03-01

    A promising process analytical technology (PAT) tool has been introduced for batch processes monitoring. Direct analysis in real time mass spectrometry (DART-MS), a means of rapid fingerprint analysis, was applied to a percolation process with multi-constituent substances for an anti-cancer botanical preparation. Fifteen batches were carried out, including ten normal operations and five abnormal batches with artificial variations. The obtained multivariate data were analyzed by a multi-way partial least squares (MPLS) model. Control trajectories were derived from eight normal batches, and the qualification was tested by R(2) and Q(2). Accuracy and diagnosis capability of the batch model were then validated by the remaining batches. Assisted with high performance liquid chromatography (HPLC) determination, process faults were explained by corresponding variable contributions. Furthermore, a batch level model was developed to compare and assess the model performance. The present study has demonstrated that DART-MS is very promising in process monitoring in botanical manufacturing. Compared with general PAT tools, DART-MS offers a particular account on effective compositions and can be potentially used to improve batch quality and process consistency of samples in complex matrices. Copyright © 2014 Elsevier B.V. All rights reserved.

  18. Health Heritage© a web-based tool for the collection and assessment of family health history: initial user experience and analytic validity.

    Science.gov (United States)

    Cohn, W F; Ropka, M E; Pelletier, S L; Barrett, J R; Kinzie, M B; Harrison, M B; Liu, Z; Miesfeldt, S; Tucker, A L; Worrall, B B; Gibson, J; Mullins, I M; Elward, K S; Franko, J; Guterbock, T M; Knaus, W A

    2010-01-01

    A detailed family health history is currently the most potentially useful tool for diagnosis and risk assessment in clinical genetics. We developed and evaluated the usability and analytic validity of a patient-driven web-based family health history collection and analysis tool. Health Heritage(©) guides users through the collection of their family health history by relative, generates a pedigree, completes risk assessment, stratification, and recommendations for 89 conditions. We compared the performance of Health Heritage to that of Usual Care using a nonrandomized cohort trial of 109 volunteers. We contrasted the completeness and sensitivity of family health history collection and risk assessments derived from Health Heritage and Usual Care to those obtained by genetic counselors and genetic assessment teams. Nearly half (42%) of the Health Heritage participants reported discovery of health risks; 63% found the information easy to understand and 56% indicated it would change their health behavior. Health Heritage consistently outperformed Usual Care in the completeness and accuracy of family health history collection, identifying 60% of the elevated risk conditions specified by the genetic team versus 24% identified by Usual Care. Health Heritage also had greater sensitivity than Usual Care when comparing the identification of risks. These results suggest a strong role for automated family health history collection and risk assessment and underscore the potential of these data to serve as the foundation for comprehensive, cost-effective personalized genomic medicine. Copyright © 2010 S. Karger AG, Basel.

  19. Web-based management of research groups - using the right tools and an adequate integration strategy

    Energy Technology Data Exchange (ETDEWEB)

    Barroso, Antonio Carlos de Oliveira; Menezes, Mario Olimpio de, E-mail: barroso@ipen.b, E-mail: mario@ipen.b [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil). Grupo de Pesquisa em Gestao do Conhecimento Aplicada a Area Nuclear

    2011-07-01

    Nowadays broad interest in a couple of inter linked subject areas can make the configuration of a research group to be much diversified both in terms of its components and of the binding relationships that glues the group together. That is the case of the research group for knowledge management and its applications to nuclear technology - KMANT at IPEN, a living entity born 7 years ago and that has sustainably attracted new collaborators. This paper describes the strategic planning of the group, its charter and credo, the present components of the group and the diversified nature of their relations with the group and with IPEN. Then the technical competencies and currently research lines (or programs) are described as well as the research projects, and the management scheme of the group. In the sequence the web-based management and collaboration tools are described as well our experience with their use. KMANT have experiment with over 20 systems and software in this area, but we will focus on those aimed at: (a) web-based project management (RedMine, ClockinIT, Who does, PhProjekt and Dotproject); (b) teaching platform (Moodle); (c) mapping and knowledge representation tools (Cmap, Freemind and VUE); (d) Simulation tools (Matlab, Vensim and NetLogo); (e) social network analysis tools (ORA, MultiNet and UciNet); (f) statistical analysis and modeling tools (R and SmartPLS). Special emphasis is given to the coupling of the group permanent activities like graduate courses and regular seminars and how newcomers are selected and trained to be able to enroll the group. A global assessment of the role the management strategy and available tool set for the group performance is presented. (author)

  20. Web-based management of research groups - using the right tools and an adequate integration strategy

    International Nuclear Information System (INIS)

    Barroso, Antonio Carlos de Oliveira; Menezes, Mario Olimpio de

    2011-01-01

    Nowadays broad interest in a couple of inter linked subject areas can make the configuration of a research group to be much diversified both in terms of its components and of the binding relationships that glues the group together. That is the case of the research group for knowledge management and its applications to nuclear technology - KMANT at IPEN, a living entity born 7 years ago and that has sustainably attracted new collaborators. This paper describes the strategic planning of the group, its charter and credo, the present components of the group and the diversified nature of their relations with the group and with IPEN. Then the technical competencies and currently research lines (or programs) are described as well as the research projects, and the management scheme of the group. In the sequence the web-based management and collaboration tools are described as well our experience with their use. KMANT have experiment with over 20 systems and software in this area, but we will focus on those aimed at: (a) web-based project management (RedMine, ClockinIT, Who does, PhProjekt and Dotproject); (b) teaching platform (Moodle); (c) mapping and knowledge representation tools (Cmap, Freemind and VUE); (d) Simulation tools (Matlab, Vensim and NetLogo); (e) social network analysis tools (ORA, MultiNet and UciNet); (f) statistical analysis and modeling tools (R and SmartPLS). Special emphasis is given to the coupling of the group permanent activities like graduate courses and regular seminars and how newcomers are selected and trained to be able to enroll the group. A global assessment of the role the management strategy and available tool set for the group performance is presented. (author)