WorldWideScience

Sample records for analytical tool development

  1. Cryogenic Propellant Feed System Analytical Tool Development

    Science.gov (United States)

    Lusby, Brian S.; Miranda, Bruno M.; Collins, Jacob A.

    2011-01-01

    The Propulsion Systems Branch at NASA s Lyndon B. Johnson Space Center (JSC) has developed a parametric analytical tool to address the need to rapidly predict heat leak into propellant distribution lines based on insulation type, installation technique, line supports, penetrations, and instrumentation. The Propellant Feed System Analytical Tool (PFSAT) will also determine the optimum orifice diameter for an optional thermodynamic vent system (TVS) to counteract heat leak into the feed line and ensure temperature constraints at the end of the feed line are met. PFSAT was developed primarily using Fortran 90 code because of its number crunching power and the capability to directly access real fluid property subroutines in the Reference Fluid Thermodynamic and Transport Properties (REFPROP) Database developed by NIST. A Microsoft Excel front end user interface was implemented to provide convenient portability of PFSAT among a wide variety of potential users and its ability to utilize a user-friendly graphical user interface (GUI) developed in Visual Basic for Applications (VBA). The focus of PFSAT is on-orbit reaction control systems and orbital maneuvering systems, but it may be used to predict heat leak into ground-based transfer lines as well. PFSAT is expected to be used for rapid initial design of cryogenic propellant distribution lines and thermodynamic vent systems. Once validated, PFSAT will support concept trades for a variety of cryogenic fluid transfer systems on spacecraft, including planetary landers, transfer vehicles, and propellant depots, as well as surface-based transfer systems. The details of the development of PFSAT, its user interface, and the program structure will be presented.

  2. Single cell analytic tools for drug discovery and development

    Science.gov (United States)

    Heath, James R.; Ribas, Antoni; Mischel, Paul S.

    2016-01-01

    The genetic, functional, or compositional heterogeneity of healthy and diseased tissues presents major challenges in drug discovery and development.1-3 In cancers, heterogeneity may be essential for tumor stability,4 but its precise role in tumor biology is poorly resolved. This challenges the design of accurate disease models for use in drug development, and can confound the interpretation of biomarker levels, and of patient responses to specific therapies. The complex nature of heterogeneous tissues has motivated the development of tools for single cell genomic, transcriptomic, and multiplex proteomic analysis. We review these tools, assess their advantages and limitations, and explore their potential applications in drug discovery and development. PMID:26669673

  3. Social Data Analytics Tool

    DEFF Research Database (Denmark)

    Hussain, Abid; Vatrapu, Ravi

    2014-01-01

    This paper presents the design, development and demonstrative case studies of the Social Data Analytics Tool, SODATO. Adopting Action Design Framework [1], the objective of SODATO [2] is to collect, store, analyze, and report big social data emanating from the social media engagement of and social...... media conversations about organizations. We report and discuss results from two demonstrative case studies that were conducted using SODATO and conclude with implications and future work....

  4. Development of Multi-slice Analytical Tool to Support BIM-based Design Process

    Science.gov (United States)

    Atmodiwirjo, P.; Johanes, M.; Yatmo, Y. A.

    2017-03-01

    This paper describes the on-going development of computational tool to analyse architecture and interior space based on multi-slice representation approach that is integrated with Building Information Modelling (BIM). Architecture and interior space is experienced as a dynamic entity, which have the spatial properties that might be variable from one part of space to another, therefore the representation of space through standard architectural drawings is sometimes not sufficient. The representation of space as a series of slices with certain properties in each slice becomes important, so that the different characteristics in each part of space could inform the design process. The analytical tool is developed for use as a stand-alone application that utilises the data exported from generic BIM modelling tool. The tool would be useful to assist design development process that applies BIM, particularly for the design of architecture and interior spaces that are experienced as continuous spaces. The tool allows the identification of how the spatial properties change dynamically throughout the space and allows the prediction of the potential design problems. Integrating the multi-slice analytical tool in BIM-based design process thereby could assist the architects to generate better design and to avoid unnecessary costs that are often caused by failure to identify problems during design development stages.

  5. Development of computer-based analytical tool for assessing physical protection system

    Energy Technology Data Exchange (ETDEWEB)

    Mardhi, Alim, E-mail: alim-m@batan.go.id [National Nuclear Energy Agency Indonesia, (BATAN), PUSPIPTEK area, Building 80, Serpong, Tangerang Selatan, Banten (Indonesia); Chulalongkorn University, Faculty of Engineering, Nuclear Engineering Department, 254 Phayathai Road, Pathumwan, Bangkok Thailand. 10330 (Thailand); Pengvanich, Phongphaeth, E-mail: ppengvan@gmail.com [Chulalongkorn University, Faculty of Engineering, Nuclear Engineering Department, 254 Phayathai Road, Pathumwan, Bangkok Thailand. 10330 (Thailand)

    2016-01-22

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer–based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system’s detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  6. Development of computer-based analytical tool for assessing physical protection system

    Science.gov (United States)

    Mardhi, Alim; Pengvanich, Phongphaeth

    2016-01-01

    Assessment of physical protection system effectiveness is the priority for ensuring the optimum protection caused by unlawful acts against a nuclear facility, such as unauthorized removal of nuclear materials and sabotage of the facility itself. Since an assessment based on real exercise scenarios is costly and time-consuming, the computer-based analytical tool can offer the solution for approaching the likelihood threat scenario. There are several currently available tools that can be used instantly such as EASI and SAPE, however for our research purpose it is more suitable to have the tool that can be customized and enhanced further. In this work, we have developed a computer-based analytical tool by utilizing the network methodological approach for modelling the adversary paths. The inputs are multi-elements in security used for evaluate the effectiveness of the system's detection, delay, and response. The tool has capability to analyze the most critical path and quantify the probability of effectiveness of the system as performance measure.

  7. Big Data Geo-Analytical Tool Development for Spatial Analysis Uncertainty Visualization and Quantification Needs

    Science.gov (United States)

    Rose, K.; Bauer, J. R.; Baker, D. V.

    2015-12-01

    As big data computing capabilities are increasingly paired with spatial analytical tools and approaches, there is a need to ensure uncertainty associated with the datasets used in these analyses is adequately incorporated and portrayed in results. Often the products of spatial analyses, big data and otherwise, are developed using discontinuous, sparse, and often point-driven data to represent continuous phenomena. Results from these analyses are generally presented without clear explanations of the uncertainty associated with the interpolated values. The Variable Grid Method (VGM) offers users with a flexible approach designed for application to a variety of analyses where users there is a need to study, evaluate, and analyze spatial trends and patterns while maintaining connection to and communicating the uncertainty in the underlying spatial datasets. The VGM outputs a simultaneous visualization representative of the spatial data analyses and quantification of underlying uncertainties, which can be calculated using data related to sample density, sample variance, interpolation error, uncertainty calculated from multiple simulations. In this presentation we will show how we are utilizing Hadoop to store and perform spatial analysis through the development of custom Spark and MapReduce applications that incorporate ESRI Hadoop libraries. The team will present custom 'Big Data' geospatial applications that run on the Hadoop cluster and integrate with ESRI ArcMap with the team's probabilistic VGM approach. The VGM-Hadoop tool has been specially built as a multi-step MapReduce application running on the Hadoop cluster for the purpose of data reduction. This reduction is accomplished by generating multi-resolution, non-overlapping, attributed topology that is then further processed using ESRI's geostatistical analyst to convey a probabilistic model of a chosen study region. Finally, we will share our approach for implementation of data reduction and topology generation

  8. The Rack-Gear Tool Generation Modelling. Non-Analytical Method Developed in CATIA, Using the Relative Generating Trajectories Method

    Science.gov (United States)

    Teodor, V. G.; Baroiu, N.; Susac, F.; Oancea, N.

    2016-11-01

    The modelling of a curl of surfaces associated with a pair of rolling centrodes, when it is known the profile of the rack-gear's teeth profile, by direct measuring, as a coordinate matrix, has as goal the determining of the generating quality for an imposed kinematics of the relative motion of tool regarding the blank. In this way, it is possible to determine the generating geometrical error, as a base of the total error. The generation modelling allows highlighting the potential errors of the generating tool, in order to correct its profile, previously to use the tool in machining process. A method developed in CATIA is proposed, based on a new method, namely the method of “relative generating trajectories”. They are presented the analytical foundation, as so as some application for knows models of rack-gear type tools used on Maag teething machines.

  9. Rapid process development of chromatographic process using direct analysis in real time mass spectrometry as a process analytical technology tool.

    Science.gov (United States)

    Yan, Binjun; Chen, Teng; Xu, Zhilin; Qu, Haibin

    2014-06-01

    The concept of quality by design (QbD) is widely applied in the process development of pharmaceuticals. However, the additional cost and time have caused some resistance about QbD implementation. To show a possible solution, this work proposed a rapid process development method, which used direct analysis in real time mass spectrometry (DART-MS) as a process analytical technology (PAT) tool for studying the chromatographic process of Ginkgo biloba L., as an example. The breakthrough curves were fast determined by DART-MS at-line. A high correlation coefficient of 0.9520 was found between the concentrations of ginkgolide A determined by DART-MS and HPLC. Based on the PAT tool, the impacts of process parameters on the adsorption capacity were discovered rapidly, which showed a decreased adsorption capacity with the increase of the flow rate. This work has shown the feasibility and advantages of integrating PAT into QbD implementation for rapid process development.

  10. On the Design, Development and Use of the Social Data Analytics Tool (SODATO)

    DEFF Research Database (Denmark)

    Hussain, Abid

    Science, Computational Social Science and Information Systems, the PhD project addressed two general research questions about the technological architectures and design principles for big social data analytics in an organisational context. The PhD project is grounded in the theory of socio......-technical interactions for better understanding perception of, and action on, the screen when individuals use social media platforms such as Facebook. Based on the theory of socio-technical interactions, a conceptual model of social data was generated. This conceptual model of social data model consists of two...... contributions consisting of design propositions, design principles, and software design patterns for big social data analytics in general, and an analytical framework for set-theoretical computational social science....

  11. Theory-Led Design of Instruments and Representations in Learning Analytics: Developing a Novel Tool for Orchestration of Online Collaborative Learning

    Science.gov (United States)

    Kelly, Nick; Thompson, Kate; Yeoman, Pippa

    2015-01-01

    This paper describes theory-led design as a way of developing novel tools for learning analytics (LA). It focuses upon the domain of automated discourse analysis (ADA) of group learning activities to help an instructor to orchestrate online groups in real-time. The paper outlines the literature on the development of LA tools within the domain of…

  12. A Qualitative Evaluation of Evolution of a Learning Analytics Tool

    Science.gov (United States)

    Ali, Liaqat; Hatala, Marek; Gasevic, Dragan; Jovanovic, Jelena

    2012-01-01

    LOCO-Analyst is a learning analytics tool we developed to provide educators with feedback on students learning activities and performance. Evaluation of the first version of the tool led to the enhancement of the tool's data visualization, user interface, and supported feedback types. The second evaluation of the improved tool allowed us to see…

  13. An analytical model for resistivity tools

    Energy Technology Data Exchange (ETDEWEB)

    Hovgaard, J.

    1991-04-01

    An analytical model for resistivity tools is developed. It takes into account the effect of the borehole and the actual shape of the electrodes. The model is two-dimensional, i.e. the model does not deal with eccentricity. The electrical potential around a current source satisfies Poisson`s equation. The method used here to solve Poisson`s equation is the expansion fo the potential function in terms of a complete set of functions involving one of the coordinates with coefficients which are undetermined functions of the other coordinate. Numerical examples of the use of the model are presented. The results are compared with results given in the literature. (au).

  14. Internet promotion tools and techniques: analytical review

    Directory of Open Access Journals (Sweden)

    S.M. Illiashenko

    2015-09-01

    Full Text Available The aim of the article. The aim of the article is an analysis and systematization of modern communication Internet marketing tools, development of recommendations for their management to promote products in a virtual environment and maintaining the highest level of communication with their economic partners and contact groups. The results of the analysis. The systematic analysis and systematization of the known Internet marketing tools were made. Authors divide them into 8 categories of the use functionality: Search Engine Marketing, Internet advertising, Social Relationship Marketing, Viral Marketing, Video Marketing, E-mail Marketing, Innovative Marketing and Analytical Marketing. The recommendations for this tools use by various size companies were proposed and the most popular Internet-instruments for products promotion were noted. By the results of analysis, the communication instruments of Internet-marketing are divided into 4 groups, which are closely interrelated. Their complex use leads to synergistic effect that appears at profit growth, consumer interest and creating of company’s positive image. Today the forgotten method of communication – E-mail Marketing, interactive infographics, communications in the form of stories, Marketing in social networks and Analytical Marketing have acquired unexpected development. These instruments satisfy needs of companies (the possibility of solid presentation, active communication link and its precise measurements and consumers (interesting content, supported by visual image and information on request. Conclusions and directions for future research. The results can be used as methodological assistance in choosing rational sets of Internet marketing instruments that would take into account the specificity of a production company (seller and its products, market, target audience. The future research must be directed to detection of inexpensive but effective Internet-communication tools, detection

  15. The development and application of mid-infrared spectroscopy as a process analytical technology (PAT) tool for cell culture applications

    OpenAIRE

    Foley, Roisin

    2013-01-01

    The objective of this thesis was to investigate the use of mid-infrared spectroscopy(MIR) as a PAT tool in bioprocessing. This was achieved through the development of chemometric models from MIR spectroscopic data. Models were applied to both upstream and downstream bioprocess steps to evaluate the potential of MIR as a PAT tool in each scenario.The first study included a preliminary examination of 8 typical components found in a mammalian cell culture medium. A multivariate limit of detec...

  16. Analytical Web Tool for CERES Products

    Science.gov (United States)

    Mitrescu, C.; Chu, C.; Doelling, D.

    2012-12-01

    The CERES project provides the community climate quality observed TOA fluxes, consistent cloud properties, and computed profile and surface fluxes. The 11-year long data set proves invaluable for remote sensing and climate modeling communities for annual global mean energy, meridianal heat transport, consistent cloud and fluxes and climate trends studies. Moreover, a broader audience interested in Earth's radiative properties such as green energy, health and environmental companies have showed their interest in CERES derived products. A few years ago, the CERES team start developing a new web-based Ordering Tool tailored for this wide diversity of users. Recognizing the potential that web-2.0 technologies can offer to both Quality Control (QC) and scientific data visualization and manipulation, the CERES team began introducing a series of specialized functions that addresses the above. As such, displaying an attractive, easy to use modern web-based format, the Ordering Tool added the following analytical functions: i) 1-D Histograms to display the distribution of the data field to identify outliers that are useful for QC purposes; ii) an "Anomaly" map that shows the regional differences between the current month and the climatological monthly mean; iii) a 2-D Histogram that can identify either potential problems with the data (i.e. QC function) or provides a global view of trends and/or correlations between various CERES flux, cloud, aerosol, and atmospheric properties. The large volume and diversity of data, together with the on-the-fly execution were the main challenges that had to be tackle with. Depending on the application, the execution was done on either the browser side or the server side with the help of auxiliary files. Additional challenges came from the use of various open source applications, the multitude of CERES products and the seamless transition from previous development. For the future, we plan on expanding the analytical capabilities of the

  17. Analytical tools in accelerator physics

    Energy Technology Data Exchange (ETDEWEB)

    Litvinenko, V.N.

    2010-09-01

    This paper is a sub-set of my lectures presented in the Accelerator Physics course (USPAS, Santa Rosa, California, January 14-25, 2008). It is based on my notes I wrote during period from 1976 to 1979 in Novosibirsk. Only few copies (in Russian) were distributed to my colleagues in Novosibirsk Institute of Nuclear Physics. The goal of these notes is a complete description starting from the arbitrary reference orbit, explicit expressions for 4-potential and accelerator Hamiltonian and finishing with parameterization with action and angle variables. To a large degree follow logic developed in Theory of Cyclic Particle Accelerators by A.A.Kolmensky and A.N.Lebedev [Kolomensky], but going beyond the book in a number of directions. One of unusual feature is these notes use of matrix function and Sylvester formula for calculating matrices of arbitrary elements. Teaching the USPAS course motivated me to translate significant part of my notes into the English. I also included some introductory materials following Classical Theory of Fields by L.D. Landau and E.M. Liftsitz [Landau]. A large number of short notes covering various techniques are placed in the Appendices.

  18. Public Interest Energy Research (PIER) Program Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Tengfang [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Flapper, Joris [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ke, Jing [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Kramer, Klaas [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sathaye, Jayant [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-02-01

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry - including four dairy processes - cheese, fluid milk, butter, and milk powder. BEST-Dairy tool developed in this project provides three options for the user to benchmark each of the dairy product included in the tool, with each option differentiated based on specific detail level of process or plant, i.e., 1) plant level; 2) process-group level, and 3) process-step level. For each detail level, the tool accounts for differences in production and other variables affecting energy use in dairy processes. The dairy products include cheese, fluid milk, butter, milk powder, etc. The BEST-Dairy tool can be applied to a wide range of dairy facilities to provide energy and water savings estimates, which are based upon the comparisons with the best available reference cases that were established through reviewing information from international and national samples. We have performed and completed alpha- and beta-testing (field testing) of the BEST-Dairy tool, through which feedback from voluntary users in the U.S. dairy industry was gathered to validate and improve the tool's functionality. BEST-Dairy v1.2 was formally published in May 2011, and has been made available for free downloads from the internet (i.e., http://best-dairy.lbl.gov). A user's manual has been developed and published as the companion documentation for use with the BEST-Dairy tool. In addition, we also carried out technology transfer activities by engaging the dairy industry in the process of tool development and testing, including field testing, technical presentations, and technical assistance throughout the project. To date, users from more than ten countries in addition to those in the U.S. have downloaded the BEST-Dairy from the LBNL website. It is expected that the use of BEST-Dairy tool will advance understanding of energy and

  19. Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Tengfang [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Flapper, Joris [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ke, Jing [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Kramer, Klaas [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sathaye, Jayant [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-02-01

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry – including four dairy processes – cheese, fluid milk, butter, and milk powder.

  20. Guidance for the Design and Adoption of Analytic Tools.

    Energy Technology Data Exchange (ETDEWEB)

    Bandlow, Alisa [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-12-01

    The goal is to make software developers aware of common issues that can impede the adoption of analytic tools. This paper provides a summary of guidelines, lessons learned and existing research to explain what is currently known about what analysts want and how to better understand what tools they do and don't need.

  1. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    Energy Technology Data Exchange (ETDEWEB)

    Robinson P. Khosah; Charles G. Crawford

    2004-02-01

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its eighteenth month of development activities.

  2. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    Energy Technology Data Exchange (ETDEWEB)

    Robinson P. Khosah; Charles G. Crawford

    2003-03-13

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase 1, which is currently in progress and will take twelve months to complete, will include the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. In Phase 2, which will be completed in the second year of the project, a platform for on-line data analysis will be developed. Phase 2 will include the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its sixth month of Phase

  3. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    Energy Technology Data Exchange (ETDEWEB)

    Robinson P. Khosah; Frank T. Alex

    2007-02-11

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-eighth month of development activities.

  4. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM 2.5) RESEARCH

    Energy Technology Data Exchange (ETDEWEB)

    Robinson P. Khosah; Charles G. Crawford

    2006-02-11

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-second month of development activities.

  5. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    Energy Technology Data Exchange (ETDEWEB)

    Robinson P. Khosah; Charles G. Crawford

    2005-02-01

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its thirtieth month of development activities.

  6. ‘Slag_Fun’ – A New Tool for Archaeometallurgy: Development of an Analytical (PED-XRF Method for Iron-Rich Materials

    Directory of Open Access Journals (Sweden)

    Harald Alexander Veldhuijzen

    2003-11-01

    Full Text Available This paper describes the development of a new analytical tool for bulk chemical analysis of iron-rich archaeometallurgical remains by Polarising Energy Dispersive X-ray Fluorescence ((PED-XRF. Prompted by the ongoing archaeological and archaeometric analyses of early first millennium BC iron smelting and smithing finds from Tell Hammeh (az-Zarqa, Jordan, the creation of this tool has already benefited several studies on iron-rich slag, of widely varying provenance as well as age (Anguilano 2002; Chirikure 2002; Ige and Rehren 2003; Stanway 2003. Following an explanation of the archaeological background and importance of the Hammeh finds, the paper describes the technical foundations of XRF analysis and the design, development and application of the "slag_fun" calibration method.

  7. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    Energy Technology Data Exchange (ETDEWEB)

    Robinson Khosah

    2007-07-31

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project was conducted in two phases. Phase One included the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two involved the development of a platform for on-line data analysis. Phase Two included the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now technically completed.

  8. DSAT: Data Storage and Analytics Tool Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The aim of this project is the development a large data warehousing and analysis tool for air traffic management (ATM) research that can be accessed by users through...

  9. Chemometrics tools used in analytical chemistry: an overview.

    Science.gov (United States)

    Kumar, Naveen; Bansal, Ankit; Sarma, G S; Rawal, Ravindra K

    2014-06-01

    This article presents various important tools of chemometrics utilized as data evaluation tools generated by various hyphenated analytical techniques including their application since its advent to today. The work has been divided into various sections, which include various multivariate regression methods and multivariate resolution methods. Finally the last section deals with the applicability of chemometric tools in analytical chemistry. The main objective of this article is to review the chemometric methods used in analytical chemistry (qualitative/quantitative), to determine the elution sequence, classify various data sets, assess peak purity and estimate the number of chemical components. These reviewed methods further can be used for treating n-way data obtained by hyphenation of LC with multi-channel detectors. We prefer to provide a detailed view of various important methods developed with their algorithm in favor of employing and understanding them by researchers not very familiar with chemometrics.

  10. Developing an analytical tool for evaluating EMS system design changes and their impact on cardiac arrest outcomes: combining geographic information systems with register data on survival rates

    Directory of Open Access Journals (Sweden)

    Sund Björn

    2013-02-01

    Full Text Available Abstract Background Out-of-hospital cardiac arrest (OHCA is a frequent and acute medical condition that requires immediate care. We estimate survival rates from OHCA in the area of Stockholm, through developing an analytical tool for evaluating Emergency Medical Services (EMS system design changes. The study also is an attempt to validate the proposed model used to generate the outcome measures for the study. Methods and results This was done by combining a geographic information systems (GIS simulation of driving times with register data on survival rates. The emergency resources comprised ambulance alone and ambulance plus fire services. The simulation model predicted a baseline survival rate of 3.9 per cent, and reducing the ambulance response time by one minute increased survival to 4.6 per cent. Adding the fire services as first responders (dual dispatch increased survival to 6.2 per cent from the baseline level. The model predictions were validated using empirical data. Conclusion We have presented an analytical tool that easily can be generalized to other regions or countries. The model can be used to predict outcomes of cardiac arrest prior to investment in EMS design changes that affect the alarm process, e.g. (1 static changes such as trimming the emergency call handling time or (2 dynamic changes such as location of emergency resources or which resources should carry a defibrillator.

  11. Analytical and Decision Support Tools for Genomics-Assisted Breeding

    OpenAIRE

    Varshney, Rajeev K.; Singh, Vikas K; Hickey, John M.; Xun, Xu; Marshall, David F; Wang, Jun; Edwards, David; Ribaut, Jean-Marcel

    2016-01-01

    To successfully implement genomics-assisted breeding (GAB) in crop improvement programs, efficient and effective analytical and decision support tools (ADSTs) are 'must haves' to evaluate and select plants for developing next-generation crops. Here we review the applications and deployment of appropriate ADSTs for GAB, in the context of next-generation sequencing (NGS), an emerging source of massive genomic information. We discuss suitable software tools and pipelines for marker-based approac...

  12. Commentary on "Theory-Led Design of Instruments and Representations in Learning Analytics: Developing a Novel Tool for Orchestration of Online Collaborative Learning"

    Science.gov (United States)

    Teplovs, Chris

    2015-01-01

    This commentary reflects on the contributions to learning analytics and theory by a paper that describes how multiple theoretical frameworks were woven together to inform the creation of a new, automated discourse analysis tool. The commentary highlights the contributions of the original paper, provides some alternative approaches, and touches on…

  13. Pre-analytical workstations: a tool for reducing laboratory errors.

    Science.gov (United States)

    Da Rin, Giorgio

    2009-06-01

    Laboratory testing, a highly complex process commonly called the total testing process (TTP), is usually subdivided into three traditional (pre-, intra-, and post-) analytical phases. The majority of errors in TTP originate in the pre-analytical phase, being due to individual or system design defects. In order to reduce errors in TTP, the pre-analytical phase should therefore be prioritized. In addition to developing procedures, providing training, improving interdepartmental cooperation, information technology and robotics may be a tool to reduce errors in specimen collection and pre-analytical sample handling. It has been estimated that >2000 clinical laboratories worldwide use total or subtotal automation supporting pre-analytic activities, with a high rate of increase compared to 2007; the need to reduce errors seems to be the catalyst for increasing the use of robotics. Automated systems to prevent medical personnel from drawing blood from the wrong patient were introduced commercially in the early 1990s. Correct patient identification and test tube labelling before phlebotomy are of extreme importance for patient safety in TTP, but currently few laboratories are interested in such products. At San Bassiano hospital, the implementation of advanced information technology and robotics in the pre-analytical phase (specimen collection and pre-analytical sample handling) have improved accuracy, and clinical efficiency of the laboratory process and created a TTP that minimizes errors.

  14. Medical text analytics tools for search and classification.

    Science.gov (United States)

    Huang, Jimmy; An, Aijun; Hu, Vivian; Tu, Karen

    2009-01-01

    A text-analytic tool has been developed that accepts clinical medical data as input in order to produce patient details. The integrated tool has the following four characteristics. 1) It has a graphical user interface. 2) It has a free-text search tool that is designed to retrieve records using keywords such as "MI" for myocardial infarction. The result set is a display of those sentences in the medical records that contain the keywords. 3) It has three tools to classify patients based on the likelihood of being diagnosed for myocardial infarction, hypertension, or their smoking status. 4) A summary is generated for each patient selected. Large medical data sets provided by the Institute for Clinical Evaluative Sciences were used during the project.

  15. Evaluation of Linked Data tools for Learning Analytics

    NARCIS (Netherlands)

    Hendrik, Drachsler; Herder, Eelco; d'Aquin, Mathieu; Dietze, Stefan

    2013-01-01

    Drachsler, H., Herder, E., d'Aquin, M., & Dietze, S. (2013, 8-12 April). Evaluation of Linked Data tools for Learning Analytics. Presentation given in the tutorial on 'Using Linked Data for Learning Analytics' at LAK2013, the Third Conference on Learning Analytics and Knowledge, Leuven, Belgium.

  16. Analytical and numerical tools for vacuum systems

    CERN Document Server

    Kersevan, R

    2007-01-01

    Modern particle accelerators have reached a level of sophistication which require a thorough analysis of all their sub-systems. Among the latter, the vacuum system is often a major contributor to the operating performance of a particle accelerator. The vacuum engineer has nowadays a large choice of computational schemes and tools for the correct analysis, design, and engineering of the vacuum system. This paper is a review of the different type of algorithms and methodologies which have been developed and employed in the field since the birth of vacuum technology. The different level of detail between simple back-of-the-envelope calculations and more complex numerical analysis is discussed by means of comparisons. The domain of applicability of each method is discussed, together with its pros and cons.

  17. Method for effective usage of Google Analytics tools

    Directory of Open Access Journals (Sweden)

    Ирина Николаевна Егорова

    2016-01-01

    Full Text Available Modern Google Analytics tools have been investigated against effective attraction channels for users and bottlenecks detection. Conducted investigation allowed to suggest modern method for effective usage of Google Analytics tools. The method is based on main traffic indicators analysis, as well as deep analysis of goals and their consecutive tweaking. Method allows to increase website conversion and might be useful for SEO and Web analytics specialists

  18. Factors Influencing Beliefs for Adoption of a Learning Analytics Tool: An Empirical Study

    Science.gov (United States)

    Ali, Liaqat; Asadi, Mohsen; Gasevic, Dragan; Jovanovic, Jelena; Hatala, Marek

    2013-01-01

    Present research and development offer various learning analytics tools providing insights into different aspects of learning processes. Adoption of a specific tool for practice is based on how its learning analytics are perceived by educators to support their pedagogical and organizational goals. In this paper, we propose and empirically validate…

  19. Game development tool essentials

    CERN Document Server

    Berinstein, Paula; Ardolino, Alessandro; Franco, Simon; Herubel, Adrien; McCutchan, John; Nedelcu, Nicusor; Nitschke, Benjamin; Olmstead, Don; Robinet, Fabrice; Ronchi, Christian; Turkowski, Rita; Walter, Robert; Samour, Gustavo

    2014-01-01

    Offers game developers new techniques for streamlining the critical game tools pipeline. Inspires game developers to share their secrets and improve the productivity of the entire industry. Helps game industry practitioners compete in a hyper-competitive environment.

  20. Web analytics tools and web metrics tools: An overview and comparative analysis

    OpenAIRE

    Ivan Bekavac; Daniela Garbin Praničević

    2015-01-01

    The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytic...

  1. Using Business Intelligence Tools for Predictive Analytics in Healthcare System

    Directory of Open Access Journals (Sweden)

    Mihaela-Laura IVAN

    2016-05-01

    Full Text Available The scope of this article is to highlight how healthcare analytics can be improved using Business Intelligence tools. Healthcare system has learned from the previous lessons the necessity of using healthcare analytics for improving patient care, hospital administration, population growth and many others aspects. Business Intelligence solutions applied for the current analysis demonstrate the benefits brought by the new tools, such as SAP HANA, SAP Lumira, and SAP Predictive Analytics. In detailed is analyzed the birth rate with the contribution of different factors to the world.

  2. Analytical tools for speciation in the field of toxicology

    Energy Technology Data Exchange (ETDEWEB)

    Bresson, C. [CEA, DEN, DPC, SEARS, Gif-sur-Yvette (France). Laboratoire de Developpement Analytique Nucleaire, Isotopique et Elementaire; Chartier, F. [CEA, DEN, Gif-sur-Yvette (France). Dept. de Physico-Chimie; Ansoborlo, E. [CEA, DEN, DRCP, CETAMA, Marcoule, Bagnols-sur-ceze (France); Ortega, R. [Bordeaux Univ., CENBG, UMR 5797, Gradignan (France); CNRS, IN2P3, CENBG, UMR 5797, Gradignan (France)

    2013-08-01

    The knowledge of the speciation of elements at trace and ultra-trace level, in biological and environmental media is essential to acquire a better understanding of the mechanisms of toxicity, transport and accumulation in which they are involved. Determining the speciation of an element in a given medium is challenging and requires the knowledge of different methodological approaches: the calculation approach and the experimental approach through the use of dedicated analytical and spectroscopic tools. In this framework, this mini-review reports the approaches to investigate the speciation of elements in biological and environmental media as well as the experimental techniques of speciation analysis, illustrated by recent examples. The main analytical and spectroscopic techniques to obtain structural, molecular, elemental and isotopic information are described. A brief overview of separation techniques coupled with spectrometric techniques is given. Imaging and micro-localisation techniques, which aim at determining the in situ spatial distribution of elements and molecules in various solid samples, are also presented. The last part deals with the development of micro-analytical systems, since they open crucial perspectives to speciation analysis for low sample amounts and analysis on field. (orig.)

  3. Electronic tongue: An analytical gustatory tool

    Directory of Open Access Journals (Sweden)

    Rewanthwar Swathi Latha

    2012-01-01

    Full Text Available Taste is an important organoleptic property governing acceptance of products for administration through mouth. But majority of drugs available are bitter in taste. For patient acceptability and compliance, bitter taste drugs are masked by adding several flavoring agents. Thus, taste assessment is one important quality control parameter for evaluating taste-masked formulations. The primary method for the taste measurement of drug substances and formulations is by human panelists. The use of sensory panelists is very difficult and problematic in industry and this is due to the potential toxicity of drugs and subjectivity of taste panelists, problems in recruiting taste panelists, motivation and panel maintenance are significantly difficult when working with unpleasant products. Furthermore, Food and Drug Administration (FDA-unapproved molecules cannot be tested. Therefore, analytical taste-sensing multichannel sensory system called as electronic tongue (e-tongue or artificial tongue which can assess taste have been replacing the sensory panelists. Thus, e-tongue includes benefits like reducing reliance on human panel. The present review focuses on the electrochemical concepts in instrumentation, performance qualification of E-tongue, and applications in various fields.

  4. Electronic tongue: An analytical gustatory tool.

    Science.gov (United States)

    Latha, Rewanthwar Swathi; Lakshmi, P K

    2012-01-01

    Taste is an important organoleptic property governing acceptance of products for administration through mouth. But majority of drugs available are bitter in taste. For patient acceptability and compliance, bitter taste drugs are masked by adding several flavoring agents. Thus, taste assessment is one important quality control parameter for evaluating taste-masked formulations. The primary method for the taste measurement of drug substances and formulations is by human panelists. The use of sensory panelists is very difficult and problematic in industry and this is due to the potential toxicity of drugs and subjectivity of taste panelists, problems in recruiting taste panelists, motivation and panel maintenance are significantly difficult when working with unpleasant products. Furthermore, Food and Drug Administration (FDA)-unapproved molecules cannot be tested. Therefore, analytical taste-sensing multichannel sensory system called as electronic tongue (e-tongue or artificial tongue) which can assess taste have been replacing the sensory panelists. Thus, e-tongue includes benefits like reducing reliance on human panel. The present review focuses on the electrochemical concepts in instrumentation, performance qualification of E-tongue, and applications in various fields.

  5. Deriving Earth Science Data Analytics Tools/Techniques Requirements

    Science.gov (United States)

    Kempler, S. J.

    2015-12-01

    Data Analytics applications have made successful strides in the business world where co-analyzing extremely large sets of independent variables have proven profitable. Today, most data analytics tools and techniques, sometimes applicable to Earth science, have targeted the business industry. In fact, the literature is nearly absent of discussion about Earth science data analytics. Earth science data analytics (ESDA) is the process of examining large amounts of data from a variety of sources to uncover hidden patterns, unknown correlations, and other useful information. ESDA is most often applied to data preparation, data reduction, and data analysis. Co-analysis of increasing number and volume of Earth science data has become more prevalent ushered by the plethora of Earth science data sources generated by US programs, international programs, field experiments, ground stations, and citizen scientists. Through work associated with the Earth Science Information Partners (ESIP) Federation, ESDA types have been defined in terms of data analytics end goals. Goals of which are very different than those in business, requiring different tools and techniques. A sampling of use cases have been collected and analyzed in terms of data analytics end goal types, volume, specialized processing, and other attributes. The goal of collecting these use cases is to be able to better understand and specify requirements for data analytics tools and techniques yet to be implemented. This presentation will describe the attributes and preliminary findings of ESDA use cases, as well as provide early analysis of data analytics tools/techniques requirements that would support specific ESDA type goals. Representative existing data analytics tools/techniques relevant to ESDA will also be addressed.

  6. Enabling analytical and Modeling Tools for Enhanced Disease Surveillance

    Energy Technology Data Exchange (ETDEWEB)

    Dawn K. Manley

    2003-04-01

    Early detection, identification, and warning are essential to minimize casualties from a biological attack. For covert attacks, sick people are likely to provide the first indication of an attack. An enhanced medical surveillance system that synthesizes distributed health indicator information and rapidly analyzes the information can dramatically increase the number of lives saved. Current surveillance methods to detect both biological attacks and natural outbreaks are hindered by factors such as distributed ownership of information, incompatible data storage and analysis programs, and patient privacy concerns. Moreover, because data are not widely shared, few data mining algorithms have been tested on and applied to diverse health indicator data. This project addressed both integration of multiple data sources and development and integration of analytical tools for rapid detection of disease outbreaks. As a first prototype, we developed an application to query and display distributed patient records. This application incorporated need-to-know access control and incorporated data from standard commercial databases. We developed and tested two different algorithms for outbreak recognition. The first is a pattern recognition technique that searches for space-time data clusters that may signal a disease outbreak. The second is a genetic algorithm to design and train neural networks (GANN) that we applied toward disease forecasting. We tested these algorithms against influenza, respiratory illness, and Dengue Fever data. Through this LDRD in combination with other internal funding, we delivered a distributed simulation capability to synthesize disparate information and models for earlier recognition and improved decision-making in the event of a biological attack. The architecture incorporates user feedback and control so that a user's decision inputs can impact the scenario outcome as well as integrated security and role-based access-control for communicating

  7. Nanomaterials as Analytical Tools for Genosensors

    Directory of Open Access Journals (Sweden)

    Anees Ahmad Ansari

    2010-01-01

    Full Text Available Nanomaterials are being increasingly used for the development of electrochemical DNA biosensors, due to the unique electrocatalytic properties found in nanoscale materials. They offer excellent prospects for interfacing biological recognition events with electronic signal transduction and for designing a new generation of bioelectronic devices exhibiting novel functions. In particular, nanomaterials such as noble metal nanoparticles (Au, Pt, carbon nanotubes (CNTs, magnetic nanoparticles, quantum dots and metal oxide nanoparticles have been actively investigated for their applications in DNA biosensors, which have become a new interdisciplinary frontier between biological detection and material science. In this article, we address some of the main advances in this field over the past few years, discussing the issues and challenges with the aim of stimulating a broader interest in developing nanomaterial-based biosensors and improving their applications in disease diagnosis and food safety examination.

  8. Web analytics tools and web metrics tools: An overview and comparative analysis

    Directory of Open Access Journals (Sweden)

    Ivan Bekavac

    2015-10-01

    Full Text Available The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytics tools to exploring their functionalities and ability to be integrated into the respective business model. Web analytics tools support the business analyst’s efforts in obtaining useful and relevant insights into market dynamics. Thus, generally speaking, selecting a web analytics and web metrics tool should be based on an investigative approach, not a random decision. The second section is a quantitative focus shifting from theory to an empirical approach, and which subsequently presents output data resulting from a study based on perceived user satisfaction of web analytics tools. The empirical study was carried out on employees from 200 Croatian firms from either an either IT or marketing branch. The paper contributes to highlighting the support for management that available web analytics and web metrics tools available on the market have to offer, and based on the growing needs of understanding and predicting global market trends.

  9. An analytical framework and tool ('InteRa') for integrating the informal recycling sector in waste and resource management systems in developing countries.

    Science.gov (United States)

    Velis, Costas A; Wilson, David C; Rocca, Ondina; Smith, Stephen R; Mavropoulos, Antonis; Cheeseman, Chris R

    2012-09-01

    In low- and middle-income developing countries, the informal (collection and) recycling sector (here abbreviated IRS) is an important, but often unrecognised, part of a city's solid waste and resources management system. Recent evidence shows recycling rates of 20-30% achieved by IRS systems, reducing collection and disposal costs. They play a vital role in the value chain by reprocessing waste into secondary raw materials, providing a livelihood to around 0.5% of urban populations. However, persisting factual and perceived problems are associated with IRS (waste-picking): occupational and public health and safety (H&S), child labour, uncontrolled pollution, untaxed activities, crime and political collusion. Increasingly, incorporating IRS as a legitimate stakeholder and functional part of solid waste management (SWM) is attempted, further building recycling rates in an affordable way while also addressing the negatives. Based on a literature review and a practitioner's workshop, here we develop a systematic framework--or typology--for classifying and analysing possible interventions to promote the integration of IRS in a city's SWM system. Three primary interfaces are identified: between the IRS and the SWM system, the materials and value chain, and society as a whole; underlain by a fourth, which is focused on organisation and empowerment. To maximise the potential for success, IRS integration/inclusion/formalisation initiatives should consider all four categories in a balanced way and pay increased attention to their interdependencies, which are central to success, including specific actions, such as the IRS having access to source separated waste. A novel rapid evaluation and visualisation tool is presented--integration radar (diagram) or InterRa--aimed at illustrating the degree to which a planned or existing intervention considers each of the four categories. The tool is further demonstrated by application to 10 cases around the world, including a step

  10. Analytical Quality by Design: A Tool for Regulatory Flexibility and Robust Analytics

    Directory of Open Access Journals (Sweden)

    Ramalingam Peraman

    2015-01-01

    Full Text Available Very recently, Food and Drug Administration (FDA has approved a few new drug applications (NDA with regulatory flexibility for quality by design (QbD based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design. It allows the analytical method for movement within method operable design region (MODR. Unlike current methods, analytical method developed using analytical quality by design (AQbD approach reduces the number of out-of-trend (OOT results and out-of-specification (OOS results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10. Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT.

  11. Analytical quality by design: a tool for regulatory flexibility and robust analytics.

    Science.gov (United States)

    Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT).

  12. Big Data Analytics Tools as Applied to ATLAS Event Data

    CERN Document Server

    Vukotic, Ilija; The ATLAS collaboration

    2016-01-01

    Big Data technologies have proven to be very useful for storage, processing and visualization of derived metrics associated with ATLAS distributed computing (ADC) services. Log file data and database records, and metadata from a diversity of systems have been aggregated and indexed to create an analytics platform for ATLAS ADC operations analysis. Dashboards, wide area data access cost metrics, user analysis patterns, and resource utilization efficiency charts are produced flexibly through queries against a powerful analytics cluster. Here we explore whether these techniques and analytics ecosystem can be applied to add new modes of open, quick, and pervasive access to ATLAS event data so as to simplify access and broaden the reach of ATLAS public data to new communities of users. An ability to efficiently store, filter, search and deliver ATLAS data at the event and/or sub-event level in a widely supported format would enable or significantly simplify usage of big data, statistical and machine learning tools...

  13. Applications of Spatial Data Using Business Analytics Tools

    Directory of Open Access Journals (Sweden)

    Anca Ioana ANDREESCU

    2011-12-01

    Full Text Available This paper addresses the possibilities of using spatial data in business analytics tools, with emphasis on SAS software. Various kinds of map data sets containing spatial data are presented and discussed. Examples of map charts illustrating macroeconomic parameters demonstrate the application of spatial data for the creation of map charts in SAS Enterprise Guise. Extended features of map charts are being exemplified by producing charts via SAS programming procedures.

  14. Learning Analytics: drivers, developments and challenges

    Directory of Open Access Journals (Sweden)

    Rebecca Ferguson

    2014-12-01

    Full Text Available Learning analytics is a significant area of Technology-Enhanced Learning (TEL that has emerged during the last decade. This review of the field begins with an examination of the technological, educational and political factors that have driven the development of analytics in educational settings. It goes on to chart the emergence of learning analytics, including their origins in the 20th century, the development of data-driven analytics, the rise of learning-focused perspectives and the influence of national economic concerns. It next focuses on the relationships between learning analytics, educational data mining and academic analytics. Finally, it examines developing areas of learning analytics research, and identifies a series of future challenges.

  15. Micromechanical String Resonators: Analytical Tool for Thermal Characterization of Polymers

    DEFF Research Database (Denmark)

    Bose, Sanjukta; Schmid, Silvan; Larsen, Tom;

    2014-01-01

    Resonant microstrings show promise as a new analytical tool for thermal characterization of polymers with only few nanograms of sample. The detection of the glass transition temperature (Tg) of an amorphous poly(d,l-lactide) (PDLLA) and a semicrystalline poly(l-lactide) (PLLA) is investigated....... The polymers are spray coated on one side of the resonating microstrings. The resonance frequency and quality factor (Q) are measured simultaneously as a function of temperature. Change in the resonance frequency reflects a change in static tensile stress, which yields information about the Young’s modulus...... of the polymer, and a change in Q reflects the change in damping of the polymer-coated string. The frequency response of the microstring is validated with an analytical model. From the frequency independent tensile stress change, static Tg values of 40.6 and 57.6 °C were measured for PDLLA and PLLA, respectively...

  16. TNO monitoring plan development tool

    NARCIS (Netherlands)

    Sijacic, D.; Wildenborg, T.; Steeghs, P.

    2014-01-01

    TNO has developed a software tool that supports the design of a risk-based monitoring plan for a CO2 storage site. The purpose of the tool is to aid storage site operators by facilitating a structured monitoring technologies selection or evaluation process. The tool makes a selection this recommende

  17. 33 CFR 385.33 - Revisions to models and analytical tools.

    Science.gov (United States)

    2010-07-01

    ... analytical tools. 385.33 Section 385.33 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE... Incorporating New Information Into the Plan § 385.33 Revisions to models and analytical tools. (a) In carrying... and other analytical tools for conducting analyses for the planning, design, construction,...

  18. Environmental tools in product development

    DEFF Research Database (Denmark)

    Wenzel, Henrik; Hauschild, Michael Zwicky; Jørgensen, Jørgen

    1994-01-01

    A precondition for design of environmentally friendly products is that the design team has access to methods and tools supporting the introduction of environmental criteria in product development. A large Danish program, EDIP, is being carried out by the Institute for Product Development, Technical...... University of Denmark, in cooperation with 5 major Danish companies aiming at the development and testing of such tools. These tools are presented in this paper...

  19. Development and Verification of Behavior of Tritium Analytic Code (BOTANIC)

    Energy Technology Data Exchange (ETDEWEB)

    Park, Min Young; Kim, Eung Soo [Seoul National University, Seoul (Korea, Republic of)

    2014-10-15

    VHTR, one of the Generation IV reactor concepts, has a relatively high operation temperature and is usually suggested as a heat source for many industrial processes, including hydrogen production process. Thus, it is vital to trace tritium behavior in the VHTR system and the potential permeation rate to the industrial process. In other words, tritium is a crucial issue in terms of safety in the fission reactor system. Therefore, it is necessary to understand the behavior of tritium and the development of the tool to enable this is vital.. In this study, a Behavior of Tritium Analytic Code (BOTANIC) an analytic tool which is capable of analyzing tritium behavior is developed using a chemical process code called gPROMS. BOTANIC was then further verified using the analytic solutions and benchmark codes such as Tritium Permeation Analysis Code (TPAC) and COMSOL. In this study, the Behavior of Tritium Analytic Code, BOTANIC, has been developed using a chemical process code called gPROMS. The code has several distinctive features including non-diluted assumption, flexible applications and adoption of distributed permeation model. Due to these features, BOTANIC has the capability to analyze a wide range of tritium level systems and has a higher accuracy as it has the capacity to solve distributed models. BOTANIC was successfully developed and verified using analytical solution and the benchmark code calculation result. The results showed very good agreement with the analytical solutions and the calculation results of TPAC and COMSOL. Future work will be focused on the total system verification.

  20. Trial analytics--a tool for clinical trial management.

    Science.gov (United States)

    Bose, Anindya; Das, Suman

    2012-01-01

    Prolonged timelines and large expenses associated with clinical trials have prompted a new focus on improving the operational efficiency of clinical trials by use of Clinical Trial Management Systems (CTMS) in order to improve managerial control in trial conduct. However, current CTMS systems are not able to meet the expectations due to various shortcomings like inability of timely reporting and trend visualization within/beyond an organization. To overcome these shortcomings of CTMS, clinical researchers can apply a business intelligence (BI) framework to create Clinical Research Intelligence (CLRI) for optimization of data collection and analytics. This paper proposes the usage of an innovative and collaborative visualization tool (CTA) as CTMS "add-on" to help overwhelm these deficiencies of traditional CTMS, with suitable examples.

  1. Identity as an Analytical Tool to Explore Students’ Mathematical Writing

    DEFF Research Database (Denmark)

    Iversen, Steffen Møllegaard

    Learning to communicate in, with and about mathematics is a key part of learning mathematics (Niss & Højgaard, 2011). Therefore, understanding how students’ mathematical writing is shaped is important to mathematics education research. In this paper the notion of ‘writer identity’ (Ivanič, 1998; ......; Burgess & Ivanič, 2010) is introduced and operationalised in relation to students’ mathematical writing and through a case study it is illustrated how this analytic tool can facilitate valuable insights when exploring students’ mathematical writing.......Learning to communicate in, with and about mathematics is a key part of learning mathematics (Niss & Højgaard, 2011). Therefore, understanding how students’ mathematical writing is shaped is important to mathematics education research. In this paper the notion of ‘writer identity’ (Ivanič, 1998...

  2. Development of Nuclear Analytical Technology

    Energy Technology Data Exchange (ETDEWEB)

    Park, Yong Joon; Kim, J. Y.; Sohn, S. C. (and others)

    2007-06-15

    The pre-treatment and handling techniques for the micro-particles in swipe samples were developed for the safeguards purpose. The development of screening technique for the swipe samples has been established using the nuclear fission track method as well as the alpha track method. The laser ablation system to take a nuclear particle present in swipe was designed and constructed for the determination of the enrichment factors for uranium or plutonium, and its performance was tested in atmosphere as well as in vacuum. The optimum conditions for the synthesis of silica based micro-particles were obtained for mass production. The optimum ion exchange resin was selected and the optimum conditions for the uranium adsorption in resin bead technique were established for the development of the enrichment factor for nuclear particles in swipe. The established technique was applied to the swipe taken directly from the nuclear facility and also to the archive samples of IAEA's environmental swipes. The evaluation of dose rate of neutron and secondary gamma-ray for the radiation shields were carried out to design the NIPS system, as well as the evaluation of the thermal neutron concentration effect by the various reflectors. D-D neutron generator was introduced as a neutron source for the NIPS system to have more advantages such as easier control and moderation capability than the {sup 252}Cf source. Simulated samples for explosive and chemical warfare were prepared to construct a prompt gamma-ray database. Based on the constructed database, a computer program for the detection of illicit chemical and nuclear materials was developed using the MATLAB software.

  3. Android development tools for Eclipse

    CERN Document Server

    Shah, Sanjay

    2013-01-01

    A standard tutorial aimed at developing Android applications in a practical manner.Android Development Tools for Eclipse is aimed at beginners and existing developers who want to learn more about Android development. It is assumed that you have experience in Java programming and that you have used IDE for development.

  4. Distributed Engine Control Empirical/Analytical Verification Tools

    Science.gov (United States)

    DeCastro, Jonathan; Hettler, Eric; Yedavalli, Rama; Mitra, Sayan

    2013-01-01

    NASA's vision for an intelligent engine will be realized with the development of a truly distributed control system featuring highly reliable, modular, and dependable components capable of both surviving the harsh engine operating environment and decentralized functionality. A set of control system verification tools was developed and applied to a C-MAPSS40K engine model, and metrics were established to assess the stability and performance of these control systems on the same platform. A software tool was developed that allows designers to assemble easily a distributed control system in software and immediately assess the overall impacts of the system on the target (simulated) platform, allowing control system designers to converge rapidly on acceptable architectures with consideration to all required hardware elements. The software developed in this program will be installed on a distributed hardware-in-the-loop (DHIL) simulation tool to assist NASA and the Distributed Engine Control Working Group (DECWG) in integrating DCS (distributed engine control systems) components onto existing and next-generation engines.The distributed engine control simulator blockset for MATLAB/Simulink and hardware simulator provides the capability to simulate virtual subcomponents, as well as swap actual subcomponents for hardware-in-the-loop (HIL) analysis. Subcomponents can be the communication network, smart sensor or actuator nodes, or a centralized control system. The distributed engine control blockset for MATLAB/Simulink is a software development tool. The software includes an engine simulation, a communication network simulation, control algorithms, and analysis algorithms set up in a modular environment for rapid simulation of different network architectures; the hardware consists of an embedded device running parts of the CMAPSS engine simulator and controlled through Simulink. The distributed engine control simulation, evaluation, and analysis technology provides unique

  5. Prioritizing pesticide compounds for analytical methods development

    Science.gov (United States)

    Norman, Julia E.; Kuivila, Kathryn M.; Nowell, Lisa H.

    2012-01-01

    The U.S. Geological Survey (USGS) has a periodic need to re-evaluate pesticide compounds in terms of priorities for inclusion in monitoring and studies and, thus, must also assess the current analytical capabilities for pesticide detection. To meet this need, a strategy has been developed to prioritize pesticides and degradates for analytical methods development. Screening procedures were developed to separately prioritize pesticide compounds in water and sediment. The procedures evaluate pesticide compounds in existing USGS analytical methods for water and sediment and compounds for which recent agricultural-use information was available. Measured occurrence (detection frequency and concentrations) in water and sediment, predicted concentrations in water and predicted likelihood of occurrence in sediment, potential toxicity to aquatic life or humans, and priorities of other agencies or organizations, regulatory or otherwise, were considered. Several existing strategies for prioritizing chemicals for various purposes were reviewed, including those that identify and prioritize persistent, bioaccumulative, and toxic compounds, and those that determine candidates for future regulation of drinking-water contaminants. The systematic procedures developed and used in this study rely on concepts common to many previously established strategies. The evaluation of pesticide compounds resulted in the classification of compounds into three groups: Tier 1 for high priority compounds, Tier 2 for moderate priority compounds, and Tier 3 for low priority compounds. For water, a total of 247 pesticide compounds were classified as Tier 1 and, thus, are high priority for inclusion in analytical methods for monitoring and studies. Of these, about three-quarters are included in some USGS analytical method; however, many of these compounds are included on research methods that are expensive and for which there are few data on environmental samples. The remaining quarter of Tier 1

  6. Enhancing simulation of efficiency with analytical tools. [combining computer simulation and analytical techniques for cost reduction

    Science.gov (United States)

    Seltzer, S. M.

    1974-01-01

    Some means of combining both computer simulation and anlytical techniques are indicated in order to mutually enhance their efficiency as design tools and to motivate those involved in engineering design to consider using such combinations. While the idea is not new, heavy reliance on computers often seems to overshadow the potential utility of analytical tools. Although the example used is drawn from the area of dynamics and control, the principles espoused are applicable to other fields. In the example the parameter plane stability analysis technique is described briefly and extended beyond that reported in the literature to increase its utility (through a simple set of recursive formulas) and its applicability (through the portrayal of the effect of varying the sampling period of the computer). The numerical values that were rapidly selected by analysis were found to be correct for the hybrid computer simulation for which they were needed. This obviated the need for cut-and-try methods to choose the numerical values, thereby saving both time and computer utilization.

  7. Web analytics as tool for improvement of website taxonomies

    DEFF Research Database (Denmark)

    Jonasen, Tanja Svarre; Ådland, Marit Kristine; Lykke, Marianne

    The poster examines how web analytics can be used to provide information about users and inform design and redesign of taxonomies. It uses a case study of the website Cancer.dk by the Danish Cancer Society. The society is a private organization with an overall goal to prevent the development...... of cancer, improve patients’ chances of recovery, and limit the physical, psychological and social side-effects of cancer. The website is the main channel for communication and knowledge sharing with patients, their relatives and professionals. The present study consists of two independent analyses, one...... provides information about e.g. subjects of interest, searching behaviour, browsing patterns in website structure as well as tag clouds, page views. The poster discusses benefits and challenges of the two web metrics, with a model of how to use search and tag data for the design of taxonomies, e.g. choice...

  8. Analytical and Semi-Analytical Tools for the Design of Oscillatory Pumping Tests.

    Science.gov (United States)

    Cardiff, Michael; Barrash, Warren

    2015-01-01

    Oscillatory pumping tests-in which flow is varied in a periodic fashion-provide a method for understanding aquifer heterogeneity that is complementary to strategies such as slug testing and constant-rate pumping tests. During oscillatory testing, pressure data collected at non-pumping wells can be processed to extract metrics, such as signal amplitude and phase lag, from a time series. These metrics are robust against common sensor problems (including drift and noise) and have been shown to provide information about aquifer heterogeneity. Field implementations of oscillatory pumping tests for characterization, however, are not common and thus there are few guidelines for their design and implementation. Here, we use available analytical solutions from the literature to develop design guidelines for oscillatory pumping tests, while considering practical field constraints. We present two key analytical results for design and analysis of oscillatory pumping tests. First, we provide methods for choosing testing frequencies and flow rates which maximize the signal amplitude that can be expected at a distance from an oscillating pumping well, given design constraints such as maximum/minimum oscillator frequency and maximum volume cycled. Preliminary data from field testing helps to validate the methodology. Second, we develop a semi-analytical method for computing the sensitivity of oscillatory signals to spatially distributed aquifer flow parameters. This method can be quickly applied to understand the "sensed" extent of an aquifer at a given testing frequency. Both results can be applied given only bulk aquifer parameter estimates, and can help to optimize design of oscillatory pumping test campaigns.

  9. Ultrasonic trap as analytical tool; Die Ultraschallfalle als analytisches Werkzeug

    Energy Technology Data Exchange (ETDEWEB)

    Leiterer, Jork

    2009-11-30

    nanoparticles. This comprises fields of research like biomineralisation, protein agglomeration, distance dependent effects of nanocrystalline quantum dots and the in situ observation of early crystallization stages. In summary, the results of this work open a broad area of application to use the ultrasonic trap as an analytical tool. [German] Die Ultraschallfalle bietet eine besondere Moeglichkeit zur Handhabung von Proben im Mikrolitermassstab. Durch die akustische Levitation wird die Probe kontaktfrei in einer gasfoermigen Umgebung positioniert und somit dem Einfluss fester Oberflaechen entzogen. In dieser Arbeit werden die Moeglichkeiten der Ultraschallfalle fuer den Einsatz in der Analytik experimentell untersucht. Durch die Kopplung mit typischen kontaktlosen Analysemethoden wie der Spektroskopie und der Roentgenstreuung werden die Vorteile dieser Levitationstechnik an verschiedenen Materialien wie anorganischen, organischen, pharmazeutischen Substanzen bis hin zu Proteinen, Nano- und Mikropartikeln demonstriert. Es wird gezeigt, dass die Nutzung der akustischen Levitation zuverlaessig eine beruehrungslose Probenhandhabung fuer den Einsatz spektroskopischer Methoden (LIF, Raman) sowie erstmalig Methoden der Roentgenstreuung (EDXD, SAXS, WAXS) und Roentgenfluoreszenz (RFA, XANES) ermoeglicht. Fuer alle genannten Methoden erwies sich die wandlose Probenhalterung als vorteilhaft. So sind die Untersuchungsergebnisse vergleichbar mit denen herkoemmlicher Probenhalter und uebertreffen diese teilweise hinsichtlich der Datenqualitaet. Einen besonderen Erfolg stellt die Integration des akustischen Levitators in die experimentellen Aufbauten der Messplaetze am Synchrotron dar. Die Anwendung der Ultraschallfalle am BESSY konnte im Rahmen dieser Arbeit etabliert werden und bildet derzeit die Grundlage intensiver interdisziplinaerer Forschung. Ausserdem wurde das Potential der Falle zur Aufkonzentration erkannt und zum Studium verdunstungskontrollierter Prozesse angewendet. Die

  10. Ultrasonic trap as analytical tool; Die Ultraschallfalle als analytisches Werkzeug

    Energy Technology Data Exchange (ETDEWEB)

    Leiterer, Jork

    2009-11-30

    nanoparticles. This comprises fields of research like biomineralisation, protein agglomeration, distance dependent effects of nanocrystalline quantum dots and the in situ observation of early crystallization stages. In summary, the results of this work open a broad area of application to use the ultrasonic trap as an analytical tool. [German] Die Ultraschallfalle bietet eine besondere Moeglichkeit zur Handhabung von Proben im Mikrolitermassstab. Durch die akustische Levitation wird die Probe kontaktfrei in einer gasfoermigen Umgebung positioniert und somit dem Einfluss fester Oberflaechen entzogen. In dieser Arbeit werden die Moeglichkeiten der Ultraschallfalle fuer den Einsatz in der Analytik experimentell untersucht. Durch die Kopplung mit typischen kontaktlosen Analysemethoden wie der Spektroskopie und der Roentgenstreuung werden die Vorteile dieser Levitationstechnik an verschiedenen Materialien wie anorganischen, organischen, pharmazeutischen Substanzen bis hin zu Proteinen, Nano- und Mikropartikeln demonstriert. Es wird gezeigt, dass die Nutzung der akustischen Levitation zuverlaessig eine beruehrungslose Probenhandhabung fuer den Einsatz spektroskopischer Methoden (LIF, Raman) sowie erstmalig Methoden der Roentgenstreuung (EDXD, SAXS, WAXS) und Roentgenfluoreszenz (RFA, XANES) ermoeglicht. Fuer alle genannten Methoden erwies sich die wandlose Probenhalterung als vorteilhaft. So sind die Untersuchungsergebnisse vergleichbar mit denen herkoemmlicher Probenhalter und uebertreffen diese teilweise hinsichtlich der Datenqualitaet. Einen besonderen Erfolg stellt die Integration des akustischen Levitators in die experimentellen Aufbauten der Messplaetze am Synchrotron dar. Die Anwendung der Ultraschallfalle am BESSY konnte im Rahmen dieser Arbeit etabliert werden und bildet derzeit die Grundlage intensiver interdisziplinaerer Forschung. Ausserdem wurde das Potential der Falle zur Aufkonzentration erkannt und zum Studium verdunstungskontrollierter Prozesse angewendet. Die

  11. Aspects of recent developments in analytical chemometrics

    Institute of Scientific and Technical Information of China (English)

    LIANG; Yizeng; WU; Hailong; SHEN; Guoli; JIANG; Jianhui; LIANG; Sheng

    2006-01-01

    Some aspects of recent developments in analytical chemometrics are discussed, in particular the developments viewed from the angle of the research efforts undertaken in authors' laboratories. The topics concerned include resolution of high-order chemical data, morphological theory and methodology for chemical signal processing, multivariate calibration and chemical pattern recognition for solving complex chemical problems, and resolution of two-way chemical data from hyphenated chromatographic instruments.

  12. Employability Skills Assessment Tool Development

    Science.gov (United States)

    Rasul, Mohamad Sattar; Rauf, Rose Amnah Abd; Mansor, Azlin Norhaini; Puvanasvaran, A. P.

    2012-01-01

    Research nationally and internationally found that technical graduates are lacking in employability skills. As employability skills are crucial in outcome-based education, the main goal of this research is to develop an Employability Skill Assessment Tool to help students and lecturers produce competent graduates in employability skills needed by…

  13. Factors Influencing Attitudes Towards the Use of CRM’s Analytical Tools in Organizations

    Directory of Open Access Journals (Sweden)

    Šebjan Urban

    2016-02-01

    Full Text Available Background and Purpose: Information solutions for analytical customer relationship management CRM (aCRM IS that include the use of analytical tools are becoming increasingly important, due organizations’ need for knowledge of their customers and the ability to manage big data. The objective of the research is, therefore, to determine how the organizations’ orientations (process, innovation, and technology as critical organizational factors affect the attitude towards the use of the analytical tools of aCRM IS.

  14. A Business Analytics Software Tool for Monitoring and Predicting Radiology Throughput Performance.

    Science.gov (United States)

    Jones, Stephen; Cournane, Seán; Sheehy, Niall; Hederman, Lucy

    2016-12-01

    Business analytics (BA) is increasingly being utilised by radiology departments to analyse and present data. It encompasses statistical analysis, forecasting and predictive modelling and is used as an umbrella term for decision support and business intelligence systems. The primary aim of this study was to determine whether utilising BA technologies could contribute towards improved decision support and resource management within radiology departments. A set of information technology requirements were identified with key stakeholders, and a prototype BA software tool was designed, developed and implemented. A qualitative evaluation of the tool was carried out through a series of semi-structured interviews with key stakeholders. Feedback was collated, and emergent themes were identified. The results indicated that BA software applications can provide visibility of radiology performance data across all time horizons. The study demonstrated that the tool could potentially assist with improving operational efficiencies and management of radiology resources.

  15. A results-based process for evaluation of diverse visual analytics tools

    Science.gov (United States)

    Rubin, Gary; Berger, David H.

    2013-05-01

    With the pervasiveness of still and full-motion imagery in commercial and military applications, the need to ingest and analyze these media has grown rapidly in recent years. Additionally, video hosting and live camera websites provide a near real-time view of our changing world with unprecedented spatial coverage. To take advantage of these controlled and crowd-sourced opportunities, sophisticated visual analytics (VA) tools are required to accurately and efficiently convert raw imagery into usable information. Whether investing in VA products or evaluating algorithms for potential development, it is important for stakeholders to understand the capabilities and limitations of visual analytics tools. Visual analytics algorithms are being applied to problems related to Intelligence, Surveillance, and Reconnaissance (ISR), facility security, and public safety monitoring, to name a few. The diversity of requirements means that a onesize- fits-all approach to performance assessment will not work. We present a process for evaluating the efficacy of algorithms in real-world conditions, thereby allowing users and developers of video analytics software to understand software capabilities and identify potential shortcomings. The results-based approach described in this paper uses an analysis of end-user requirements and Concept of Operations (CONOPS) to define Measures of Effectiveness (MOEs), test data requirements, and evaluation strategies. We define metrics that individually do not fully characterize a system, but when used together, are a powerful way to reveal both strengths and weaknesses. We provide examples of data products, such as heatmaps, performance maps, detection timelines, and rank-based probability-of-detection curves.

  16. Revolutions in Neuroscience: Tool Development

    Directory of Open Access Journals (Sweden)

    John eBickle

    2016-03-01

    Full Text Available Thomas Kuhn’s famous model of the components and dynamics of scientific revolutions is still dominant to this day across science, philosophy, and history. The guiding philosophical theme of this paper is that, concerning actual revolutions in neuroscience over the past sixty years, Kuhn’s account is wrong. There have been revolutions, and new ones are brewing, but they do not turn on competing paradigms, anomalies, or the like. Instead, they turn exclusively on the development of new experimental tools. I adopt a metascientific approach and examine in detail the development of two recent neuroscience revolutions: the impact of engineered genetically mutated mammals in the search for causal mechanisms of higher cognitive functions; and the more recent impact of optogenetics (and DREADDs. The two key metascientific concepts I derive from these case studies are a revolutionary new tool’s motivating problem, and its initial and second-phase hook experiments. These concepts hardly exhaust a detailed metascience of Tool Development experiments in neuroscience, but they get that project off to a useful start and distinguish the subsequent account of neuroscience revolutions clearly from Kuhn’s famous model. I close with a brief remark about the general importance of molecular biology for a current philosophical understanding of science, as comparable to the place physics occupied when Kuhn formulated his famous theory of scientific revolutions.

  17. Analytic Tools for Evaluating Variability of Standard Errors in Large-Scale Establishment Surveys

    Directory of Open Access Journals (Sweden)

    Cho MoonJung

    2014-12-01

    Full Text Available Large-scale establishment surveys often exhibit substantial temporal or cross-sectional variability in their published standard errors. This article uses a framework defined by survey generalized variance functions to develop three sets of analytic tools for the evaluation of these patterns of variability. These tools are for (1 identification of predictor variables that explain some of the observed temporal and cross-sectional variability in published standard errors; (2 evaluation of the proportion of variability attributable to the abovementioned predictors, equation error and estimation error, respectively; and (3 comparison of equation error variances across groups defined by observable predictor variables. The primary ideas are motivated and illustrated by an application to the U.S. Current Employment Statistics program.

  18. Median of patient results as a tool for assessment of analytical stability

    DEFF Research Database (Denmark)

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft;

    2015-01-01

    BACKGROUND: In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. METHOD...

  19. An Analytical Study of Tools and Techniques for Movie Marketing

    Directory of Open Access Journals (Sweden)

    Garima Maik

    2014-08-01

    Full Text Available Abstract. Bollywood or Hindi movie industry is one of the fastest growing sector in the media and entertainment space creating numerous business and employment opportunities. Movies in India are a major source of entertainment for all sects of society. They not only face competition from other movie industries and movies but from other source of entertainment such as adventure sports, amusement parks, theatre and drama, pubs and discothèques. A lot of man power, man hours, creative brains, and money are put in to build a quality feature film. Bollywood is the industry which continuously works towards providing the 7 billion population with something new always. So it is important for the movie and production team to stand out, to grab the due attention of the maximum audience. Movie makers employ various tools and techniques today to market their movies. They leave no stone unturned. They roll out teasers, First look, Theatrical trailer release, Music launch, City tours, Producer’s and director’s interview, Movie premier, Movie release, post release follow up and etc. to pull the viewers to the Cineplex. The audience today which comprises mainly of youth requires photos, videos, meet ups, gossip, debate, collaboration and content creation. These requirements of today’s generation are most fulfilled through digital platforms. However, the traditional media like newspapers, radio, and television are not old school. They reach out to mass audience and play an upper role in effective marketing. This study aims at analysing these tools for their effectiveness. The objectives are fulfilled through a consumer survey. This study will bring out the effectiveness and relational importance of various tools which are employed by movie marketers to generate maximum returns on the investments by using various data reduction techniques like factor analysis and statistical techniques like chi-square test with data visualization using pie charts

  20. Galileo's Discorsi as a Tool for the Analytical Art.

    Science.gov (United States)

    Raphael, Renee Jennifer

    2015-01-01

    A heretofore overlooked response to Galileo's 1638 Discorsi is described by examining two extant copies of the text (one which has received little attention in the historiography, the other apparently unknown) which are heavily annotated. It is first demonstrated that these copies contain annotations made by Seth Ward and Sir Christopher Wren. This article then examines one feature of Ward's and Wren's responses to the Discorsi, namely their decision to re-write several of Galileo's geometrical demonstrations into the language of symbolic algebra. It is argued that this type of active reading of period mathematical texts may have been part of the regular scholarly and pedagogical practices of early modern British mathematicians like Ward and Wren. A set of Appendices contains a transcription and translation of the analytical solutions found in these annotated copies.

  1. Information and Analytic Maintenance of Nanoindustry Development

    Directory of Open Access Journals (Sweden)

    Glushchenko Aleksandra Vasilyevna

    2015-05-01

    Full Text Available The successful course of nanotechnological development in many respects depends on the volume and quality of information provided to external and internal users. The objective of the present research is to reveal the information requirements of various groups of users for effective management of nanotech industry and to define ways of their most effective satisfaction. The authors also aim at developing the system of the indicators characterizing the current state and the dynamic parameters of nanotech industry development. On the basis of the conducted research the need of information system of nanotech industry development is proved. The information interrelations of subjects of nanotech industry for development of communicative function of the account which becomes dominating in comparison with control function are revealed. The information needs of users of financial and non-financial information are defined. The stages of its introduction, since determination of character, volume, the list and degree of efficiency of information before creation of system of the administrative reporting, the analysis and control are in detail registered. The information and analytical system is focused on the general assessment of efficiency and the major economic indicators, the general tendencies of development of nanotech industry, possible reserves of increasing the efficiency of their functioning. The authors develop pthe system of the indicators characterizing the advancement of nanotech industry and allowing to estimate innovative activity in the sphere of nanotech industry, to calculate intensity of nano-innovations costs, to define the productivity and efficiency of nanotech industry in branch, the region, national economy in general.

  2. A Mobile Network Planning Tool Based on Data Analytics

    Directory of Open Access Journals (Sweden)

    Jessica Moysen

    2017-01-01

    Full Text Available Planning future mobile networks entails multiple challenges due to the high complexity of the network to be managed. Beyond 4G and 5G networks are expected to be characterized by a high densification of nodes and heterogeneity of layers, applications, and Radio Access Technologies (RAT. In this context, a network planning tool capable of dealing with this complexity is highly convenient. The objective is to exploit the information produced by and already available in the network to properly deploy, configure, and optimise network nodes. This work presents such a smart network planning tool that exploits Machine Learning (ML techniques. The proposed approach is able to predict the Quality of Service (QoS experienced by the users based on the measurement history of the network. We select Physical Resource Block (PRB per Megabit (Mb as our main QoS indicator to optimise, since minimizing this metric allows offering the same service to users by consuming less resources, so, being more cost-effective. Two cases of study are considered in order to evaluate the performance of the proposed scheme, one to smartly plan the small cell deployment in a dense indoor scenario and a second one to timely face a detected fault in a macrocell network.

  3. Ethnography in design: Tool-kit or analytic science?

    DEFF Research Database (Denmark)

    Bossen, Claus

    2002-01-01

    The role of ethograpyh in system development is discussed through the selective application of an ethnographic easy-to-use toolkit, Contextual design, by a computer firm in the initial stages of the development of a health care system.......The role of ethograpyh in system development is discussed through the selective application of an ethnographic easy-to-use toolkit, Contextual design, by a computer firm in the initial stages of the development of a health care system....

  4. Analytics to Literacies: The Development of a Learning Analytics Framework for Multiliteracies Assessment

    Directory of Open Access Journals (Sweden)

    Shane Dawson

    2014-09-01

    Full Text Available The rapid advances in information and communication technologies, coupled with increased access to information and the formation of global communities, have resulted in interest among researchers and academics to revise educational practice to move beyond traditional ‘literacy’ skills towards an enhanced set of “multiliteracies” or “new media literacies”. Measuring the literacy of a population, in the light of its linkage to individual and community wealth and wellbeing, is essential to determining the impact of compulsory education. The opportunity now is to develop tools to assess individual and societal attainment of these new literacies. Drawing on the work of Jenkins and colleagues (2006 and notions of a participatory culture, this paper proposes a conceptual framework for how learning analytics can assist in measuring individual achievement of multiliteracies and how this evaluative process can be scaled to provide an institutional perspective of the educational progress in fostering these fundamental skills.

  5. Laser: a Tool for Optimization and Enhancement of Analytical Methods

    Energy Technology Data Exchange (ETDEWEB)

    Preisler, Jan [Iowa State Univ., Ames, IA (United States)

    1997-01-01

    In this work, we use lasers to enhance possibilities of laser desorption methods and to optimize coating procedure for capillary electrophoresis (CE). We use several different instrumental arrangements to characterize matrix-assisted laser desorption (MALD) at atmospheric pressure and in vacuum. In imaging mode, 488-nm argon-ion laser beam is deflected by two acousto-optic deflectors to scan plumes desorbed at atmospheric pressure via absorption. All absorbing species, including neutral molecules, are monitored. Interesting features, e.g. differences between the initial plume and subsequent plumes desorbed from the same spot, or the formation of two plumes from one laser shot are observed. Total plume absorbance can be correlated with the acoustic signal generated by the desorption event. A model equation for the plume velocity as a function of time is proposed. Alternatively, the use of a static laser beam for observation enables reliable determination of plume velocities even when they are very high. Static scattering detection reveals negative influence of particle spallation on MS signal. Ion formation during MALD was monitored using 193-nm light to photodissociate a portion of insulin ion plume. These results define the optimal conditions for desorbing analytes from matrices, as opposed to achieving a compromise between efficient desorption and efficient ionization as is practiced in mass spectrometry. In CE experiment, we examined changes in a poly(ethylene oxide) (PEO) coating by continuously monitoring the electroosmotic flow (EOF) in a fused-silica capillary during electrophoresis. An imaging CCD camera was used to follow the motion of a fluorescent neutral marker zone along the length of the capillary excited by 488-nm Ar-ion laser. The PEO coating was shown to reduce the velocity of EOF by more than an order of magnitude compared to a bare capillary at pH 7.0. The coating protocol was important, especially at an intermediate pH of 7.7. The increase of p

  6. Laser: a Tool for Optimization and Enhancement of Analytical Methods

    Energy Technology Data Exchange (ETDEWEB)

    Preisler, Jan

    1997-01-01

    In this work, we use lasers to enhance possibilities of laser desorption methods and to optimize coating procedure for capillary electrophoresis (CE). We use several different instrumental arrangements to characterize matrix-assisted laser desorption (MALD) at atmospheric pressure and in vacuum. In imaging mode, 488-nm argon-ion laser beam is deflected by two acousto-optic deflectors to scan plumes desorbed at atmospheric pressure via absorption. All absorbing species, including neutral molecules, are monitored. Interesting features, e.g. differences between the initial plume and subsequent plumes desorbed from the same spot, or the formation of two plumes from one laser shot are observed. Total plume absorbance can be correlated with the acoustic signal generated by the desorption event. A model equation for the plume velocity as a function of time is proposed. Alternatively, the use of a static laser beam for observation enables reliable determination of plume velocities even when they are very high. Static scattering detection reveals negative influence of particle spallation on MS signal. Ion formation during MALD was monitored using 193-nm light to photodissociate a portion of insulin ion plume. These results define the optimal conditions for desorbing analytes from matrices, as opposed to achieving a compromise between efficient desorption and efficient ionization as is practiced in mass spectrometry. In CE experiment, we examined changes in a poly(ethylene oxide) (PEO) coating by continuously monitoring the electroosmotic flow (EOF) in a fused-silica capillary during electrophoresis. An imaging CCD camera was used to follow the motion of a fluorescent neutral marker zone along the length of the capillary excited by 488-nm Ar-ion laser. The PEO coating was shown to reduce the velocity of EOF by more than an order of magnitude compared to a bare capillary at pH 7.0. The coating protocol was important, especially at an intermediate pH of 7.7. The increase of p

  7. System analysis: Developing tools for the future

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, K.; clever, J.; Draper, J.V.; Davies, B.; Lonks, A.

    1996-02-01

    This report introduces and evaluates system analysis tools that were developed, or are under development, for the Robotics Technology Development Program (RTDP). Additionally, it discusses system analysis work completed using these tools aimed at completing a system analysis of the retrieval of waste from underground storage tanks on the Hanford Reservation near Richland, Washington. The tools developed and evaluated include a mixture of commercially available tools adapted to RTDP requirements, and some tools developed in house. The tools that are included in this report include: a Process Diagramming Tool, a Cost Modeling Tool, an Amortization Modeling Tool, a graphical simulation linked to the Cost Modeling Tool, a decision assistance tool, and a system thinking tool. Additionally, the importance of performance testing to the RTDP and the results of such testing executed is discussed. Further, the results of the Tank Waste Retrieval (TWR) System Diagram, the TWR Operations Cost Model, and the TWR Amortization Model are presented, and the implication of the results are discussed. Finally, the RTDP system analysis tools are assessed and some recommendations are made regarding continuing development of the tools and process.

  8. Horses for courses: analytical tools to explore planetary boundaries

    Science.gov (United States)

    van Vuuren, D. P.; Lucas, P. L.; Häyhä, T.; Cornell, S. E.; Stafford-Smith, M.

    2015-09-01

    There is a need for further integrated research on developing a set of sustainable development objectives, based on the proposed framework of planetary boundaries indicators. The relevant research questions are divided in this paper into four key categories, related to the underlying processes and selection of key indicators, understanding the impacts of different exposure levels and influence of connections between different types of impacts, a better understanding of different response strategies and the available options to implement changes. Clearly, different categories of scientific disciplines and associated models exist that can contribute to the necessary analysis, noting that the distinctions between them are fuzzy. In the paper, we both indicate how different models relate to the four categories of questions but also how further insights can be obtained by connecting the different disciplines (without necessarily fully integrating them). Research on integration can support planetary boundary quantification in a credible way, linking human drivers and social and biophysical impacts.

  9. Monitoring automotive oil degradation: analytical tools and onboard sensing technologies.

    Science.gov (United States)

    Mujahid, Adnan; Dickert, Franz L

    2012-09-01

    Engine oil experiences a number of thermal and oxidative phases that yield acidic products in the matrix consequently leading to degradation of the base oil. Generally, oil oxidation is a complex process and difficult to elucidate; however, the degradation pathways can be defined for almost every type of oil because they mainly depend on the mechanical status and operating conditions. The exact time of oil change is nonetheless difficult to predict, but it is of great interest from an economic and ecological point of view. In order to make a quick and accurate decision about oil changes, onboard assessment of oil quality is highly desirable. For this purpose, a variety of physical and chemical sensors have been proposed along with spectroscopic strategies. We present a critical review of all these approaches and of recent developments to analyze the exact lifetime of automotive engine oil. Apart from their potential for degradation monitoring, their limitations and future perspectives have also been investigated.

  10. Chemometric classification techniques as a tool for solving problems in analytical chemistry.

    Science.gov (United States)

    Bevilacqua, Marta; Nescatelli, Riccardo; Bucci, Remo; Magrì, Andrea D; Magrì, Antonio L; Marini, Federico

    2014-01-01

    Supervised pattern recognition (classification) techniques, i.e., the family of chemometric methods whose aim is the prediction of a qualitative response on a set of samples, represent a very important assortment of tools for solving problems in several areas of applied analytical chemistry. This paper describes the theory behind the chemometric classification techniques most frequently used in analytical chemistry together with some examples of their application to real-world problems.

  11. Horses for courses: analytical tools to explore planetary boundaries

    Science.gov (United States)

    van Vuuren, Detlef P.; Lucas, Paul L.; Häyhä, Tiina; Cornell, Sarah E.; Stafford-Smith, Mark

    2016-03-01

    There is a need for more integrated research on sustainable development and global environmental change. In this paper, we focus on the planetary boundaries framework to provide a systematic categorization of key research questions in relation to avoiding severe global environmental degradation. The four categories of key questions are those that relate to (1) the underlying processes and selection of key indicators for planetary boundaries, (2) understanding the impacts of environmental pressure and connections between different types of impacts, (3) better understanding of different response strategies to avoid further degradation, and (4) the available instruments to implement such strategies. Clearly, different categories of scientific disciplines and associated model types exist that can accommodate answering these questions. We identify the strength and weaknesses of different research areas in relation to the question categories, focusing specifically on different types of models. We discuss that more interdisciplinary research is need to increase our understanding by better linking human drivers and social and biophysical impacts. This requires better collaboration between relevant disciplines (associated with the model types), either by exchanging information or by fully linking or integrating them. As fully integrated models can become too complex, the appropriate type of model (the racehorse) should be applied for answering the target research question (the race course).

  12. Environmental equity research: review with focus on outdoor air pollution research methods and analytic tools.

    Science.gov (United States)

    Miao, Qun; Chen, Dongmei; Buzzelli, Michael; Aronson, Kristan J

    2015-01-01

    The objective of this study was to review environmental equity research on outdoor air pollution and, specifically, methods and tools used in research, published in English, with the aim of recommending the best methods and analytic tools. English language publications from 2000 to 2012 were identified in Google Scholar, Ovid MEDLINE, and PubMed. Research methodologies and results were reviewed and potential deficiencies and knowledge gaps identified. The publications show that exposure to outdoor air pollution differs by social factors, but findings are inconsistent in Canada. In terms of study designs, most were small and ecological and therefore prone to the ecological fallacy. Newer tools such as geographic information systems, modeling, and biomarkers offer improved precision in exposure measurement. Higher-quality research using large, individual-based samples and more precise analytic tools are needed to provide better evidence for policy-making to reduce environmental inequities.

  13. Program Development Tools and Infrastructures

    Energy Technology Data Exchange (ETDEWEB)

    Schulz, M

    2012-03-12

    Exascale class machines will exhibit a new level of complexity: they will feature an unprecedented number of cores and threads, will most likely be heterogeneous and deeply hierarchical, and offer a range of new hardware techniques (such as speculative threading, transactional memory, programmable prefetching, and programmable accelerators), which all have to be utilized for an application to realize the full potential of the machine. Additionally, users will be faced with less memory per core, fixed total power budgets, and sharply reduced MTBFs. At the same time, it is expected that the complexity of applications will rise sharply for exascale systems, both to implement new science possible at exascale and to exploit the new hardware features necessary to achieve exascale performance. This is particularly true for many of the NNSA codes, which are large and often highly complex integrated simulation codes that push the limits of everything in the system including language features. To overcome these limitations and to enable users to reach exascale performance, users will expect a new generation of tools that address the bottlenecks of exascale machines, that work seamlessly with the (set of) programming models on the target machines, that scale with the machine, that provide automatic analysis capabilities, and that are flexible and modular enough to overcome the complexities and changing demands of the exascale architectures. Further, any tool must be robust enough to handle the complexity of large integrated codes while keeping the user's learning curve low. With the ASC program, in particular the CSSE (Computational Systems and Software Engineering) and CCE (Common Compute Environment) projects, we are working towards a new generation of tools that fulfill these requirements and that provide our users as well as the larger HPC community with the necessary tools, techniques, and methodologies required to make exascale performance a reality.

  14. The Comparison of Learning Analytics Tools%学习分析工具比较研究

    Institute of Scientific and Technical Information of China (English)

    孟玲玲; 顾小清; 李泽

    2014-01-01

    In recent years, with the rapid development of a smart learning environment, massive, rich, diverse, and heterogeneous data are increasing amazingly. In education field, students’ interests, preferences, activities, learning process information, such as the interaction with learning platform, as well as their implicit feedback to the e-learning platform, can all be recorded and traced. How to effectively make use of these data has drawn great concern. The da-ta of a single person seems to be chaotic, but with the data accumulating to a certain extent, it will be presented in an order. There are strong or weak relations among the data. For example, what are the characteristics of students in dif-ferent region or countries? What are the characteristics of learning behavior in different ages? What are the learning habits of different students? Which courses are needed urgently for a successful career? For a special course, which u-nits are needed for review? Which units are needed to be emphasized? Which students encounter difficulties and need help? Therefore there are amazing insights behind the data. If we extract the rules or determine the relationships among data, tremendous value will be created. Therefore, learning analysis techniques arise. According to the Horizon Report 2011 in the New Media Consortiums Horizon Project, learning analytics technol-ogy will become a hot topic in the next few years. It will contribute to improving the learning process and make the learning more intelligent. As we can imagine, the analytics tools play an important role in the process of learning an-alytics. Good tools can make the research process more effective. Many analytics tools have been developed. For example, Nvivo, Atlas. ti can be used to annotate the text and multimedia content. Gephi, JUNG, Guess can be used to analyze learning networks, and SPSS can analyze user data statistics. However, a key issue is how to choose the appropriate tool because different tools

  15. Capitalizing on App Development Tools and Technologies

    Science.gov (United States)

    Luterbach, Kenneth J.; Hubbell, Kenneth R.

    2015-01-01

    Instructional developers and others creating apps must choose from a wide variety of app development tools and technologies. Some app development tools have incorporated visual programming features, which enable some drag and drop coding and contextual programming. While those features help novices begin programming with greater ease, questions…

  16. Adequacy of surface analytical tools for studying the tribology of ceramics

    Science.gov (United States)

    Sliney, H. E.

    1986-01-01

    Surface analytical tools are very beneficial in tribological studies of ceramics. Traditional methods of optical microscopy, XRD, XRF, and SEM should be combined with newer surface sensitive techniques especially AES and XPS. ISS and SIMS can also be useful in providing additional compositon details. Tunneling microscopy and electron energy loss spectroscopy are less known techniques that may also prove useful.

  17. (Re)braiding to Tell: Using "Trenzas" as a Metaphorical-Analytical Tool in Qualitative Research

    Science.gov (United States)

    Quiñones, Sandra

    2016-01-01

    Metaphors can be used in qualitative research to illuminate the meanings of participant experiences and examine phenomena from insightful and creative perspectives. The purpose of this paper is to illustrate how I utilized "trenzas" (braids) as a metaphorical and analytical tool for understanding the experiences and perspectives of…

  18. Understanding Emotions as Situated, Embodied, and Fissured: Thinking with Theory to Create an Analytical Tool

    Science.gov (United States)

    Kuby, Candace R.

    2014-01-01

    An emerging theoretical perspective is that emotions are a verb or something we do in relation to others. Studies that demonstrate ways to analyze emotions from a performative stance are scarce. In this article, a new analytical tool is introduced; a critical performative analysis of emotion (CPAE) that draws upon three theoretical perspectives:…

  19. 40 CFR 766.16 - Developing the analytical test method.

    Science.gov (United States)

    2010-07-01

    ... 40 Protection of Environment 30 2010-07-01 2010-07-01 false Developing the analytical test method... SUBSTANCES CONTROL ACT DIBENZO-PARA-DIOXINS/DIBENZOFURANS General Provisions § 766.16 Developing the analytical test method. Because of the matrix differences of the chemicals listed for testing, no one...

  20. Developing a Code of Practice for Learning Analytics

    Science.gov (United States)

    Sclater, Niall

    2016-01-01

    Ethical and legal objections to learning analytics are barriers to development of the field, thus potentially denying students the benefits of predictive analytics and adaptive learning. Jisc, a charitable organization that champions the use of digital technologies in UK education and research, has attempted to address this with the development of…

  1. Lowering the Barrier to Reproducible Research by Publishing Provenance from Common Analytical Tools

    Science.gov (United States)

    Jones, M. B.; Slaughter, P.; Walker, L.; Jones, C. S.; Missier, P.; Ludäscher, B.; Cao, Y.; McPhillips, T.; Schildhauer, M.

    2015-12-01

    Scientific provenance describes the authenticity, origin, and processing history of research products and promotes scientific transparency by detailing the steps in computational workflows that produce derived products. These products include papers, findings, input data, software products to perform computations, and derived data and visualizations. The geosciences community values this type of information, and, at least theoretically, strives to base conclusions on computationally replicable findings. In practice, capturing detailed provenance is laborious and thus has been a low priority; beyond a lab notebook describing methods and results, few researchers capture and preserve detailed records of scientific provenance. We have built tools for capturing and publishing provenance that integrate into analytical environments that are in widespread use by geoscientists (R and Matlab). These tools lower the barrier to provenance generation by automating capture of critical information as researchers prepare data for analysis, develop, test, and execute models, and create visualizations. The 'recordr' library in R and the `matlab-dataone` library in Matlab provide shared functions to capture provenance with minimal changes to normal working procedures. Researchers can capture both scripted and interactive sessions, tag and manage these executions as they iterate over analyses, and then prune and publish provenance metadata and derived products to the DataONE federation of archival repositories. Provenance traces conform to the ProvONE model extension of W3C PROV, enabling interoperability across tools and languages. The capture system supports fine-grained versioning of science products and provenance traces. By assigning global identifiers such as DOIs, reseachers can cite the computational processes used to reach findings. And, finally, DataONE has built a web portal to search, browse, and clearly display provenance relationships between input data, the software

  2. Observation Tools for Professional Development

    Science.gov (United States)

    Malu, Kathleen F.

    2015-01-01

    Professional development of teachers, including English language teachers, empowers them to change in ways that improve teaching and learning (Gall and Acheson 2011; Murray 2010). In their seminal research on staff development--professional development in today's terms--Joyce and Showers (2002) identify key factors that promote teacher change.…

  3. Topological data analysis: A promising big data exploration tool in biology, analytical chemistry and physical chemistry.

    Science.gov (United States)

    Offroy, Marc; Duponchel, Ludovic

    2016-03-03

    An important feature of experimental science is that data of various kinds is being produced at an unprecedented rate. This is mainly due to the development of new instrumental concepts and experimental methodologies. It is also clear that the nature of acquired data is significantly different. Indeed in every areas of science, data take the form of always bigger tables, where all but a few of the columns (i.e. variables) turn out to be irrelevant to the questions of interest, and further that we do not necessary know which coordinates are the interesting ones. Big data in our lab of biology, analytical chemistry or physical chemistry is a future that might be closer than any of us suppose. It is in this sense that new tools have to be developed in order to explore and valorize such data sets. Topological data analysis (TDA) is one of these. It was developed recently by topologists who discovered that topological concept could be useful for data analysis. The main objective of this paper is to answer the question why topology is well suited for the analysis of big data set in many areas and even more efficient than conventional data analysis methods. Raman analysis of single bacteria should be providing a good opportunity to demonstrate the potential of TDA for the exploration of various spectroscopic data sets considering different experimental conditions (with high noise level, with/without spectral preprocessing, with wavelength shift, with different spectral resolution, with missing data).

  4. Measuring the bright side of being blue: a new tool for assessing analytical rumination in depression.

    Directory of Open Access Journals (Sweden)

    Skye P Barbic

    Full Text Available BACKGROUND: Diagnosis and management of depression occurs frequently in the primary care setting. Current diagnostic and management of treatment practices across clinical populations focus on eliminating signs and symptoms of depression. However, there is debate that some interventions may pathologize normal, adaptive responses to stressors. Analytical rumination (AR is an example of an adaptive response of depression that is characterized by enhanced cognitive function to help an individual focus on, analyze, and solve problems. To date, research on AR has been hampered by the lack of theoretically-derived and psychometrically sound instruments. This study developed and tested a clinically meaningful measure of AR. METHODS: Using expert panels and an extensive literature review, we developed a conceptual framework for AR and 22 candidate items. Items were field tested to 579 young adults; 140 of whom completed the items at a second time point. We used Rasch measurement methods to construct and test the item set; and traditional psychometric analyses to compare items to existing rating scales. RESULTS: Data were high quality (0.81; evidence for divergent validity. Evidence of misfit for 2 items suggested that a 20-item scale with 4-point response categories best captured the concept of AR, fitting the Rasch model (χ2 = 95.26; df = 76, p = 0.07, with high reliability (rp = 0.86, ordered response scale structure, and no item bias (gender, age, time. CONCLUSION: Our study provides evidence for a 20-item Analytical Rumination Questionnaire (ARQ that can be used to quantify AR in adults who experience symptoms of depression. The ARQ is psychometrically robust and a clinically useful tool for the assessment and improvement of depression in the primary care setting. Future work is needed to establish the validity of this measure in people with major depression.

  5. GeneAnalytics: An Integrative Gene Set Analysis Tool for Next Generation Sequencing, RNAseq and Microarray Data

    Science.gov (United States)

    Ben-Ari Fuchs, Shani; Lieder, Iris; Mazor, Yaron; Buzhor, Ella; Kaplan, Sergey; Bogoch, Yoel; Plaschkes, Inbar; Shitrit, Alina; Rappaport, Noa; Kohn, Asher; Edgar, Ron; Shenhav, Liraz; Safran, Marilyn; Lancet, Doron; Guan-Golan, Yaron; Warshawsky, David; Shtrichman, Ronit

    2016-01-01

    Abstract Postgenomics data are produced in large volumes by life sciences and clinical applications of novel omics diagnostics and therapeutics for precision medicine. To move from “data-to-knowledge-to-innovation,” a crucial missing step in the current era is, however, our limited understanding of biological and clinical contexts associated with data. Prominent among the emerging remedies to this challenge are the gene set enrichment tools. This study reports on GeneAnalytics™ (geneanalytics.genecards.org), a comprehensive and easy-to-apply gene set analysis tool for rapid contextualization of expression patterns and functional signatures embedded in the postgenomics Big Data domains, such as Next Generation Sequencing (NGS), RNAseq, and microarray experiments. GeneAnalytics' differentiating features include in-depth evidence-based scoring algorithms, an intuitive user interface and proprietary unified data. GeneAnalytics employs the LifeMap Science's GeneCards suite, including the GeneCards®—the human gene database; the MalaCards—the human diseases database; and the PathCards—the biological pathways database. Expression-based analysis in GeneAnalytics relies on the LifeMap Discovery®—the embryonic development and stem cells database, which includes manually curated expression data for normal and diseased tissues, enabling advanced matching algorithm for gene–tissue association. This assists in evaluating differentiation protocols and discovering biomarkers for tissues and cells. Results are directly linked to gene, disease, or cell “cards” in the GeneCards suite. Future developments aim to enhance the GeneAnalytics algorithm as well as visualizations, employing varied graphical display items. Such attributes make GeneAnalytics a broadly applicable postgenomics data analyses and interpretation tool for translation of data to knowledge-based innovation in various Big Data fields such as precision medicine, ecogenomics, nutrigenomics

  6. Analytical Quality by Design Approach to Test Method Development and Validation in Drug Substance Manufacturing

    Directory of Open Access Journals (Sweden)

    N. V. V. S. S. Raman

    2015-01-01

    Full Text Available Pharmaceutical industry has been emerging rapidly for the last decade by focusing on product Quality, Safety, and Efficacy. Pharmaceutical firms increased the number of product development by using scientific tools such as QbD (Quality by Design and PAT (Process Analytical Technology. ICH guidelines Q8 to Q11 have discussed QbD implementation in API synthetic process and formulation development. ICH Q11 guidelines clearly discussed QbD approach for API synthesis with examples. Generic companies are implementing QbD approach in formulation development and even it is mandatory for USFDA perspective. As of now there is no specific requirements for AQbD (Analytical Quality by Design and PAT in analytical development from all regulatory agencies. In this review, authors have discussed the implementation of QbD and AQbD simultaneously for API synthetic process and analytical methods development. AQbD key tools are identification of ATP (Analytical Target Profile, CQA (Critical Quality Attributes with risk assessment, Method Optimization and Development with DoE, MODR (method operable design region, Control Strategy, AQbD Method Validation, and Continuous Method Monitoring (CMM. Simultaneous implementation of QbD activities in synthetic and analytical development will provide the highest quality product by minimizing the risks and even it is very good input for PAT approach.

  7. Recent Advances in Algal Genetic Tool Development

    Energy Technology Data Exchange (ETDEWEB)

    R. Dahlin, Lukas; T. Guarnieri, Michael

    2016-06-24

    The goal of achieving cost-effective biofuels and bioproducts derived from algal biomass will require improvements along the entire value chain, including identification of robust, high-productivity strains and development of advanced genetic tools. Though there have been modest advances in development of genetic systems for the model alga Chlamydomonas reinhardtii, progress in development of algal genetic tools, especially as applied to non-model algae, has generally lagged behind that of more commonly utilized laboratory and industrial microbes. This is in part due to the complex organellar structure of algae, including robust cell walls and intricate compartmentalization of target loci, as well as prevalent gene silencing mechanisms, which hinder facile utilization of conventional genetic engineering tools and methodologies. However, recent progress in global tool development has opened the door for implementation of strain-engineering strategies in industrially-relevant algal strains. Here, we review recent advances in algal genetic tool development and applications in eukaryotic microalgae.

  8. Urban Development Tools in Denmark

    DEFF Research Database (Denmark)

    Aunsborg, Christian; Enemark, Stig; Sørensen, Michael Tophøj

    2005-01-01

    Artiklen indeholder følgende afsnit: 1. Urbax and the Danish Planning system 2. Main Challenges in the Urban Development 3. Coordination and Growth (Management) Policies and Spatial Planning Policies 4. Coordination of Market Events and Spatial Planning 5. The application of Urban Development Too...

  9. Hasse diagram as a green analytical metrics tool: ranking of methods for benzo[a]pyrene determination in sediments.

    Science.gov (United States)

    Bigus, Paulina; Tsakovski, Stefan; Simeonov, Vasil; Namieśnik, Jacek; Tobiszewski, Marek

    2016-05-01

    This study presents an application of the Hasse diagram technique (HDT) as the assessment tool to select the most appropriate analytical procedures according to their greenness or the best analytical performance. The dataset consists of analytical procedures for benzo[a]pyrene determination in sediment samples, which were described by 11 variables concerning their greenness and analytical performance. Two analyses with the HDT were performed-the first one with metrological variables and the second one with "green" variables as input data. Both HDT analyses ranked different analytical procedures as the most valuable, suggesting that green analytical chemistry is not in accordance with metrology when benzo[a]pyrene in sediment samples is determined. The HDT can be used as a good decision support tool to choose the proper analytical procedure concerning green analytical chemistry principles and analytical performance merits.

  10. New Tools to Prepare ACE Cross-section Files for MCNP Analytic Test Problems

    Energy Technology Data Exchange (ETDEWEB)

    Brown, Forrest B. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States). Monte Carlo Codes Group

    2016-06-17

    Monte Carlo calculations using one-group cross sections, multigroup cross sections, or simple continuous energy cross sections are often used to: (1) verify production codes against known analytical solutions, (2) verify new methods and algorithms that do not involve detailed collision physics, (3) compare Monte Carlo calculation methods with deterministic methods, and (4) teach fundamentals to students. In this work we describe 2 new tools for preparing the ACE cross-section files to be used by MCNP® for these analytic test problems, simple_ace.pl and simple_ace_mg.pl.

  11. EPIPOI: A user-friendly analytical tool for the extraction and visualization of temporal parameters from epidemiological time series

    Directory of Open Access Journals (Sweden)

    Alonso Wladimir J

    2012-11-01

    Full Text Available Abstract Background There is an increasing need for processing and understanding relevant information generated by the systematic collection of public health data over time. However, the analysis of those time series usually requires advanced modeling techniques, which are not necessarily mastered by staff, technicians and researchers working on public health and epidemiology. Here a user-friendly tool, EPIPOI, is presented that facilitates the exploration and extraction of parameters describing trends, seasonality and anomalies that characterize epidemiological processes. It also enables the inspection of those parameters across geographic regions. Although the visual exploration and extraction of relevant parameters from time series data is crucial in epidemiological research, until now it had been largely restricted to specialists. Methods EPIPOI is freely available software developed in Matlab (The Mathworks Inc that runs both on PC and Mac computers. Its friendly interface guides users intuitively through useful comparative analyses including the comparison of spatial patterns in temporal parameters. Results EPIPOI is able to handle complex analyses in an accessible way. A prototype has already been used to assist researchers in a variety of contexts from didactic use in public health workshops to the main analytical tool in published research. Conclusions EPIPOI can assist public health officials and students to explore time series data using a broad range of sophisticated analytical and visualization tools. It also provides an analytical environment where even advanced users can benefit by enabling a higher degree of control over model assumptions, such as those associated with detecting disease outbreaks and pandemics.

  12. Comprehensive data resources and analytical tools for pathological association of aminoacyl tRNA synthetases with cancer

    Science.gov (United States)

    Lee, Ji-Hyun; You, Sungyong; Hyeon, Do Young; Kang, Byeongsoo; Kim, Hyerim; Park, Kyoung Mii; Han, Byungwoo; Hwang, Daehee; Kim, Sunghoon

    2015-01-01

    Mammalian cells have cytoplasmic and mitochondrial aminoacyl-tRNA synthetases (ARSs) that catalyze aminoacylation of tRNAs during protein synthesis. Despite their housekeeping functions in protein synthesis, recently, ARSs and ARS-interacting multifunctional proteins (AIMPs) have been shown to play important roles in disease pathogenesis through their interactions with disease-related molecules. However, there are lacks of data resources and analytical tools that can be used to examine disease associations of ARS/AIMPs. Here, we developed an Integrated Database for ARSs (IDA), a resource database including cancer genomic/proteomic and interaction data of ARS/AIMPs. IDA includes mRNA expression, somatic mutation, copy number variation and phosphorylation data of ARS/AIMPs and their interacting proteins in various cancers. IDA further includes an array of analytical tools for exploration of disease association of ARS/AIMPs, identification of disease-associated ARS/AIMP interactors and reconstruction of ARS-dependent disease-perturbed network models. Therefore, IDA provides both comprehensive data resources and analytical tools for understanding potential roles of ARS/AIMPs in cancers. Database URL: http://ida.biocon.re.kr/, http://ars.biocon.re.kr/ PMID:25824651

  13. Development of Desktop Computing Applications and Engineering Tools on GPUs

    DEFF Research Database (Denmark)

    Sørensen, Hans Henrik Brandenborg; Glimberg, Stefan Lemvig; Hansen, Toke Jansen

    (GPUs) for high-performance computing applications and software tools in science and engineering, inverse problems, visualization, imaging, dynamic optimization. The goals are to contribute to the development of new state-of-the-art mathematical models and algorithms for maximum throughout performance......GPULab - A competence center and laboratory for research and collaboration within academia and partners in industry has been established in 2008 at section for Scientific Computing, DTU informatics, Technical University of Denmark. In GPULab we focus on the utilization of Graphics Processing Units......, improved performance profiling tools and assimilation of results to academic and industrial partners in our network. Our approaches calls for multi-disciplinary skills and understanding of hardware, software development, profiling tools and tuning techniques, analytical methods for analysis and development...

  14. Analytical and Empirical Modeling of Wear and Forces of CBN Tool in Hard Turning - A Review

    Science.gov (United States)

    Patel, Vallabh Dahyabhai; Gandhi, Anishkumar Hasmukhlal

    2016-06-01

    Machining of steel material having hardness above 45 HRC (Hardness-Rockwell C) is referred as a hard turning. There are numerous models which should be scrutinized and implemented to gain optimum performance of hard turning. Various models in hard turning by cubic boron nitride tool have been reviewed, in attempt to utilize appropriate empirical and analytical models. Validation of steady state flank and crater wear model, Usui's wear model, forces due to oblique cutting theory, extended Lee and Shaffer's force model, chip formation and progressive flank wear have been depicted in this review paper. Effort has been made to understand the relationship between tool wear and tool force based on the different cutting conditions and tool geometries so that appropriate model can be used according to user requirement in hard turning.

  15. Development of CAD implementing the algorithm of boundary elements’ numerical analytical method

    Directory of Open Access Journals (Sweden)

    Yulia V. Korniyenko

    2015-03-01

    Full Text Available Up to recent days the algorithms for numerical-analytical boundary elements method had been implemented with programs written in MATLAB environment language. Each program had a local character, i.e. used to solve a particular problem: calculation of beam, frame, arch, etc. Constructing matrices in these programs was carried out “manually” therefore being time-consuming. The research was purposed onto a reasoned choice of programming language for new CAD development, allows to implement algorithm of numerical analytical boundary elements method and to create visualization tools for initial objects and calculation results. Research conducted shows that among wide variety of programming languages the most efficient one for CAD development, employing the numerical analytical boundary elements method algorithm, is the Java language. This language provides tools not only for development of calculating CAD part, but also to build the graphic interface for geometrical models construction and calculated results interpretation.

  16. Analytics for smart energy management tools and applications for sustainable manufacturing

    CERN Document Server

    Oh, Seog-Chan

    2016-01-01

    This book introduces the issues and problems that arise when implementing smart energy management for sustainable manufacturing in the automotive manufacturing industry and the analytical tools and applications to deal with them. It uses a number of illustrative examples to explain energy management in automotive manufacturing, which involves most types of manufacturing technology and various levels of energy consumption. It demonstrates how analytical tools can help improve energy management processes, including forecasting, consumption, and performance analysis, emerging new technology identification as well as investment decisions for establishing smart energy consumption practices. It also details practical energy management systems, making it a valuable resource for professionals involved in real energy management processes, and allowing readers to implement the procedures and applications presented.

  17. Software Development Management: Empirical and Analytical Perspectives

    Science.gov (United States)

    Kang, Keumseok

    2011-01-01

    Managing software development is a very complex activity because it must deal with people, organizations, technologies, and business processes. My dissertation consists of three studies that examine software development management from various perspectives. The first study empirically investigates the impacts of prior experience with similar…

  18. Analytical Tools for the Application of Operational Culture: A Case Study in the Trans-Sahel

    Science.gov (United States)

    2011-03-28

    their support to MEF staffs with Psychological Operations planning teams in the near future. 2.2.4 Observations on the Situation Context...1998). ―Organizational Consulting: A Gestalt Approach,‖ Cambridge: GIC Press, 23 UNCLASSIFIED Analytical Tools for the Application of Operational...cultural learning and past experiences. What we perceive is often based on our needs, our expectation, our projections, our psychological defenses, and

  19. An analytic solution to LO coupled DGLAP evolution equations: a new pQCD tool

    CERN Document Server

    Block, Martin M; Ha, Phuoc; McKay, Douglas W

    2010-01-01

    We have analytically solved the LO pQCD singlet DGLAP equations using Laplace transform techniques. Newly-developed highly accurate numerical inverse Laplace transform algorithms allow us to write fully decoupled solutions for the singlet structure function F_s(x,Q^2)and G(x,Q^2) as F_s(x,Q^2)={\\cal F}_s(F_{s0}(x), G_0(x)) and G(x,Q^2)={\\cal G}(F_{s0}(x), G_0(x)). Here {\\cal F}_s and \\cal G are known functions of the initial boundary conditions F_{s0}(x) = F_s(x,Q_0^2) and G_{0}(x) = G(x,Q_0^2), i.e., the chosen starting functions at the virtuality Q_0^2. For both G and F_s, we are able to either devolve or evolve each separately and rapidly, with very high numerical accuracy, a computational fractional precision of O(10^{-9}). Armed with this powerful new tool in the pQCD arsenal, we compare our numerical results from the above equations with the published MSTW2008 and CTEQ6L LO gluon and singlet F_s distributions, starting from their initial values at Q_0^2=1 GeV^2 and 1.69 GeV^2, respectively, using their ...

  20. SHARAD Radargram Analysis Tool Development in JMARS

    Science.gov (United States)

    Adler, J. B.; Anwar, S.; Dickenshied, S.; Carter, S.

    2016-09-01

    New tools are being developed in JMARS, a free GIS software, for SHARAD radargram viewing and analysis. These capabilities are useful for the polar science community, and for constraining the viability of ice resource deposits for human exploration.

  1. Data Mining and Optimization Tools for Developing Engine Parameters Tools

    Science.gov (United States)

    Dhawan, Atam P.

    1998-01-01

    This project was awarded for understanding the problem and developing a plan for Data Mining tools for use in designing and implementing an Engine Condition Monitoring System. Tricia Erhardt and I studied the problem domain for developing an Engine Condition Monitoring system using the sparse and non-standardized datasets to be available through a consortium at NASA Lewis Research Center. We visited NASA three times to discuss additional issues related to dataset which was not made available to us. We discussed and developed a general framework of data mining and optimization tools to extract useful information from sparse and non-standard datasets. These discussions lead to the training of Tricia Erhardt to develop Genetic Algorithm based search programs which were written in C++ and used to demonstrate the capability of GA algorithm in searching an optimal solution in noisy, datasets. From the study and discussion with NASA LeRC personnel, we then prepared a proposal, which is being submitted to NASA for future work for the development of data mining algorithms for engine conditional monitoring. The proposed set of algorithm uses wavelet processing for creating multi-resolution pyramid of tile data for GA based multi-resolution optimal search.

  2. Benefits and limitations of using decision analytic tools to assess uncertainty and prioritize Landscape Conservation Cooperative information needs

    Science.gov (United States)

    Post van der Burg, Max; Cullinane Thomas, Catherine; Holcombe, Tracy R.; Nelson, Richard D.

    2016-01-01

    The Landscape Conservation Cooperatives (LCCs) are a network of partnerships throughout North America that are tasked with integrating science and management to support more effective delivery of conservation at a landscape scale. In order to achieve this integration, some LCCs have adopted the approach of providing their partners with better scientific information in an effort to facilitate more effective and coordinated conservation decisions. Taking this approach has led many LCCs to begin funding research to provide the information for improved decision making. To ensure that funding goes to research projects with the highest likelihood of leading to more integrated broad scale conservation, some LCCs have also developed approaches for prioritizing which information needs will be of most benefit to their partnerships. We describe two case studies in which decision analytic tools were used to quantitatively assess the relative importance of information for decisions made by partners in the Plains and Prairie Potholes LCC. The results of the case studies point toward a few valuable lessons in terms of using these tools with LCCs. Decision analytic tools tend to help shift focus away from research oriented discussions and toward discussions about how information is used in making better decisions. However, many technical experts do not have enough knowledge about decision making contexts to fully inform the latter type of discussion. When assessed in the right decision context, however, decision analyses can point out where uncertainties actually affect optimal decisions and where they do not. This helps technical experts understand that not all research is valuable in improving decision making. But perhaps most importantly, our results suggest that decision analytic tools may be more useful for LCCs as way of developing integrated objectives for coordinating partner decisions across the landscape, rather than simply ranking research priorities.

  3. Culture and Development : An Analytical Framework

    NARCIS (Netherlands)

    Francois, P.; Zabojnik, J.

    2001-01-01

    This paper develops a framework which analyzes how a population's culture affects the decisions of rational profit maximizing firms, while simultaneously exploring how the actions of these firms in turn affect the population's culture.By endogenizing culture as well as the more usual economic variab

  4. Earlier development of analytical than holistic object recognition in adolescence.

    Directory of Open Access Journals (Sweden)

    Elley Wakui

    Full Text Available BACKGROUND: Previous research has shown that object recognition may develop well into late childhood and adolescence. The present study extends that research and reveals novel differences in holistic and analytic recognition performance in 7-12 year olds compared to that seen in adults. We interpret our data within a hybrid model of object recognition that proposes two parallel routes for recognition (analytic vs. holistic modulated by attention. METHODOLOGY/PRINCIPAL FINDINGS: Using a repetition-priming paradigm, we found in Experiment 1 that children showed no holistic priming, but only analytic priming. Given that holistic priming might be thought to be more 'primitive', we confirmed in Experiment 2 that our surprising finding was not because children's analytic recognition was merely a result of name repetition. CONCLUSIONS/SIGNIFICANCE: Our results suggest a developmental primacy of analytic object recognition. By contrast, holistic object recognition skills appear to emerge with a much more protracted trajectory extending into late adolescence.

  5. Recent analytical developments for powder characterization

    Science.gov (United States)

    Brackx, E.; Pages, S.; Dugne, O.; Podor, R.

    2015-07-01

    Powders and divided solid materials are widely represented as finished or intermediary products in industries as widely varied as foodstuffs, cosmetics, construction, pharmaceuticals, electronic transmission, and energy. Their optimal use requires a mastery of the transformation process based on knowledge of the different phenomena concerned (sintering, chemical reactivity, purity, etc.). Their modelling and understanding need a prior acquisition of sets of data and characteristics which are more or less challenging to obtain. The goal of this study is to present the use of different physico-chemical characterization techniques adapted to uranium-containing powders analyzed either in a raw state or after a specific preparation (ionic polishing). The new developments touched on concern dimensional characterization techniques for grains and pores by image analysis, chemical surface characterization and powder chemical reactivity characterization. The examples discussed are from fabrication process materials used in the nuclear fuel cycle.

  6. ANALYTICAL METHOD DEVELOPMENT AND VALIDATION FOR DIPYRIDAMOLE

    Directory of Open Access Journals (Sweden)

    SURESH KUMAR,LATIF D.JAMADAR, KRIS KRISHNAMURTHY BHAT, JAGDISH P.C, SHRIRAM PATHAK

    2013-10-01

    Full Text Available A sensitive, specific, precise and cost effective High Performance Liquid Chromatographic method of analysis for dipyridamole in presence of its degradation products is developed and validated. The method employed Targa C8 column i.e., (250 X 4.6 mm 5 μm particle size column as stationary phase. The mobile phase consists of acetonitrile and pH3.0 buffer in the ratio of 35:65 %. It is pumped through the chromatographic system at a flow rate of 1.2 ml/min. The UV detector is operated at 282 nm. This system was found to give good resolution between dipyridamole and its degradation products. Method was validated as per ICH guidelines

  7. Computational Tools to Accelerate Commercial Development

    Energy Technology Data Exchange (ETDEWEB)

    Miller, David C

    2013-01-01

    The goals of the work reported are: to develop new computational tools and models to enable industry to more rapidly develop and deploy new advanced energy technologies; to demonstrate the capabilities of the CCSI Toolset on non-proprietary case studies; and to deploy the CCSI Toolset to industry. Challenges of simulating carbon capture (and other) processes include: dealing with multiple scales (particle, device, and whole process scales); integration across scales; verification, validation, and uncertainty; and decision support. The tools cover: risk analysis and decision making; validated, high-fidelity CFD; high-resolution filtered sub-models; process design and optimization tools; advanced process control and dynamics; process models; basic data sub-models; and cross-cutting integration tools.

  8. Forty years of development in diamond tools

    Science.gov (United States)

    The growth of the diamond industry in Western Countries since the First World War is surveyed. The articles described deal specifically with the development of the industrial diamond and diamond tool sector in different countries. All data point to continuing rapid expansion in the diamond tool sector. The West consumes 80 percent of world industrial diamond production. Diamond consumption increased sharply in the U.S. during World War 2. There are 300 diamond manufacturers in the U.S. today. In 1940, there were 25. In Japan, consumption of industrial diamonds has increased several times. In Italy, there has been a 75 fold increase in the production of diamond tools since 1959.

  9. Earth Science Data Analytics: Bridging Tools and Techniques with the Co-Analysis of Large, Heterogeneous Datasets

    Science.gov (United States)

    Kempler, Steve; Mathews, Tiffany

    2016-01-01

    The continuum of ever-evolving data management systems affords great opportunities to the enhancement of knowledge and facilitation of science research. To take advantage of these opportunities, it is essential to understand and develop methods that enable data relationships to be examined and the information to be manipulated. This presentation describes the efforts of the Earth Science Information Partners (ESIP) Federation Earth Science Data Analytics (ESDA) Cluster to understand, define, and facilitate the implementation of ESDA to advance science research. As a result of the void of Earth science data analytics publication material, the cluster has defined ESDA along with 10 goals to set the framework for a common understanding of tools and techniques that are available and still needed to support ESDA.

  10. Tiered analytics for purity assessment of macrocyclic peptides in drug discovery: Analytical consideration and method development.

    Science.gov (United States)

    Qian Cutrone, Jingfang Jenny; Huang, Xiaohua Stella; Kozlowski, Edward S; Bao, Ye; Wang, Yingzi; Poronsky, Christopher S; Drexler, Dieter M; Tymiak, Adrienne A

    2017-05-10

    Synthetic macrocyclic peptides with natural and unnatural amino acids have gained considerable attention from a number of pharmaceutical/biopharmaceutical companies in recent years as a promising approach to drug discovery, particularly for targets involving protein-protein or protein-peptide interactions. Analytical scientists charged with characterizing these leads face multiple challenges including dealing with a class of complex molecules with the potential for multiple isomers and variable charge states and no established standards for acceptable analytical characterization of materials used in drug discovery. In addition, due to the lack of intermediate purification during solid phase peptide synthesis, the final products usually contain a complex profile of impurities. In this paper, practical analytical strategies and methodologies were developed to address these challenges, including a tiered approach to assessing the purity of macrocyclic peptides at different stages of drug discovery. Our results also showed that successful progression and characterization of a new drug discovery modality benefited from active analytical engagement, focusing on fit-for-purpose analyses and leveraging a broad palette of analytical technologies and resources.

  11. Adsorptive micro-extraction techniques--novel analytical tools for trace levels of polar solutes in aqueous media.

    Science.gov (United States)

    Neng, N R; Silva, A R M; Nogueira, J M F

    2010-11-19

    A novel enrichment technique, adsorptive μ-extraction (AμE), is proposed for trace analysis of polar solutes in aqueous media. The preparation, stability tests and development of the analytical devices using two geometrical configurations, i.e. bar adsorptive μ-extraction (BAμE) and multi-spheres adsorptive μ-extraction (MSAμE) is fully discussed. From the several sorbent materials tested, activated carbons and polystyrene divinylbenzene phases demonstrated the best stability, robustness and to be the most suitable for analytical purposes. The application of both BAμE and MSAμE devices proved remarkable performance for the determination of trace levels of polar solutes and metabolites (e.g. pesticides, disinfection by-products, drugs of abuse and pharmaceuticals) in water matrices and biological fluids. By comparing AμE techniques with stir bar sorptive extraction based on polydimethylsiloxane phase, great effectiveness is attained overcoming the limitations of the latter enrichment approach regarding the more polar solutes. Furthermore, convenient sensitivity and selectivity is reached through AμE techniques, since the great advantage of this new analytical technology is the possibility to choose the most suitable sorbent to each particular type of application. The enrichment techniques proposed are cost-effective, easy to prepare and work-up, demonstrating robustness and to be a remarkable analytical tool for trace analysis of priority solutes in areas of recognized importance such as environment, forensic and other related life sciences.

  12. The Blackbird Whistling or Just After? Vygotsky's Tool and Sign as an Analytic for Writing

    Science.gov (United States)

    Imbrenda, Jon-Philip

    2016-01-01

    Based on Vygotsky's theory of the interplay of the tool and sign functions of language, this study presents a textual analysis of a corpus of student-authored texts to illuminate aspects of development evidenced through the dialectical tension of tool and sign. Data were drawn from a series of reflective memos I authored during a seminar for new…

  13. Solar Data and Tools: Resources for Researchers, Industry, and Developers

    Energy Technology Data Exchange (ETDEWEB)

    2016-04-01

    In partnership with the U.S. Department of Energy SunShot Initiative, the National Renewable Energy Laboratory (NREL) has created a suite of analytical tools and data that can inform decisions about implementing solar and that are increasingly forming the basis of private-sector tools and services to solar consumers. The following solar energy data sets and analytical tools are available free to the public.

  14. Development of analytical cell support for vitrification at the West Valley Demonstration Project. Topical report

    Energy Technology Data Exchange (ETDEWEB)

    Barber, F.H.; Borek, T.T.; Christopher, J.Z. [and others

    1997-12-01

    Analytical and Process Chemistry (A&PC) support is essential to the high-level waste vitrification campaign at the West Valley Demonstration Project (WVDP). A&PC characterizes the waste, providing information necessary to formulate the recipe for the target radioactive glass product. High-level waste (HLW) samples are prepared and analyzed in the analytical cells (ACs) and Sample Storage Cell (SSC) on the third floor of the main plant. The high levels of radioactivity in the samples require handling them in the shielded cells with remote manipulators. The analytical hot cells and third floor laboratories were refurbished to ensure optimal uninterrupted operation during the vitrification campaign. New and modified instrumentation, tools, sample preparation and analysis techniques, and equipment and training were required for A&PC to support vitrification. Analytical Cell Mockup Units (ACMUs) were designed to facilitate method development, scientist and technician training, and planning for analytical process flow. The ACMUs were fabricated and installed to simulate the analytical cell environment and dimensions. New techniques, equipment, and tools could be evaluated m in the ACMUs without the consequences of generating or handling radioactive waste. Tools were fabricated, handling and disposal of wastes was addressed, and spatial arrangements for equipment were refined. As a result of the work at the ACMUs the remote preparation and analysis methods and the equipment and tools were ready for installation into the ACs and SSC m in July 1995. Before use m in the hot cells, all remote methods had been validated and four to eight technicians were trained on each. Fine tuning of the procedures has been ongoing at the ACs based on input from A&PC technicians. Working at the ACs presents greater challenges than had development at the ACMUs. The ACMU work and further refinements m in the ACs have resulted m in a reduction m in analysis turnaround time (TAT).

  15. From Mini to Micro Scale—Feasibility of Raman Spectroscopy as a Process Analytical Tool (PAT

    Directory of Open Access Journals (Sweden)

    Peter Kleinebudde

    2011-10-01

    Full Text Available Background: Active coating is an important unit operation in the pharmaceutical industry. The quality, stability, safety and performance of the final product largely depend on the amount and uniformity of coating applied. Active coating is challenging regarding the total amount of coating and its uniformity. Consequently, there is a strong demand for tools, which are able to monitor and determine the endpoint of a coating operation. In previous work, it was shown that Raman spectroscopy is an appropriate process analytical tool (PAT to monitor an active spray coating process in a pan coater [1]. Using a multivariate model (Partial Least Squares—PLS the Raman spectral data could be correlated with the coated amount of the API diprophylline. While the multivariate model was shown to be valid for the process in a mini scale pan coater (batch size: 3.5 kg cores, the aim of the present work was to prove the robustness of the model by transferring the results to tablets coated in a micro scale pan coater (0.5 kg. Method: Coating experiments were performed in both, a mini scale and a micro scale pan coater. The model drug diprophylline was coated on placebo tablets. The multivariate model, established for the process in the mini scale pan coater, was applied to the Raman measurements of tablets coated in the micro scale coater for six different coating levels. Then, the amount of coating, which was predicted by the model, was compared with reference measurements using UV spectroscopy. Results: For all six coating levels the predicted coating amount was equal to the amounts obtained by UV spectroscopy within the statistical error. Thus, it was possible to predict the total coating amount with an error smaller than 3.6%. The root mean squares of errors for calibration and prediction (root mean square of errors for calibration and prediction—RMSEC and RMSEP were 0.335 mg and 0.392 mg, respectively, which means that the predictive power of the model

  16. Windows Developer Power Tools Turbocharge Windows development with more than 170 free and open source tools

    CERN Document Server

    Avery, James

    2007-01-01

    Software developers need to work harder and harder to bring value to their development process in order to build high quality applications and remain competitive. Developers can accomplish this by improving their productivity, quickly solving problems, and writing better code. A wealth of open source and free software tools are available for developers who want to improve the way they create, build, deploy, and use software. Tools, components, and frameworks exist to help developers at every point in the development process. Windows Developer Power Tools offers an encyclopedic guide to m

  17. Pulsatile microfluidics as an analytical tool for determining the dynamic characteristics of microfluidic systems

    DEFF Research Database (Denmark)

    Vedel, Søren; Olesen, Laurits Højgaard; Bruus, Henrik

    2010-01-01

    An understanding of all fluid dynamic time scales is needed to fully understand and hence exploit the capabilities of fluid flow in microfluidic systems. We propose the use of harmonically oscillating microfluidics as an analytical tool for the deduction of these time scales. Furthermore, we...... suggest the use of system-level equivalent circuit theory as an adequate theory of the behavior of the system. A novel pressure source capable of operation in the desired frequency range is presented for this generic analysis. As a proof of concept, we study the fairly complex system of water...

  18. Analytical Ultracentrifugation as a Tool to Study Nonspecific Protein–DNA Interactions

    Science.gov (United States)

    Yang, Teng-Chieh; Catalano, Carlos Enrique; Maluf, Nasib Karl

    2016-01-01

    Analytical ultracentrifugation (AUC) is a powerful tool that can provide thermodynamic information on associating systems. Here, we discuss how to use the two fundamental AUC applications, sedimentation velocity (SV), and sedimentation equilibrium (SE), to study nonspecific protein–nucleic acid interactions, with a special emphasis on how to analyze the experimental data to extract thermodynamic information. We discuss three specific applications of this approach: (i) determination of nonspecific binding stoichiometry of E. coli integration host factor protein to dsDNA, (ii) characterization of nonspecific binding properties of Adenoviral IVa2 protein to dsDNA using SE-AUC, and (iii) analysis of the competition between specific and nonspecific DNA-binding interactions observed for E. coli integration host factor protein assembly on dsDNA. These approaches provide powerful tools that allow thermodynamic interrogation and thus a mechanistic understanding of how proteins bind nucleic acids by both specific and nonspecific interactions. PMID:26412658

  19. Analytical continuation in physical geodesy constructed by means of tools and formulas related to an ellipsoid of revolution

    Science.gov (United States)

    Holota, Petr; Nesvadba, Otakar

    2014-05-01

    In physical geodesy mathematical tools applied for solving problems of potential theory are often essentially associated with the concept of the so-called spherical approximation (interpreted as a mapping). The same holds true for the method of analytical (harmonic) continuation which is frequently considered as a means suitable for converting the ground gravity anomalies or disturbances to corresponding values on the level surface that is close to the original boundary. In the development and implementation of this technique the key role has the representation of a harmonic function by means of the famous Poisson's formula and the construction of a radial derivative operator on the basis of this formula. In this contribution an attempt is made to avoid spherical approximation mentioned above and to develop mathematical tools that allow implementation of the concept of analytical continuation also in a more general case, in particular for converting the ground gravity anomalies or disturbances to corresponding values on the surface of an oblate ellipsoid of revolution. The respective integral kernels are constructed with the aid of series of ellipsoidal harmonics and their summation, but also the mathematical nature of the boundary date is discussed in more details.

  20. Latest Developments in PVD Coatings for Tooling

    Directory of Open Access Journals (Sweden)

    Gabriela Strnad

    2010-06-01

    Full Text Available The paper presents the recent developments in the field of PVD coating for manufacturing tools. A review of monoblock, multilayer, nanocomposite, DLC and oxinitride coatings is discussed, with the emphasis on coatings which enables the manufacturers to implement high productivity processes such as high speed cutting and dry speed machining.

  1. Assessment and Development of Software Engineering Tools

    Science.gov (United States)

    1991-01-16

    Assessment (REA) tool would advise a potential software reuser on the tradeoffs between reusing a RSC versus developing a brand new software product...of memberships in the key RSC reusability attributes; e.g., size, structure, or documentation, etc., all of which would be weighted by reuser

  2. A Survey of Export System Development Tools

    Science.gov (United States)

    1988-06-01

    f b x GESBT 4.0 62 r o m f b x x GETREE 26 r m f b GLIB 26 r GPSI 26 o GUESS/I 28 r o n m f b GURU 28 r m f b x x Hearsay-3 28 r HPRL 30 r o f b IN...93. GESBT 4.0 (Generic Expert System Building Tool) A- 3 - - - . -..m’ .,.A APPENDIX B Expert System Development Tools . B-1 ’ APPENDIX B p Expert... GESBT Knowledge acquisition:_________________________________ conflict detection ____________________________________ explicit rule entry X fact/control

  3. Developing Adaptive Elearning: An Authoring Tool Design

    Directory of Open Access Journals (Sweden)

    Said Talhi

    2011-09-01

    Full Text Available Adaptive hypermedia is the answer to the lost in hyperspace syndrome, where the user has normally too many links to choose from, and little knowledge about how to proceed and select the most appropriate ones to him/her. Adaptive hypermedia thus offers a selection of links or content most appropriate to the user. Until very recently, little attention has been given to the complex task of authoring materials for Adaptive Educational Hypermedia. An author faces a multitude of problems when creating a personalized, rich learning experience for each user. The purpose of this paper is to present an authoring tool for adaptive hypermedia based courses. Designed to satisfy guidelines of accessibility of the W3C recommendation for authors and learners that present disabilities, the authoring tool allows several authors geographically dispersed to produce such courses together. It consists of a shared workspace gathering all tools necessary to the cooperative development task.

  4. Kapteyn Package: Tools for developing astronomical applications

    Science.gov (United States)

    Terlouw, J. P.; Vogelaar, M. G. R.

    2016-11-01

    The Kapteyn Package provides tools for the development of astronomical applications with Python. It handles spatial and spectral coordinates, WCS projections and transformations between different sky systems; spectral translations (e.g., between frequencies and velocities) and mixed coordinates are also supported. Kapteyn offers versatile tools for writing small and dedicated applications for the inspection of FITS headers, the extraction and display of (FITS) data, interactive inspection of this data (color editing) and for the creation of plots with world coordinate information. It includes utilities for use with matplotlib such as obtaining coordinate information from plots, interactively modifiable colormaps and timer events (module mplutil); tools for parsing and interpreting coordinate information entered by the user (module positions); a function to search for gaussian components in a profile (module profiles); and a class for non-linear least squares fitting (module kmpfit).

  5. H1640 caster tool development report

    Energy Technology Data Exchange (ETDEWEB)

    Brown, L.A.

    1997-12-01

    This report describes the development and certification of the H1640 caster tool. This tool is used to rotate swivel caster wheels 90 degrees on bomb hand trucks or shipping containers. The B83 is a heavy bomb system and weighs close to 5,600 pounds for a two-high stack configuration. High castering moments (handle length times the force exerted on handle) are required to caster a wheel for a two-high stack of B83s. The H1640 is available to the DoD (Air Force) through the Special Equipment List (SEL) for the B83 as a replacement for the H631 and H1216 caster tools.

  6. Recent developments in analytical toxicology : for better or for worse

    NARCIS (Netherlands)

    de Zeeuw, RA

    1998-01-01

    When considering the state of the art in toxicology from an analytical perspective, the key developments relate to three major areas. (1) Forensic horizon: Today forensic analysis has broadened its scope dramatically, to include workplace toxicology, drug abuse testing, drugs and driving, doping, en

  7. Developing an Evaluation Framework of Quality Indicators for Learning Analytics

    NARCIS (Netherlands)

    Scheffel, Maren; Drachsler, Hendrik; Specht, Marcus

    2017-01-01

    This paper presents results from the continuous process of developing an evaluation framework of quality indicators for learning analytics (LA). Building on a previous study, a group concept mapping approach that uses multidimensional scaling and hierarchical clustering, the study presented here app

  8. The role of big data and advanced analytics in drug discovery, development, and commercialization.

    Science.gov (United States)

    Szlezák, N; Evers, M; Wang, J; Pérez, L

    2014-05-01

    In recent years, few ideas have captured the imagination of health-care practitioners as much as the advent of "big data" and the advanced analytical methods and technologies used to interpret it-it is a trend seen as having the potential to revolutionize biology, medicine, and health care.(1,2,3) As new types of data and tools become available, a unique opportunity is emerging for smarter and more effective discovery, development, and commercialization of innovative biopharmaceutical drugs.

  9. Second International Workshop on Teaching Analytics

    DEFF Research Database (Denmark)

    Vatrapu, Ravi; Reimann, Peter; Halb, Wolfgang;

    2013-01-01

    Teaching Analytics is conceived as a subfield of learning analytics that focuses on the design, development, evaluation, and education of visual analytics methods and tools for teachers in primary, secondary, and tertiary educational settings. The Second International Workshop on Teaching Analytics...... (IWTA) 2013 seeks to bring together researchers and practitioners in the fields of education, learning sciences, learning analytics, and visual analytics to investigate the design, development, use, evaluation, and impact of visual analytical methods and tools for teachers’ dynamic diagnostic decision...

  10. An analytical tool-box for comprehensive biochemical, structural and transcriptome evaluation of oral biofilms mediated by mutans streptococci.

    Science.gov (United States)

    Klein, Marlise I; Xiao, Jin; Heydorn, Arne; Koo, Hyun

    2011-01-25

    Biofilms are highly dynamic, organized and structured communities of microbial cells enmeshed in an extracellular matrix of variable density and composition (1, 2). In general, biofilms develop from initial microbial attachment on a surface followed by formation of cell clusters (or microcolonies) and further development and stabilization of the microcolonies, which occur in a complex extracellular matrix. The majority of biofilm matrices harbor exopolysaccharides (EPS), and dental biofilms are no exception; especially those associated with caries disease, which are mostly mediated by mutans streptococci (3). The EPS are synthesized by microorganisms (S. mutans, a key contributor) by means of extracellular enzymes, such as glucosyltransferases using sucrose primarily as substrate (3). Studies of biofilms formed on tooth surfaces are particularly challenging owing to their constant exposure to environmental challenges associated with complex diet-host-microbial interactions occurring in the oral cavity. Better understanding of the dynamic changes of the structural organization and composition of the matrix, physiology and transcriptome/proteome profile of biofilm-cells in response to these complex interactions would further advance the current knowledge of how oral biofilms modulate pathogenicity. Therefore, we have developed an analytical tool-box to facilitate biofilm analysis at structural, biochemical and molecular levels by combining commonly available and novel techniques with custom-made software for data analysis. Standard analytical (colorimetric assays, RT-qPCR and microarrays) and novel fluorescence techniques (for simultaneous labeling of bacteria and EPS) were integrated with specific software for data analysis to address the complex nature of oral biofilm research. The tool-box is comprised of 4 distinct but interconnected steps (Figure 1): 1) Bioassays, 2) Raw Data Input, 3) Data Processing, and 4) Data Analysis. We used our in vitro biofilm model and

  11. Analytical tools for investigating strong-field QED processes in tightly focused laser fields

    CERN Document Server

    Di Piazza, A

    2015-01-01

    The present paper is the natural continuation of the letter [Phys. Rev. Lett. \\textbf{113}, 040402 (2014)], where the electron wave functions in the presence of a background electromagnetic field of general space-time structure have been constructed analytically, assuming that the initial energy of the electron is the largest dynamical energy scale in the problem and having in mind the case of a background tightly focused laser beam. Here, we determine the scalar and the spinor propagators under the same approximations, which are useful tools for calculating, e.g., total probabilities of processes occurring in such complex electromagnetic fields. In addition, we also present a simpler and more general expression of the electron wave functions found in [Phys. Rev. Lett. \\textbf{113}, 040402 (2014)] and we indicate a substitution rule to obtain them starting from the well-known Volkov wave functions in a plane-wave field.

  12. Development of Impurity Profiling Methods Using Modern Analytical Techniques.

    Science.gov (United States)

    Ramachandra, Bondigalla

    2017-01-02

    This review gives a brief introduction about the process- and product-related impurities and emphasizes on the development of novel analytical methods for their determination. It describes the application of modern analytical techniques, particularly the ultra-performance liquid chromatography (UPLC), liquid chromatography-mass spectrometry (LC-MS), high-resolution mass spectrometry (HRMS), gas chromatography-mass spectrometry (GC-MS) and high-performance thin layer chromatography (HPTLC). In addition to that, the application of nuclear magnetic resonance (NMR) spectroscopy was also discussed for the characterization of impurities and degradation products. The significance of the quality, efficacy and safety of drug substances/products, including the source of impurities, kinds of impurities, adverse effects by the presence of impurities, quality control of impurities, necessity for the development of impurity profiling methods, identification of impurities and regulatory aspects has been discussed. Other important aspects that have been discussed are forced degradation studies and the development of stability indicating assay methods.

  13. Surrogate Analysis and Index Developer (SAID) tool

    Science.gov (United States)

    Domanski, Marian M.; Straub, Timothy D.; Landers, Mark N.

    2015-10-01

    The use of acoustic and other parameters as surrogates for suspended-sediment concentrations (SSC) in rivers has been successful in multiple applications across the Nation. Tools to process and evaluate the data are critical to advancing the operational use of surrogates along with the subsequent development of regression models from which real-time sediment concentrations can be made available to the public. Recent developments in both areas are having an immediate impact on surrogate research and on surrogate monitoring sites currently (2015) in operation.

  14. SE Requirements Development Tool User Guide

    Energy Technology Data Exchange (ETDEWEB)

    Benson, Faith Ann [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-13

    The LANL Systems Engineering Requirements Development Tool (SERDT) is a data collection tool created in InfoPath for use with the Los Alamos National Laboratory’s (LANL) SharePoint sites. Projects can fail if a clear definition of the final product requirements is not performed. For projects to be successful requirements must be defined early in the project and those requirements must be tracked during execution of the project to ensure the goals of the project are met. Therefore, the focus of this tool is requirements definition. The content of this form is based on International Council on Systems Engineering (INCOSE) and Department of Defense (DoD) process standards and allows for single or collaborative input. The “Scoping” section is where project information is entered by the project team prior to requirements development, and includes definitions and examples to assist the user in completing the forms. The data entered will be used to define the requirements and once the form is filled out, a “Requirements List” is automatically generated and a Word document is created and saved to a SharePoint document library. SharePoint also includes the ability to download the requirements data defined in the InfoPath from into an Excel spreadsheet. This User Guide will assist you in navigating through the data entry process.

  15. Process-Based Quality (PBQ) Tools Development

    Energy Technology Data Exchange (ETDEWEB)

    Cummins, J.L.

    2001-12-03

    The objective of this effort is to benchmark the development of process-based quality tools for application in CAD (computer-aided design) model-based applications. The processes of interest are design, manufacturing, and quality process applications. A study was commissioned addressing the impact, current technologies, and known problem areas in application of 3D MCAD (3-dimensional mechanical computer-aided design) models and model integrity on downstream manufacturing and quality processes. The downstream manufacturing and product quality processes are profoundly influenced and dependent on model quality and modeling process integrity. The goal is to illustrate and expedite the modeling and downstream model-based technologies for available or conceptual methods and tools to achieve maximum economic advantage and advance process-based quality concepts.

  16. Development of quality-by-design analytical methods.

    Science.gov (United States)

    Vogt, Frederick G; Kord, Alireza S

    2011-03-01

    Quality-by-design (QbD) is a systematic approach to drug development, which begins with predefined objectives, and uses science and risk management approaches to gain product and process understanding and ultimately process control. The concept of QbD can be extended to analytical methods. QbD mandates the definition of a goal for the method, and emphasizes thorough evaluation and scouting of alternative methods in a systematic way to obtain optimal method performance. Candidate methods are then carefully assessed in a structured manner for risks, and are challenged to determine if robustness and ruggedness criteria are satisfied. As a result of these studies, the method performance can be understood and improved if necessary, and a control strategy can be defined to manage risk and ensure the method performs as desired when validated and deployed. In this review, the current state of analytical QbD in the industry is detailed with examples of the application of analytical QbD principles to a range of analytical methods, including high-performance liquid chromatography, Karl Fischer titration for moisture content, vibrational spectroscopy for chemical identification, quantitative color measurement, and trace analysis for genotoxic impurities.

  17. Developments and retrospectives in Lie theory geometric and analytic methods

    CERN Document Server

    Penkov, Ivan; Wolf, Joseph

    2014-01-01

    This volume reviews and updates a prominent series of workshops in representation/Lie theory, and reflects the widespread influence of those  workshops in such areas as harmonic analysis, representation theory, differential geometry, algebraic geometry, and mathematical physics.  Many of the contributors have had leading roles in both the classical and modern developments of Lie theory and its applications. This Work, entitled Developments and Retrospectives in Lie Theory, and comprising 26 articles, is organized in two volumes: Algebraic Methods and Geometric and Analytic Methods. This is the Geometric and Analytic Methods volume. The Lie Theory Workshop series, founded by Joe Wolf and Ivan Penkov and joined shortly thereafter by Geoff Mason, has been running for over two decades. Travel to the workshops has usually been supported by the NSF, and local universities have provided hospitality. The workshop talks have been seminal in describing new perspectives in the field covering broad areas of current re...

  18. The use of meta-analytical tools in risk assessment for food safety.

    Science.gov (United States)

    Gonzales-Barron, Ursula; Butler, Francis

    2011-06-01

    This communication deals with the use of meta-analysis as a valuable tool for the synthesis of food safety research, and in quantitative risk assessment modelling. A common methodology for the conduction of meta-analysis (i.e., systematic review and data extraction, parameterisation of effect size, estimation of overall effect size, assessment of heterogeneity, and presentation of results) is explained by reviewing two meta-analyses derived from separate sets of primary studies of Salmonella in pork. Integrating different primary studies, the first meta-analysis elucidated for the first time a relationship between the proportion of Salmonella-carrier slaughter pigs entering the slaughter lines and the resulting proportion of contaminated carcasses at the point of evisceration; finding that the individual studies on their own could not reveal. On the other hand, the second application showed that meta-analysis can be used to estimate the overall effect of a critical process stage (chilling) on the incidence of the pathogen under study. The derivation of a relationship between variables and a probabilistic distribution is illustrations of the valuable quantitative information synthesised by the meta-analytical tools, which can be incorporated in risk assessment modelling. Strengths and weaknesses of meta-analysis within the context of food safety are also discussed.

  19. Analytical Concept: Development of a Multinational Information Strategy

    Science.gov (United States)

    2008-10-31

    home of refugees because the UNHCR already has a resourced programme in place. THEME: "Acceptance of the Coalition" (Understanding of Coalition aims...gender or sex of a person. Page 61 of 106 UNCLASSIFIED FOR PUBLIC RELEASE – Development of a Multinational Information Strategy – MNIOE Analytical...constructive dialogue between all sides. − We are encouraging the local communities to accept refugees /IDPs on a temporary basis. We will support

  20. Recent developments in detection methods for microfabricated analytical devices.

    Science.gov (United States)

    Schwarz, M A; Hauser, P C

    2001-09-01

    Sensitive detection in microfluidic analytical devices is a challenge because of the extremely small detection volumes available. Considerable efforts have been made lately to further address this aspect and to investigate techniques other than fluorescence. Among the newly introduced techniques are the optical methods of chemiluminescence, refraction and thermooptics, as well as the electrochemical methods of amperometry, conductimetry and potentiometry. Developments are also in progress to create miniaturized plasma-emission spectrometers and sensitive detectors for gas-chromatographic separations.

  1. Developing a mapping tool for tablets

    Science.gov (United States)

    Vaughan, Alan; Collins, Nathan; Krus, Mike

    2014-05-01

    Digital field mapping offers significant benefits when compared with traditional paper mapping techniques in that it provides closer integration with downstream geological modelling and analysis. It also provides the mapper with the ability to rapidly integrate new data with existing databases without the potential degradation caused by repeated manual transcription of numeric, graphical and meta-data. In order to achieve these benefits, a number of PC-based digital mapping tools are available which have been developed for specific communities, eg the BGS•SIGMA project, Midland Valley's FieldMove®, and a range of solutions based on ArcGIS® software, which can be combined with either traditional or digital orientation and data collection tools. However, with the now widespread availability of inexpensive tablets and smart phones, a user led demand for a fully integrated tablet mapping tool has arisen. This poster describes the development of a tablet-based mapping environment specifically designed for geologists. The challenge was to deliver a system that would feel sufficiently close to the flexibility of paper-based geological mapping while being implemented on a consumer communication and entertainment device. The first release of a tablet-based geological mapping system from this project is illustrated and will be shown as implemented on an iPad during the poster session. Midland Valley is pioneering tablet-based mapping and, along with its industrial and academic partners, will be using the application in field based projects throughout this year and will be integrating feedback in further developments of this technology.

  2. Developing automated analytical methods for scientific environments using LabVIEW.

    Science.gov (United States)

    Wagner, Christoph; Armenta, Sergio; Lendl, Bernhard

    2010-01-15

    The development of new analytical techniques often requires the building of specially designed devices, each requiring its own dedicated control software. Especially in the research and development phase, LabVIEW has proven to be one highly useful tool for developing this software. Yet, it is still common practice to develop individual solutions for different instruments. In contrast to this, we present here a single LabVIEW-based program that can be directly applied to various analytical tasks without having to change the program code. Driven by a set of simple script commands, it can control a whole range of instruments, from valves and pumps to full-scale spectrometers. Fluid sample (pre-)treatment and separation procedures can thus be flexibly coupled to a wide range of analytical detection methods. Here, the capabilities of the program have been demonstrated by using it for the control of both a sequential injection analysis - capillary electrophoresis (SIA-CE) system with UV detection, and an analytical setup for studying the inhibition of enzymatic reactions using a SIA system with FTIR detection.

  3. Electrochemical immunoassay using magnetic beads for the determination of zearalenone in baby food: An anticipated analytical tool for food safety

    Energy Technology Data Exchange (ETDEWEB)

    Hervas, Miriam; Lopez, Miguel Angel [Departamento Quimica Analitica, Universidad de Alcala, Ctra. Madrid-Barcelona, Km. 33600, E-28871 Alcala de Henares, Madrid (Spain); Escarpa, Alberto, E-mail: alberto.escarpa@uah.es [Departamento Quimica Analitica, Universidad de Alcala, Ctra. Madrid-Barcelona, Km. 33600, E-28871 Alcala de Henares, Madrid (Spain)

    2009-10-27

    In this work, electrochemical immunoassay involving magnetic beads to determine zearalenone in selected food samples has been developed. The immunoassay scheme has been based on a direct competitive immunoassay method in which antibody-coated magnetic beads were employed as the immobilisation support and horseradish peroxidase (HRP) was used as enzymatic label. Amperometric detection has been achieved through the addition of hydrogen peroxide substrate and hydroquinone as mediator. Analytical performance of the electrochemical immunoassay has been evaluated by analysis of maize certified reference material (CRM) and selected baby food samples. A detection limit (LOD) of 0.011 {mu}g L{sup -1} and EC{sub 50} 0.079 {mu}g L{sup -1} were obtained allowing the assessment of the detection of zearalenone mycotoxin. In addition, an excellent accuracy with a high recovery yield ranging between 95 and 108% has been obtained. The analytical features have shown the proposed electrochemical immunoassay to be a very powerful and timely screening tool for the food safety scene.

  4. 100-B/C Target Analyte List Development for Soil

    Energy Technology Data Exchange (ETDEWEB)

    R.W. Ovink

    2010-03-18

    This report documents the process used to identify source area target analytes in support of the 100-B/C remedial investigation/feasibility study addendum to DOE/RL-2008-46. This report also establishes the analyte exclusion criteria applicable for 100-B/C use and the analytical methods needed to analyze the target analytes.

  5. Addressing new analytical challenges in protein formulation development.

    Science.gov (United States)

    Mach, Henryk; Arvinte, Tudor

    2011-06-01

    As the share of therapeutic proteins in the arsenal of modern medicine continue increasing, relatively little progress has been made in the development of analytical methods that would address specific needs encountered during the development of these new drugs. Consequently, the researchers resort to adaptation of existing instrumentation to meet the demands of rigorous bioprocess and formulation development. In this report, we present a number of such adaptations as well as new instruments that allow efficient and precise measurement of critical parameters throughout the development stage. The techniques include use of atomic force microscopy to visualize proteinacious sub-visible particles, use of extrinsic fluorescent dyes to visualize protein aggregates, particle tracking analysis, determination of the concentration of monoclonal antibodies by the analysis of second-derivative UV spectra, flow cytometry for the determination of subvisible particle counts, high-throughput fluorescence spectroscopy to study phase separation phenomena, an adaptation of a high-pressure liquid chromatography (HPLC) system for the measurement of solution viscosity and a variable-speed streamlined analytical ultracentrifugation method. An ex vivo model for understanding the factors that affect bioavailability after subcutaneous injections is also described. Most of these approaches allow not only a more precise insight into the nature of the formulated proteins, but also offer increased throughput while minimizing sample requirements.

  6. Modelling the level of adoption of analytical tools; An implementation of multi-criteria evidential reasoning

    Directory of Open Access Journals (Sweden)

    Igor Barahona

    2014-08-01

    Full Text Available In the future, competitive advantages will be given to organisations that can extract valuable information from massive data and make better decisions. In most cases, this data comes from multiple sources. Therefore, the challenge is to aggregate them into a common framework in order to make them meaningful and useful.This paper will first review the most important multi-criteria decision analysis methods (MCDA existing in current literature. We will offer a novel, practical and consistent methodology based on a type of MCDA, to aggregate data from two different sources into a common framework. Two datasets that are different in nature but related to the same topic are aggregated to a common scale by implementing a set of transformation rules. This allows us to generate appropriate evidence for assessing and finally prioritising the level of adoption of analytical tools in four types of companies.A numerical example is provided to clarify the form for implementing this methodology. A six-step process is offered as a guideline to assist engineers, researchers or practitioners interested in replicating this methodology in any situation where there is a need to aggregate and transform multiple source data.

  7. Interactive entity resolution in relational data: a visual analytic tool and its evaluation.

    Science.gov (United States)

    Kang, Hyunmo; Getoor, Lise; Shneiderman, Ben; Bilgic, Mustafa; Licamele, Louis

    2008-01-01

    Databases often contain uncertain and imprecise references to real-world entities. Entity resolution, the process of reconciling multiple references to underlying real-world entities, is an important data cleaning process required before accurate visualization or analysis of the data is possible. In many cases, in addition to noisy data describing entities, there is data describing the relationships among the entities. This relational data is important during the entity resolution process; it is useful both for the algorithms which determine likely database references to be resolved and for visual analytic tools which support the entity resolution process. In this paper, we introduce a novel user interface, D-Dupe, for interactive entity resolution in relational data. D-Dupe effectively combines relational entity resolution algorithms with a novel network visualization that enables users to make use of an entity's relational context for making resolution decisions. Since resolution decisions often are interdependent, D-Dupe facilitates understanding this complex process through animations which highlight combined inferences and a history mechanism which allows users to inspect chains of resolution decisions. An empirical study with 12 users confirmed the benefits of the relational context visualization on the performance of entity resolution tasks in relational data in terms of time as well as users' confidence and satisfaction.

  8. Raman-in-SEM, a multimodal and multiscale analytical tool: performance for materials and expertise.

    Science.gov (United States)

    Wille, Guillaume; Bourrat, Xavier; Maubec, Nicolas; Lahfid, Abdeltif

    2014-12-01

    The availability of Raman spectroscopy in a powerful analytical scanning electron microscope (SEM) allows morphological, elemental, chemical, physical and electronic analysis without moving the sample between instruments. This paper documents the metrological performance of the SEMSCA commercial Raman interface operated in a low vacuum SEM. It provides multiscale and multimodal analyses as Raman/EDS, Raman/cathodoluminescence or Raman/STEM (STEM: scanning transmission electron microscopy) as well as Raman spectroscopy on nanomaterials. Since Raman spectroscopy in a SEM can be influenced by several SEM-related phenomena, this paper firstly presents a comparison of this new tool with a conventional micro-Raman spectrometer. Then, some possible artefacts are documented, which are due to the impact of electron beam-induced contamination or cathodoluminescence contribution to the Raman spectra, especially with geological samples. These effects are easily overcome by changing or adapting the Raman spectrometer and the SEM settings and methodology. The deletion of the adverse effect of cathodoluminescence is solved by using a SEM beam shutter during Raman acquisition. In contrast, this interface provides the ability to record the cathodoluminescence (CL) spectrum of a phase. In a second part, this study highlights the interest and efficiency of the coupling in characterizing micrometric phases at the same point. This multimodal approach is illustrated with various issues encountered in geosciences.

  9. Sugar Maple Pigments Through the Fall and the Role of Anthocyanin as an Analytical Tool

    Science.gov (United States)

    Lindgren, E.; Rock, B.; Middleton, E.; Aber, J.

    2008-12-01

    Sugar maple habitat is projected to almost disappear in future climate scenarios. In fact, many institutions state that these trees are already in decline. Being able to detect sugar maple health could prove to be a useful analytical tool to monitor changes in phenology. Anthocyanin, a red pigment found in sugar maples, is thought to be a universal indicator of plant stress. It is very prominent in the spring during the first flush of leaves, as well as in the fall as leaves senesce. Determining an anthocyanin index that could be used with satellite systems will provide a greater understanding of tree phenology and the distribution of plant stress, both over large areas as well as changes over time. The utilization of anthocyanin for one of it's functions, prevention of oxidative stress, may fluctuate in response to changing climatic conditions that occur during senescence or vary from year to year. By monitoring changes in pigment levels and antioxidant capacity through the fall, one may be able to draw conclusions about the ability to detect anthocyanin remotely from space-based systems, and possibly determine a more specific function for anthocyanin during fall senescence. These results could then be applied to track changes in tree stress.

  10. Narrative health research: exploring big and small stories as analytical tools.

    Science.gov (United States)

    Sools, Anneke

    2013-01-01

    In qualitative health research many researchers use a narrative approach to study lay health concepts and experiences. In this article, I explore the theoretical linkages between the concepts narrative and health, which are used in a variety of ways. The article builds on previous work that conceptualizes health as a multidimensional, positive, dynamic and morally dilemmatic yet meaningful practice. I compare big and small stories as analytical tools to explore what narrative has to offer to address, nuance and complicate five challenges in narrative health research: (1) the interplay between health and other life issues; (2) the taken-for-granted yet rare character of the experience of good health; (3) coherence or incoherence as norms for good health; (4) temporal issues; (5) health as moral practice. In this article, I do not present research findings per se; rather, I use two interview excerpts for methodological and theoretical reflections. These interview excerpts are derived from a health promotion study in the Netherlands, which was partly based on peer-to-peer interviews. I conclude with a proposal to advance narrative health research by sensitizing researchers to different usages of both narrative and health, and the interrelationship(s) between the two.

  11. Development and first application of an operating events ranking tool

    Energy Technology Data Exchange (ETDEWEB)

    Šimić, Zdenko [European Commission Joint Research Centre – Institute for Energy and Transport, Postbus 2, 1755ZG Petten (Netherlands); University of Zagreb, Faculty of Electrical Engineering and Computing, Zagreb (Croatia); Zerger, Benoit, E-mail: benoit.zerger@ec.europa.eu [European Commission Joint Research Centre – Institute for Energy and Transport, Postbus 2, 1755ZG Petten (Netherlands); Banov, Reni [University of Zagreb, Faculty of Electrical Engineering and Computing, Zagreb (Croatia)

    2015-02-15

    Highlights: • A method using analitycal hierarchy process for ranking operating events is developed and tested. • The method is applied for 5 years of U.S. NRC Licensee Event Reports (1453 events). • Uncertainty and sensitivity of the ranking results are evaluated. • Real events assessment shows potential of the method for operating experience feedback. - Abstract: The operating experience feedback is important for maintaining and improving safety and availability in nuclear power plants. Detailed investigation of all events is challenging since it requires excessive resources, especially in case of large event databases. This paper presents an event groups ranking method to complement the analysis of individual operating events. The basis for the method is the use of an internationally accepted events characterization scheme that allows different ways of events grouping and ranking. The ranking method itself consists of implementing the analytical hierarchy process (AHP) by means of a custom developed tool which allows events ranking based on ranking indexes pre-determined by expert judgment. Following the development phase, the tool was applied to analyze a complete set of 5 years of real nuclear power plants operating events (1453 events). The paper presents the potential of this ranking method to identify possible patterns throughout the event database and therefore to give additional insights into the events as well as to give quantitative input for the prioritization of further more detailed investigation of selected event groups.

  12. Development of the SOFIA Image Processing Tool

    Science.gov (United States)

    Adams, Alexander N.

    2011-01-01

    The Stratospheric Observatory for Infrared Astronomy (SOFIA) is a Boeing 747SP carrying a 2.5 meter infrared telescope capable of operating between at altitudes of between twelve and fourteen kilometers, which is above more than 99 percent of the water vapor in the atmosphere. The ability to make observations above most water vapor coupled with the ability to make observations from anywhere, anytime, make SOFIA one of the world s premiere infrared observatories. SOFIA uses three visible light CCD imagers to assist in pointing the telescope. The data from these imagers is stored in archive files as is housekeeping data, which contains information such as boresight and area of interest locations. A tool that could both extract and process data from the archive files was developed.

  13. Solar Array Verification Analysis Tool (SAVANT) Developed

    Science.gov (United States)

    Bailey, Sheila G.; Long, KIenwyn J.; Curtis, Henry B.; Gardner, Barbara; Davis, Victoria; Messenger, Scott; Walters, Robert

    1999-01-01

    Modeling solar cell performance for a specific radiation environment to obtain the end-of-life photovoltaic array performance has become both increasingly important and, with the rapid advent of new types of cell technology, more difficult. For large constellations of satellites, a few percent difference in the lifetime prediction can have an enormous economic impact. The tool described here automates the assessment of solar array on-orbit end-of-life performance and assists in the development and design of ground test protocols for different solar cell designs. Once established, these protocols can be used to calculate on-orbit end-of-life performance from ground test results. The Solar Array Verification Analysis Tool (SAVANT) utilizes the radiation environment from the Environment Work Bench (EWB) model developed by the NASA Lewis Research Center s Photovoltaic and Space Environmental Effects Branch in conjunction with Maxwell Technologies. It then modifies and combines this information with the displacement damage model proposed by Summers et al. (ref. 1) of the Naval Research Laboratory to determine solar cell performance during the course of a given mission. The resulting predictions can then be compared with flight data. The Environment WorkBench (ref. 2) uses the NASA AE8 (electron) and AP8 (proton) models of the radiation belts to calculate the trapped radiation flux. These fluxes are integrated over the defined spacecraft orbit for the duration of the mission to obtain the total omnidirectional fluence spectra. Components such as the solar cell coverglass, adhesive, and antireflective coatings can slow and attenuate the particle fluence reaching the solar cell. In SAVANT, a continuous slowing down approximation is used to model this effect.

  14. Multiple Forces Driving China's Economic Development: A New Analytic Framework

    Institute of Scientific and Technical Information of China (English)

    Yahua Wang; Angang Hu

    2007-01-01

    Based on economic growth theory and the World Bank's analytical framework relating to the quality of growth, the present paper constructs a framework that encompasses physical, international, human, natural and knowledge capital to synthetically interpret economic development. After defining the five types of capital and total capital, we analyze the dynamic changes of these types of capital in China and in other countries. The results show that since China's reform and opening up, knowledge, international, human and physical capital have grown rapidly, with speeds of growth higher than that of economic growth. As the five types of capital have all increased at varying paces, the savings level of total capital in China has quadrupled in 25 years and overtook that of the USA in the 1990s. The changes in the five types of capital and total capital reveal that there are progressively multiple driving forces behind China's rapid economic development. Implications for China's long-term economic development are thereby raised.

  15. saltPAD: A New Analytical Tool for Monitoring Salt Iodization in Low Resource Settings

    Directory of Open Access Journals (Sweden)

    Nicholas M. Myers

    2016-03-01

    Full Text Available We created a paper test card that measures a common iodizing agent, iodate, in salt. To test the analytical metrics, usability, and robustness of the paper test card when it is used in low resource settings, the South African Medical Research Council and GroundWork performed independ‐ ent validation studies of the device. The accuracy and precision metrics from both studies were comparable. In the SAMRC study, more than 90% of the test results (n=1704 were correctly classified as corresponding to adequately or inadequately iodized salt. The cards are suitable for market and household surveys to determine whether salt is adequately iodized. Further development of the cards will improve their utility for monitoring salt iodization during production.

  16. ANALYTICAL MODEL OF CALCULUS FOR INFLUENCE THE TRANSLATION GUIDE WEAR OVER THE MACHINING ACCURACY ON THE MACHINE TOOL

    Directory of Open Access Journals (Sweden)

    Ivona PETRE

    2010-10-01

    Full Text Available The wear of machine tools guides influences favorably to vibrations. As a result of guides wear, the initial trajectory of cutting tools motion will be modified, the generating dimensional accuracy discrepancies and deviations of geometrical shape of the work pieces. As it has already been known, the wear of mobile and rigid guides is determined by many parameters (pressure, velocity, friction length, lubrication, material. The choice of one or another analytic model and/or the experimental model of the wear is depending by the working conditions, assuming that the coupling material is known.The present work’s goal is to establish an analytic model of calculus showing the influence of the translation guides wear over the machining accuracy on machine-tools.

  17. Artificial neural networks in biology and chemistry: the evolution of a new analytical tool.

    Science.gov (United States)

    Cartwright, Hugh M

    2008-01-01

    Once regarded as an eccentric and unpromising algorithm for the analysis of scientific data, the neural network has been developed in the last decade into a powerful computational tool. Its use now spans all areas of science, from the physical sciences and engineering to the life sciences and allied subjects. Applications range from the assessment of epidemiological data or the deconvolution of spectra to highly practical applications, such as the electronic nose. This introductory chapter considers briefly the growth in the use of neural networks and provides some general background in preparation for the more detailed chapters that follow.

  18. An analytic framework for developing inherently-manufacturable pop-up laminate devices

    Science.gov (United States)

    Aukes, Daniel M.; Goldberg, Benjamin; Cutkosky, Mark R.; Wood, Robert J.

    2014-09-01

    Spurred by advances in manufacturing technologies developed around layered manufacturing technologies such as PC-MEMS, SCM, and printable robotics, we propose a new analytic framework for capturing the geometry of folded composite laminate devices and the mechanical processes used to manufacture them. These processes can be represented by combining a small set of geometric operations which are general enough to encompass many different manufacturing paradigms. Furthermore, such a formulation permits one to construct a variety of geometric tools which can be used to analyze common manufacturability concepts, such as tool access, part removability, and device support. In order to increase the speed of development, reduce the occurrence of manufacturing problems inherent with current design methods, and reduce the level of expertise required to develop new devices, the framework has been implemented in a new design tool called popupCAD, which is suited for the design and development of complex folded laminate devices. We conclude with a demonstration of utility of the tools by creating a folded leg mechanism.

  19. USING ANALYTIC HIERARCHY PROCESS (AHP METHOD IN RURAL DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Tülay Cengiz

    2003-04-01

    Full Text Available Rural development is a body of economical and social policies towards improving living conditions in rural areas through enabling rural population to utilize economical, social, cultural and technological blessing of city life in place, without migrating. As it is understood from this description, rural development is a very broad concept. Therefore, in development efforts problem should be stated clearly, analyzed and many criterias should be evaluated by experts. Analytic Hierarchy Process (AHP method can be utilized at there stages of development efforts. AHP methods is one of multi-criteria decision method. After degrading a problem in smaller pieces, relative importance and level of importance of two compared elements are determined. It allows evaluation of quality and quantity factors. At the same time, it permits utilization of ideas of many experts and use them in decision process. Because mentioned features of AHP method, it could be used in rural development works. In this article, cultural factors, one of the important components of rural development is often ignored in many studies, were evaluated as an example. As a result of these applications and evaluations, it is concluded that AHP method could be helpful in rural development efforts.

  20. Innovative analytical tools to characterize prebiotic carbohydrates of functional food interest.

    Science.gov (United States)

    Corradini, Claudio; Lantano, Claudia; Cavazza, Antonella

    2013-05-01

    Functional foods are one of the most interesting areas of research and innovation in the food industry. A functional food or functional ingredient is considered to be any food or food component that provides health benefits beyond basic nutrition. Recently, consumers have shown interest in natural bioactive compounds as functional ingredients in the diet owing to their various beneficial effects for health. Water-soluble fibers and nondigestible oligosaccharides and polysaccharides can be defined as functional food ingredients. Fructooligosaccharides (FOS) and inulin are resistant to direct metabolism by the host and reach the caecocolon, where they are used by selected groups of beneficial bacteria. Furthermore, they are able to improve physical and structural properties of food, such as hydration, oil-holding capacity, viscosity, texture, sensory characteristics, and shelf-life. This article reviews major innovative analytical developments to screen and identify FOS, inulins, and the most employed nonstarch carbohydrates added or naturally present in functional food formulations. High-performance anion-exchange chromatography with pulsed electrochemical detection (HPAEC-PED) is one of the most employed analytical techniques for the characterization of those molecules. Mass spectrometry is also of great help, in particularly matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS), which is able to provide extensive information regarding the molecular weight and length profiles of oligosaccharides and polysaccharides. Moreover, MALDI-TOF-MS in combination with HPAEC-PED has been shown to be of great value for the complementary information it can provide. Some other techniques, such as NMR spectroscopy, are also discussed, with relevant examples of recent applications. A number of articles have appeared in the literature in recent years regarding the analysis of inulin, FOS, and other carbohydrates of interest in the field and

  1. Developing a Parametric Urban Design Tool

    DEFF Research Database (Denmark)

    Steinø, Nicolai; Obeling, Esben

    2014-01-01

    Parametric urban design is a potentially powerful tool for collaborative urban design processes. Rather than making one- off designs which need to be redesigned from the ground up in case of changes, parametric design tools make it possible keep the design open while at the same time allowing...

  2. A Survey on Big Data Analytics: Challenges, Open Research Issues and Tools

    Directory of Open Access Journals (Sweden)

    D. P. Acharjya

    2016-02-01

    Full Text Available A huge repository of terabytes of data is generated each day from modern information systems and digital technolo-gies such as Internet of Things and cloud computing. Analysis of these massive data requires a lot of efforts at multiple levels to extract knowledge for decision making. Therefore, big data analysis is a current area of research and development. The basic objective of this paper is to explore the potential impact of big data challenges, open research issues, and various tools associated with it. As a result, this article provides a platform to explore big data at numerous stages. Additionally, it opens a new horizon for researchers to develop the solution, based on the challenges and open research issues.

  3. Development of Integrated Protein Analysis Tool

    Directory of Open Access Journals (Sweden)

    Poorna Satyanarayana Boyidi,

    2010-05-01

    Full Text Available We present an “Integrated Protein Analysis Tool(IPAT” that is able to perform the following tasks in segregating and annotating genomic data: Protein Editor enables the entry of nucleotide/ aminoacid sequences Utilities :IPAT enables to conversion of given nucleotide sequence to equivalent amino acid sequence: Secondary Structure Prediction is possible using three algorithms (GOR-I Gibrat Method and DPM (Double Prediction Method with graphical display. Profiles and properties: allow calculating eight physico-chemical profiles and properties, viz Hydrophobicity, Hydrophilicity, Antigenicity, Transmembranous regions , Solvent Accessibility, Molecular Weight, Absorption factor and Amino Acid Content. IPAT has a provision for viewing Helical-Wheel Projection of a selected region of a given protein sequence and 2D representation of alphacarbon IPAT was developed using the UML (Unified Modeling Language for modeling the project elements, coded in Java, and subjected to unit testing, path testing, and integration testing.This project mainly concentrates on Butyrylcholinesterase to predict secondary structure and its physicochemical profiles, properties.

  4. Developing a Support Tool for Global Product Development Decisions

    DEFF Research Database (Denmark)

    Søndergaard, Erik Stefan; Ahmed-Kristensen, Saeema

    2016-01-01

    This paper investigates how global product development decisions are made through a multiple-case study in three Danish engineering. The paper identifies which information and methods are applied for making decisions and how decision-making can be supported based on previous experience. The paper...... presents results from 51 decisions made in the three companies, and based on the results of the studies a framework for a decision-support tool is outlined and discussed. The paper rounds off with an identification of future research opportunities in the area of global product development and decision-making....

  5. The Broad Application of Data Science and Analytics: Essential Tools for the Liberal Arts Graduate

    Science.gov (United States)

    Cárdenas-Navia, Isabel; Fitzgerald, Brian K.

    2015-01-01

    New technologies and data science are transforming a wide range of organizations into analytics-intensive enterprises. Despite the resulting demand for graduates with experience in the application of analytics, though, undergraduate education has been slow to change. The academic and policy communities have engaged in a decade-long conversation…

  6. Study of an ultrasound-based process analytical tool for homogenization of nanoparticulate pharmaceutical vehicles.

    Science.gov (United States)

    Cavegn, Martin; Douglas, Ryan; Akkermans, Guy; Kuentz, Martin

    2011-08-01

    There are currently no adequate process analyzers for nanoparticulate viscosity enhancers. This article aims to evaluate ultrasonic resonator technology as a monitoring tool for homogenization of nanoparticulate gels. Aqueous dispersions of colloidal microcrystalline cellulose (MCC) and a mixture of clay particles with xanthan gum were compared with colloidal silicon dioxide in oil. The processing was conducted using a laboratory-scale homogenizing vessel. The study investigated first the homogenization kinetics of the different systems to focus then on process factors in the case of colloidal MCC. Moreover, rheological properties were analyzed offline to assess the structure of the resulting gels. Results showed the suitability of ultrasound velocimetry to monitor the homogenization process. The obtained data were fitted using a novel heuristic model. It was possible to identify characteristic homogenization times for each formulation. The subsequent study of the process factors demonstrated that ultrasonic process analysis was equally sensitive as offline rheological measurements in detecting subtle manufacturing changes. It can be concluded that the ultrasonic method was able to successfully assess homogenization of nanoparticulate viscosity enhancers. This novel technique can become a vital tool for development and production of pharmaceutical suspensions in the future.

  7. A Tool for the Development of Robot Control Strategies

    Directory of Open Access Journals (Sweden)

    Luiz Carlos Figueiredo

    2007-12-01

    Full Text Available In this paper we report as the development of a tool in to develop and set control strategies as a fast and easy way. Additionally, a tricycle robot with two traction motors was built to test the strategies produced with the tool. Experimental tests have shown an advantage in the use of such tool.

  8. Effect of Micro Electrical Discharge Machining Process Conditions on Tool Wear Characteristics: Results of an Analytic Study

    DEFF Research Database (Denmark)

    Puthumana, Govindan; P., Rajeev

    2016-01-01

    Micro electrical discharge machining is one of the established techniques to manufacture high aspect ratio features on electrically conductive materials. This paper presents the results and inferences of an analytical study for estimating theeffect of process conditions on tool electrode wear...... characteristicsin micro-EDM process. A new approach with two novel factors anticipated to directly control the material removal mechanism from the tool electrode are proposed; using discharge energyfactor (DEf) and dielectric flushing factor (DFf). The results showed that the correlation between the tool wear rate...... (TWR) and the factors is poor. Thus, individual effects of each factor on TWR are analyzed. The factors selected for the study of individual effects are pulse on-time, discharge peak current, gap voltage and gap flushing pressure. The tool wear rate decreases linearly with an increase in the pulse on...

  9. Implementing analytics a blueprint for design, development, and adoption

    CERN Document Server

    Sheikh, Nauman

    2013-01-01

    Implementing Analytics demystifies the concept, technology and application of analytics and breaks its implementation down to repeatable and manageable steps, making it possible for widespread adoption across all functions of an organization. Implementing Analytics simplifies and helps democratize a very specialized discipline to foster business efficiency and innovation without investing in multi-million dollar technology and manpower. A technology agnostic methodology that breaks down complex tasks like model design and tuning and emphasizes business decisions rather than the technology behi

  10. Using fuzzy analytical hierarchy process (AHP to evaluate web development platform

    Directory of Open Access Journals (Sweden)

    Ahmad Sarfaraz

    2012-01-01

    Full Text Available Web development is plays an important role on business plans and people's lives. One of the key decisions in which both short-term and long-term success of the project depends is choosing the right development platform. Its criticality can be judged by the fact that once a platform is chosen, one has to live with it throughout the software development life cycle. The entire shape of the project depends on the language, operating system, tools, frameworks etc., in short the web development platform chosen. In addition, choosing the right platform is a multi criteria decision making (MCDM problem. We propose a fuzzy analytical hierarchy process model to solve the MCDM problem. We try to tap the real-life modeling potential of fuzzy logic and conjugate it with the commonly used powerful AHP modeling method.

  11. Analytical Microscopy

    Energy Technology Data Exchange (ETDEWEB)

    2006-06-01

    In the Analytical Microscopy group, within the National Center for Photovoltaic's Measurements and Characterization Division, we combine two complementary areas of analytical microscopy--electron microscopy and proximal-probe techniques--and use a variety of state-of-the-art imaging and analytical tools. We also design and build custom instrumentation and develop novel techniques that provide unique capabilities for studying materials and devices. In our work, we collaborate with you to solve materials- and device-related R&D problems. This sheet summarizes the uses and features of four major tools: transmission electron microscopy, scanning electron microscopy, the dual-beam focused-ion-beam workstation, and scanning probe microscopy.

  12. Application of quality by design to the development of analytical separation methods.

    Science.gov (United States)

    Orlandini, Serena; Pinzauti, Sergio; Furlanetto, Sandra

    2013-01-01

    Recent pharmaceutical regulatory documents have stressed the critical importance of applying quality by design (QbD) principles for in-depth process understanding to ensure that product quality is built in by design. This article outlines the application of QbD concepts to the development of analytical separation methods, for example chromatography and capillary electrophoresis. QbD tools, for example risk assessment and design of experiments, enable enhanced quality to be integrated into the analytical method, enabling earlier understanding and identification of variables affecting method performance. A QbD guide is described, from identification of quality target product profile to definition of control strategy, emphasizing the main differences from the traditional quality by testing (QbT) approach. The different ways several authors have treated single QbD steps of method development are reviewed and compared. In a final section on outlook, attention is focused on general issues which have arisen from the surveyed literature, and on the need to change the researcher's mindset from the QbT to QbD approach as an important analytical trend for the near future.

  13. Bio-electrosprays: from bio-analytics to a generic tool for the health sciences.

    Science.gov (United States)

    Jayasinghe, Suwan N

    2011-03-07

    Electrosprays or electrospraying is a process by which an aerosol is generated between two charged electrodes. This aerosol generation methodology has been known for well over a century, and has undergone exploration in aerosol and materials sciences, to many other areas of research and development. In one such exploration, electrosprays were partnered with mass spectrometry for the accurate characterisation of molecules. This technology now widely referred to as electrospray ionisation mass spectrometry (ESI MS) significantly contributes to molecular analysis and cancer biology to name a few. In fact these findings were recognised by the Chemistry Nobel Committee in 2002, and have catapulted electrosprays to many areas of research and development. In this review, the author wishes to introduce and discuss another such recent discovery, where electrosprays have been investigated for directly handling living cells and whole organisms. Over the past few years these electrosprays now referred to as "bio-electrosprays" have undergone rigorous developmental studies both in terms of understanding all the associate physical, chemical and biological sciences for completely assessing their effects, if any on the direct handling of living biological materials. Therefore, the review will bring together all the work that has contributed to fully understanding that bio-electrosprays are an inert technology for directly handling living biological materials, while elucidating some unique features they possess over competing technologies. Hence, demonstrating this approach as a flexible methodology for a wide range of applications spanning bio-analytics, diagnostics to the possible creation of synthetic tissues, for repairing and replacing damaged/ageing tissues, to the targeted and controlled delivery of personalised medicine through experimental and/or medical cells and/or genes. Therefore, elucidating the far reaching ramifications bio-electrosprays have to our health sciences

  14. Big data analytics from strategic planning to enterprise integration with tools, techniques, NoSQL, and graph

    CERN Document Server

    Loshin, David

    2013-01-01

    Big Data Analytics will assist managers in providing an overview of the drivers for introducing big data technology into the organization and for understanding the types of business problems best suited to big data analytics solutions, understanding the value drivers and benefits, strategic planning, developing a pilot, and eventually planning to integrate back into production within the enterprise. Guides the reader in assessing the opportunities and value propositionOverview of big data hardware and software architecturesPresents a variety of te

  15. Algorithms for optimized maximum entropy and diagnostic tools for analytic continuation

    Science.gov (United States)

    Bergeron, Dominic; Tremblay, A.-M. S.

    2016-08-01

    Analytic continuation of numerical data obtained in imaginary time or frequency has become an essential part of many branches of quantum computational physics. It is, however, an ill-conditioned procedure and thus a hard numerical problem. The maximum-entropy approach, based on Bayesian inference, is the most widely used method to tackle that problem. Although the approach is well established and among the most reliable and efficient ones, useful developments of the method and of its implementation are still possible. In addition, while a few free software implementations are available, a well-documented, optimized, general purpose, and user-friendly software dedicated to that specific task is still lacking. Here we analyze all aspects of the implementation that are critical for accuracy and speed and present a highly optimized approach to maximum entropy. Original algorithmic and conceptual contributions include (1) numerical approximations that yield a computational complexity that is almost independent of temperature and spectrum shape (including sharp Drude peaks in broad background, for example) while ensuring quantitative accuracy of the result whenever precision of the data is sufficient, (2) a robust method of choosing the entropy weight α that follows from a simple consistency condition of the approach and the observation that information- and noise-fitting regimes can be identified clearly from the behavior of χ2 with respect to α , and (3) several diagnostics to assess the reliability of the result. Benchmarks with test spectral functions of different complexity and an example with an actual physical simulation are presented. Our implementation, which covers most typical cases for fermions, bosons, and response functions, is available as an open source, user-friendly software.

  16. Interactive network analytical tool for instantaneous bespoke interrogation of food safety notifications.

    Directory of Open Access Journals (Sweden)

    Tamás Nepusz

    Full Text Available BACKGROUND: The globalization of food supply necessitates continued advances in regulatory control measures to ensure that citizens enjoy safe and adequate nutrition. The aim of this study was to extend previous reports on network analysis relating to food notifications by including an optional filter by type of notification and in cases of contamination, by type of contaminant in the notified foodstuff. METHODOLOGY/PRINCIPAL FINDINGS: A filter function has been applied to enable processing of selected notifications by contaminant or type of notification to i capture complexity, ii analyze trends, and iii identify patterns of reporting activities between countries. The program rapidly assesses nations' roles as transgressor and/or detector for each category of contaminant and for the key class of border rejection. In the open access demonstration version, the majority of notifications in the Rapid Alert System for Food and Feed were categorized by contaminant type as mycotoxin (50.4%, heavy metals (10.9% or bacteria (20.3%. Examples are given demonstrating how network analytical approaches complement, and in some cases supersede, descriptive statistics such as frequency counts, which may give limited or potentially misleading information. One key feature is that network analysis takes the relationship between transgressor and detector countries, along with number of reports and impact simultaneously into consideration. Furhermore, the indices that compliment the network maps and reflect each country's transgressor and detector activities allow comparisons to be made between (transgressing vs. detecting as well as within (e.g. transgressing activities. CONCLUSIONS/SIGNIFICANCE: This further development of the network analysis approach to food safety contributes to a better understanding of the complexity of the effort ensuring food is safe for consumption in the European Union. The unique patterns of the interplay between detectors and

  17. categoryCompare, a novel analytical tool based on feature annotations

    Directory of Open Access Journals (Sweden)

    Robert Maxwell Flight

    2014-04-01

    Full Text Available Assessment of high-throughput –omics data initially focuses on relative or raw levels of a particular feature, such as an expression value for a transcript, protein, or metabolite. At a second level, analyses of annotations including known or predicted functions and associations of each individual feature, attempt to distill biological context. Most currently available comparative- and meta-analyses methods are dependent on the availability of identical features across data sets, and concentrate on determining features that are differentially expressed across experiments, some of which may be considered biomarkers. The heterogeneity of measurement platforms and inherent variability of biological systems confounds the search for robust biomarkers indicative of a particular condition. In many instances, however, multiple data sets show involvement of common biological processes or signaling pathways, even though individual features are not commonly measured or differentially expressed between them.We developed a methodology, CATEGORYCOMPARE, for cross-platform and cross-sample comparison of high-throughput data at the annotation level. We assessed the utility of the approach using hypothetical data, as well as determining similarities and differences in the set of processes in two instances: 1 denervated skin vs. denervated muscle, and 2 colon from Crohn’s disease vs. colon from ulcerative colitis. The hypothetical data showed that in many cases comparing annotations gave superior results to comparing only at the gene level. Improved analytical results depended as well on the number of genes included in the annotation term, the amount of noise in relation to the number of genes expressing in unenriched annotation categories, and the specific method in which samples are combined.CATEGORYCOMPARE is available from http://bioconductor.org/packages/release/bioc/html/categoryCompare.html

  18. Development of a Framework for Sustainable Outsourcing: Analytic Balanced Scorecard Method (A-BSC

    Directory of Open Access Journals (Sweden)

    Fabio De Felice

    2015-06-01

    Full Text Available Nowadays, many enterprises choose to outsource its non-core business to other enterprises to reduce cost and increase the efficiency. Many enterprises choose to outsource their supply chain management (SCM and leave it to a third-party organization in order to improve their services. The paper proposes an integrated and multicriteria tool useful to monitor and to improve performance in an outsourced supply chain. The Analytic Balanced Scorecard method (A-BSC is proposed as an effective method useful to analyze strategic performance within an outsourced supply chain. The aim of the paper is to present the integration of two methodologies: Balanced Scorecard, a multiple perspective framework for performance assessment, and Analytic Hierarchy Process, a decision-making tool used to prioritize multiple performance perspectives and to generate a unified metric. The development of the framework is aimed to provide a performance analysis to achieve better sustainability performance of supply chain. A real case study concerning a typical value chain is presented.

  19. NASTRAN as an analytical research tool for composite mechanics and composite structures

    Science.gov (United States)

    Chamis, C. C.; Sinclair, J. H.; Sullivan, T. L.

    1976-01-01

    Selected examples are described in which NASTRAN is used as an analysis research tool for composite mechanics and for composite structural components. The examples were selected to illustrate the importance of using NASTRAN as an analysis tool in this rapidly advancing field.

  20. Tool Forces Developed During Friction Stir Welding

    Science.gov (United States)

    Melendez, M.; Tang, W.; Schmidt, C.; McClure, J. C.; Nunes, A. C.; Murr, L. E.

    2003-01-01

    This paper will describe a technique for measuring the various forces and the torque that exist on the Friction Stir Welding pin tool. Results for various plunge depths, weld speeds, rotational speed, and tool configurations will be presented. Welds made on 6061 aluminum with typical welding conditions require a downward force of 2800 lbs. (12.5 kN) a longitudinal force in the direction of motion of 300 lbs (1.33 kN), a transverse force in the omega x v direction of 30 lbs (135 N). Aluminum 2195 under typical weld conditions requires a downward force of 3100 lbs. (1.38 kN), a longitudinal force of 920 lbs. (4.1 kN), and a transverse force of 45 lbs. (200 N) in the omega x v direction.

  1. TAQL: A Problem Space Tool for Expert System Development.

    Science.gov (United States)

    1992-05-01

    tools developed for use in Round 3. Prior to building the tools, the average fix time for errors that would have been catchable was nearly identical to...the average fix time for errors that would not have been catchable . After building the tools, uncatchabie errors took three times longer to fix than... catchable errors. I conclude that the model tools are highly effective for catching space and data model errors and that this translated into reduced

  2. Evaluation And Selection Process of Suppliers Through Analytical Framework: An Emprical Evidence of Evaluation Tool

    Directory of Open Access Journals (Sweden)

    Imeri Shpend

    2015-09-01

    Full Text Available The supplier selection process is very important to companies as selecting the right suppliers that fit companies strategy needs brings drastic savings. Therefore, this paper seeks to address the key area of supplies evaluation from the supplier review perspective. The purpose was to identify the most important criteria for suppliers’ evaluation and develop evaluation tool based on surveyed criteria. The research was conducted through structured questionnaire and the sample focused on small to medium sized companies (SMEs in Greece. In total eighty companies participated in the survey answering the full questionnaire which consisted of questions whether these companies utilize some suppliers’ evaluation criteria and what criteria if any is applied. The main statistical instrument used in the study is Principal Component Analysis (PCA. Thus, the research has shown that the main criteria are: the attitude of the vendor towards the customer, supplier delivery time, product quality and price. Conclusions are made on the suitability and usefulness of suppliers’ evaluation criteria and in way they are applied in enterprises.

  3. Computational Tools for Accelerating Carbon Capture Process Development

    Energy Technology Data Exchange (ETDEWEB)

    Miller, David; Sahinidis, N V; Cozad, A; Lee, A; Kim, H; Morinelly, J; Eslick, J; Yuan, Z

    2013-06-04

    This presentation reports development of advanced computational tools to accelerate next generation technology development. These tools are to develop an optimized process using rigorous models. They include: Process Models; Simulation-Based Optimization; Optimized Process; Uncertainty Quantification; Algebraic Surrogate Models; and Superstructure Optimization (Determine Configuration).

  4. Development of novel tools to measure food neophobia in children

    DEFF Research Database (Denmark)

    Damsbo-Svendsen, Marie; Frøst, Michael Bom; Olsen, Annemarie

    2017-01-01

    The main tool currently used to measure food neophobia (the Food Neophobia Scale, FNS, developed by Pliner & Hobden, 1992) may not remain optimal forever. It was developed around 25 years ago, and the perception and availability of “novel” and “ethnic” foods may have changed in the meantime....... Consequently, there is a need for developing updated tools for measuring food neophobia....

  5. PLS2 regression as a tool for selection of optimal analytical modality

    DEFF Research Database (Denmark)

    Madsen, Michael; Esbensen, Kim

    , analytical modalities. We here present results from a feasibility study, where Fourier Transform Near InfraRed (FT-NIR), Fourier Transform Mid InfraRed (FT-MIR), and Raman laser spectroscopy were applied on the same set of samples obtained from a pilot-scale beer brewing process. Quantitative PLS1 models...

  6. Development of balanced key performance indicators for emergency departments strategic dashboards following analytic hierarchical process.

    Science.gov (United States)

    Safdari, Reza; Ghazisaeedi, Marjan; Mirzaee, Mahboobeh; Farzi, Jebrail; Goodini, Azadeh

    2014-01-01

    Dynamic reporting tools, such as dashboards, should be developed to measure emergency department (ED) performance. However, choosing an effective balanced set of performance measures and key performance indicators (KPIs) is a main challenge to accomplish this. The aim of this study was to develop a balanced set of KPIs for use in ED strategic dashboards following an analytic hierarchical process. The study was carried out in 2 phases: constructing ED performance measures based on balanced scorecard perspectives and incorporating them into analytic hierarchical process framework to select the final KPIs. The respondents placed most importance on ED internal processes perspective especially on measures related to timeliness and accessibility of care in ED. Some measures from financial, customer, and learning and growth perspectives were also selected as other top KPIs. Measures of care effectiveness and care safety were placed as the next priorities too. The respondents placed least importance on disease-/condition-specific "time to" measures. The methodology can be presented as a reference model for development of KPIs in various performance related areas based on a consistent and fair approach. Dashboards that are designed based on such a balanced set of KPIs will help to establish comprehensive performance measurements and fair benchmarks and comparisons.

  7. Some key issues in the development of ergonomic intervention tools

    DEFF Research Database (Denmark)

    Edwards, Kasper; Winkel, Jørgen

    2016-01-01

    Literature reviews suggest that tools facilitating the ergonomic intervention processes should be integrated into rationalization tools, particular if such tools are participative. Such a Tool has recently been developed as an add-in module to the Lean tool “Value Stream Mapping” (VSM). However......, in the investigated context this module seems not to have any direct impact on the generation of proposals with ergonomic consideration. Contextual factors of importance seem to be e.g. allocation of sufficient resources and if work environment issues are generally accepted as part of the VSM methodology...

  8. Learning analytics as a tool for closing the assessment loop in higher education

    Directory of Open Access Journals (Sweden)

    Karen D. Mattingly

    2012-09-01

    Full Text Available This paper examines learning and academic analytics and its relevance to distance education in undergraduate and graduate programs as it impacts students and teaching faculty, and also academic institutions. The focus is to explore the measurement, collection, analysis, and reporting of data as predictors of student success and drivers of departmental process and program curriculum. Learning and academic analytics in higher education is used to predict student success by examining how and what students learn and how success is supported by academic programs and institutions. The paper examines what is being done to support students, whether or not it is effective, and if not why, and what educators can do. The paper also examines how these data can be used to create new metrics and inform a continuous cycle of improvement. It presents examples of working models from a sample of institutions of higher education: The Graduate School of Medicine at the University of Wollongong, the University of Michigan, Purdue University, and the University of Maryland, Baltimore County. Finally, the paper identifies considerations and recommendations for using analytics and offer suggestions for future research.

  9. Advanced reach tool (ART) : Development of the mechanistic model

    NARCIS (Netherlands)

    Fransman, W.; Tongeren, M. van; Cherrie, J.W.; Tischer, M.; Schneider, T.; Schinkel, J.; Kromhout, H.; Warren, N.; Goede, H.; Tielemans, E.

    2011-01-01

    This paper describes the development of the mechanistic model within a collaborative project, referred to as the Advanced REACH Tool (ART) project, to develop a tool to model inhalation exposure for workers sharing similar operational conditions across different industries and locations in Europe. T

  10. Verified System Development with the AutoFocus Tool Chain

    OpenAIRE

    Maria Spichkova; Florian Hölzl; David Trachtenherz

    2012-01-01

    This work presents a model-based development methodology for verified software systems as well as a tool support for it: an applied AutoFocus tool chain and its basic principles emphasizing the verification of the system under development as well as the check mechanisms we used to raise the level of confidence in the correctness of the implementation of the automatic generators.

  11. Development of site-oriented Analytics for Grid computing centres

    Science.gov (United States)

    Washbrook, A.; Crooks, D.; Roy, G.; Skipsey, S.; Qin, G.; Stewart, G. P.; Britton, D.

    2015-12-01

    The field of analytics, the process of analysing data to visualise meaningful patterns and trends, has become increasingly important in scientific computing as the volume and variety of data available to process has significantly increased. There is now ongoing work in the High Energy Physics (HEP) community in this area, for example in the augmentation of systems management at WLCG computing sites. We report on work evaluating the feasibility of distributed site-oriented analytics using the Elasticsearch, Logstash and Kibana software stack and demonstrate functionality by the application of two workflows that give greater insight into site operations.

  12. Development of ecohydrological assessment tool and its application

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    The development of Hydro-Informatic Modelling System (HIMS) provides an integrated platform for hydrological simulation. To extend the application of HIMS, an ecohydrological modeling system named ecohydrological assessment tool (EcoHAT) has been developed. Integrating parameter-management tools, RS (remote sensing) inversion tools, module-design tools and GIS analysis tools, the EcoHAT provides an integrated tool to simulate ecohydrological processes on regional scale, which develops a new method on sustainable use of water. EcoHAT has been applied to several case studies, such as, the Yellow River Basin, the acid deposition area in Guizhou province and the riparian catchment of Guanting reservoir in Beijing. Results prove that EcoHAT can efficiently simulate and analysis the ecohydrological processes on regional scale and provide technical support to integrated water resources management on basin scale.

  13. Development of ecohydrological assessment tool and its application

    Institute of Scientific and Technical Information of China (English)

    LIU ChangMing; YANG ShengTian; WEN ZhiQun; WANG XueLei; WANG YuJuan; LI Qian; SHENG HaoRan

    2009-01-01

    The development of Hydro-Informatic Modelling System (HIMS) provides an integrated platform for hydrological simulation. To extend the application of HIMS, an ecohydrological modeling system named ecohydrological assessment tool (EcoHAT) has been developed, integrating parameter-management tools, RS (remote sensing) inversion tools, module-design tools and GIS analysis tools, the EcoHAT provides an integrated tool to simulate ecohydrological processes on regional scale, which develops a new method on sustainable use of water. EcoHAT has been applied to several case studies,such as, the Yellow River Basin, the acid deposition area in Guizhou province and the riparian catchment of Guanting reservoir in Beijing. Results prove that EcoHAT can efficiently simulate and analysis the ecohydrological processes on regional scale and provide technical support to integrated water resources management on basin scale.

  14. Development of database and searching system for tool grinding

    Directory of Open Access Journals (Sweden)

    J.Y. Chen

    2008-02-01

    Full Text Available Purpose: For achieving the goal of saving time on the tool grinding and design, an efficient method of developing the data management and searching system for the standard cutting tools is proposed in this study.Design/methodology/approach: At first the tool grinding software with open architecture was employed to design and plan grinding processes for seven types of tools. According to the characteristics of tools (e.g. types, diameter, radius and so on, 4802 tool data were established in the relational database. Then, the SQL syntax was utilized to write the searching algorithms, and the human machine interfaces of the searching system for the tool database were developed by C++ Builder.Findings: For grinding a square end mill with two-flute, a half of time on the tool design and the change of production line for grinding other types of tools can be saved by means of our system. More specifically, the efficiency in terms of the approach and retract time was improved up to 40%, and an improvement of approximately 10.6% in the overall machining time can be achieved.Research limitations/implications: In fact, the used tool database in this study only includes some specific tools such as the square end mill. The step drill, taper tools, and special tools can also be taken into account in the database for future research.Practical implications: The most commercial tool grinding software is the modular-based design and use tool shapes to construct the CAM interface. Some limitations on the tool design are undesirable for customers. On the contrary, employing not only the grinding processes to construct the grinding path of tools but the searching system combined with the grinding software, it gives more flexible for one to design new tools.Originality/value: A novel tool database and searching system is presented for tool grinding. Using this system can save time and provide more convenience on designing tools and grinding. In other words, the

  15. A Tool for Conceptualising in PSS development

    DEFF Research Database (Denmark)

    Matzen, Detlef; McAloone, Timothy Charles

    2006-01-01

    International Design Conference [2]. In this contribution, we take the step from a fundamental understanding of the phenomenon to creating a normative exploitation of this understanding for PSS concept development. The developed modelling technique is based on the Customer Activity Cycle (CAC) model...

  16. 100-F Target Analyte List Development for Soil

    Energy Technology Data Exchange (ETDEWEB)

    Ovink, R.

    2012-09-18

    This report documents the process used to identify source area target analytes in support of the 100-F Area remedial investigation/feasibility study (RI/FS) addendum to the Integrated 100 Area Remedial Investigation/Feasibility Study Work Plan (DOE/RL-2008-46, Rev. 0).

  17. 100-K Target Analyte List Development for Soil

    Energy Technology Data Exchange (ETDEWEB)

    Ovink, R.

    2012-09-18

    This report documents the process used to identify source area target analytes in support of the 100-K Area remedial investigation/feasibility study (RI/FS) addendum to the Integrated 100 Area Remedial Investigation/Feasibility Study Work Plan (DOE/RL-2008-46, Rev. 0).

  18. Developing a 300C Analog Tool for EGS

    Energy Technology Data Exchange (ETDEWEB)

    Normann, Randy

    2015-03-23

    This paper covers the development of a 300°C geothermal well monitoring tool for supporting future EGS (enhanced geothermal systems) power production. This is the first of 3 tools planed. This is an analog tool designed for monitoring well pressure and temperature. There is discussion on 3 different circuit topologies and the development of the supporting surface electronics and software. There is information on testing electronic circuits and component. One of the major components is the cable used to connect the analog tool to the surface.

  19. Towards a Process for Developing Maintenance Tools in Academia

    CERN Document Server

    Kienle, Holger M

    2008-01-01

    Building of tools--from simple prototypes to industrial-strength applications--is a pervasive activity in academic research. When proposing a new technique for software maintenance, effective tool support is typically required to demonstrate the feasibility and effectiveness of the approach. However, even though tool building is both pervasive and requiring significant time and effort, it is still pursued in an ad hoc manner. In this paper, we address these issues by proposing a dedicated development process for tool building that takes the unique characteristics of an academic research environment into account. We first identify process requirements based on a review of the literature and our extensive tool building experience in the domain of maintenance tools. We then outline a process framework based on work products that accommodates the requirements while providing needed flexibility for tailoring the process to account for specific tool building approaches and project constraints. The work products are...

  20. Analytical Method Development & Validation for Related Substances Method of Busulfan Injection by Ion Chromatography Method

    Directory of Open Access Journals (Sweden)

    Rewaria S

    2013-05-01

    Full Text Available A new simple, accurate, precise and reproducible Ion chromatography method has been developed forthe estimation of Methane sulfonic acid in Busulfan injectable dosage. The method which is developedis also validated in complete compliance with the current regulatory guidelines by using well developedanalytical method validation techniques and tools which comprises with the analytical method validationparameters like Linearity, LOD and LOQ determination, Accuracy, Method precision, Specificity,System suitability, Robustness, Ruggedness etc. by adopting the current method the linearity obtained isnear to 0.999 and thus this shows that the method is capable to give a good detector response, therecovery calculated was within the range of 85% to 115% of the specification limits.

  1. DEVELOPMENT OF SOLUBILITY PRODUCT VISUALIZATION TOOLS

    Energy Technology Data Exchange (ETDEWEB)

    T.F. Turner; A.T. Pauli; J.F. Schabron

    2004-05-01

    Western Research Institute (WRI) has developed software for the visualization of data acquired from solubility tests. The work was performed in conjunction with AB Nynas Petroleum, Nynashamn, Sweden who participated as the corporate cosponsor for this Jointly Sponsored Research (JSR) task. Efforts in this project were split between software development and solubility test development. The Microsoft Windows-compatible software developed inputs up to three solubility data sets, calculates the parameters for six solid body types to fit the data, and interactively displays the results in three dimensions. Several infrared spectroscopy techniques have been examined for potential use in determining bitumen solubility in various solvents. Reflectance, time-averaged absorbance, and transmittance techniques were applied to bitumen samples in single and binary solvent systems. None of the techniques were found to have wide applicability.

  2. Auditing: A Tool for Institutional Development.

    Science.gov (United States)

    Moreland, Neil; Horsburgh, Rod

    1992-01-01

    Presents a format for auditing educational programs based on the key elements of efficiency, effectiveness, and economy. Describes a review of the audit format developed by the Britain's Further Education Unit. (SK)

  3. DEVELOPING A TOOL FOR ENVIRONMENTALLY PREFERABLE PURCHASING

    Science.gov (United States)

    LCA-based guidance was developed by EPA under the Framework for Responsible Environmental Decision Making (FRED) effort to demonstrate how to conduct a relative comparison between product types to determine environmental preferability. It identifies data collection needs and iss...

  4. 78 FR 68459 - Medical Device Development Tools; Draft Guidance for Industry, Tool Developers, and Food and Drug...

    Science.gov (United States)

    2013-11-14

    ... guidance to FDA staff, industry, healthcare providers, researchers, and patient and consumer groups on a... HUMAN SERVICES Food and Drug Administration Medical Device Development Tools; Draft Guidance for Industry, Tool Developers, and Food and Drug Administration Staff; Availability AGENCY: Food and...

  5. ICL-Based OF-CEAS: A Sensitive Tool for Analytical Chemistry.

    Science.gov (United States)

    Manfred, Katherine M; Hunter, Katharine M; Ciaffoni, Luca; Ritchie, Grant A D

    2017-01-03

    Optical-feedback cavity-enhanced absorption spectroscopy (OF-CEAS) using mid-infrared interband cascade lasers (ICLs) is a sensitive technique for trace gas sensing. The setup of a V-shaped optical cavity operating with a 3.29 μm cw ICL is detailed, and a quantitative characterization of the injection efficiency, locking stability, mode matching, and detection sensitivity is presented. The experimental data are supported by a model to show how optical feedback affects the laser frequency as it is scanned across several longitudinal modes of the optical cavity. The model predicts that feedback enhancement effects under strongly absorbing conditions can cause underestimations in the measured absorption, and these predictions are verified experimentally. The technique is then used in application to the detection of nitrous oxide as an exemplar of the utility of this technique for analytical gas phase spectroscopy. The analytical performance of the spectrometer, expressed as noise equivalent absorption coefficient, was estimated as 4.9 × 10(-9) cm (-1) Hz(-1/2), which compares well with recently reported values.

  6. Networks as Tools for Sustainable Urban Development

    DEFF Research Database (Denmark)

    Jensen, Jesper Ole; Tollin, Nicola

    . By applying the GREMI2-theories of “innovative milieux” (Aydalot, 1986; Camagni, 1991) to the case study, we will suggest some reasons for the benefits achieved by the Dogme-network, compared to other networks. This analysis will point to the existence of an “innovative milieu” on sustainability within......, strategies and actions. There has been little theoretically development on the subject. In practice networks for sustainable development can be seen as combining different theoretical approaches to networks, including governance, urban competition and innovation. To give a picture of the variety...

  7. Tools for Nanotechnology Education Development Program

    Energy Technology Data Exchange (ETDEWEB)

    Dorothy Moore

    2010-09-27

    The overall focus of this project was the development of reusable, cost-effective educational modules for use with the table top scanning electron microscope (TTSEM). The goal of this project's outreach component was to increase students' exposure to the science and technology of nanoscience.

  8. Developing Multilateral Surveillance Tools in the EU

    NARCIS (Netherlands)

    Ruiter, de Rik

    2008-01-01

    The development of the infrastructure of the Open Method of Coordination (OMC) is an unaddressed topic in scholarly debates. On the basis of secondary literature on the European Employment Strategy, it is hypothesised that a conflict between an incentive and reluctance to act on the EU level on the

  9. The development of tool manufacture in humans: what helps young children make innovative tools?

    Science.gov (United States)

    Chappell, Jackie; Cutting, Nicola; Apperly, Ian A; Beck, Sarah R

    2013-11-19

    We know that even young children are proficient tool users, but until recently, little was known about how they make tools. Here, we will explore the concepts underlying tool making, and the kinds of information and putative cognitive abilities required for children to manufacture novel tools. We will review the evidence for novel tool manufacture from the comparative literature and present a growing body of data from children suggesting that innovation of the solution to a problem by making a tool is a much more challenging task than previously thought. Children's difficulty with these kinds of tasks does not seem to be explained by perseveration with unmodified tools, difficulty with switching to alternative strategies, task pragmatics or issues with permission. Rather, making novel tools (without having seen an example of the required tool within the context of the task) appears to be hard, because it is an example of an 'ill-structured problem'. In this type of ill-structured problem, the starting conditions and end goal are known, but the transformations and/or actions required to get from one to the other are not specified. We will discuss the implications of these findings for understanding the development of problem-solving in humans and other animals.

  10. A Multi-facetted Visual Analytics Tool for Exploratory Analysis of Human Brain and Function Datasets.

    Science.gov (United States)

    Angulo, Diego A; Schneider, Cyril; Oliver, James H; Charpak, Nathalie; Hernandez, Jose T

    2016-01-01

    Brain research typically requires large amounts of data from different sources, and often of different nature. The use of different software tools adapted to the nature of each data source can make research work cumbersome and time consuming. It follows that data is not often used to its fullest potential thus limiting exploratory analysis. This paper presents an ancillary software tool called BRAVIZ that integrates interactive visualization with real-time statistical analyses, facilitating access to multi-facetted neuroscience data and automating many cumbersome and error-prone tasks required to explore such data. Rather than relying on abstract numerical indicators, BRAVIZ emphasizes brain images as the main object of the analysis process of individuals or groups. BRAVIZ facilitates exploration of trends or relationships to gain an integrated view of the phenomena studied, thus motivating discovery of new hypotheses. A case study is presented that incorporates brain structure and function outcomes together with different types of clinical data.

  11. Drugs on the internet, part IV: Google's Ngram viewer analytic tool applied to drug literature.

    Science.gov (United States)

    Montagne, Michael; Morgan, Melissa

    2013-04-01

    Google Inc.'s digitized book library can be searched based on key words and phrases over a five-century time frame. Application of the Ngram Viewer to drug literature was assessed for its utility as a research tool. The results appear promising as a method for noting changes in the popularity of specific drugs over time, historical epidemiology of drug use and misuse, and adoption and regulation of drug technologies.

  12. EXPERT SYSTEMS - DEVELOPMENT OF AGRICULTURAL INSURANCE TOOL

    Directory of Open Access Journals (Sweden)

    NAN Anca-Petruţa

    2013-07-01

    Full Text Available Because of the fact that specialty agricultural assistance is not always available when the farmers need it, we identified expert systems as a strong instrument with an extended potential in agriculture. This started to grow in scale recently, including all socially-economic activity fields, having the role of collecting data regarding different aspects from human experts with the purpose of assisting the user in the necessary steps for solving problems, at the performance level of the expert, making his acquired knowledge and experience available. We opted for a general presentation of the expert systems as well as their necessity, because, the solution to develop the agricultural system can come from artificial intelligence by implementing the expert systems in the field of agricultural insurance, promoting existing insurance products, farmers finding options in depending on their necessities and possibilities. The objective of this article consists of collecting data about different aspects about specific areas of interest of agricultural insurance, preparing the database, a conceptual presentation of a pilot version which will become constantly richer depending on the answers received from agricultural producers, with the clearest exposure of knowledgebase possible. We can justify picking this theme with the fact that even while agricultural insurance plays a very important role in agricultural development, the registered result got from them are modest, reason why solutions need to be found in the scope of developing the agricultural sector. The importance of this consists in the proposal of an immediate viable solution to correspond with the current necessities of agricultural producers and in the proposal of an innovative solution, namely the implementation of expert system in agricultural insurance as a way of promoting insurance products. Our research, even though it treats the subject at an conceptual level, it wants to undertake an

  13. Development of Machine Learning Tools in ROOT

    Science.gov (United States)

    Gleyzer, S. V.; Moneta, L.; Zapata, Omar A.

    2016-10-01

    ROOT is a framework for large-scale data analysis that provides basic and advanced statistical methods used by the LHC experiments. These include machine learning algorithms from the ROOT-integrated Toolkit for Multivariate Analysis (TMVA). We present several recent developments in TMVA, including a new modular design, new algorithms for variable importance and cross-validation, interfaces to other machine-learning software packages and integration of TMVA with Jupyter, making it accessible with a browser.

  14. Selection of reference standard during method development using the analytical hierarchy process.

    Science.gov (United States)

    Sun, Wan-yang; Tong, Ling; Li, Dong-xiang; Huang, Jing-yi; Zhou, Shui-ping; Sun, Henry; Bi, Kai-shun

    2015-03-25

    Reference standard is critical for ensuring reliable and accurate method performance. One important issue is how to select the ideal one from the alternatives. Unlike the optimization of parameters, the criteria of the reference standard are always immeasurable. The aim of this paper is to recommend a quantitative approach for the selection of reference standard during method development based on the analytical hierarchy process (AHP) as a decision-making tool. Six alternative single reference standards were assessed in quantitative analysis of six phenolic acids from Salvia Miltiorrhiza and its preparations by using ultra-performance liquid chromatography. The AHP model simultaneously considered six criteria related to reference standard characteristics and method performance, containing feasibility to obtain, abundance in samples, chemical stability, accuracy, precision and robustness. The priority of each alternative was calculated using standard AHP analysis method. The results showed that protocatechuic aldehyde is the ideal reference standard, and rosmarinic acid is about 79.8% ability as the second choice. The determination results successfully verified the evaluation ability of this model. The AHP allowed us comprehensive considering the benefits and risks of the alternatives. It was an effective and practical tool for optimization of reference standards during method development.

  15. Market research companies and new product development tools

    NARCIS (Netherlands)

    Nijssen, E.J.; Frambach, R.T.

    1998-01-01

    This research investigates (1) the share of new product development (NPD) research services in market research (MR) companies’ turnover, (2) MR companies’ awareness and use of NPD tools and the modifications made to these NPD tools, and (3) MR company managers’ perceptions of the influence of client

  16. Developing Tool Support for Problem Diagrams with CPN and VDM++

    DEFF Research Database (Denmark)

    Tjell, Simon; Lassen, Kristian Bisgaard

    2008-01-01

    In this paper, we describe ongoing work on the development of tool support for formal description of domains found in Problem Diagrams. The purpose of the tool is to handle the generation of a CPN model based on a collection of Problem Diagrams. The Problem Diagrams are used for representing the ...

  17. Evaluating IMU communication skills training programme: assessment tool development.

    Science.gov (United States)

    Yeap, R; Beevi, Z; Lukman, H

    2008-08-01

    This article describes the development of four assessment tools designed to evaluate the communication skills training (CST) programme at the International Medical University (IMU). The tools measure pre-clinical students' 1) perceived competency in basic interpersonal skills, 2) attitude towards patient-centred communication, 3) conceptual knowledge on doctor-patient communication, and 4) acceptance of the CST programme.

  18. Single-cell analysis tools for drug discovery and development.

    Science.gov (United States)

    Heath, James R; Ribas, Antoni; Mischel, Paul S

    2016-03-01

    The genetic, functional or compositional heterogeneity of healthy and diseased tissues presents major challenges in drug discovery and development. Such heterogeneity hinders the design of accurate disease models and can confound the interpretation of biomarker levels and of patient responses to specific therapies. The complex nature of virtually all tissues has motivated the development of tools for single-cell genomic, transcriptomic and multiplex proteomic analyses. Here, we review these tools and assess their advantages and limitations. Emerging applications of single cell analysis tools in drug discovery and development, particularly in the field of oncology, are discussed.

  19. On-line near infrared spectroscopy as a Process Analytical Technology (PAT) tool to control an industrial seeded API crystallization.

    Science.gov (United States)

    Schaefer, C; Lecomte, C; Clicq, D; Merschaert, A; Norrant, E; Fotiadu, F

    2013-09-01

    The final step of an active pharmaceutical ingredient (API) manufacturing synthesis process consists of a crystallization during which the API and residual solvent contents have to be quantified precisely in order to reach a predefined seeding point. A feasibility study was conducted to demonstrate the suitability of on-line NIR spectroscopy to control this step in line with new version of the European Medicines Agency (EMA) guideline [1]. A quantitative method was developed at laboratory scale using statistical design of experiments (DOE) and multivariate data analysis such as principal component analysis (PCA) and partial least squares (PLS) regression. NIR models were built to quantify the API in the range of 9-12% (w/w) and to quantify the residual methanol in the range of 0-3% (w/w). To improve the predictive ability of the models, the development procedure encompassed: outliers elimination, optimum model rank definition, spectral range and spectral pre-treatment selection. Conventional criteria such as, number of PLS factors, R(2), root mean square errors of calibration, cross-validation and prediction (RMSEC, RMSECV, RMSEP) enabled the selection of three model candidates. These models were tested in the industrial pilot plant during three technical campaigns. Results of the most suitable models were evaluated against to the chromatographic reference methods. Maximum relative bias of 2.88% was obtained about API target content. Absolute bias of 0.01 and 0.02% (w/w) respectively were achieved at methanol content levels of 0.10 and 0.13% (w/w). The repeatability was assessed as sufficient for the on-line monitoring of the 2 analytes. The present feasibility study confirmed the possibility to use on-line NIR spectroscopy as a PAT tool to monitor in real-time both the API and the residual methanol contents, in order to control the seeding of an API crystallization at industrial scale. Furthermore, the successful scale-up of the method proved its capability to be

  20. Can the analyte-triggered asymmetric autocatalytic Soai reaction serve as a universal analytical tool for measuring enantiopurity and assigning absolute configuration?

    Science.gov (United States)

    Welch, Christopher J; Zawatzky, Kerstin; Makarov, Alexey A; Fujiwara, Satoshi; Matsumoto, Arimasa; Soai, Kenso

    2016-12-20

    An investigation is reported on the use of the autocatalytic enantioselective Soai reaction, known to be influenced by the presence of a wide variety of chiral materials, as a generic tool for measuring the enantiopurity and absolute configuration of any substance. Good generality for the reaction across a small group of test analytes was observed, consistent with literature reports suggesting a diversity of compound types that can influence the stereochemical outcome of this reaction. Some trends in the absolute sense of stereochemical enrichment were noted, suggesting the possible utility of the approach for assigning absolute configuration to unknown compounds, by analogy to closely related species with known outcomes. Considerable variation was observed in the triggering strength of different enantiopure materials, an undesirable characteristic when dealing with mixtures containing minor impurities with strong triggering strength in the presence of major components with weak triggering strength. A strong tendency of the reaction toward an 'all or none' type of behavior makes the reaction most sensitive for detecting enantioenrichment close to zero. Consequently, the ability to discern modest from excellent enantioselectivity was relatively poor. While these properties limit the ability to obtain precise enantiopurity measurements in a simple single addition experiment, prospects may exist for more complex experimental setups that may potentially offer improved performance.

  1. Knowledge base development for SAM training tools

    Energy Technology Data Exchange (ETDEWEB)

    Jae, M.S.; Yoo, W.S.; Park, S. S.; Choi, H.K. [Hansung Univ., Seoul (Korea)

    2001-03-01

    Severe accident management can be defined as the use of existing and alternative resources, systems, and actions to prevent or mitigate a core-melt accident in nuclear power plants. TRAIN (Training pRogram for AMP In NPP), developed for training control room staff and the technical group, is introduced in this report. The TRAIN composes of phenomenological knowledge base (KB), accident sequence KB and accident management procedures with AM strategy control diagrams and information needs. This TRAIN might contribute to training them by obtaining phenomenological knowledge of severe accidents, understanding plant vulnerabilities, and solving problems under high stress. 24 refs., 76 figs., 102 tabs. (Author)

  2. VisTool: A user interface and visualization development system

    DEFF Research Database (Denmark)

    Xu, Shangjin

    Although software usability has long been emphasized, there is a lot of software with poor usability. In Usability Engineering, usability professionals prescribe a classical usability approach to improving software usability. It is essential to prototype and usability test user interfaces before....... However, it is more difficult to follow the classical usability approach for graphical presentation development. These difficulties result from the fact that designers cannot implement user interface with interactions and real data. We developed VisTool – a user interface and visualization development...... system – to simplify user interface development. VisTool allows user interface development without real programming. With VisTool a designer assembles visual objects (e.g. textboxes, ellipse, etc.) to visualize database contents. In VisTool, visual properties (e.g. color, position, etc.) can be formulas...

  3. Developing Learning Analytics Design Knowledge in the "Middle Space": The Student Tuning Model and Align Design Framework for Learning Analytics Use

    Science.gov (United States)

    Wise, Alyssa Friend; Vytasek, Jovita Maria; Hausknecht, Simone; Zhao, Yuting

    2016-01-01

    This paper addresses a relatively unexplored area in the field of learning analytics: how analytics are taken up and used as part of teaching and learning processes. Initial steps are taken towards developing design knowledge for this "middle space," with a focus on students as analytics users. First, a core set of challenges for…

  4. Nano-Scale Secondary Ion Mass Spectrometry - A new analytical tool in biogeochemistry and soil ecology

    Energy Technology Data Exchange (ETDEWEB)

    Herrmann, A M; Ritz, K; Nunan, N; Clode, P L; Pett-Ridge, J; Kilburn, M R; Murphy, D V; O' Donnell, A G; Stockdale, E A

    2006-10-18

    Soils are structurally heterogeneous across a wide range of spatio-temporal scales. Consequently, external environmental conditions do not have a uniform effect throughout the soil, resulting in a large diversity of micro-habitats. It has been suggested that soil function can be studied without explicit consideration of such fine detail, but recent research has indicated that the micro-scale distribution of organisms may be of importance for a mechanistic understanding of many soil functions. Due to a lack of techniques with adequate sensitivity for data collection at appropriate scales, the question 'How important are various soil processes acting at different scales for ecological function?' is challenging to answer. The nano-scale secondary ion mass spectrometer (NanoSIMS) represents the latest generation of ion microprobes which link high-resolution microscopy with isotopic analysis. The main advantage of NanoSIMS over other secondary ion mass spectrometers is the ability to operate at high mass resolution, whilst maintaining both excellent signal transmission and spatial resolution ({approx}50 nm). NanoSIMS has been used previously in studies focusing on presolar materials from meteorites, in material science, biology, geology and mineralogy. Recently, the potential of NanoSIMS as a new tool in the study of biophysical interfaces in soils has been demonstrated. This paper describes the principles of NanoSIMS and discusses the potential of this tool to contribute to the field of biogeochemistry and soil ecology. Practical considerations (sample size and preparation, simultaneous collection of isotopes, mass resolution, isobaric interference and quantification of the isotopes of interest) are discussed. Adequate sample preparation avoiding biases in the interpretation of NanoSIMS data due to artifacts and identification of regions-of interest are of most concerns in using NanoSIMS as a new tool in biogeochemistry and soil ecology. Finally, we review

  5. Newspaper Reading among College Students in Development of Their Analytical Ability

    Science.gov (United States)

    Kumar, Dinesh

    2009-01-01

    The study investigated the newspaper reading among college students in development of their analytical ability. Newspapers are one of the few sources of information that are comprehensive, interconnected and offered in one format. The main objective of the study was to find out the development of the analytical ability among college students by…

  6. An Analytic Hierarchy Process for School Quality and Inspection: Model Development and Application

    Science.gov (United States)

    Al Qubaisi, Amal; Badri, Masood; Mohaidat, Jihad; Al Dhaheri, Hamad; Yang, Guang; Al Rashedi, Asma; Greer, Kenneth

    2016-01-01

    Purpose: The purpose of this paper is to develop an analytic hierarchy planning-based framework to establish criteria weights and to develop a school performance system commonly called school inspections. Design/methodology/approach: The analytic hierarchy process (AHP) model uses pairwise comparisons and a measurement scale to generate the…

  7. Development of a Test to Evaluate Students' Analytical Thinking Based on Fact versus Opinion Differentiation

    Science.gov (United States)

    Thaneerananon, Taveep; Triampo, Wannapong; Nokkaew, Artorn

    2016-01-01

    Nowadays, one of the biggest challenges of education in Thailand is the development and promotion of the students' thinking skills. The main purposes of this research were to develop an analytical thinking test for 6th grade students and evaluate the students' analytical thinking. The sample was composed of 3,567 6th grade students in 2014…

  8. Designing a tool for curriculum leadership development in postgraduate programs

    Directory of Open Access Journals (Sweden)

    M Avizhgan

    2016-07-01

    Full Text Available Introduction: Leadership in the area of curriculum development is increasingly important as we look for ways to improve our programmes and practices. In curriculum studies, leadership has received little attention. Considering the lack of an evaluation tool with objective criteria in postgraduate curriculum leadership process, this study aimed to design a specific tool and determine the validity and reliability of the tool. Method: This study is a methodological research.  At first, domains and items of the tool were determined through expert interviews and literature review. Then, using Delphi technique, 54 important criteria were developed. A panel of experts was used to confirm content and face validity. Reliability was determined by a descriptive study in which 30 faculties from two of Isfahan universities and was estimated by internal consistency. The data were analyzed by SPSS software, using Pearson Correlation Coefficient and reliability analysis. Results: At first, considering the definition of curriculum leadership determined the domains and items of the tool and they were developed primary tool. Expert’s faculties’ views were used in deferent stages of development and psychometry. The tool internal consistency with Cronbach's alpha coefficient times was 96.5. This was determined for each domain separately. Conclution: Applying this instrument can improve the effectiveness of curriculum leadership. Identifying the characteristics of successful and effective leaders, and utilizing this knowledge in developing and implementing curriculum might help us to have better respond to the changing needs of our students, teachers and schools of tomorrow.

  9. External beam milli-PIXE as analytical tool for Neolithic obsidian provenance studies

    Energy Technology Data Exchange (ETDEWEB)

    Constantinescu, B.; Cristea-Stan, D. [National Institute for Nuclear Physics and Engineering Horia Hulubei, Bucharest-Magurele (Romania); Kovács, I.; Szõkefalvi-Nagy, Z. [Wigner Research Centre for Phyics, Institute for Particle and Nuclear Physics, Budapest (Hungary)

    2013-07-01

    Full text: Obsidian is the most important archaeological material used for tools and weapons before metals appearance. Its geological sources are limited and concentrated in few geographical zones: Armenia, Eastern Anatolia, Italian Lipari and Sardinia islands, Greek Melos and Yali islands, Hungarian and Slovak Tokaj Mountains. Due to this fact, in Mesolithic and Neolithic periods obsidian was the first archaeological material intensively traded even at long distances. To determine the geological provenance of obsidian and to identify the prehistoric long-range trade routes and possible population migrations elemental concentration ratios can help a 101, since each geological source has its 'fingerprints'. In this work external milli-PIXE technique was applied for elemental concentration ratio determinations in some Neolithic tools found in Transylvania and in the lron Gales region near Danube, and on few relevant geological samples (Slovak Tokaj Mountains, Lipari,Armenia). In Transylvania (the North-Western part of Romania, a region surrounded by Carpathian Mountains), Neolithic obsidian tools were discovered mainly in three regions: North-West - Oradea (near the border with Hungary, Slovakia and Ukraine), Centre -Cluj and Southwest- Banat (near the border with Serbia). A special case is lron Gales - Mesolithic and Early Neolithic sites, directly related to the appearance of agriculture replacing the Mesolithic economy based on hunting and fishing. Three long-distance trade routes could be considered: from Caucasus Mountains via North of the Black Sea, from Greek islands or Asia Minor via ex-Yugoslavia area or via Greece-Bulgaria or from Central Europe- Tokaj Mountains in the case of obsidian. As provenance 'fingerprints', we focused on Ti to Mn, and Rb-Sr-Y-Zr ratios. The measurements were performed at the external milli-PIXE beam-line of the 5MV VdG accelerator of the Wigner RCP. Proton energy of 3MeV and beam currents in the range of 1 ±1 D

  10. Psychometric properties of a Mental Health Team Development Audit Tool.

    LENUS (Irish Health Repository)

    Roncalli, Silvia

    2013-02-01

    To assist in improving team working in Community Mental Health Teams (CMHTs), the Mental Health Commission formulated a user-friendly but yet-to-be validated 25-item Mental Health Team Development Audit Tool (MHDAT).

  11. Magnetic optical sensor particles: a flexible analytical tool for microfluidic devices.

    Science.gov (United States)

    Ungerböck, Birgit; Fellinger, Siegfried; Sulzer, Philipp; Abel, Tobias; Mayr, Torsten

    2014-05-21

    In this study we evaluate magnetic optical sensor particles (MOSePs) with incorporated sensing functionalities regarding their applicability in microfluidic devices. MOSePs can be separated from the surrounding solution to form in situ sensor spots within microfluidic channels, while read-out is accomplished outside the chip. These magnetic sensor spots exhibit benefits of sensor layers (high brightness and convenient usage) combined with the advantages of dispersed sensor particles (ease of integration). The accumulation characteristics of MOSePs with different diameters were investigated as well as the in situ sensor spot stability at varying flow rates. Magnetic sensor spots were stable at flow rates specific to microfluidic applications. Furthermore, MOSePs were optimized regarding fiber optic and imaging read-out systems, and different referencing schemes were critically discussed on the example of oxygen sensors. While the fiber optic sensing system delivered precise and accurate results for measurement in microfluidic channels, limitations due to analyte consumption were found for microscopic oxygen imaging. A compensation strategy is provided, which utilizes simple pre-conditioning by exposure to light. Finally, new application possibilities were addressed, being enabled by the use of MOSePs. They can be used for microscopic oxygen imaging in any chip with optically transparent covers, can serve as flexible sensor spots to monitor enzymatic activity or can be applied to form fixed sensor spots inside microfluidic structures, which would be inaccessible to integration of sensor layers.

  12. THE ANALYTICAL TOOLS AS TERRITORY IN GEOGRAPHY EDUCATION: the control devices the production of multi/transterritorialities

    Directory of Open Access Journals (Sweden)

    Marcos Mondardo

    2015-06-01

    Full Text Available Education uses control devices. In the student's training process, the slogans and market integration mechanisms, try to control its revolutionary strength, resistance and revolt, as sociability command condition. For Deleuze and Felix Guattari education is an essentially political act. So inspired these authors, our intention in this paper is to develop a conceptual shift of exercise, that is, away conceptual locus of “minor literature” to think a “minor Geography” by analytical tool, the concept of territory, as a form of subversion of the very tradition of this disciplinary field, thinking the emergence of new dispossession movements in education. With this we intend to discuss the emergence of escape routes in the educational process and to recognize the movements of struggle and resistance politically committed to the values and subversive practices through the production of multi/transterritorialities. A educação utiliza dispositivos de controle. No processo de formação do estudante, as palavras de ordem e os mecanismos de integração ao mercado, tentam controlar sua força revolucionária, de resistência e de revolta, como condição de comando da sociabilidade. Para Gilles Deleuze e Félix Guattari a educação é um ato essencialmente político. Por isso, inspirado nesses autores, nossa pretensão neste ensaio é desenvolver um exercício de deslocamento conceitual, isto é, desviar o lócus conceitual de “literatura menor” para pensar uma “Geografia menor” por meio da ferramenta analítica, o conceito de território, como uma forma de subversão da própria tradição desse campo disciplinar, pensando a emergência de novos movimentos de desterritorialização na educação. Com isso pretendemos problematizar o surgimento de linhas de fuga no processo educativo e reconhecer os movimentos de luta e resistência comprometidos politicamente com os valores e práticas subversoras por meio da produção de multi/transterritorialidades.

  13. Advanced organic analysis and analytical methods development: FY 1995 progress report. Waste Tank Organic Safety Program

    Energy Technology Data Exchange (ETDEWEB)

    Wahl, K.L.; Campbell, J.A.; Clauss, S.A. [and others

    1995-09-01

    This report describes the work performed during FY 1995 by Pacific Northwest Laboratory in developing and optimizing analysis techniques for identifying organics present in Hanford waste tanks. The main focus was to provide a means for rapidly obtaining the most useful information concerning the organics present in tank waste, with minimal sample handling and with minimal waste generation. One major focus has been to optimize analytical methods for organic speciation. Select methods, such as atmospheric pressure chemical ionization mass spectrometry and matrix-assisted laser desorption/ionization mass spectrometry, were developed to increase the speciation capabilities, while minimizing sample handling. A capillary electrophoresis method was developed to improve separation capabilities while minimizing additional waste generation. In addition, considerable emphasis has been placed on developing a rapid screening tool, based on Raman and infrared spectroscopy, for determining organic functional group content when complete organic speciation is not required. This capability would allow for a cost-effective means to screen the waste tanks to identify tanks that require more specialized and complete organic speciation to determine tank safety.

  14. Analytical validation of accelerator mass spectrometry for pharmaceutical development.

    Science.gov (United States)

    Keck, Bradly D; Ognibene, Ted; Vogel, John S

    2010-03-01

    The validation parameters for pharmaceutical analyses were examined for the accelerator mass spectrometry measurement of (14)C/C ratio, independent of chemical separation procedures. The isotope ratio measurement was specific (owing to the (14)C label), stable across samples storage conditions for at least 1 year, linear over four orders of magnitude with an analytical range from 0.1 Modern to at least 2000 Modern (instrument specific). Furthermore, accuracy was excellent (between 1 and 3%), while precision expressed as coefficient of variation was between 1 and 6% determined primarily by radiocarbon content and the time spent analyzing a sample. Sensitivity, expressed as LOD and LLOQ was 1 and 10 attomoles of (14)C, respectively (which can be expressed as compound equivalents) and for a typical small molecule labeled at 10% incorporated with (14)C corresponds to 30 fg equivalents. Accelerator mass spectrometry provides a sensitive, accurate and precise method of measuring drug compounds in biological matrices.

  15. Electrochemical treatment of olive mill wastewater: Treatment extent and effluent phenolic compounds monitoring using some uncommon analytical tools

    Institute of Scientific and Technical Information of China (English)

    Chokri Belaid; Moncef Khadraoui; Salma Mseddi; Monem Kallel; Boubaker Elleuch; Jean Francois Fauvarque

    2013-01-01

    Problems related with industrials effluents can be divided in two parts:(1) their toxicity associated to their chemical content which should be removed before discharging the wastewater into the receptor media; (2) and the second part is linked to the difficulties of pollution characterisation and monitoring caused by the complexity of these matrixes.This investigation deals with these two aspects,an electrochemical treatment method of an olive mill wastewater (OMW) under pla ttmized expanded titanium electrodes using a modified Grignard reactor for toxicity removal as well as the exploration of the use of some specific analytical tools to monitor effluent phenolic compounds elimination.The results showed that electrochemical oxidation is able to remove/mitigate the OMW pollution.Indeed,87% of OMW color was removed and all aromatic compounds were disappeared from the solution by anodic oxidation.Moreover,55% of the chemical oxygen demand (COD) and the total organic carbon (TOC) were reduced.On the other hand,UV-Visible spectrophotometry,Gaz chromatography/mass spectrometry,cyclic voltammetry and 13C Nuclear Magnetic Resonance (NMR)showed that the used treatment seems efficaciously to eliminate phenolic compounds from OMW.It was concluded that electrochemical oxidation in a modified Gaignard reactor is a promising process for the destruction of all phenolic compounds present in OMW.Among the monitoring analytical tools applied,cyclic voltammetry and 13C NMR are among the techniques that are introduced for the first time to control the advancement of the OMW treatment and gave a close insight on polyphenols disappearance.

  16. Towards a green analytical laboratory: microextraction techniques as a useful tool for the monitoring of polluted soils

    Science.gov (United States)

    Lopez-Garcia, Ignacio; Viñas, Pilar; Campillo, Natalia; Hernandez Cordoba, Manuel; Perez Sirvent, Carmen

    2016-04-01

    Microextraction techniques are a valuable tool at the analytical laboratory since they allow sensitive measurements of pollutants to be carried out by means of easily available instrumentation. There is a large number of such procedures involving miniaturized liquid-liquid or liquid-solid extractions with the common denominator of using very low amounts (only a few microliters) or even none of organic solvents. Since minimal amounts of reagents are involved, and the generation of residues is consequently minimized, the approach falls within the concept of Green Analytical Chemistry. This general methodology is useful both for inorganic and organic pollutants. Thus, low amounts of metallic ions can be measured without the need of using ICP-MS since this instrument can be replaced by a simple AAS spectrometer which is commonly present in any laboratory and involves low acquisition and maintenance costs. When dealing with organic pollutants, the microextracts obtained can be introduced into liquid or gas chromatographs equipped with common detectors and there is no need for the most sophisticated and expensive mass spectrometers. This communication reports an overview of the advantages of such a methodology, and gives examples for the determination of some particular contaminants in soil and water samples The authors are grateful to the Comunidad Autonóma de la Región de Murcia , Spain (Fundación Séneca, 19888/GERM/15) for financial support

  17. Photography by Cameras Integrated in Smartphones as a Tool for Analytical Chemistry Represented by an Butyrylcholinesterase Activity Assay.

    Science.gov (United States)

    Pohanka, Miroslav

    2015-06-11

    Smartphones are popular devices frequently equipped with sensitive sensors and great computational ability. Despite the widespread availability of smartphones, practical uses in analytical chemistry are limited, though some papers have proposed promising applications. In the present paper, a smartphone is used as a tool for the determination of cholinesterasemia i.e., the determination of a biochemical marker butyrylcholinesterase (BChE). The work should demonstrate suitability of a smartphone-integrated camera for analytical purposes. Paper strips soaked with indoxylacetate were used for the determination of BChE activity, while the standard Ellman's assay was used as a reference measurement. In the smartphone-based assay, BChE converted indoxylacetate to indigo blue and coloration was photographed using the phone's integrated camera. A RGB color model was analyzed and color values for the individual color channels were determined. The assay was verified using plasma samples and samples containing pure BChE, and validated using Ellmans's assay. The smartphone assay was proved to be reliable and applicable for routine diagnoses where BChE serves as a marker (liver function tests; some poisonings, etc.). It can be concluded that the assay is expected to be of practical applicability because of the results' relevance.

  18. Atomic force microscopy as analytical tool to study physico-mechanical properties of intestinal cells

    Directory of Open Access Journals (Sweden)

    Christa Schimpel

    2015-07-01

    Full Text Available The small intestine is a complex system that carries out various functions. The main function of enterocytes is absorption of nutrients, whereas membranous cells (M cells are responsible for delivering antigens/foreign substances to the mucosal lymphoid tissues. However, to get a fundamental understanding of how cellular structures contribute to physiological processes, precise knowledge about surface morphologies, cytoskeleton organizations and biomechanical properties is necessary. Atomic force microscopy (AFM was used here as a powerful tool to study surface topographies of Caco-2 cells and M cells. Furthermore, cell elasticity (i.e., the mechanical response of a cell on a tip indentation, was elucidated by force curve measurements. Besides elasticity, adhesion was evaluated by recording the attraction and repulsion forces between the tip and the cell surface. Organization of F-actin networks were investigated via phalloidin labeling and visualization was performed with confocal laser scanning fluorescence microscopy (CLSM and scanning electron microscopy (SEM. The results of these various experimental techniques revealed significant differences in the cytoskeleton/microvilli arrangements and F-actin organization. Caco-2 cells displayed densely packed F-actin bundles covering the entire cell surface, indicating the formation of a well-differentiated brush border. In contrast, in M cells actins were arranged as short and/or truncated thin villi, only available at the cell edge. The elasticity of M cells was 1.7-fold higher compared to Caco-2 cells and increased significantly from the cell periphery to the nuclear region. Since elasticity can be directly linked to cell adhesion, M cells showed higher adhesion forces than Caco-2 cells. The combination of distinct experimental techniques shows that morphological differences between Caco-2 cells and M cells correlate with mechanical cell properties and provide useful information to understand

  19. Development and Validation of a Learning Analytics Framework: Two Case Studies Using Support Vector Machines

    Science.gov (United States)

    Ifenthaler, Dirk; Widanapathirana, Chathuranga

    2014-01-01

    Interest in collecting and mining large sets of educational data on student background and performance to conduct research on learning and instruction has developed as an area generally referred to as learning analytics. Higher education leaders are recognizing the value of learning analytics for improving not only learning and teaching but also…

  20. Computational Tools for Accelerating Carbon Capture Process Development

    Energy Technology Data Exchange (ETDEWEB)

    Miller, David

    2013-01-01

    The goals of the work reported are: to develop new computational tools and models to enable industry to more rapidly develop and deploy new advanced energy technologies; to demonstrate the capabilities of the CCSI Toolset on non-proprietary case studies; and to deploy the CCSI Toolset to industry. Challenges of simulating carbon capture (and other) processes include: dealing with multiple scales (particle, device, and whole process scales); integration across scales; verification, validation, and uncertainty; and decision support. The tools cover: risk analysis and decision making; validated, high-fidelity CFD; high-resolution filtered sub-models; process design and optimization tools; advanced process control and dynamics; process models; basic data sub-models; and cross-cutting integration tools.

  1. The Development of a Tool for Sustainable Building Design:

    DEFF Research Database (Denmark)

    Tine Ring Hansen, Hanne; Knudstrup, Mary-Ann

    2009-01-01

    for sustainable buildings, as well as, an analysis of the relationship between the different approaches (e.g. low-energy, environmental, green building, solar architecture, bio-climatic architecture etc.) to sustainable building design and these indicators. The paper furthermore discusses how sustainable...... architecture will gain more focus in the coming years, thus, establishing the need for the development of a new tool and methodology, The paper furthermore describes the background and considerations involved in the development of a design support tool for sustainable building design. A tool which considers...... the context that the building is located in, as well as, a tool which facilitates the discussion of which type of sustainability is achieved in specific projects....

  2. Large scale experiments as a tool for numerical model development

    DEFF Research Database (Denmark)

    Kirkegaard, Jens; Hansen, Erik Asp; Fuchs, Jesper;

    2003-01-01

    for improvement of the reliability of physical model results. This paper demonstrates by examples that numerical modelling benefits in various ways from experimental studies (in large and small laboratory facilities). The examples range from very general hydrodynamic descriptions of wave phenomena to specific......Experimental modelling is an important tool for study of hydrodynamic phenomena. The applicability of experiments can be expanded by the use of numerical models and experiments are important for documentation of the validity of numerical tools. In other cases numerical tools can be applied...... hydrodynamic interaction with structures. The examples also show that numerical model development benefits from international co-operation and sharing of high quality results....

  3. Development of a data capture tool for researching tech entrepreneurship

    DEFF Research Database (Denmark)

    Andersen, Jakob Axel Bejbro; Howard, Thomas J.; McAloone, Tim C.

    2014-01-01

    Startups play a crucial role in exploiting the commercial advantages created by new, advanced technologies. Surprisingly, the processes by which the entrepreneur commercialises these technologies are largely undescribed - partly due to the absence of appropriate process data capture tools....... This paper elucidates the requirements for such tools by drawing on knowledge of the entrepreneurial phenomenon and by building on the existing research tools used in design research. On this basis, the development of a capture method for tech startup processes is described and its potential discussed....

  4. Improved chiral SFC screening for analytical method development.

    Science.gov (United States)

    Schafer, Wes; Chandrasekaran, Tilak; Pirzada, Zainab; Zhang, Chaowei; Gong, Xiaoyi; Biba, Mirlinda; Regalado, Erik L; Welch, Christopher J

    2013-11-01

    In this study we describe the evaluation of a recently developed supercritical fluid chromatography (SFC) instrument for automated chiral SFC method development. The greatly improved gradient dwell volume and liquid flow control of the new instrument in combination with the use of shorter columns containing smaller stationary phase particles affords chiral SFC method development that is faster and more universal than previous systems.

  5. Tools for Large-Scale Data Analytic Examination of Relational and Epistemic Networks in Engineering Education

    Science.gov (United States)

    Madhavan, Krishna; Johri, Aditya; Xian, Hanjun; Wang, G. Alan; Liu, Xiaomo

    2014-01-01

    The proliferation of digital information technologies and related infrastructure has given rise to novel ways of capturing, storing and analyzing data. In this paper, we describe the research and development of an information system called Interactive Knowledge Networks for Engineering Education Research (iKNEER). This system utilizes a framework…

  6. Development of a sustainability assessment tool for office buildings

    OpenAIRE

    Barbosa, José Amarilio; Mateus, Ricardo; Bragança, L.

    2012-01-01

    The few available sustainability assessment tools applicable in Portugal are oriented for residential buildings. Nevertheless, the impacts of office buildings have been rising mainly due to an increase in the energy consumption for cooling and heating. This way, due to the growing environmental impact of office buildings, the development of Build-ing Sustainability Assessment (BSA) tools to assess the sustainability of this type of buildings is necessary and important to guide and to boost th...

  7. Extensions of the Johnson-Neyman Technique to Linear Models with Curvilinear Effects: Derivations and Analytical Tools

    Science.gov (United States)

    Miller, Jason W.; Stromeyer, William R.; Schwieterman, Matthew A.

    2013-01-01

    The past decade has witnessed renewed interest in the use of the Johnson-Neyman (J-N) technique for calculating the regions of significance for the simple slope of a focal predictor on an outcome variable across the range of a second, continuous independent variable. Although tools have been developed to apply this technique to probe 2- and 3-way…

  8. ISS Biotechnology Facility - Overview of Analytical Tools for Cellular Biotechnology Investigations

    Science.gov (United States)

    Jeevarajan, A. S.; Towe, B. C.; Anderson, M. M.; Gonda, S. R.; Pellis, N. R.

    2001-01-01

    The ISS Biotechnology Facility (BTF) platform provides scientists with a unique opportunity to carry out diverse experiments in a microgravity environment for an extended period of time. Although considerable progress has been made in preserving cells on the ISS for long periods of time for later return to Earth, future biotechnology experiments would desirably monitor, process, and analyze cells in a timely way on-orbit. One aspect of our work has been directed towards developing biochemical sensors for pH, glucose, oxygen, and carbon dioxide for perfused bioreactor system developed at Johnson Space Center. Another aspect is the examination and identification of new and advanced commercial biotechnologies that may have applications to on-orbit experiments.

  9. DEVELOPMENT OF A WIRELINE CPT SYSTEM FOR MULTIPLE TOOL USAGE

    Energy Technology Data Exchange (ETDEWEB)

    Stephen P. Farrington; Martin L. Gildea; J. Christopher Bianchi

    1999-08-01

    The first phase of development of a wireline cone penetrometer system for multiple tool usage was completed under DOE award number DE-AR26-98FT40366. Cone penetrometer technology (CPT) has received widespread interest and is becoming more commonplace as a tool for environmental site characterization activities at several Department of Energy (DOE) facilities. Although CPT already offers many benefits for site characterization, the wireline system can improve CPT technology by offering greater utility and increased cost savings. Currently the use of multiple CPT tools during a site characterization (i.e. piezometric cone, chemical sensors, core sampler, grouting tool) must be accomplished by withdrawing the entire penetrometer rod string to change tools. This results in multiple penetrations being required to collect the data and samples that may be required during characterization of a site, and to subsequently seal the resulting holes with grout. The wireline CPT system allows multiple CPT tools to be interchanged during a single penetration, without withdrawing the CPT rod string from the ground. The goal of the project is to develop and demonstrate a system by which various tools can be placed at the tip of the rod string depending on the type of information or sample desired. Under the base contract, an interchangeable piezocone and grouting tool was designed, fabricated, and evaluated. The results of the evaluation indicate that success criteria for the base contract were achieved. In addition, the wireline piezocone tool was validated against ASTM standard cones, the depth capability of the system was found to compare favorably with that of conventional CPT, and the reliability and survivability of the system were demonstrated.

  10. Development and Application of Predictive Tools for MHD Stability Limits in Tokamaks

    Energy Technology Data Exchange (ETDEWEB)

    Brennan, Dylan [Princeton Univ., NJ (United States); Miller, G. P. [Univ. of Tulsa, Tulsa, AZ (United States)

    2016-10-03

    This is a project to develop and apply analytic and computational tools to answer physics questions relevant to the onset of non-ideal magnetohydrodynamic (MHD) instabilities in toroidal magnetic confinement plasmas. The focused goal of the research is to develop predictive tools for these instabilities, including an inner layer solution algorithm, a resistive wall with control coils, and energetic particle effects. The production phase compares studies of instabilities in such systems using analytic techniques, PEST- III and NIMROD. Two important physics puzzles are targeted as guiding thrusts for the analyses. The first is to form an accurate description of the physics determining whether the resistive wall mode or a tearing mode will appear first as β is increased at low rotation and low error fields in DIII-D. The second is to understand the physical mechanism behind recent NIMROD results indicating strong damping and stabilization from energetic particle effects on linear resistive modes. The work seeks to develop a highly relevant predictive tool for ITER, advance the theoretical description of this physics in general, and analyze these instabilities in experiments such as ASDEX Upgrade, DIII-D, JET, JT-60U and NTSX. The awardee on this grant is the University of Tulsa. The research efforts are supervised principally by Dr. Brennan. Support is included for two graduate students, and a strong collaboration with Dr. John M. Finn of LANL. The work includes several ongoing collaborations with General Atomics, PPPL, and the NIMROD team, among others.

  11. TENTube: A video-based connection tool supporting competence development

    NARCIS (Netherlands)

    Angehrn, Albert; Maxwell, Katrina

    2008-01-01

    Angehrn, A. A., & Maxwell, K. (2008). TENTube: A video-based connection tool supporting competence development. In H. W. Sligte & R. Koper (Eds.), Proceedings of the 4th TENCompetence Open Workshop. Empowering Learners for Lifelong Competence Development: pedagogical, organisational and technologica

  12. Narrative Inquiry: Research Tool and Medium for Professional Development.

    Science.gov (United States)

    Conle, Carola

    2000-01-01

    Describes the development of narrative inquiry, highlighting one institutional setting, and discussing how narrative inquiry moved from being a research tool to a vehicle for curriculum within both graduate and preservice teacher development. After discussing theoretical resources for narrative inquiry, the paper examines criteria and terms…

  13. Computer-based phosphoric acid fuel cell analytical tools Descriptions and usages

    Science.gov (United States)

    Lu, C.; Presler, A. F.

    1987-01-01

    Simulation models have been developed for the prediction of phosphoric acid fuel cell (PAFC) powerplant system performance under both transient and steady operation conditions, as well as for the design of component configurations and for optimal systems synthesis. These models, which are presently computer-implemented, are an engineering and a system model; the former being solved by the finite difference method to determine the balances and properties of different sections, and the latter using thermodynamic balances to set up algebraic equations that yield physical and chemical properties of the stream for one operating condition.

  14. A Decision-Analytic Feasibility Study of Upgrading Machinery at a Tools Workshop

    Directory of Open Access Journals (Sweden)

    M. L. Chew Hernandez

    2012-04-01

    Full Text Available This paper presents the evaluation, from a Decision Analysis point of view, of the feasibility of upgrading machinery at an existing metal-forming workshop. The Integral Decision Analysis (IDA methodology is applied to clarify the decision and develop a decision model. One of the key advantages of the IDA is its careful selection of the problem frame, allowing a correct problem definition. While following most of the original IDA methodology, an addition to this methodology is proposed in this work, that of using the strategic Means-Ends Objective Network as a backbone for the development of the decision model. The constructed decision model uses influence diagrams to include factual operator and vendor expertise, simulation to evaluate the alternatives and a utility function to take into account the risk attitude of the decision maker. Three alternatives are considered: Base (no modification, CNC (installing an automatic lathe and CF (installation of an automatic milling machine. The results are presented as a graph showing zones in which a particular alternative should be selected. The results show the potential of IDA to tackle technical decisions that are otherwise approached without the due care.

  15. Modelling turbulent boundary layer flow over fractal-like multiscale terrain using large-eddy simulations and analytical tools.

    Science.gov (United States)

    Yang, X I A; Meneveau, C

    2017-04-13

    In recent years, there has been growing interest in large-eddy simulation (LES) modelling of atmospheric boundary layers interacting with arrays of wind turbines on complex terrain. However, such terrain typically contains geometric features and roughness elements reaching down to small scales that typically cannot be resolved numerically. Thus subgrid-scale models for the unresolved features of the bottom roughness are needed for LES. Such knowledge is also required to model the effects of the ground surface 'underneath' a wind farm. Here we adapt a dynamic approach to determine subgrid-scale roughness parametrizations and apply it for the case of rough surfaces composed of cuboidal elements with broad size distributions, containing many scales. We first investigate the flow response to ground roughness of a few scales. LES with the dynamic roughness model which accounts for the drag of unresolved roughness is shown to provide resolution-independent results for the mean velocity distribution. Moreover, we develop an analytical roughness model that accounts for the sheltering effects of large-scale on small-scale roughness elements. Taking into account the shading effect, constraints from fundamental conservation laws, and assumptions of geometric self-similarity, the analytical roughness model is shown to provide analytical predictions that agree well with roughness parameters determined from LES.This article is part of the themed issue 'Wind energy in complex terrains'.

  16. Analytical tools for the study of cellular glycosylation in the immune system

    Directory of Open Access Journals (Sweden)

    Yvette eVan Kooyk

    2013-12-01

    Full Text Available It is becoming increasingly clear that glycosylation plays important role in intercellular communication within the immune system. Glycosylation-dependent interactions are crucial for the innate and adaptive immune system and regulate immune cell trafficking, synapse formation, activation, and survival. These functions take place by the cis or trans interaction of lectins with glycans. Classical immunological and biochemical methods have been used for the study of lectin function; however, the investigation of their counterparts, glycans, requires very specialized methodologies that have been extensively developed in the past decade within the Glycobiology scientific community. This Mini-Review intends to summarize the available technology for the study of glycan biosynthesis, its regulation and characterization for their application to the study of glycans in Immunology.

  17. Implementing WAI Authoring Tool Accessibility Guidelines in Developing Adaptive Elearning

    Directory of Open Access Journals (Sweden)

    Mahieddine Djoudi

    2012-09-01

    Full Text Available Adaptive learning technology allows for the development of more personalized online learning experiences with materials that adapt to student performance and skill level. The term “adaptive” is also used to describe Assistive Technologies that allow the usability of online based courses for learners with disabilities and special needs. Authoring tools can enable, encourage, and assist authors in the creation of elearning content. Because most of the content of the Web based adaptive learning is created using authoring tools, they may be accessible to authors regardless of disability and they may support and encourage the authors in creating accessible elearning content. This paper presents an authoring tool designed for developing accessible adaptive elearning. The authoring tool, dedicated to Algerian universities, is designed to satisfy the W3C/WAI Authoring Tool Accessibility Guidelines (ATAG, and to allow collaboration functionalities among teachers where building elearning courses. After presenting the W3C/WAI accessibility guidelines, the collaborative authoring tool is outlined.

  18. Filmes de metal-hexacianoferrato: uma ferramenta em química analítica Metal-hexacyanoferrate films: a tool in analytical Chemistry

    Directory of Open Access Journals (Sweden)

    Ivanildo Luiz de Mattos

    2001-04-01

    Full Text Available Chemically modified electrodes based on hexacyanometalate films are presented as a tool in analytical chemistry. Use of amperometric sensors and/or biosensors based on the metal-hexacyanoferrate films is a tendency. This article reviews some applications of these films for analytical determination of both inorganic (e.g. As3+, S2O3(2- and organic (e.g. cysteine, hydrazine, ascorbic acid, gluthatione, glucose, etc. compounds.

  19. Development of a Safety Management Web Tool for Horse Stables

    Directory of Open Access Journals (Sweden)

    Jarkko Leppälä

    2015-11-01

    Full Text Available Managing a horse stable involves risks, which can have serious consequences for the stable, employees, clients, visitors and horses. Existing industrial or farm production risk management tools are not directly applicable to horse stables and they need to be adapted for use by managers of different types of stables. As a part of the InnoEquine project, an innovative web tool, InnoHorse, was developed to support horse stable managers in business, safety, pasture and manure management. A literature review, empirical horse stable case studies, expert panel workshops and stakeholder interviews were carried out to support the design. The InnoHorse web tool includes a safety section containing a horse stable safety map, stable safety checklists, and examples of good practices in stable safety, horse handling and rescue planning. This new horse stable safety management tool can also help in organizing work processes in horse stables in general.

  20. Development and psychometric testing of the nursing culture assessment tool.

    Science.gov (United States)

    Kennerly, Susan M; Yap, Tracey L; Hemmings, Annette; Beckett, Gulbahar; Schafer, John C; Borchers, Andrea

    2012-11-01

    A valid and reliable nursing culture assessment tool aimed at capturing general aspects of nursing culture is needed for use in health care settings to assess and then reshape indicated troubled areas of the nursing culture. This article summarizes the Nursing Culture Assessment Tool's (NCAT) development and reports on a cross-sectional, exploratory investigation of its psychometric properties. The research aims were to test the tool's psychometric properties; discover its dimensionality; and refine the item structure to best represent the construct of nursing culture, an occupational subset of organizational culture. Empirical construct validity was tested using a sample of licensed nurses and nursing assistants (n = 340). Exploratory and confirmatory factor analysis (CFA) and logistical regression yielded a 6-factor, 19-item solution. Evidence supports the tool's validity for assessing nursing culture as a basis for shaping the culture into one that supports change, thereby accelerating, improving, and advancing nursing best practices and care outcomes.

  1. Technology developments in biological tools for targeted genome surgery.

    Science.gov (United States)

    Teimourian, Shahram; Abdollahzadeh, Rasoul

    2015-01-01

    Different biological tools for targeted genome engineering have recently appeared and these include tools like meganucleases, zinc-finger nucleases and newer technologies including TALENs and CRISPR/Cas systems. transcription activator-like effector nucleases (TALENs) have greatly improved genome editing efficiency by making site-specific DNA double-strand breaks. Several studies have shown the prominence of TALENs in comparison to the meganucleases and zinc-finger nucleases. The most important feature of TALENs that makes them suitable tools for targeted genome editing is the modularity of central repeat domains, meaning that they can be designed to recognize any desirable DNA sequence. In this review, we present a comprehensive and concise description of TALENs technology developments for targeted genome surgery with to the point description and comparison of other tools.

  2. FLOW CYTOMETRY AS A MODERN ANALYTICAL TOOL IN BIOLOGY AND MEDICINE

    Directory of Open Access Journals (Sweden)

    S. V. Khaidukov

    2007-01-01

    Full Text Available Abstract. Flow cytometry is considered as a modern technology for fast measurements of cellular characteristics, their organelles, and processes occurring within them. It is regarded as an efficient solution in many important areas of cell biology, immunology and cellular engineering. Present article bears on main developments in flow cytometry and their applications in medical and biological practice. Usage of modern achievements in fluorescent dyes, progress in laser and computer technologies, as well as potent software, resulted in wide application of this technique in medical practice. Accordingly, usage of monoclonal antibodies conjugated to different fluorochromes has led to elaboration of multiparametric analysis and did sufficiently simplify specialized works aimed for diagnostics of various immune disorders. The new directions in flow cytometry, e.g., flow cytoenzymology, provide wide opportunities for detailed identification of damaged or altered cells, and taking adequate decisions in treatment of detected pathological changes. The authors suggest that this article could initiate a series of publications concerning usage of this technology and its modern applications in broad laboratory practice.

  3. Integration of Environmental Analytical Chemistry with Environmental Law: The Development of a Problem-Based Laboratory.

    Science.gov (United States)

    Cancilla, Devon A.

    2001-01-01

    Introduces an undergraduate level problem-based analytical chemistry laboratory course integrated with an environmental law course. Aims to develop an understanding among students on the use of environmental indicators for environmental evaluation. (Contains 30 references.) (YDS)

  4. Temperature-controlled micro-TLC: a versatile green chemistry and fast analytical tool for separation and preliminary screening of steroids fraction from biological and environmental samples.

    Science.gov (United States)

    Zarzycki, Paweł K; Slączka, Magdalena M; Zarzycka, Magdalena B; Bartoszuk, Małgorzata A; Włodarczyk, Elżbieta; Baran, Michał J

    2011-11-01

    This paper is a continuation of our previous research focusing on development of micro-TLC methodology under temperature-controlled conditions. The main goal of present paper is to demonstrate separation and detection capability of micro-TLC technique involving simple analytical protocols without multi-steps sample pre-purification. One of the advantages of planar chromatography over its column counterpart is that each TLC run can be performed using non-previously used stationary phase. Therefore, it is possible to fractionate or separate complex samples characterized by heavy biological matrix loading. In present studies components of interest, mainly steroids, were isolated from biological samples like fish bile using single pre-treatment steps involving direct organic liquid extraction and/or deproteinization by freeze-drying method. Low-molecular mass compounds with polarity ranging from estetrol to progesterone derived from the environmental samples (lake water, untreated and treated sewage waters) were concentrated using optimized solid-phase extraction (SPE). Specific bands patterns for samples derived from surface water of the Middle Pomerania in northern part of Poland can be easily observed on obtained micro-TLC chromatograms. This approach can be useful as simple and non-expensive complementary method for fast control and screening of treated sewage water discharged by the municipal wastewater treatment plants. Moreover, our experimental results show the potential of micro-TLC as an efficient tool for retention measurements of a wide range of steroids under reversed-phase (RP) chromatographic conditions. These data can be used for further optimalization of SPE or HPLC systems working under RP conditions. Furthermore, we also demonstrated that micro-TLC based analytical approach can be applied as an effective method for the internal standard (IS) substance search. Generally, described methodology can be applied for fast fractionation or screening of the

  5. Methodological framework, analytical tool and database for the assessment of climate change impacts, adaptation and vulnerability in Denmark

    DEFF Research Database (Denmark)

    Kaspersen, Per Skougaard; Halsnæs, Kirsten; Gregg, Jay Sterling

    . The project is one of seven initiatives proposed by KFT for 2012. The methodology report includes definitions of major concepts, an outline of an analytical structure, a presentation of models and their applicability, and the results of case studies. The work presented in this report draws on intensive...... Council. The flood hazard maps presented in this report constitute the first preliminary results of on-going methodological and analysis development in mapping potential impacts in relation to flooding from extreme precipitation in the city of Aarhus. For all purposes the Aarhus flood maps presented...

  6. The South African dysphagia screening tool (SADS: A screening tool for a developing context

    Directory of Open Access Journals (Sweden)

    Calli Ostrofsky

    2016-02-01

    Full Text Available Background: Notwithstanding its value, there are challenges and limitations to implementing a dysphagia screening tool from a developed contexts in a developing context. The need for a reliable and valid screening tool for dysphagia that considers context, systemic rules and resources was identified to prevent further medical compromise, optimise dysphagia prognosis and ultimately hasten patients’ return to home or work.Methodology: To establish the validity and reliability of the South African dysphagia screening tool (SADS for acute stroke patients accessing government hospital services. The study was a quantitative, non-experimental, correlational cross-sectional design with a retrospective component. Convenient sampling was used to recruit 18 speech-language therapists and 63 acute stroke patients from three South African government hospitals. The SADS consists of 20 test items and was administered by speech-language therapists. Screening was followed by a diagnostic dysphagia assessment. The administrator of the tool was not involved in completing the diagnostic assessment, to eliminate bias and prevent contamination of results from screener to diagnostic assessment. Sensitivity, validity and efficacy of the screening tool were evaluated against the results of the diagnostic dysphagia assessment. Cohen’s kappa measures determined inter-rater agreement between the results of the SADS and the diagnostic assessment.Results and conclusion: The SADS was proven to be valid and reliable. Cohen’s kappa indicated a high inter-rater reliability and showed high sensitivity and adequate specificity in detecting dysphagia amongst acute stroke patients who were at risk for dysphagia. The SADS was characterised by concurrent, content and face validity. As a first step in establishing contextual appropriateness, the SADS is a valid and reliable screening tool that is sensitive in identifying stroke patients at risk for dysphagia within government

  7. Applying decision-making tools to national e-waste recycling policy: an example of Analytic Hierarchy Process.

    Science.gov (United States)

    Lin, Chun-Hsu; Wen, Lihchyi; Tsai, Yue-Mi

    2010-05-01

    As policy making is in essence a process of discussion, decision-making tools have in many cases been proposed to resolve the differences of opinion among the different parties. In our project that sought to promote a country's performance in recycling, we used the Analytic Hierarchy Process (AHP) to evaluate the possibilities and determine the priority of the addition of new mandatory recycled waste, also referred to as Due Recycled Wastes, from candidate waste appliances. The evaluation process started with the collection of data based on telephone interviews and field investigations to understand the behavior of consumers as well as their overall opinions regarding the disposal of certain waste appliances. With the data serving as background information, the research team then implemented the Analytic Hierarchy Process using the information that formed an incomplete hierarchy structure in order to determine the priority for recycling. Since the number of objects to be evaluated exceeded the number that the AHP researchers had suggested, we reclassified the objects into four groups and added one more level of pair-wise comparisons, which substantially reduced the inconsistency in the judgment of the AHP participants. The project was found to serve as a flexible and achievable application of AHP to the environmental policy-making process. In addition, based on the project's outcomes derived from the project as a whole, the research team drew conclusions regarding the government's need to take back 15 of the items evaluated, and suggested instruments that could be used or recycling regulations that could be changed in the future. Further analysis on the top three items recommended by the results of the evaluation for recycling, namely, Compact Disks, Cellular Phones and Computer Keyboards, was then conducted to clarify their concrete feasibility. After the trial period for recycling ordered by the Taiwan Environmental Protection Administration, only Computer

  8. Work and Learner Identity -Developing an analytical framework

    DEFF Research Database (Denmark)

    Kondrup, Sissel

    The paper address the need to develop a theoretical framework able to grasp how engagement in work form certain conditions for workers to meet the obligation to form a pro-active learner identity, position themselves as educable subjects and engage in lifelong learning. An obligation that has...... to comprehend work situations as crucial spaces for learning and for the continuing development, maintenance or transformation of identity. Therefore it is necessary to bring work to the forefront of analysis and focus on peoples’ work-life-experiences when trying to understand how they perceive themselves......, their life situation and how they formulate their life strategies e.g. how they orientate toward different learning activities and form certain learner identities. The paper outline how the relation between work and identity can be conceptualised and provide a theoretical framework enabling researchers...

  9. Development and testing of a portfolio evaluation scoring tool.

    Science.gov (United States)

    Karlowicz, Karen A

    2010-02-01

    This study focused on development of a portfolio evaluation tool to guide the assignment of valid and reliable scores. Tool development was facilitated by a literature review, guidance of a faculty committee, and validation by content experts. Testing involved a faculty team that evaluated 60 portfolios. Calculation of interrater reliability and a paired-samples t test were used to judge effectiveness. Interrater reliability was 0.78 for overall scores, 0.81 for the seven program outcomes criteria scores, and more than 0.65 for scores assigned by 11 of 13 pairs of raters. There were no significant differences between raters' scores in 10 of 13 pairs. The portfolio evaluation tool demonstrated high reliability and should be tested by other schools using portfolio evaluation.

  10. An assessment tool for developing healthcare managerial skills and roles.

    Science.gov (United States)

    Guo, Kristina L

    2003-01-01

    This article is based on a study to identify, and by doing so help develop, the skills and roles of senior-level healthcare managers related to the needs of the current healthcare environment. To classify these roles and skills, a qualitative study was conducted to examine the literature on forces in the healthcare environment and their impact on managers. Ten senior managers were interviewed, revealing six roles as the most crucial to their positions along with the skills necessary to perform those roles. A pilot study was conducted with these senior managers to produce a final assessment tool. This assessment tool helps managers to identify strengths and weaknesses, develop in deficient areas, and promote competence in all areas as demanded by the market and organization. This tool can be used by organizations in the recruitment process and in the training process.

  11. Development of culturally sensitive dialog tools in diabetes education

    Directory of Open Access Journals (Sweden)

    Nana Folmann Hempler

    2015-01-01

    Full Text Available Person-centeredness is a goal in diabetes education, and cultural influences are important to consider in this regard. This report describes the use of a design-based research approach to develop culturally sensitive dialog tools to support person-centered dietary education targeting Pakistani immigrants in Denmark with type 2 diabetes. The approach appears to be a promising method to develop dialog tools for patient education that are culturally sensitive, thereby increasing their acceptability among ethnic minority groups. The process also emphasizes the importance of adequate training and competencies in the application of dialog tools and of alignment between researchers and health care professionals with regards to the educational philosophy underlying their use.

  12. Development of a green remediation tool in Japan.

    Science.gov (United States)

    Yasutaka, Tetsuo; Zhang, Hong; Murayama, Koki; Hama, Yoshihito; Tsukada, Yasuhisa; Furukawa, Yasuhide

    2016-09-01

    The green remediation assessment tool for Japan (GRATJ) presented in this study is a spreadsheet-based software package developed to facilitate comparisons of the environmental impacts associated with various countermeasures against contaminated soil in Japan. This tool uses a life-cycle assessment-based model to calculate inventory inputs/outputs throughout the activity life cycle during remediation. Processes of 14 remediation methods for heavy metal contamination and 12 for volatile organic compound contamination are built into the tool. This tool can evaluate 130 inventory inputs/outputs and easily integrate those inputs/outputs into 9 impact categories, 4 integrated endpoints, and 1 index. Comparative studies can be performed by entering basic data associated with a target site. The integrated results can be presented in a simpler and clearer manner than the results of an inventory analysis. As a case study, an arsenic-contaminated soil remediation site was examined using this tool. Results showed that the integrated environmental impacts were greater with onsite remediation methods than with offsite ones. Furthermore, the contributions of CO2 to global warming, SO2 to urban air pollution, and crude oil to resource consumption were greater than other inventory inputs/outputs. The GRATJ has the potential to improve green remediation and can serve as a valuable tool for decision makers and practitioners in selecting countermeasures in Japan.

  13. Interactive test tool for interoperable C-ITS development

    NARCIS (Netherlands)

    Voronov, A.; Englund, C.; Bengtsson, H.H.; Chen, L.; Ploeg, J.; Jongh, J.F.C.M. de; Sluis, H.J.D. van de

    2015-01-01

    This paper presents the architecture of an Interactive Test Tool (ITT) for interoperability testing of Cooperative Intelligent Transport Systems (C-ITS). Cooperative systems are developed by different manufacturers at different locations, which makes interoperability testing a tedious task. Up until

  14. Strategy for a Military Spiritual Self-Development Tool

    Science.gov (United States)

    2008-12-12

    military leaders can identify through use of a spirituality measurement tool. While the works of Jean Piaget , Lawrence Kohlberg, and Erik Eri have... Piaget , Kohlberg, and Erikson in a comprehensive theory directly applicable to the topic of spiritual development in the military. Therefore, the

  15. Development of a New Measurement Tool for Individualism and Collectivism

    Science.gov (United States)

    Shulruf, Boaz; Hattie, John; Dixon, Robyn

    2007-01-01

    A new measurement tool for individualism and collectivism has been developed to address critical methodological issues in this field of social psychology. This new measure, the Auckland Individualism and Collectivism Scale (AICS), defines three dimensions of individualism: (a) responsibility (acknowledging one's responsibility for one's actions),…

  16. ODX - diagnostic standard and development tools; ODX Diagnosestandard und Entwicklungswerkzeuge

    Energy Technology Data Exchange (ETDEWEB)

    Gomez, A.; Kricke, C.; Meyer, J. [ETAS GmbH, Stuttgart (Germany)

    2005-09-01

    ODX, the new standard for the description of diagnostic protocols and data for electronic control units was released a year ago. After a brief review of the standard, ETAS introduces development tools for measuring, calibration, and diagnostic applications, with an emphasis on ECU calibration tasks. (orig.)

  17. Monitoring Conceptual Development: Design Considerations of a Formative Feedback tool

    NARCIS (Netherlands)

    Berlanga, Adriana; Smithies, Alisdair; Braidman, Isobel; Wild, Fridolin

    2010-01-01

    Berlanga, A. J., Smithies, A., Braidman, I., & Wild, F. (2010, 15 September). Monitoring Conceptual Development: Design Considerations of a Formative Feedback Tool. Presentation at the Interactive Computer Aided Learning Conference (ICL 2010), Track on Computer-based Knowledge & Skill Assessment and

  18. The Development of a Literacy Diagnostic Tool for Maltese Children

    Science.gov (United States)

    Xuereb, Rachael; Grech, Helen; Dodd, Barbara

    2011-01-01

    This article focuses on the development of a Literacy Assessment Battery for the diagnosis of Maltese children with specific learning difficulties. It forms part of a wider research study involving testing of 549 children in Malta as well as standardisation of the tool. Results of the children's performance and psychometric validation go beyond…

  19. Development of a biogas planning tool for project owners

    DEFF Research Database (Denmark)

    Fredenslund, Anders Michael; Kjær, Tyge

    A spreadsheet model was developed, which can be used as a tool in the initial phases of planning a centralized biogas plant in Denmark. The model assesses energy production, total plant costs, operational costs and revenues and effect on greenhouse gas emissions. Two energy utilization alternatives...... to provide data for other documents needed such as economic and environmental assessments....

  20. Development of a Psychotropic PRN Medication Evaluative Tool

    Science.gov (United States)

    Silk, Larry; Watt, Jackie; Pilon, Nancy; Draper, Chad

    2013-01-01

    This article describes a psychotropic PRN Evaluative Tool developed by interprofessional clinicians to address inconsistent reporting and assessment of the effectiveness of PRN medications used for people who are developmentally disabled. Fifty-nine participants (37 males, 22 females), ages 16 to 60 years, were included in the review, all…

  1. Developing mobile educational apps: development strategies, tools and business models

    Directory of Open Access Journals (Sweden)

    Serena Pastore

    Full Text Available The mobile world is a growing and evolving market in all its aspects from hardware, networks, operating systems and applications. Mobile applications or apps are becoming the new frontier of software development, since actual digital users use mobile devi ...

  2. Development of an analytical methodology using Fourier transform mass spectrometry to discover new structural analogs of wine natural sweeteners.

    Science.gov (United States)

    Marchal, Axel; Génin, Eric; Waffo-Téguo, Pierre; Bibès, Alice; Da Costa, Grégory; Mérillon, Jean-Michel; Dubourdieu, Denis

    2015-01-01

    Volatile and non-volatile molecules are directly responsible for the thrill and excitement provided by wine-tasting. Their elucidation requires powerful analytical techniques and innovative methodologies. In a recent work, two novel sweet compounds called quercotriterpenosides (QTT) were identified in oak wood used for wine-ageing. The aim of the present study is to discover structural analogs of such natural sweeteners in oak wood. For this purpose, an analytical approach was developed as an alternative to chemical synthesis. Orbitrap mass spectrometry proved to be a crucial technique both to demonstrate the presence of QTT analogs in oak wood by targeted screening and to guide the purification pathway of these molecules using complementary chromatographic tools. Four compounds were isolated and identified for the first time: two isomers, one glucosyl derivative and one galloyl derivative of QTT. Their tasting showed that only the two new isomers were sweet, thus demonstrating both the pertinence of the strategy and the influence of functional groups on gustatory properties. Finally, this paper presents some developments involving multistage Fourier transform mass spectrometry (FTMS) to provide solid structural information on these functional groups prior to any purification of compounds. Such analytical developments could be particularly useful for research on taste-active or bio-active products.

  3. Review of the Development of Learning Analytics Applied in College-Level Institutes

    Directory of Open Access Journals (Sweden)

    Ken-Zen Chen

    2014-07-01

    Full Text Available This article focuses on the recent development of Learning Analytics using higher education institutional big-data. It addresses current state of Learning Analytics, creates a shared understanding, and clarifies misconceptions about the field. This article also reviews prominent examples from peer institutions that are conducting analytics, identifies their data and methodological framework, and comments on market vendors and non-for-profit initiatives. Finally, it suggests an implementation agenda for potential institutions and their stakeholders by drafting necessary preparations and creating iterative implementation flows.

  4. Developing a Conceptual Design Engineering Toolbox and its Tools

    Directory of Open Access Journals (Sweden)

    R. W. Vroom

    2004-01-01

    Full Text Available In order to develop a successful product, a design engineer needs to pay attention to all relevant aspects of that product. Many tools are available, software, books, websites, and commercial services. To unlock these potentially useful sources of knowledge, we are developing C-DET, a toolbox for conceptual design engineering. The idea of C-DET is that designers are supported by a system that provides them with a knowledge portal on one hand, and a system to store their current work on the other. The knowledge portal is to help the designer to find the most appropriate sites, experts, tools etc. at a short notice. Such a toolbox offers opportunities to incorporate extra functionalities to support the design engineering work. One of these functionalities could be to help the designer to reach a balanced comprehension in his work. Furthermore C-DET enables researchers in the area of design engineering and design engineers themselves to find each other or their work earlier and more easily. Newly developed design tools that can be used by design engineers but have not yet been developed up to a commercial level could be linked to by C-DET. In this way these tools can be evaluated in an early stage by design engineers who would like to use them. This paper describes the first prototypes of C-DET, an example of the development of a design tool that enables designers to forecast the use process and an example of the future functionalities of C-DET such as balanced comprehension.

  5. Opportunities for Use and Development of Collaborative Tools in ATLAS

    CERN Document Server

    Goldfarb, S; McKee, S P; Neal, H A; Finholt, T A; Olson, G M; Birnholtz, J P; Hofer, E; Storr, M; Vitaglione, G; Hardin, J B; Severance, C

    2003-01-01

    This document presents an assessment of the current and expected needs of the ATLAS Collaboration in the development, deployment, usage, and maintenance of collaborative tools to facilitate its internal and external communications, member training, education, and public outreach. It is prepared in response to a request by the ATLAS management to investigate these needs, to survey the current status, and to propose solutions where needed. We conclude the document with a set of recommendations designed to address selected immediate needs and to position the Collaboration for the anticipated growing demands for collaborative tools in a Grid-enabled analysis environment.

  6. Designing the user experience of game development tools

    CERN Document Server

    Lightbown, David

    2015-01-01

    The Big Green Button My Story Who Should Read this Book? Companion Website and Twitter Account Before we BeginWelcome to Designing the User Experience of Game Development ToolsWhat Will We Learn in This Chapter?What Is This Book About?Defining User ExperienceThe Value of Improving the User Experience of Our ToolsParallels Between User Experience and Game DesignHow Do People Benefit From an Improved User Experience?Finding the Right BalanceWrapping UpThe User-Centered Design ProcessWhat Will We

  7. Soft x-ray microscopy - a powerful analytical tool to image magnetism down to fundamental length and times scales

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, Peter

    2008-08-01

    The magnetic properties of low dimensional solid state matter is of the utmost interest both scientifically as well as technologically. In addition to the charge of the electron which is the base for current electronics, by taking into account the spin degree of freedom in future spintronics applications open a new avenue. Progress towards a better physical understanding of the mechanism and principles involved as well as potential applications of nanomagnetic devices can only be achieved with advanced analytical tools. Soft X-ray microscopy providing a spatial resolution towards 10nm, a time resolution currently in the sub-ns regime and inherent elemental sensitivity is a very promising technique for that. This article reviews the recent achievements of magnetic soft X-ray microscopy by selected examples of spin torque phenomena, stochastical behavior on the nanoscale and spin dynamics in magnetic nanopatterns. The future potential with regard to addressing fundamental magnetic length and time scales, e.g. imaging fsec spin dynamics at upcoming X-ray sources is pointed out.

  8. Development of novel analytical methods to study the metabolism of coumarin

    OpenAIRE

    1996-01-01

    The research in this thesis revolves around developing analytical methods for the determination of coumann and 7-hydroxycoumann for various applications. The techniques used in this work were, capillary electrophoresis, immunosensing and electrochemistry. Chapter 1 serves as general review of the analysis of coumann and 7-hydroxycoumann, including the many different types of analytical technique which have been used to analyse this drug. Capillary electrophoresis was used as the basis of a me...

  9. Development of Interpretive Simulation Tool for the Proton Radiography Technique

    CERN Document Server

    Levy, M C; Wilks, S C; Ross, J S; Huntington, C M; Fiuza, F; Baring, M G; Park, H- S

    2014-01-01

    Proton radiography is a useful diagnostic of high energy density (HED) plasmas under active theoretical and experimental development. In this paper we describe a new simulation tool that interacts realistic laser-driven point-like proton sources with three dimensional electromagnetic fields of arbitrary strength and structure and synthesizes the associated high resolution proton radiograph. The present tool's numerical approach captures all relevant physics effects, including effects related to the formation of caustics. Electromagnetic fields can be imported from PIC or hydrodynamic codes in a streamlined fashion, and a library of electromagnetic field `primitives' is also provided. This latter capability allows users to add a primitive, modify the field strength, rotate a primitive, and so on, while quickly generating a high resolution radiograph at each step. In this way, our tool enables the user to deconstruct features in a radiograph and interpret them in connection to specific underlying electromagneti...

  10. Preliminary Development of an Object-Oriented Optimization Tool

    Science.gov (United States)

    Pak, Chan-gi

    2011-01-01

    The National Aeronautics and Space Administration Dryden Flight Research Center has developed a FORTRAN-based object-oriented optimization (O3) tool that leverages existing tools and practices and allows easy integration and adoption of new state-of-the-art software. The object-oriented framework can integrate the analysis codes for multiple disciplines, as opposed to relying on one code to perform analysis for all disciplines. Optimization can thus take place within each discipline module, or in a loop between the central executive module and the discipline modules, or both. Six sample optimization problems are presented. The first four sample problems are based on simple mathematical equations; the fifth and sixth problems consider a three-bar truss, which is a classical example in structural synthesis. Instructions for preparing input data for the O3 tool are presented.

  11. Health Heritage© a web-based tool for the collection and assessment of family health history: initial user experience and analytic validity.

    Science.gov (United States)

    Cohn, W F; Ropka, M E; Pelletier, S L; Barrett, J R; Kinzie, M B; Harrison, M B; Liu, Z; Miesfeldt, S; Tucker, A L; Worrall, B B; Gibson, J; Mullins, I M; Elward, K S; Franko, J; Guterbock, T M; Knaus, W A

    2010-01-01

    A detailed family health history is currently the most potentially useful tool for diagnosis and risk assessment in clinical genetics. We developed and evaluated the usability and analytic validity of a patient-driven web-based family health history collection and analysis tool. Health Heritage(©) guides users through the collection of their family health history by relative, generates a pedigree, completes risk assessment, stratification, and recommendations for 89 conditions. We compared the performance of Health Heritage to that of Usual Care using a nonrandomized cohort trial of 109 volunteers. We contrasted the completeness and sensitivity of family health history collection and risk assessments derived from Health Heritage and Usual Care to those obtained by genetic counselors and genetic assessment teams. Nearly half (42%) of the Health Heritage participants reported discovery of health risks; 63% found the information easy to understand and 56% indicated it would change their health behavior. Health Heritage consistently outperformed Usual Care in the completeness and accuracy of family health history collection, identifying 60% of the elevated risk conditions specified by the genetic team versus 24% identified by Usual Care. Health Heritage also had greater sensitivity than Usual Care when comparing the identification of risks. These results suggest a strong role for automated family health history collection and risk assessment and underscore the potential of these data to serve as the foundation for comprehensive, cost-effective personalized genomic medicine.

  12. Direct analysis in real time mass spectrometry, a process analytical technology tool for real-time process monitoring in botanical drug manufacturing.

    Science.gov (United States)

    Wang, Lu; Zeng, Shanshan; Chen, Teng; Qu, Haibin

    2014-03-01

    A promising process analytical technology (PAT) tool has been introduced for batch processes monitoring. Direct analysis in real time mass spectrometry (DART-MS), a means of rapid fingerprint analysis, was applied to a percolation process with multi-constituent substances for an anti-cancer botanical preparation. Fifteen batches were carried out, including ten normal operations and five abnormal batches with artificial variations. The obtained multivariate data were analyzed by a multi-way partial least squares (MPLS) model. Control trajectories were derived from eight normal batches, and the qualification was tested by R(2) and Q(2). Accuracy and diagnosis capability of the batch model were then validated by the remaining batches. Assisted with high performance liquid chromatography (HPLC) determination, process faults were explained by corresponding variable contributions. Furthermore, a batch level model was developed to compare and assess the model performance. The present study has demonstrated that DART-MS is very promising in process monitoring in botanical manufacturing. Compared with general PAT tools, DART-MS offers a particular account on effective compositions and can be potentially used to improve batch quality and process consistency of samples in complex matrices.

  13. National Energy Audit Tool for Multifamily Buildings Development Plan

    Energy Technology Data Exchange (ETDEWEB)

    Malhotra, Mini [ORNL; MacDonald, Michael [Sentech, Inc.; Accawi, Gina K [ORNL; New, Joshua Ryan [ORNL; Im, Piljae [ORNL

    2012-03-01

    The U.S. Department of Energy's (DOE's) Weatherization Assistance Program (WAP) enables low-income families to reduce their energy costs by providing funds to make their homes more energy efficient. In addition, the program funds Weatherization Training and Technical Assistance (T and TA) activities to support a range of program operations. These activities include measuring and documenting performance, monitoring programs, promoting advanced techniques and collaborations to further improve program effectiveness, and training, including developing tools and information resources. The T and TA plan outlines the tasks, activities, and milestones to support the weatherization network with the program implementation ramp up efforts. Weatherization of multifamily buildings has been recognized as an effective way to ramp up weatherization efforts. To support this effort, the 2009 National Weatherization T and TA plan includes the task of expanding the functionality of the Weatherization Assistant, a DOE-sponsored family of energy audit computer programs, to perform audits for large and small multifamily buildings This report describes the planning effort for a new multifamily energy audit tool for DOE's WAP. The functionality of the Weatherization Assistant is being expanded to also perform energy audits of small multifamily and large multifamily buildings. The process covers an assessment of needs that includes input from national experts during two national Web conferences. The assessment of needs is then translated into capability and performance descriptions for the proposed new multifamily energy audit, with some description of what might or should be provided in the new tool. The assessment of needs is combined with our best judgment to lay out a strategy for development of the multifamily tool that proceeds in stages, with features of an initial tool (version 1) and a more capable version 2 handled with currently available resources. Additional

  14. Analytical Validation of Accelerator Mass Spectrometry for Pharmaceutical Development: the Measurement of Carbon-14 Isotope Ratio.

    Energy Technology Data Exchange (ETDEWEB)

    Keck, B D; Ognibene, T; Vogel, J S

    2010-02-05

    Accelerator mass spectrometry (AMS) is an isotope based measurement technology that utilizes carbon-14 labeled compounds in the pharmaceutical development process to measure compounds at very low concentrations, empowers microdosing as an investigational tool, and extends the utility of {sup 14}C labeled compounds to dramatically lower levels. It is a form of isotope ratio mass spectrometry that can provide either measurements of total compound equivalents or, when coupled to separation technology such as chromatography, quantitation of specific compounds. The properties of AMS as a measurement technique are investigated here, and the parameters of method validation are shown. AMS, independent of any separation technique to which it may be coupled, is shown to be accurate, linear, precise, and robust. As the sensitivity and universality of AMS is constantly being explored and expanded, this work underpins many areas of pharmaceutical development including drug metabolism as well as absorption, distribution and excretion of pharmaceutical compounds as a fundamental step in drug development. The validation parameters for pharmaceutical analyses were examined for the accelerator mass spectrometry measurement of {sup 14}C/C ratio, independent of chemical separation procedures. The isotope ratio measurement was specific (owing to the {sup 14}C label), stable across samples storage conditions for at least one year, linear over 4 orders of magnitude with an analytical range from one tenth Modern to at least 2000 Modern (instrument specific). Further, accuracy was excellent between 1 and 3 percent while precision expressed as coefficient of variation is between 1 and 6% determined primarily by radiocarbon content and the time spent analyzing a sample. Sensitivity, expressed as LOD and LLOQ was 1 and 10 attomoles of carbon-14 (which can be expressed as compound equivalents) and for a typical small molecule labeled at 10% incorporated with {sup 14}C corresponds to 30 fg

  15. Development of a site analysis tool for distributed wind projects

    Energy Technology Data Exchange (ETDEWEB)

    Shaw, Shawn [The Cadmus Group, Inc., Waltham MA (United States)

    2012-02-28

    The Cadmus Group, Inc., in collaboration with the National Renewable Energy Laboratory (NREL) and Encraft, was awarded a grant from the Department of Energy (DOE) to develop a site analysis tool for distributed wind technologies. As the principal investigator for this project, Mr. Shawn Shaw was responsible for overall project management, direction, and technical approach. The product resulting from this project is the Distributed Wind Site Analysis Tool (DSAT), a software tool for analyzing proposed sites for distributed wind technology (DWT) systems. This user-friendly tool supports the long-term growth and stability of the DWT market by providing reliable, realistic estimates of site and system energy output and feasibility. DSAT-which is accessible online and requires no purchase or download of software-is available in two account types; Standard: This free account allows the user to analyze a limited number of sites and to produce a system performance report for each; and Professional: For a small annual fee users can analyze an unlimited number of sites, produce system performance reports, and generate other customizable reports containing key information such as visual influence and wind resources. The tool’s interactive maps allow users to create site models that incorporate the obstructions and terrain types present. Users can generate site reports immediately after entering the requisite site information. Ideally, this tool also educates users regarding good site selection and effective evaluation practices.

  16. Development of a Test to Evaluate Students’ Analytical Thinking Based on Fact versus Opinion Differentiation

    Directory of Open Access Journals (Sweden)

    Taveep Thaneerananon

    2016-08-01

    Full Text Available Nowadays, one of the biggest challenges of education in Thailand is the development and promotion of the students’ thinking skills. The main purposes of this research were to develop an analytical thinking test for 6th grade students and evaluate the students’ analytical thinking. The sample was composed of 3567 6 th grade students in 2014 academic year at schools in Samuthsakorn province, the largest sample size that has been reached so far for an analytical thinking test in Thailand. The instruments for collecting data were the analytical thinking skill test; Fact vs. Opinion test (F vs. O test and Ordinary National Educational based test (O-NET based test. The collected data were analysed through TAP 6.65, SIA1.0.1 and SPSS 22 statistical programs. The results revealed statistic consistency between F vs. O test and O-NET based test. In addition, most 6th grade students were in “Unsatisfactory” level for analytical thinking skills. Though improvements are much needed, we believe that the developed Fact vs. Opinion test suits for the promotion and evaluation of the students’ analytical thinking skills

  17. Methodological framework, analytical tool and database for the assessment of climate change impacts, adaptation and vulnerability in Denmark

    Energy Technology Data Exchange (ETDEWEB)

    Skougaard Kaspersen, P.; Halsnaes, K.; Gregg, J.; Drews, M.

    2012-12-15

    In this report we provide recommendations about how more consistent studies and data can be provided based on available modelling tools and data for integrated assessment of climate change risks and adaptation options. It is concluded that integrated assessments within this area requires the use of a wide range of data and models in order to cover the full chain of elements including climate modelling, impact, risks, costs, social issues, and decision making. As an outcome of this activity a comprehensive data and modelling tool named Danish Integrated Assessment System (DIAS) has been developed, this may be used by researchers within the field. DIAS has been implemented and tested in a case study on urban flooding caused by extreme precipitation in Aarhus, and this study highlights the usefulness of integrating data, models, and methods from several disciplines into a common framework. DIAS is an attempt to describe such a framework with regards to integrated analysis of climate impacts and adaptation. The final product of the DTU KFT project ''Tool for Vulnerability analysis'' is NOT a user friendly Climate Adaptation tool ready for various types of analysis that may directly be used by decision makers and consultant on their own. Rather developed methodology and collected/available data can serve as a starting point for case specific analyses. For this reason alone this work should very much be viewed as an attempt to coordinate research, data and models outputs between different research institutes from various disciplines. It is unquestionable that there is a future need to integrate information for areas not yet included, and it is very likely that such efforts will depend on research projects conducted in different climate change adaptation areas and sectors in Denmark. (Author)

  18. Waste Tank Organic Safety Program: Analytical methods development. Progress report, FY 1994

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, J.A.; Clauss, S.A.; Grant, K.E. [and others

    1994-09-01

    The objectives of this task are to develop and document extraction and analysis methods for organics in waste tanks, and to extend these methods to the analysis of actual core samples to support the Waste Tank organic Safety Program. This report documents progress at Pacific Northwest Laboratory (a) during FY 1994 on methods development, the analysis of waste from Tank 241-C-103 (Tank C-103) and T-111, and the transfer of documented, developed analytical methods to personnel in the Analytical Chemistry Laboratory (ACL) and 222-S laboratory. This report is intended as an annual report, not a completed work.

  19. Development of tools for automatic generation of PLC code

    CERN Document Server

    Koutli, Maria; Rochez, Jacques

    This Master thesis was performed at CERN and more specifically in the EN-ICE-PLC section. The Thesis describes the integration of two PLC platforms, that are based on CODESYS development tool, to the CERN defined industrial framework, UNICOS. CODESYS is a development tool for PLC programming, based on IEC 61131-3 standard, and is adopted by many PLC manufacturers. The two PLC development environments are, the SoMachine from Schneider and the TwinCAT from Beckhoff. The two CODESYS compatible PLCs, should be controlled by the SCADA system of Siemens, WinCC OA. The framework includes a library of Function Blocks (objects) for the PLC programs and a software for automatic generation of the PLC code based on this library, called UAB. The integration aimed to give a solution that is shared by both PLC platforms and was based on the PLCOpen XML scheme. The developed tools were demonstrated by creating a control application for both PLC environments and testing of the behavior of the code of the library.

  20. Comparative study between univariate spectrophotometry and multivariate calibration as analytical tools for simultaneous quantitation of Moexipril and Hydrochlorothiazide.

    Science.gov (United States)

    Tawakkol, Shereen M; Farouk, M; Elaziz, Omar Abd; Hemdan, A; Shehata, Mostafa A

    2014-12-10

    Three simple, accurate, reproducible, and selective methods have been developed and subsequently validated for the simultaneous determination of Moexipril (MOX) and Hydrochlorothiazide (HCTZ) in pharmaceutical dosage form. The first method is the new extended ratio subtraction method (EXRSM) coupled to ratio subtraction method (RSM) for determination of both drugs in commercial dosage form. The second and third methods are multivariate calibration which include Principal Component Regression (PCR) and Partial Least Squares (PLSs). A detailed validation of the methods was performed following the ICH guidelines and the standard curves were found to be linear in the range of 10-60 and 2-30 for MOX and HCTZ in EXRSM method, respectively, with well accepted mean correlation coefficient for each analyte. The intra-day and inter-day precision and accuracy results were well within the acceptable limits.

  1. Developing a tool for assessing public health law in countries.

    Science.gov (United States)

    Kim, So Yoon; Lee, Yuri; Sohn, Myongsei; Hahm, Ki-Hyun

    2012-09-01

    At present, the World Health Organization (WHO) is in the process of developing a tool designed to assess the status of public health legislation in a given country. An Expert Consultation on Public Health Law was convened in Manila, Philippines, in May 2011. The participants agreed that the tool could serve as a guide for a regional approach to assist Member States in assessing the scope, completeness, and adequacy of their public health law. Given the broad definition of "public health" and the laws that affect health, directly or indirectly, the participants further agreed to narrow the field to 4 areas based on significant WHO works/policies, each organized into an independent module: (1) International Digest on Health Law, (2) Primary Health Care, (3) International Health Regulations 2005, and (4) Framework Convention on Tobacco Control. The tool would be drafted in a questionnaire format that asks the respondent to determine whether primary and/or subsidiary legislation exists in the country on a specific topic and, if so, to cite the relevant law, describe the pertinent points, and attach and/or link to the full text where available. The participants agreed that the respondents should include government officials and/or academics with legal competency. Version 1 of the tool was piloted in the Philippines, the Republic of Korea, Samoa, and Vanuatu. At a 2nd Expert Consultation on Public Health Law, convened in Incheon, Republic of Korea, in October 2011, in conjunction with the 43rd Conference of the Asia-Pacific Academic Consortium on Public Health, the participants determined that the tool was generally usable, certain concerns notwithstanding, such as the risk of standardizing compliance with WHO policies. The agreed next step is to finalize the analysis tool by August 2012, marking the end of stage I in the development process. Stage II will consist of team building and networking of responsible officers and/or professionals in the countries. The tool

  2. Microsystem design framework based on tool adaptations and library developments

    Science.gov (United States)

    Karam, Jean Michel; Courtois, Bernard; Rencz, Marta; Poppe, Andras; Szekely, Vladimir

    1996-09-01

    Besides foundry facilities, Computer-Aided Design (CAD) tools are also required to move microsystems from research prototypes to an industrial market. This paper describes a Computer-Aided-Design Framework for microsystems, based on selected existing software packages adapted and extended for microsystem technology, assembled with libraries where models are available in the form of standard cells described at different levels (symbolic, system/behavioral, layout). In microelectronics, CAD has already attained a highly sophisticated and professional level, where complete fabrication sequences are simulated and the device and system operation is completely tested before manufacturing. In comparison, the art of microsystem design and modelling is still in its infancy. However, at least for the numerical simulation of the operation of single microsystem components, such as mechanical resonators, thermo-elements, elastic diaphragms, reliable simulation tools are available. For the different engineering disciplines (like electronics, mechanics, optics, etc) a lot of CAD-tools for the design, simulation and verification of specific devices are available, but there is no CAD-environment within which we could perform a (micro-)system simulation due to the different nature of the devices. In general there are two different approaches to overcome this limitation: the first possibility would be to develop a new framework tailored for microsystem-engineering. The second approach, much more realistic, would be to use the existing CAD-tools which contain the most promising features, and to extend these tools so that they can be used for the simulation and verification of microsystems and of the devices involved. These tools are assembled with libraries in a microsystem design environment allowing a continuous design flow. The approach is driven by the wish to make microsystems accessible to a large community of people, including SMEs and non-specialized academic institutions.

  3. Oral Development for LSP via Open Source Tools

    Directory of Open Access Journals (Sweden)

    Alejandro Curado Fuentes

    2015-11-01

    Full Text Available For the development of oral abilities in LSP, few computer-based teaching and learning resources have actually focused intensively on web-based listening and speaking. Many more do on reading, writing, vocabulary and grammatical activities. Our aim in this paper is to approach oral communication in the online environment of Moodle by striving to make it suitable for a learning project which incorporates oral skills. The paper describes a blended process in which both individual and collaborative learning strategies can be combined and exploited through the implementation of specific tools and resources which may go hand in hand with traditional face-to-face conversational classes. The challenge with this new perspective is, ultimately, to provide effective tools for oral LSP development in an apparently writing skill-focused medium.

  4. Innovation tools of economic development of the enterprise

    Directory of Open Access Journals (Sweden)

    Fedor Pavlovich Zotov

    2012-12-01

    Full Text Available Ways to generate new economic and financial benefits from the practice of rationalization work in the industrial enterprise are considered. An attempt to combine the practice rationalization work with the capabilities of tools and techniques of the modern management technologies is made. It is offered to learn the tools and techniques of the technologies by members of the 4types of the formed cross-functional teams through the tutorials. It is offered to distribute the tutorials between the four stages of the method PDCA management cycle. It is shown that the creation of teams and development of tutorials will create internal resources for innovation projects to achieve effective changes in economic development of the enterprise.

  5. Development of an Analytical System for Determination of Free Acid via a Joint Method Combining Density and Conductivity Measurement

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Determination of free acid plays an important role in spent nuclear fuel reprocessing. It is necessary to develop a rapid analytical device and method for measuring free acid. A novel analytical system and method was studied to monitor the acidity

  6. Flammable gas safety program. Analytical methods development: FY 1994 progress report

    Energy Technology Data Exchange (ETDEWEB)

    Campbell, J.A.; Clauss, S.; Grant, K.; Hoopes, V.; Lerner, B.; Lucke, R.; Mong, G.; Rau, J.; Wahl, K.; Steele, R.

    1994-09-01

    This report describes the status of developing analytical methods to account for the organic components in Hanford waste tanks, with particular focus on tanks assigned to the Flammable Gas Watch List. The methods that have been developed are illustrated by their application to samples obtained from Tank 241-SY-101 (Tank 101-SY).

  7. The development and application of advanced analytical methods to commercial ICF reactor chambers. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Cousseau, P.; Engelstad, R.; Henderson, D.L. [and others

    1997-10-01

    Progress is summarized in this report for each of the following tasks: (1) multi-dimensional radiation hydrodynamics computer code development; (2) 2D radiation-hydrodynamic code development; (3) ALARA: analytic and Laplacian adaptive radioactivity analysis -- a complete package for analysis of induced activation; (4) structural dynamics modeling of ICF reactor chambers; and (5) analysis of self-consistent target chamber clearing.

  8. ANALYTICAL, CRITICAL AND CREATIVE THINKING DEVELOPMENT OF THE GIFTED CHILDREN IN THE USA SCHOOLS

    Directory of Open Access Journals (Sweden)

    Anna Yurievna Kuvarzina

    2013-11-01

    Full Text Available Teachers of the gifted students should not only make an enrichment and acceleration program for them but also pay attention to the development of analytical, critical and creative thinking skills. Despite great interest for this issue in the last years, the topic of analytical and creative thinking is poorly considered in the textbooks for the gifted. In this article some methods, materials and programs of analytical, critical and creative thinking skills development, which are used in the USA, are described.  The author analyses and systematize the methods and also suggests some ways of their usage in the Russian educational system.Purpose: to analyze and systematize methods, materials and programs, that are used in the USA for teaching gifted children analytical, critical and creative thinking, for development of their capacities of problem-solving and decision-making. Methods and methodology of the research: analysis, comparison, principle of the historical and logical approaches unity.Results: positive results of employment of analytical, critical and creative thinking development methods were shown in the practical experience of teaching and educating gifted children in the USA educational system.Results employment field: the Russian Federation educational system: schools, special classes and courses for the gifted children.DOI: http://dx.doi.org/10.12731/2218-7405-2013-7-42

  9. MOOCs as a Professional Development Tool for Librarians

    Directory of Open Access Journals (Sweden)

    Meghan Ecclestone

    2013-11-01

    Full Text Available This article explores how reference and instructional librarians taking over new areas of subject responsibility can develop professional expertise using new eLearning tools called MOOCs. MOOCs – Massive Open Online Courses – are a new online learning model that offers free higher education courses to anyone with an Internet connection and a keen interest to learn. As MOOCs proliferate, librarians have the opportunity to leverage this technology to improve their professional skills.

  10. Developing security tools of WSN and WBAN networks applications

    CERN Document Server

    A M El-Bendary, Mohsen

    2015-01-01

    This book focuses on two of the most rapidly developing areas in wireless technology (WT) applications, namely, wireless sensors networks (WSNs) and wireless body area networks (WBANs). These networks can be considered smart applications of the recent WT revolutions. The book presents various security tools and scenarios for the proposed enhanced-security of WSNs, which are supplemented with numerous computer simulations. In the computer simulation section, WSN modeling is addressed using MATLAB programming language.

  11. Development of an Observational Tool to Measure Nurses’ Information Needs

    OpenAIRE

    2012-01-01

    Nurses collect, communicate and store patient information needed for care through verbal, handwritten and electronic information sources. However, the specific categories of nurses’ information needs for the care of hospitalized patients remain unknown. The purpose of this study was to identify the categories of nurses’ information needs and develop an observational tool to measure the information needs through available information sources. We analyzed qualitative data from interview transcr...

  12. Development of environmental tools for anopheline larval control

    OpenAIRE

    Mweresa Collins K; Imbahale Susan S; Takken Willem; Mukabana Wolfgang R

    2011-01-01

    Abstract Background Malaria mosquitoes spend a considerable part of their life in the aquatic stage, rendering them vulnerable to interventions directed to aquatic habitats. Recent successes of mosquito larval control have been reported using environmental and biological tools. Here, we report the effects of shading by plants and biological control agents on the development and survival of anopheline and culicine mosquito larvae in man-made natural habitats in western Kenya. Trials consisted ...

  13. On transform coding tools under development for VP10

    Science.gov (United States)

    Parker, Sarah; Chen, Yue; Han, Jingning; Liu, Zoe; Mukherjee, Debargha; Su, Hui; Wang, Yongzhe; Bankoski, Jim; Li, Shunyao

    2016-09-01

    Google started the WebM Project in 2010 to develop open source, royaltyfree video codecs designed specifically for media on the Web. The second generation codec released by the WebM project, VP9, is currently served by YouTube, and enjoys billions of views per day. Realizing the need for even greater compression efficiency to cope with the growing demand for video on the web, the WebM team embarked on an ambitious project to develop a next edition codec, VP10, that achieves at least a generational improvement in coding efficiency over VP9. Starting from VP9, a set of new experimental coding tools have already been added to VP10 to achieve decent coding gains. Subsequently, Google joined a consortium of major tech companies called the Alliance for Open Media to jointly develop a new codec AV1. As a result, the VP10 effort is largely expected to merge with AV1. In this paper, we focus primarily on new tools in VP10 that improve coding of the prediction residue using transform coding techniques. Specifically, we describe tools that increase the flexibility of available transforms, allowing the codec to handle a more diverse range or residue structures. Results are presented on a standard test set.

  14. The development of a practical tool for risk assessment of manual work – the HAT-tool

    NARCIS (Netherlands)

    Kraker, H. de; Douwes, M.

    2008-01-01

    For the Dutch Ministry of Social Affairs and Employment we developed a tool to assess the risks of developing complaints of the arm, neck or shoulders during manual work. The tool was developed for every type of organization and is easy to use, does not require measurements other than time and can b

  15. Recent developments and future trends in solid phase microextraction techniques towards green analytical chemistry.

    Science.gov (United States)

    Spietelun, Agata; Marcinkowski, Łukasz; de la Guardia, Miguel; Namieśnik, Jacek

    2013-12-20

    Solid phase microextraction find increasing applications in the sample preparation step before chromatographic determination of analytes in samples with a complex composition. These techniques allow for integrating several operations, such as sample collection, extraction, analyte enrichment above the detection limit of a given measuring instrument and the isolation of analytes from sample matrix. In this work the information about novel methodological and instrumental solutions in relation to different variants of solid phase extraction techniques, solid-phase microextraction (SPME), stir bar sorptive extraction (SBSE) and magnetic solid phase extraction (MSPE) is presented, including practical applications of these techniques and a critical discussion about their advantages and disadvantages. The proposed solutions fulfill the requirements resulting from the concept of sustainable development, and specifically from the implementation of green chemistry principles in analytical laboratories. Therefore, particular attention was paid to the description of possible uses of novel, selective stationary phases in extraction techniques, inter alia, polymeric ionic liquids, carbon nanotubes, and silica- and carbon-based sorbents. The methodological solutions, together with properly matched sampling devices for collecting analytes from samples with varying matrix composition, enable us to reduce the number of errors during the sample preparation prior to chromatographic analysis as well as to limit the negative impact of this analytical step on the natural environment and the health of laboratory employees.

  16. Development of tools, technologies, and methodologies for imaging sensor testing

    Science.gov (United States)

    Lowry, H.; Bynum, K.; Steely, S.; Nicholson, R.; Horne, H.

    2013-05-01

    Ground testing of space- and air-borne imaging sensor systems is supported by Vis-to-LWIR imaging sensor calibration and characterization, as well as hardware-in-the-loop (HWIL) simulation with high-fidelity complex scene projection to validate sensor mission performance. To accomplish this successfully, there must be the development of tools, technologies, and methodologies that are used in space simulation chambers for such testing. This paper provides an overview of such efforts being investigated and implemented at Arnold Engineering Development Complex (AEDC).

  17. Searching for Sentient Design Tools for Game Development

    DEFF Research Database (Denmark)

    Liapis, Antonios

    increasing demand by expanding their cadre, compressing development cycles and reusing code or assets. To limit development time and reduce the cost of content creation, commercial game engines and procedural content generation are popular shortcuts. Content creation tools are means to either generate...... to the user's current design in order to speed up the creation process and inspire the user to think outside the box. Several AI techniques are implemented, and others invented, for the purposes of creating meaningful suggestions for Sentient Sketchbook as well as for adapting these suggestions to the user...

  18. Assessment of in-hand manipulation: Tool development

    Directory of Open Access Journals (Sweden)

    Kavitha Raja

    2016-01-01

    Full Text Available Objective: The aim of this study is to develop an assessment tool for in-hand manipulation skills (IHMS and establish its psychometric properties. Design: Items are pooled based on literature and expert opinion. Content validation was performed by ten rehabilitation professionals. The test was administered to 123 typically developing, and 15 children with hand dysfunction-cerebral palsy (3, developmental coordination disorder (5, and Down syndrome (7. The latter group was given intervention, specific to upper extremity for 15 days, and test was readministered. Rasch analysis for rating scale structure, fit statistics, and dimension analysis was done. Results: Content validation was analyzed qualitatively. Suggestions were incorporated which consisted of instructions for scoring and test administration. The four-level ordinal rating scale was appropriate according to Rasch analysis. Of fifty items, three misfit items from translation subscale were removed based on fit statistics and clinical decision. The final test has 47 items. The tool had excellent inter-tester reliability and test stability and was responsive to change. Conclusion: Assessment of in-hand manipulation is a robust tool for clinical use in assessment IHMS.

  19. Development and Validation of the Tibetan Primary Care Assessment Tool

    Directory of Open Access Journals (Sweden)

    Wenhua Wang

    2014-01-01

    Full Text Available Objective. To develop a primary care assessment tool in Tibetan area and assess the primary care quality among different healthcare settings. Methods. Primary care assessment tool-Tibetan version (PCAT-T was developed to measure seven primary care domains. Data from a cross-sectional survey of 1386 patients was used to conduct validity and reliability analysis of PCAT-T. Analysis of variance was used to conduct comparison of primary care quality among different healthcare settings. Results. A 28-item PCAT-T was constructed which included seven multi-item scales and two single-item scales. All of multi-item scales achieved good internal consistency and item-total correlations. Scaling assumptions tests were well satisfied. The full range of possible scores was observed for all scales, except first contact and continuity. Compared with prefecture hospital (77.42 and county hospital (82.01, township health center achieved highest primary care quality total score (86.64. Conclusions. PCAT-T is a valid and reliable tool to measure patients' experience of primary care in the Tibet Autonomous Region. Township health center has the best primary care performance compared with other healthcare settings, and township health center should play a key role in providing primary care in Tibet.

  20. Development of a burn prevention teaching tool for Amish children.

    Science.gov (United States)

    Rieman, Mary T; Kagan, Richard J

    2012-01-01

    Although there are inherent risks for burn injury associated with the Amish lifestyle, burn prevention is not taught in Amish schools. The purpose of this study was to develop a burn prevention teaching tool for Amish children. An anonymous parental survey was designed to explore the content and acceptability of a teaching tool within an Old Order Amish community. After institutional review board approval, the Amish teacher distributed surveys to 16 families of the 30 children attending the one-room school. Fourteen (88%) of the families responded to identify these burn risks in and around their homes, barns, and shops: lighters, wood and coal stoves, kerosene heaters, gasoline-powered engines, and hot liquids used for canning, butchering, mopping, washing clothes, and making lye soap. All respondents were in favor of teaching familiar safety precautions, fire escape plans, burn first aid, and emergency care to the children. There was some minor objection to more modern devices such as bath tub thermometers (25%), fire extinguishers (19%), and smoke detectors (6%). The teacher was interested in a magnetic teaching board depicting Amish children and typical objects in their home environment. Movable pieces could afford the opportunity to identify hazards and to rearrange them for a safer situation. This survey served to introduce burn prevention to one Amish community and to develop an appropriate teaching tool for the school. It is anticipated that community participation would support its acceptance and eventual utilization within this tenaciously traditional culture.

  1. Requirements Document for Development of a Livermore Tomography Tools Interface

    Energy Technology Data Exchange (ETDEWEB)

    Seetho, I. M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-09

    In this document, we outline an exercise performed at LLNL to evaluate the user interface deficits of a LLNL-developed CT reconstruction software package, Livermore Tomography Tools (LTT). We observe that a difficult-to-use command line interface and the lack of support functions compound to generate a bottleneck in the CT reconstruction process when input parameters to key functions are not well known. Through the exercise of systems engineering best practices, we generate key performance parameters for a LTT interface refresh, and specify a combination of back-end (“test-mode” functions) and front-end (graphical user interface visualization and command scripting tools) solutions to LTT’s poor user interface that aim to mitigate issues and lower costs associated with CT reconstruction using LTT. Key functional and non-functional requirements and risk mitigation strategies for the solution are outlined and discussed.

  2. Developments in the tools and methodologies of synthetic biology

    Directory of Open Access Journals (Sweden)

    Richard eKelwick

    2014-11-01

    Full Text Available Synthetic biology is principally concerned with the rational design and engineering of biologically based parts, devices or systems. However, biological systems are generally complex and unpredictable and are therefore intrinsically difficult to engineer. In order to address these fundamental challenges, synthetic biology is aiming to unify a ‘body of knowledge’ from several foundational scientific fields, within the context of a set of engineering principles. This shift in perspective is enabling synthetic biologists to address complexity, such that robust biological systems can be designed, assembled and tested as part of a biological design cycle. The design cycle takes a forward-design approach in which a biological system is specified, modeled, analyzed, assembled and its functionality tested. At each stage of the design cycle an expanding repertoire of tools is being developed. In this review we highlight several of these tools in terms of their applications and benefits to the synthetic biology community.

  3. CRAB: the CMS distributed analysis tool development and design

    Energy Technology Data Exchange (ETDEWEB)

    Spiga, D. [University and INFN Perugia (Italy); Lacaprara, S. [INFN Legnaro (Italy); Bacchi, W. [University and INFN Bologna (Italy); Cinquilli, M. [University and INFN Perugia (Italy); Codispoti, G. [University and INFN Bologna (Italy); Corvo, M. [CERN (Switzerland); Dorigo, A. [INFN Padova (Italy); Fanfani, A. [University and INFN Bologna (Italy); Fanzago, F. [CERN (Switzerland); Farina, F. [INFN Milano-Bicocca (Italy); Gutsche, O. [FNAL (United States); Kavka, C. [INFN Trieste (Italy); Merlo, M. [INFN Milano-Bicocca (Italy); Servoli, L. [University and INFN Perugia (Italy)

    2008-03-15

    Starting from 2007 the CMS experiment will produce several Pbytes of data each year, to be distributed over many computing centers located in many different countries. The CMS computing model defines how the data are to be distributed such that CMS physicists can access them in an efficient manner in order to perform their physics analysis. CRAB (CMS Remote Analysis Builder) is a specific tool, designed and developed by the CMS collaboration, that facilitates access to the distributed data in a very transparent way. The tool's main feature is the possibility of distributing and parallelizing the local CMS batch data analysis processes over different Grid environments without any specific knowledge of the underlying computational infrastructures. More specifically CRAB allows the transparent usage of WLCG, gLite and OSG middleware. CRAB interacts with both the local user environment, with CMS Data Management services and with the Grid middleware.

  4. CRAB: the CMS distributed analysis tool development and design

    CERN Document Server

    Spiga, D; Bacchi, W; Cinquilli, M; Codispoti, G; Corvo, M; Dorigo, A; Fanfani, A; Fanzago, F; Farina, F; Gutsche, O; Kavka, C; Merlo, M; Servoli, L

    2008-01-01

    Starting from 2007 the CMS experiment will produce several Pbytes of data each year, to be distributed over many computing centers located in many different countries. The CMS computing model defines how the data are to be distributed such that CMS physicists can access them in an efficient manner in order to perform their physics analysis. CRAB (CMS Remote Analysis Builder) is a specific tool, designed and developed by the CMS collaboration, that facilitates access to the distributed data in a very transparent way. The tool's main feature is the possibility of distributing and parallelizing the local CMS batch data analysis processes over different Grid environments without any specific knowledge of the underlying computational infrastructures. More specifically CRAB allows the transparent usage of WLCG, gLite and OSG middleware. CRAB interacts with both the local user environment, with CMS Data Management services and with the Grid middleware.

  5. Development and testing of the codependency assessment tool.

    Science.gov (United States)

    Hughes-Hammer, C; Martsolf, D S; Zeller, R A

    1998-10-01

    Codependency constitutes a significant health risk, particularly for women, because codependent women are often involved in abusive and potentially harmful relationships. Individuals who are identified as codependent can engage in therapy and gain knowledge and freedom from such relationships. However, there is no reliable and valid measure of codependency that is consistently used to identify these individuals. This article describes the development and testing of the Codependency Assessment Tool, a multivariate tool that conceptualizes codependency as a construct comprising five factors: (1) Other Focus/Self-Neglect, (2) Low Self-Worth, (3) Hiding Self, (4) Medical Problems, and (5) Family of Origin Issues. The instrument has excellent reliability and validity. Its test-retest reliabilities = .78 to .94; Cronbach's alpha = .78 to .91. Criterion validity was determined to be established by using known groups; construct validity was established by comparing the codependency dimensions with depression.

  6. Experimental design and multiple response optimization. Using the desirability function in analytical methods development.

    Science.gov (United States)

    Candioti, Luciana Vera; De Zan, María M; Cámara, María S; Goicoechea, Héctor C

    2014-06-01

    A review about the application of response surface methodology (RSM) when several responses have to be simultaneously optimized in the field of analytical methods development is presented. Several critical issues like response transformation, multiple response optimization and modeling with least squares and artificial neural networks are discussed. Most recent analytical applications are presented in the context of analytLaboratorio de Control de Calidad de Medicamentos (LCCM), Facultad de Bioquímica y Ciencias Biológicas, Universidad Nacional del Litoral, C.C. 242, S3000ZAA Santa Fe, ArgentinaLaboratorio de Control de Calidad de Medicamentos (LCCM), Facultad de Bioquímica y Ciencias Biológicas, Universidad Nacional del Litoral, C.C. 242, S3000ZAA Santa Fe, Argentinaical methods development, especially in multiple response optimization procedures using the desirability function.

  7. Developing an Analytical Framework: Incorporating Ecosystem Services into Decision Making - Proceedings of a Workshop

    Science.gov (United States)

    Hogan, Dianna; Arthaud, Greg; Pattison, Malka; Sayre, Roger G.; Shapiro, Carl

    2010-01-01

    The analytical framework for understanding ecosystem services in conservation, resource management, and development decisions is multidisciplinary, encompassing a combination of the natural and social sciences. This report summarizes a workshop on 'Developing an Analytical Framework: Incorporating Ecosystem Services into Decision Making,' which focused on the analytical process and on identifying research priorities for assessing ecosystem services, their production and use, their spatial and temporal characteristics, their relationship with natural systems, and their interdependencies. Attendees discussed research directions and solutions to key challenges in developing the analytical framework. The discussion was divided into two sessions: (1) the measurement framework: quantities and values, and (2) the spatial framework: mapping and spatial relationships. This workshop was the second of three preconference workshops associated with ACES 2008 (A Conference on Ecosystem Services): Using Science for Decision Making in Dynamic Systems. These three workshops were designed to explore the ACES 2008 theme on decision making and how the concept of ecosystem services can be more effectively incorporated into conservation, restoration, resource management, and development decisions. Preconference workshop 1, 'Developing a Vision: Incorporating Ecosystem Services into Decision Making,' was held on April 15, 2008, in Cambridge, MA. In preconference workshop 1, participants addressed what would have to happen to make ecosystem services be used more routinely and effectively in conservation, restoration, resource management, and development decisions, and they identified some key challenges in developing the analytical framework. Preconference workshop 3, 'Developing an Institutional Framework: Incorporating Ecosystem Services into Decision Making,' was held on October 30, 2008, in Albuquerque, NM; participants examined the relationship between the institutional framework and

  8. The development of a tool to predict team performance.

    Science.gov (United States)

    Sinclair, M A; Siemieniuch, C E; Haslam, R A; Henshaw, M J D C; Evans, L

    2012-01-01

    The paper describes the development of a tool to predict quantitatively the success of a team when executing a process. The tool was developed for the UK defence industry, though it may be useful in other domains. It is expected to be used by systems engineers in initial stages of systems design, when concepts are still fluid, including the structure of the team(s) which are expected to be operators within the system. It enables answers to be calculated for questions such as "What happens if I reduce team size?" and "Can I reduce the qualifications necessary to execute this process and still achieve the required level of success?". The tool has undergone verification and validation; it predicts fairly well and shows promise. An unexpected finding is that the tool creates a good a priori argument for significant attention to Human Factors Integration in systems projects. The simulations show that if a systems project takes full account of human factors integration (selection, training, process design, interaction design, culture, etc.) then the likelihood of team success will be in excess of 0.95. As the project derogates from this state, the likelihood of team success will drop as low as 0.05. If the team has good internal communications and good individuals in key roles, the likelihood of success rises towards 0.25. Even with a team comprising the best individuals, p(success) will not be greater than 0.35. It is hoped that these results will be useful for human factors professionals involved in systems design.

  9. A free software tool for the development of decision support systems

    Directory of Open Access Journals (Sweden)

    COLONESE, G

    2008-06-01

    Full Text Available This article describes PostGeoOlap, a free software open source tool for decision support that integrates OLAP (On-Line Analytical Processing and GIS (Geographical Information Systems. Besides describing the tool, we show how it can be used to achieve effective and low cost decision support that is adequate for small and medium companies and for small public offices.

  10. Measuring vaccine hesitancy: The development of a survey tool.

    Science.gov (United States)

    Larson, Heidi J; Jarrett, Caitlin; Schulz, William S; Chaudhuri, Mohuya; Zhou, Yuqing; Dube, Eve; Schuster, Melanie; MacDonald, Noni E; Wilson, Rose

    2015-08-14

    In March 2012, the SAGE Working Group on Vaccine Hesitancy was convened to define the term "vaccine hesitancy", as well as to map the determinants of vaccine hesitancy and develop tools to measure and address the nature and scale of hesitancy in settings where it is becoming more evident. The definition of vaccine hesitancy and a matrix of determinants guided the development of a survey tool to assess the nature and scale of hesitancy issues. Additionally, vaccine hesitancy questions were piloted in the annual WHO-UNICEF joint reporting form, completed by National Immunization Managers globally. The objective of characterizing the nature and scale of vaccine hesitancy issues is to better inform the development of appropriate strategies and policies to address the concerns expressed, and to sustain confidence in vaccination. The Working Group developed a matrix of the determinants of vaccine hesitancy informed by a systematic review of peer reviewed and grey literature, and by the expertise of the working group. The matrix mapped the key factors influencing the decision to accept, delay or reject some or all vaccines under three categories: contextual, individual and group, and vaccine-specific. These categories framed the menu of survey questions presented in this paper to help diagnose and address vaccine hesitancy.

  11. Demonstration of Decision Support Tools for Sustainable Development

    Energy Technology Data Exchange (ETDEWEB)

    Shropshire, David Earl; Jacobson, Jacob Jordan; Berrett, Sharon; Cobb, D. A.; Worhach, P.

    2000-11-01

    The Demonstration of Decision Support Tools for Sustainable Development project integrated the Bechtel/Nexant Industrial Materials Exchange Planner and the Idaho National Engineering and Environmental Laboratory System Dynamic models, demonstrating their capabilities on alternative fuel applications in the Greater Yellowstone-Teton Park system. The combined model, called the Dynamic Industrial Material Exchange, was used on selected test cases in the Greater Yellow Teton Parks region to evaluate economic, environmental, and social implications of alternative fuel applications, and identifying primary and secondary industries. The test cases included looking at compressed natural gas applications in Teton National Park and Jackson, Wyoming, and studying ethanol use in Yellowstone National Park and gateway cities in Montana. With further development, the system could be used to assist decision-makers (local government, planners, vehicle purchasers, and fuel suppliers) in selecting alternative fuels, vehicles, and developing AF infrastructures. The system could become a regional AF market assessment tool that could help decision-makers understand the behavior of the AF market and conditions in which the market would grow. Based on this high level market assessment, investors and decision-makers would become more knowledgeable of the AF market opportunity before developing detailed plans and preparing financial analysis.

  12. Polar Bears or People?: How Framing Can Provide a Useful Analytic Tool to Understand & Improve Climate Change Communication in Classrooms

    Science.gov (United States)

    Busch, K. C.

    2014-12-01

    Not only will young adults bear the brunt of climate change's effects, they are also the ones who will be required to take action - to mitigate and to adapt. The Next Generation Science Standards include climate change, ensuring the topic will be covered in U.S. science classrooms in the near future. Additionally, school is a primary source of information about climate change for young adults. The larger question, though, is how can the teaching of climate change be done in such a way as to ascribe agency - a willingness to act - to students? Framing - as both a theory and an analytic method - has been used to understand how language in the media can affect the audience's intention to act. Frames function as a two-way filter, affecting both the message sent and the message received. This study adapted both the theory and the analytic methods of framing, applying them to teachers in the classroom to answer the research question: How do teachers frame climate change in the classroom? To answer this question, twenty-five lessons from seven teachers were analyzed using semiotic discourse analysis methods. It was found that the teachers' frames overlapped to form two distinct discourses: a Science Discourse and a Social Discourse. The Science Discourse, which was dominant, can be summarized as: Climate change is a current scientific problem that will have profound global effects on the Earth's physical systems. The Social Discourse, used much less often, can be summarized as: Climate change is a future social issue because it will have negative impacts at the local level on people. While it may not be surprising that the Science Discourse was most often heard in these science classrooms, it is possibly problematic if it were the only discourse used. The research literature on framing indicates that the frames found in the Science Discourse - global scale, scientific statistics and facts, and impact on the Earth's systems - are not likely to inspire action-taking. This

  13. Selenium isotope studies in plants. Development and validation of a novel geochemical tool and its application to organic samples

    Energy Technology Data Exchange (ETDEWEB)

    Banning, Helena

    2016-03-12

    Selenium (Se), being an essential nutrient and a toxin, enters the food chain mainly via plants. Selenium isotope signatures were proved to be an excellent redox tracer, making it a promising tool for the exploration of the Se cycle in plants. The analytical method is sensitive on organic samples and requires particular preparation methods, which were developed and validated in this study. Plant cultivation setups revealed the applicability of these methods to trace plant internal processes.

  14. COSTMODL: An automated software development cost estimation tool

    Science.gov (United States)

    Roush, George B.

    1991-01-01

    The cost of developing computer software continues to consume an increasing portion of many organizations' total budgets, both in the public and private sector. As this trend develops, the capability to produce reliable estimates of the effort and schedule required to develop a candidate software product takes on increasing importance. The COSTMODL program was developed to provide an in-house capability to perform development cost estimates for NASA software projects. COSTMODL is an automated software development cost estimation tool which incorporates five cost estimation algorithms including the latest models for the Ada language and incrementally developed products. The principal characteristic which sets COSTMODL apart from other software cost estimation programs is its capacity to be completely customized to a particular environment. The estimation equations can be recalibrated to reflect the programmer productivity characteristics demonstrated by the user's organization, and the set of significant factors which effect software development costs can be customized to reflect any unique properties of the user's development environment. Careful use of a capability such as COSTMODL can significantly reduce the risk of cost overruns and failed projects.

  15. Development of an Analytical System for Rapid, Remote Determining Concentration and Valence of Uranium and Plutonium

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    Concentrations and valence of U and Pu directly shows whether the Purex process is under normal conditions or not. It is necessary to monitor concentrations and valence of U and Pu in real-time.Purposes of this work is to develop an analytical

  16. Combining Multiple Measures of Students' Opportunities to Develop Analytic, Text-Based Writing Skills

    Science.gov (United States)

    Correnti, Richard; Matsumura, Lindsay Clare; Hamilton, Laura S.; Wang, Elaine

    2012-01-01

    Guided by evidence that teachers contribute to student achievement outcomes, researchers have been reexamining how to study instruction and the classroom opportunities teachers create for students. We describe our experience measuring students' opportunities to develop analytic, text-based writing skills. Utilizing multiple methods of data…

  17. In situ protein secondary structure determination in ice: Raman spectroscopy-based process analytical tool for frozen storage of biopharmaceuticals.

    Science.gov (United States)

    Roessl, Ulrich; Leitgeb, Stefan; Pieters, Sigrid; De Beer, Thomas; Nidetzky, Bernd

    2014-08-01

    A Raman spectroscopy-based method for in situ monitoring of secondary structural composition of proteins during frozen and thawed storage was developed. A set of reference proteins with different α-helix and β-sheet compositions was used for calibration and validation in a chemometric approach. Reference secondary structures were quantified with circular dichroism spectroscopy in the liquid state. Partial least squares regression models were established that enable estimation of secondary structure content from Raman spectra. Quantitative secondary structure determination in ice was accomplished for the first time and correlation with existing (qualitative) protein structural data from the frozen state was achieved. The method can be used in the presence of common stabilizing agents and is applicable in an industrial freezer setup. Raman spectroscopy represents a powerful, noninvasive, and flexibly applicable tool for protein stability monitoring during frozen storage.

  18. NEEMO 20: Science Training, Operations, and Tool Development

    Science.gov (United States)

    Graff, T.; Miller, M.; Rodriguez-Lanetty, M.; Chappell, S.; Naids, A.; Hood, A.; Coan, D.; Abell, P.; Reagan, M.; Janoiko, B.

    2016-01-01

    The 20th mission of the National Aeronautics and Space Administration (NASA) Extreme Environment Mission Operations (NEEMO) was a highly integrated evaluation of operational protocols and tools designed to enable future exploration beyond low-Earth orbit. NEEMO 20 was conducted from the Aquarius habitat off the coast of Key Largo, FL in July 2015. The habitat and its surroundings provide a convincing analog for space exploration. A crew of six (comprised of astronauts, engineers, and habitat technicians) lived and worked in and around the unique underwater laboratory over a mission duration of 14-days. Incorporated into NEEMO 20 was a diverse Science Team (ST) comprised of geoscientists from the Astromaterials Research and Exploration Science (ARES/XI) Division from the Johnson Space Center (JSC), as well as marine scientists from the Department of Biological Sciences at Florida International University (FIU). This team trained the crew on the science to be conducted, defined sampling techniques and operational procedures, and planned and coordinated the science focused Extra Vehicular Activities (EVAs). The primary science objectives of NEEMO 20 was to study planetary sampling techniques and tools in partial gravity environments under realistic mission communication time delays and operational pressures. To facilitate these objectives two types of science sites were employed 1) geoscience sites with available rocks and regolith for testing sampling procedures and tools and, 2) marine science sites dedicated to specific research focused on assessing the photosynthetic capability of corals and their genetic connectivity between deep and shallow reefs. These marine sites and associated research objectives included deployment of handheld instrumentation, context descriptions, imaging, and sampling; thus acted as a suitable proxy for planetary surface exploration activities. This abstract briefly summarizes the scientific training, scientific operations, and tool

  19. The development of a neonatal communication intervention tool

    Directory of Open Access Journals (Sweden)

    Esedra Strasheim

    2011-11-01

    Full Text Available Neonatal communication intervention is important in South Africa, which has an increased prevalence of infants born with risks for disabilities and where the majority of infants live in poverty. Local literature showed a dearth of information on the current service delivery and roles of speech-language therapists (SLTs and audiologists in neonatal nurseries in the South African context. SLTs have the opportunity to provide the earliest intervention, provided that intervention is well-timed in the neonatal nursery context. The aim of the research was to compile a locally relevant neonatal communication intervention instrument/tool for use by SLTs in neonatal nurseries of public hospitals. The study entailed descriptive, exploratory research. During phase 1, a survey was received from 39 SLTs and 2 audiologists in six provinces. The data revealed that participants performed different roles in neonatal nurseries, which depended on the environment, tools, materials and instrumentation available to them. Many participants were inexperienced, but resourceful in their attempts to adapt tools/materials. Participants expressed needs for culturally appropriate and user-friendly instruments for parent guidance and staff/team training on the topic of developmental care. During phase 2, a tool for parent guidance titled Neonatal communication intervention programme for parents was compiled in English and isiZulu. The programme was piloted by three participants. Suggestions for enhancements of the programme were made, such as providing a glossary of terms, adapting the programme’s language and terminology, and providing more illustrations. SLTs and audiologists must contribute to neonatal care of high-risk infants to facilitate development and to support families.

  20. Development and Evaluation of a Riparian Buffer Mapping Tool

    Science.gov (United States)

    Milheim, Lesley E.; Claggett, Peter R.

    2008-01-01

    Land use and land cover within riparian areas greatly affect the conditions of adjacent water features. In particular, riparian forests provide many environmental benefits, including nutrient uptake, bank stabilization, steam shading, sediment trapping, aquatic and terrestrial habitat, and stream organic matter. In contrast, residential and commercial development and associated transportation infrastructure increase pollutant and nutrient loading and change the hydrologic characteristics of the landscape, thereby affecting both water quality and habitat. Restoring riparian areas is a popular and cost effective restoration technique to improve and protect water quality. Recognizing this, the Chesapeake Executive Council committed to restoring 10,000 miles of riparian forest buffers throughout the Chesapeake Bay watershed by the year 2010. In 2006, the Chesapeake Executive Council further committed to 'using the best available...tools to identify areas where retention and expansion of forests is most needed to protect water quality'. The Chesapeake Bay watershed encompasses 64,000 square miles, including portions of six States and Washington, D.C. Therefore, the interpretation of remotely sensed imagery provides the only effective technique for comprehensively evaluating riparian forest protection and restoration opportunities throughout the watershed. Although 30-meter-resolution land use and land cover data have proved useful on a regional scale, they have not been equally successful at providing the detail required for local-scale assessment of riparian area characteristics. Use of high-resolution imagery (HRI) provides sufficient detail for local-scale assessments, although at greater cost owing to the cost of the imagery and the skill and time required to process the data. To facilitate the use of HRI for monitoring the extent of riparian forest buffers, the U.S. Forest Service and the U.S. Geological Survey Eastern Geographic Science Center funded the

  1. Ex Vivo Metrics, a preclinical tool in new drug development.

    Science.gov (United States)

    Curtis, C Gerald; Bilyard, Kevin; Stephenson, Hugo

    2008-01-23

    Among the challenges facing translational medicine today is the need for greater productivity and safety during the drug development process. To meet this need, practitioners of translational medicine are developing new technologies that can facilitate decision making during the early stages of drug discovery and clinical development. Ex Vivo Metrics is an emerging technology that addresses this need by using intact human organs ethically donated for research. After hypothermic storage, the organs are reanimated by blood perfusion, providing physiologically and biochemically stable preparations. In terms of emulating human exposure to drugs, Ex Vivo Metrics is the closest biological system available for clinical trials. Early application of this tool for evaluating drug targeting, efficacy, and toxicity could result in better selection among promising drug candidates, greater drug productivity, and increased safety.

  2. Ex Vivo Metrics™, a preclinical tool in new drug development

    Directory of Open Access Journals (Sweden)

    Bilyard Kevin

    2008-01-01

    Full Text Available Abstract Among the challenges facing translational medicine today is the need for greater productivity and safety during the drug development process. To meet this need, practitioners of translational medicine are developing new technologies that can facilitate decision making during the early stages of drug discovery and clinical development. Ex Vivo Metrics™ is an emerging technology that addresses this need by using intact human organs ethically donated for research. After hypothermic storage, the organs are reanimated by blood perfusion, providing physiologically and biochemically stable preparations. In terms of emulating human exposure to drugs, Ex Vivo Metrics is the closest biological system available for clinical trials. Early application of this tool for evaluating drug targeting, efficacy, and toxicity could result in better selection among promising drug candidates, greater drug productivity, and increased safety.

  3. Effective Management Tools in Implementing Operational Programme Administrative Capacity Development

    Directory of Open Access Journals (Sweden)

    Carmen – Elena DOBROTĂ

    2015-12-01

    Full Text Available Public administration in Romania and the administrative capacity of the central and local government has undergone a significant progress since 2007. The development of the administrative capacity deals with a set of structural and process changes that allow governments to improve the formulation and implementation of policies in order to achieve enhanced results. Identifying, developing and using management tools for a proper implementation of an operational programme dedicated to consolidate a performing public administration it was a challenging task, taking into account the types of interventions within Operational Programme Administrative Capacity Development 2007 – 2013 and the continuous changes in the economic and social environment in Romania and Europe. The aim of this article is to provide a short description of the approach used by the Managing Authority for OPACD within the performance management of the structural funds in Romania between 2008 and 2014. The paper offers a broad image of the way in which evaluations (ad-hoc, intermediate and performance were used in different stages of OP implementation as a tool of management.

  4. Development of IFC based fire safety assesment tools

    DEFF Research Database (Denmark)

    Taciuc, Anca; Karlshøj, Jan; Dederichs, Anne

    2016-01-01

    Building Information Models (BIM) to evacuate the safety level in the building during the conceptual design stage. The findings show that the developed tools can be useful in AEC industry. Integrating BIM from conceptual design stage for analyzing the fire safety level can ensure precision in further......Due to the impact that the fire safety design has on the building's layout and on other complementary systems, as installations, it is important during the conceptual design stage to evaluate continuously the safety level in the building. In case that the task is carried out too late, additional...

  5. Electrospray ionization and matrix assisted laser desorption/ionization mass spectrometry: powerful analytical tools in recombinant protein chemistry

    DEFF Research Database (Denmark)

    Andersen, Jens S.; Svensson, B; Roepstorff, P

    1996-01-01

    Electrospray ionization and matrix assisted laser desorption/ionization are effective ionization methods for mass spectrometry of biomolecules. Here we describe the capabilities of these methods for peptide and protein characterization in biotechnology. An integrated analytical strategy is presen......Electrospray ionization and matrix assisted laser desorption/ionization are effective ionization methods for mass spectrometry of biomolecules. Here we describe the capabilities of these methods for peptide and protein characterization in biotechnology. An integrated analytical strategy...

  6. Development of Next Generation Multiphase Pipe Flow Prediction Tools

    Energy Technology Data Exchange (ETDEWEB)

    Cem Sarica; Holden Zhang

    2006-05-31

    The developments of oil and gas fields in deep waters (5000 ft and more) will become more common in the future. It is inevitable that production systems will operate under multiphase flow conditions (simultaneous flow of gas, oil and water possibly along with sand, hydrates, and waxes). Multiphase flow prediction tools are essential for every phase of hydrocarbon recovery from design to operation. Recovery from deep-waters poses special challenges and requires accurate multiphase flow predictive tools for several applications, including the design and diagnostics of the production systems, separation of phases in horizontal wells, and multiphase separation (topside, seabed or bottom-hole). It is crucial for any multiphase separation technique, either at topside, seabed or bottom-hole, to know inlet conditions such as flow rates, flow patterns, and volume fractions of gas, oil and water coming into the separation devices. Therefore, the development of a new generation of multiphase flow predictive tools is needed. The overall objective of the proposed study is to develop a unified model for gas-oil-water three-phase flow in wells, flow lines, and pipelines to predict flow characteristics such as flow patterns, phase distributions, and pressure gradient encountered during petroleum production at different flow conditions (pipe diameter and inclination, fluid properties and flow rates). In the current multiphase modeling approach, flow pattern and flow behavior (pressure gradient and phase fractions) prediction modeling are separated. Thus, different models based on different physics are employed, causing inaccuracies and discontinuities. Moreover, oil and water are treated as a pseudo single phase, ignoring the distinct characteristics of both oil and water, and often resulting in inaccurate design that leads to operational problems. In this study, a new model is being developed through a theoretical and experimental study employing a revolutionary approach. The

  7. Recent developments in analytical techniques for characterization of ultra pure materials—An overview

    Indian Academy of Sciences (India)

    V Balaram

    2005-07-01

    With continual decrease of geometries used in modern IC devices, the trace metal impurities of process materials and chemicals used in their manufacture are moving to increasingly lower levels, i.e. ng/g and pg/g levels. An attempt is made to give a brief overview of the use of different analytical techniques in the analysis of trace metal impurities in ultrapure materials, such as, high-purity tellurium (7N), high purity quartz, high-purity copper (6N), and high purity water and mineral acids. In recent times mass spectrometric techniques such as ICP–MS, GD–MS and HR–ICP–MS with their characteristic high sensitivity and less interference effects were proved to be extremely useful in this field. A few examples of such application studies using these techniques are outlined. The usefulness of other analytical techniques such as F–AAS, GF–AAS, XRF, ICP–AES and INAA was also described. Specific advantages of ICP–MS and HR–ICP–MS such as high sensitivity, limited interference effects, element coverage and speed would make them powerful analytical tools for the characterization of ultrapure materials in future.

  8. Examination of the capability of the laser-induced breakdown spectroscopy (LIBS) technique as the emerging laser-based analytical tool for analyzing trace elements in coal

    Science.gov (United States)

    Idris, N.; Ramli, M.; Mahidin, Hedwig, R.; Lie, Z. S.; Kurniawan, K. H.

    2014-09-01

    Due to its superior advantageous over the conventional analytical tools, laser-induced breakdown spectroscopy (LIBS) technique nowadays is becoming an emerging analytical tools and it is expected to be new future super star of analytical tool. This technique is based on the use of optical emission from the laser-induced plasma for analyzing spectrochemically the constituent and content of the sampled object. The capability of this technique is examined on analysis of trace elements in coal sample. Coal is one difficult sample to analyze due to its complex chemical composition and physical properties. It is inherent that coal contains trace element including heavy metal, thus mining, beneficiation and utilization poses hazard to environment and to human beings. The LIBS apparatus used was composed by a laser system (Nd-YAG: Quanta Ray; LAB SERIES; 1,064 nm; 500 mJ; 8 ns) and optical detector (McPherson model 2061; 1,000 mm focal length; f/8.6 Czerny-Turner) equipped with Andor I*Star intensified CCD 1024×256 pixels. The emitted laser was focused onto coal sample with a focusing lens of +250 mm. The plasma emission was collected by a fiber optics and sent to the the spectrograph. The coal samples were taken from Province of Aceh. As the results, several trace elements including heavy metal (As, Mn, Pb) can surely be observed, implying the eventuality of LIBS technique to analysis the presence of trace element in coal.

  9. Development of the Central Dogma Concept Inventory (CDCI) Assessment Tool.

    Science.gov (United States)

    Newman, Dina L; Snyder, Christopher W; Fisk, J Nick; Wright, L Kate

    2016-01-01

    Scientific teaching requires scientifically constructed, field-tested instruments to accurately evaluate student thinking and gauge teacher effectiveness. We have developed a 23-question, multiple select-format assessment of student understanding of the essential concepts of the central dogma of molecular biology that is appropriate for all levels of undergraduate biology. Questions for the Central Dogma Concept Inventory (CDCI) tool were developed and iteratively revised based on student language and review by experts. The ability of the CDCI to discriminate between levels of understanding of the central dogma is supported by field testing (N= 54), and large-scale beta testing (N= 1733). Performance on the assessment increased with experience in biology; scores covered a broad range and showed no ceiling effect, even with senior biology majors, and pre/posttesting of a single class focused on the central dogma showed significant improvement. The multiple-select format reduces the chances of correct answers by random guessing, allows students at different levels to exhibit the extent of their knowledge, and provides deeper insight into the complexity of student thinking on each theme. To date, the CDCI is the first tool dedicated to measuring student thinking about the central dogma of molecular biology, and version 5 is ready to use.

  10. Developer Tools for Evaluating Multi-Objective Algorithms

    Science.gov (United States)

    Giuliano, Mark E.; Johnston, Mark D.

    2011-01-01

    Multi-objective algorithms for scheduling offer many advantages over the more conventional single objective approach. By keeping user objectives separate instead of combined, more information is available to the end user to make trade-offs between competing objectives. Unlike single objective algorithms, which produce a single solution, multi-objective algorithms produce a set of solutions, called a Pareto surface, where no solution is strictly dominated by another solution for all objectives. From the end-user perspective a Pareto-surface provides a tool for reasoning about trade-offs between competing objectives. From the perspective of a software developer multi-objective algorithms provide an additional challenge. How can you tell if one multi-objective algorithm is better than another? This paper presents formal and visual tools for evaluating multi-objective algorithms and shows how the developer process of selecting an algorithm parallels the end-user process of selecting a solution for execution out of the Pareto-Surface.

  11. Developing Tools and Techniques to Increase Communication Effectiveness

    Science.gov (United States)

    Hayes, Linda A.; Peterson, Doug

    1997-01-01

    The Public Affairs Office (PAO) of the Johnson Space Center (JSC) is responsible for communicating current JSC Space Program activities as well as goals and objectives to the American Public. As part of the 1996 Strategic Communications Plan, a review of PAO' s current communication procedures was conducted. The 1996 Summer Faculty Fellow performed research activities to support this effort by reviewing current research concerning NASA/JSC's customers' perceptions and interests, developing communications tools which enable PAO to more effectively inform JSC customers about the Space Program, and proposing a process for developing and using consistent messages throughout PAO. Note that this research does not attempt to change or influence customer perceptions or interests but, instead, incorporates current customer interests into PAO's communication process.

  12. TENTube: A Video-based Connection Tool Supporting Competence Development

    Directory of Open Access Journals (Sweden)

    Albert A Angehrn

    2008-07-01

    Full Text Available The vast majority of knowledge management initiatives fail because they do not take sufficiently into account the emotional, psychological and social needs of individuals. Only if users see real value for themselves will they actively use and contribute their own knowledge to the system, and engage with other users. Connection dynamics can make this easier, and even enjoyable, by connecting people and bringing them closer through shared experiences such as playing a game together. A higher connectedness of people to other people, and to relevant knowledge assets, will motivate them to participate more actively and increase system usage. In this paper, we describe the design of TENTube, a video-based connection tool we are developing to support competence development. TENTube integrates rich profiling and network visualization and navigation with agent-enhanced game-like connection dynamics.

  13. Development of a clinically applicable tool for bone density assessment

    Energy Technology Data Exchange (ETDEWEB)

    Sun, Y. [Katholieke Universiteit Leuven, Oral Imaging Center, Faculty of Medicine, Leuven (Belgium); St John' s Hospital, Department of Oral and Maxillofacial Surgery, Genk (Belgium); Dobbelaer, B. de; Suetens, P. [Katholieke Universiteit Leuven, Medical Image Computing (PSI), Faculty of Engineering, Leuven (Belgium); Nackaerts, O.; Yan, B.; Jacobs, R. [Katholieke Universiteit Leuven, Oral Imaging Center, Faculty of Medicine, Leuven (Belgium); Loubele, M. [Katholieke Universiteit Leuven, Oral Imaging Center, Faculty of Medicine, Leuven (Belgium); Katholieke Universiteit Leuven, Medical Image Computing (PSI), Faculty of Engineering, Leuven (Belgium); Politis, C.; Vrielinck, L. [St John' s Hospital, Department of Oral and Maxillofacial Surgery, Genk (Belgium); Schepers, S. [St John' s Hospital, Department of Oral and Maxillofacial Surgery, Genk (Belgium); University of Gent, Department of Oral and Maxillofacial Surgery, Faculty of Medicine, Gent (Belgium); Lambrichts, I. [University of Hasselt, Department of Morphology, Diepenbeek (Belgium); Horner, K.; Devlin, H. [University of Manchester, School of Dentistry, Manchester (United Kingdom)

    2009-03-15

    To assess the accuracy and reliability of new software for radiodensitometric evaluations. A densitometric tool developed by MevisLab {sup registered} was used in conjunction with intraoral radiographs of the premolar region in both in vivo and laboratory settings. An aluminum step wedge was utilized for comparison of grey values. After computer-aided segmentation, the interproximal bone between the premolars was assessed in order to determine the mean grey value intensity of this region and convert it to a thickness in aluminum. Evaluation of the tool was determined using bone mineral density (BMD) values derived from decalcified human bone specimens as a reference standard. In vivo BMD data was collected from 35 patients as determined with dual X-ray absorptiometry (DXA). The intra and interobserver reliability of this method was assessed by Bland and Altman Plots to determine the precision of this tool. In the laboratory study, the threshold value for detection of bone loss was 6.5%. The densitometric data (mm Al eq.) was highly correlated with the jaw bone BMD, as determined using dual X-ray absorptiometry (r=0.96). For the in vivo study, the correlations between the mm Al equivalent of the average upper and lower jaw with the lumbar spine BMD, total hip BMD and femoral neck BMD were 0.489, 0.537 and 0.467, respectively (P<0.05). For the intraobserver reliability, a Bland and Altman plot showed that the mean difference {+-}1.96 SD were within {+-}0.15 mm Al eq. with the mean difference value small than 0.003 mm Al eq. For the interobserver reliability, the mean difference {+-}1.96 SD were within {+-}0.11 mm Al eq. with the mean difference of 0.008 mm Al eq. A densitometric software tool has been developed, that is reliable for bone density assessment. It now requires further investigation to evaluate its accuracy and clinical applicability in large scale studies. (orig.)

  14. Towards a Reference Architecture to Provision Tools as a Service for Global Software Development

    DEFF Research Database (Denmark)

    Chauhan, Aufeef; Babar, Muhammad Ali

    2014-01-01

    Organizations involve in Global Software Development (GSD) face challenges in terms of having access to appropriate set of tools for performing distributed engineering and development activities, integration between heterogeneous desktop and web-based tools, management of artifacts developed and ...

  15. Drama: an appropriate tool in development support communication.

    Science.gov (United States)

    Moyo, F F

    1997-01-01

    Because it supports progress, drama embodies development and, thus, can be used to support development communication. In fact, drama is the most appropriate medium for effecting change for development because it 1) involves interpersonal communication; 2) broadens the meaning of development; 3) challenges assumptions, demands accountability, suggests remedies, and evaluates the totality of performance; 4) exploits the politics of possibility; 5) is inherently dialogue; 6) allows the target community to participation during the exposition, the conflict, and the resolution; 7) is easy to assimilate; and 8) provides access to any medium. Drama is especially appropriate in South Africa because children carry on a tradition of play-acting, South Africans exhibit politeness to strangers that makes them passively aggressive in resisting change, it allows the temporary removal of cultural barriers among members of households, and it resembles the oral tradition that was pervasive in the culture. Drama can impart greater legitimacy to topics originating from the community or fuse a totally new concept with local culture. It can be used as a tool to criticize political mismanagement, comment on social problems, question culture, debate religious matters, examine economic society, assist educational programs, assist health efforts, raise people's awareness about conservation, and help spread technological advancements. Drama allows more to be done than said and allows communities to be involved in development efforts.

  16. Crosscutting Development- EVA Tools and Geology Sample Acquisition

    Science.gov (United States)

    2011-01-01

    Exploration to all destinations has at one time or another involved the acquisition and return of samples and context data. Gathered at the summit of the highest mountain, the floor of the deepest sea, or the ice of a polar surface, samples and their value (both scientific and symbolic) have been a mainstay of Earthly exploration. In manned spaceflight exploration, the gathering of samples and their contextual information has continued. With the extension of collecting activities to spaceflight destinations comes the need for geology tools and equipment uniquely designed for use by suited crew members in radically different environments from conventional field geology. Beginning with the first Apollo Lunar Surface Extravehicular Activity (EVA), EVA Geology Tools were successfully used to enable the exploration and scientific sample gathering objectives of the lunar crew members. These early designs were a step in the evolution of Field Geology equipment, and the evolution continues today. Contemporary efforts seek to build upon and extend the knowledge gained in not only the Apollo program but a wealth of terrestrial field geology methods and hardware that have continued to evolve since the last lunar surface EVA. This paper is presented with intentional focus on documenting the continuing evolution and growing body of knowledge for both engineering and science team members seeking to further the development of EVA Geology. Recent engineering development and field testing efforts of EVA Geology equipment for surface EVA applications are presented, including the 2010 Desert Research and Technology Studies (Desert RATs) field trial. An executive summary of findings will also be presented, detailing efforts recommended for exotic sample acquisition and pre-return curation development regardless of planetary or microgravity destination.

  17. Developing Tools for Computation of Basin Topographic Parameters in GIS

    Science.gov (United States)

    Gökgöz, T.; Yayla, Y.; Yaman, M. B.; Güvenç, H.; Kaya, S.

    2016-10-01

    Although water use has been increasing day by day depending on fast population increase, urbanization and industrialization in the world, potential of usable water resources remains stable. On the other side, expansion of agricultural activities, industrialization, urbanization, global warming and climate change create a big pressure on current water resources. Therefore, management of water resources is one of the most significant problems of today that is required to be solved and `'Integrated Basin Management'' has gained importance in the world in terms of decreasing environmental problems by more efficiently using current water resources. In order to achieve integrated basin management, it is needed to determine basin boundaries with sufficient accuracy and precision and encode them systematically. In various analyses to be done on the basis of basin, topographic parameters are also needed such as shape factor, bifurcation ratio, drainage frequency, drainage density, length of the main flow path, harmonic slope, average slope, time of concentration, hypsometric curve and maximum elevation difference. Nowadays, basin boundaries are obtained with digital elevation models in geographical information systems. However, tools developed for topographic parameters are not available. In this study, programs were written in Python programming language for afore-mentioned topographic parameters and each turned into a geographical information system tool. Therefore, a significant contribution has been made to the subject by completing the deficiency in the geographical information system devoted to the topographic parameters that are needed in almost every analyses concerning to the hydrology.

  18. Development of molecular tools to monitor conjugative transfer in rhizobia.

    Science.gov (United States)

    Tejerizo, Gonzalo Torres; Bañuelos, Luis Alfredo; Cervantes, Laura; Gaytán, Paul; Pistorio, Mariano; Romero, David; Brom, Susana

    2015-10-01

    Evolution of bacterial populations has been extensively driven by horizontal transfer events. Conjugative plasmid transfer is considered the principal contributor to gene exchange among bacteria. Several conjugative and mobilizable plasmids have been identified in rhizobia, and two major molecular mechanisms that regulate their transfer have been described, under laboratory conditions. The knowledge of rhizobial plasmid transfer regulation in natural environments is very poor. In this work we developed molecular tools to easily monitor the conjugative plasmid transfer in rhizobia by flow cytometry (FC) or microscopy. 24 cassettes were constructed by combining a variety of promotors, fluorescent proteins and antibiotic resistance genes, and used to tag plasmids and chromosome of donor strains. We were able to detect plasmid transfer after conversion of non-fluorescent recipients into fluorescent transconjugants. Flow cytometry (FC) was optimized to count donor, recipient and transconjugant strains to determine conjugative transfer frequencies. Results were similar, when determined either by FC or by viable counts. Our constructions also allowed the visualization of transconjugants in crosses performed on bean roots. The tools presented here may also be used for other purposes, such as analysis of transcriptional fusions or single-cell tagging. Application of the system will allow the survey of how different environmental conditions or other regulators modulate plasmid transfer in rhizobia.

  19. Development and application of camelid molecular cytogenetic tools.

    Science.gov (United States)

    Avila, Felipe; Das, Pranab J; Kutzler, Michelle; Owens, Elaine; Perelman, Polina; Rubes, Jiri; Hornak, Miroslav; Johnson, Warren E; Raudsepp, Terje

    2014-01-01

    Cytogenetic chromosome maps offer molecular tools for genome analysis and clinical cytogenetics and are of particular importance for species with difficult karyotypes, such as camelids (2n = 74). Building on the available human-camel zoo-fluorescence in situ hybridization (FISH) data, we developed the first cytogenetic map for the alpaca (Lama pacos, LPA) genome by isolating and identifying 151 alpaca bacterial artificial chromosome (BAC) clones corresponding to 44 specific genes. The genes were mapped by FISH to 31 alpaca autosomes and the sex chromosomes; 11 chromosomes had 2 markers, which were ordered by dual-color FISH. The STS gene mapped to Xpter/Ypter, demarcating the pseudoautosomal region, whereas no markers were assigned to chromosomes 14, 21, 22, 28, and 36. The chromosome-specific markers were applied in clinical cytogenetics to identify LPA20, the major histocompatibility complex (MHC)-carrying chromosome, as a part of an autosomal translocation in a sterile male llama (Lama glama, LGL; 2n = 73,XY). FISH with LPAX BACs and LPA36 paints, as well as comparative genomic hybridization, were also used to investigate the origin of the minute chromosome, an abnormally small LPA36 in infertile female alpacas. This collection of cytogenetically mapped markers represents a new tool for camelid clinical cytogenetics and has applications for the improvement of the alpaca genome map and sequence assembly.

  20. Development of environmental tools for anopheline larval control

    Directory of Open Access Journals (Sweden)

    Mweresa Collins K

    2011-07-01

    Full Text Available Abstract Background Malaria mosquitoes spend a considerable part of their life in the aquatic stage, rendering them vulnerable to interventions directed to aquatic habitats. Recent successes of mosquito larval control have been reported using environmental and biological tools. Here, we report the effects of shading by plants and biological control agents on the development and survival of anopheline and culicine mosquito larvae in man-made natural habitats in western Kenya. Trials consisted of environmental manipulation using locally available plants, the introduction of predatory fish and/or the use of Bacillus thuringiensis var. israelensis (Bti in various combinations. Results Man-made habitats provided with shade from different crop species produced significantly fewer larvae than those without shade especially for the malaria vector Anopheles gambiae. Larval control of the African malaria mosquito An. gambiae and other mosquito species was effective in habitats where both predatory fish and Bti were applied, than where the two biological control agents were administered independently. Conclusion We conclude that integration of environmental management techniques using shade-providing plants and predatory fish and/or Bti are effective and sustainable tools for the control of malaria and other mosquito-borne disease vectors.

  1. Developing tools and strategies for communicating climate change

    Science.gov (United States)

    Bader, D.; Yam, E. M.; Perkins, L.

    2011-12-01

    Research indicates that the public views zoos and aquariums as reliable and trusted sources for information on conservation. Additionally, visiting zoos and aquariums helps people reconsider their connections to conservation issues and solutions. The Aquarium of the Pacific, an AZA-accredited institution that serves the most ethnically diverse population of all aquariums in the nation, is using exhibit space, technology, public programming, and staff professional development to present a model for how aquariums can promote climate literacy. Our newest galleries and programs are designed to immerse our visitors in experiences that connect our live animal collection to larger themes on ocean change. The Aquarium is supporting our new programming with a multifaceted staff professional development that exposes our interpretive staff to current climate science and researchers as well as current social science on public perception of climate science. Our staff also leads workshops for scientists; these sessions allow us to examine learning theory and develop tools to communicate science and controversial subjects effectively. Through our partnerships in the science, social science, and informal science education communities, we are working to innovate and develop best practices in climate communication.

  2. Near infra red spectroscopy as a multivariate process analytical tool for predicting pharmaceutical co-crystal concentration.

    Science.gov (United States)

    Wood, Clive; Alwati, Abdolati; Halsey, Sheelagh; Gough, Tim; Brown, Elaine; Kelly, Adrian; Paradkar, Anant

    2016-09-10

    The use of near infra red spectroscopy to predict the concentration of two pharmaceutical co-crystals; 1:1 ibuprofen-nicotinamide (IBU-NIC) and 1:1 carbamazepine-nicotinamide (CBZ-NIC) has been evaluated. A partial least squares (PLS) regression model was developed for both co-crystal pairs using sets of standard samples to create calibration and validation data sets with which to build and validate the models. Parameters such as the root mean square error of calibration (RMSEC), root mean square error of prediction (RMSEP) and correlation coefficient were used to assess the accuracy and linearity of the models. Accurate PLS regression models were created for both co-crystal pairs which can be used to predict the co-crystal concentration in a powder mixture of the co-crystal and the active pharmaceutical ingredient (API). The IBU-NIC model had smaller errors than the CBZ-NIC model, possibly due to the complex CBZ-NIC spectra which could reflect the different arrangement of hydrogen bonding associated with the co-crystal compared to the IBU-NIC co-crystal. These results suggest that NIR spectroscopy can be used as a PAT tool during a variety of pharmaceutical co-crystal manufacturing methods and the presented data will facilitate future offline and in-line NIR studies involving pharmaceutical co-crystals.

  3. Developing a Grid-based search and categorization tool

    CERN Document Server

    Haya, Glenn; Vigen, Jens

    2003-01-01

    Grid technology has the potential to improve the accessibility of digital libraries. The participants in Project GRACE (Grid Search And Categorization Engine) are in the process of developing a search engine that will allow users to search through heterogeneous resources stored in geographically distributed digital collections. What differentiates this project from current search tools is that GRACE will be run on the European Data Grid, a large distributed network, and will not have a single centralized index as current web search engines do. In some cases, the distributed approach offers advantages over the centralized approach since it is more scalable, can be used on otherwise inaccessible material, and can provide advanced search options customized for each data source.

  4. Development of a simple estimation tool for LMFBR construction cost

    Energy Technology Data Exchange (ETDEWEB)

    Yoshida, Kazuo; Kinoshita, Izumi [Central Research Inst. of Electric Power Industry, Komae, Tokyo (Japan). Komae Research Lab

    1999-05-01

    A simple tool for estimating the construction costs of liquid-metal-cooled fast breeder reactors (LMFBRs), 'Simple Cost' was developed in this study. Simple Cost is based on a new estimation formula that can reduce the amount of design data required to estimate construction costs. Consequently, Simple cost can be used to estimate the construction costs of innovative LMFBR concepts for which detailed design has not been carried out. The results of test calculation show that Simple Cost provides cost estimations equivalent to those obtained with conventional methods within the range of plant power from 325 to 1500 MWe. Sensitivity analyses for typical design parameters were conducted using Simple Cost. The effects of four major parameters - reactor vessel diameter, core outlet temperature, sodium handling area and number of secondary loops - on the construction costs of LMFBRs were evaluated quantitatively. The results show that the reduction of sodium handling area is particularly effective in reducing construction costs. (author)

  5. Development and Validation of a Hypersonic Vehicle Design Tool Based On Waverider Design Technique

    Science.gov (United States)

    Dasque, Nastassja

    Methodologies for a tool capable of assisting design initiatives for practical waverider based hypersonic vehicles were developed and validated. The design space for vehicle surfaces was formed using an algorithm that coupled directional derivatives with the conservation laws to determine a flow field defined by a set of post-shock streamlines. The design space is used to construct an ideal waverider with a sharp leading edge. A blunting method was developed to modify the ideal shapes to a more practical geometry for real-world application. Empirical and analytical relations were then systematically applied to the resulting geometries to determine local pressure, skin-friction and heat flux. For the ideal portion of the geometry, flat plate relations for compressible flow were applied. For the blunted portion of the geometry modified Newtonian theory, Fay-Riddell theory and Modified Reynolds analogy were applied. The design and analysis methods were validated using analytical solutions as well as empirical and numerical data. The streamline solution for the flow field generation technique was compared with a Taylor-Maccoll solution and showed very good agreement. The relationship between the local Stanton number and skin friction coefficient with local Reynolds number along the ideal portion of the body showed good agreement with experimental data. In addition, an automated grid generation routine was formulated to construct a structured mesh around resulting geometries in preparation for Computational Fluid Dynamics analysis. The overall analysis of the waverider body using the tool was then compared to CFD studies. The CFD flow field showed very good agreement with the design space. However, the distribution of the surface properties was near CFD results but did not have great agreement.

  6. Development and psychometric testing of the clinical networks engagement tool

    Science.gov (United States)

    Hecker, Kent G.; Rabatach, Leora; Noseworthy, Tom W.; White, Deborah E.

    2017-01-01

    Background Clinical networks are being used widely to facilitate large system transformation in healthcare, by engagement of stakeholders throughout the health system. However, there are no available instruments that measure engagement in these networks. Methods The study purpose was to develop and assess the measurement properties of a multiprofessional tool to measure engagement in clinical network initiatives. Based on components of the International Association of Public Participation Spectrum and expert panel review, we developed 40 items for testing. The draft instrument was distributed to 1,668 network stakeholders across different governance levels (leaders, members, support, frontline stakeholders) in 9 strategic clinical networks in Alberta (January to July 2014). With data from 424 completed surveys (25.4% response rate), descriptive statistics, exploratory and confirmatory factor analysis, Pearson correlations, linear regression, multivariate analysis, and Cronbach alpha were conducted to assess reliability and validity of the scores. Results Sixteen items were retained in the instrument. Exploratory factor analysis indicated a four-factor solution and accounted for 85.7% of the total variance in engagement with clinical network initiatives: global engagement, inform (provided with information), involve (worked together to address concerns), and empower (given final decision-making authority). All subscales demonstrated acceptable reliability (Cronbach alpha 0.87 to 0.99). Both the confirmatory factor analysis and regression analysis confirmed that inform, involve, and empower were all significant predictors of global engagement, with involve as the strongest predictor. Leaders had higher mean scores than frontline stakeholders, while members and support staff did not differ in mean scores. Conclusions This study provided foundational evidence for the use of this tool for assessing engagement in clinical networks. Further work is necessary to evaluate

  7. Developing an integration tool for soil contamination assessment

    Science.gov (United States)

    Anaya-Romero, Maria; Zingg, Felix; Pérez-Álvarez, José Miguel; Madejón, Paula; Kotb Abd-Elmabod, Sameh

    2015-04-01

    In the last decades, huge soil areas have been negatively influenced or altered in multiples forms. Soils and, consequently, underground water, have been contaminated by accumulation of contaminants from agricultural activities (fertilizers and pesticides) industrial activities (harmful material dumping, sludge, flying ashes) and urban activities (hydrocarbon, metals from vehicle traffic, urban waste dumping). In the framework of the RECARE project, local partners across Europe are focusing on a wide range of soil threats, as soil contamination, and aiming to develop effective prevention, remediation and restoration measures by designing and applying targeted land management strategies (van Lynden et al., 2013). In this context, the Guadiamar Green Corridor (Southern Spain) was used as a case study, aiming to obtain soil data and new information in order to assess soil contamination. The main threat in the Guadiamar valley is soil contamination after a mine spill occurred on April 1998. About four hm3 of acid waters and two hm3 of mud, rich in heavy metals, were released into the Agrio and Guadiamar rivers affecting more than 4,600 ha of agricultural and pasture land. Main trace elements contaminating soil and water were As, Cd, Cu, Pb, Tl and Zn. The objective of the present research is to develop informatics tools that integrate soil database, models and interactive platforms for soil contamination assessment. Preliminary results were obtained related to the compilation of harmonized databases including geographical, hydro-meteorological, soil and socio-economic variables based on spatial analysis and stakeholder's consultation. Further research will be modellization and upscaling at the European level, in order to obtain a scientifically-technical predictive tool for the assessment of soil contamination.

  8. The Web Interface Template System (WITS), a software developer`s tool

    Energy Technology Data Exchange (ETDEWEB)

    Lauer, L.J.; Lynam, M.; Muniz, T. [Sandia National Labs., Albuquerque, NM (United States). Financial Systems Dept.

    1995-11-01

    The Web Interface Template System (WITS) is a tool for software developers. WITS is a three-tiered, object-oriented system operating in a Client/Server environment. This tool can be used to create software applications that have a Web browser as the user interface and access a Sybase database. Development, modification, and implementation are greatly simplified because the developer can change and test definitions immediately, without writing or compiling any code. This document explains WITS functionality, the system structure and components of WITS, and how to obtain, install, and use the software system.

  9. Development of Wet-Etching Tools for Precision Optical Figuring

    Energy Technology Data Exchange (ETDEWEB)

    Rushford, M C; Dixit, S N; Hyde, R; Britten, J A; Nissen, J; Aasen, M; Toeppen, J; Hoaglan, C; Nelson, C; Summers, L; Thomas, I

    2004-01-27

    This FY03 final report on Wet Etch Figuring involves a 2D thermal tool. Its purpose is to flatten (0.3 to 1 mm thickness) sheets of glass faster thus cheaper than conventional sub aperture tools. An array of resistors on a circuit board was used to heat acid over the glass Optical Path Difference (OPD) thick spots and at times this heating extended over the most of the glass aperture. Where the acid is heated on the glass it dissolves faster. A self-referencing interferometer measured the glass thickness, its design taking advantage of the parallel nature and thinness of these glass sheets. This measurement is used in close loop control of the heating patterns of the circuit board thus glass and acid. Only the glass and acid were to be moved to make the tool logistically simple to use in mass production. A set of 4-circuit board, covering 80 x 80-cm aperture was ordered, but only one 40 x 40-cm board was put together and tested for this report. The interferometer measurement of glass OPD was slower than needed on some glass profiles. Sometimes the interference fringes were too fine to resolve which would alias the sign of the glass thickness profile. This also caused the phase unwrapping code (FLYNN) to struggle thus run slowly at times taking hours, for a 10 inch square area. We did extensive work to improve the speed of this code. We tried many different phase unwrapping codes. Eventually running (FLYNN) on a farm of networked computers. Most of the work reported here is therefore limited to a 10-inch square aperture. Researched into fabricating a better interferometer lens from Plexiglas so to have less of the scattered light issues of Fresnel lens groves near field scattering patterns, this set the Nyquest limit. There was also a problem with the initial concept of wetting the 1737 glass on its bottom side with acid. The wetted 1737 glass developed an Achromatic AR coating, spoiling the reflection needed to see glass thickness interference fringes. In response

  10. A Thermoelastic Hydraulic Fracture Design Tool for Geothermal Reservoir Development

    Energy Technology Data Exchange (ETDEWEB)

    Ahmad Ghassemi

    2003-06-30

    Geothermal energy is recovered by circulating water through heat exchange areas within a hot rock mass. Geothermal reservoir rock masses generally consist of igneous and metamorphic rocks that have low matrix permeability. Therefore, cracks and fractures play a significant role in extraction of geothermal energy by providing the major pathways for fluid flow and heat exchange. Thus, knowledge of conditions leading to formation of fractures and fracture networks is of paramount importance. Furthermore, in the absence of natural fractures or adequate connectivity, artificial fracture are created in the reservoir using hydraulic fracturing. At times, the practice aims to create a number of parallel fractures connecting a pair of wells. Multiple fractures are preferred because of the large size necessary when using only a single fracture. Although the basic idea is rather simple, hydraulic fracturing is a complex process involving interactions of high pressure fluid injections with a stressed hot rock mass, mechanical interaction of induced fractures with existing natural fractures, and the spatial and temporal variations of in-situ stress. As a result it is necessary to develop tools that can be used to study these interactions as an integral part of a comprehensive approach to geothermal reservoir development, particularly enhanced geothermal systems. In response to this need we have set out to develop advanced thermo-mechanical models for design of artificial fractures and rock fracture research in geothermal reservoirs. These models consider the significant hydraulic and thermo-mechanical processes and their interaction with the in-situ stress state. Wellbore failure and fracture initiation is studied using a model that fully couples poro-mechanical and thermo-mechanical effects. The fracture propagation model is based on a complex variable and regular displacement discontinuity formulations. In the complex variable approach the displacement discontinuities are

  11. The recent developments in dispersive liquid–liquid microextraction for preconcentration and determination of inorganic analytes

    OpenAIRE

    H.M. Al-Saidi; Adel A.A. Emara

    2014-01-01

    Recently, increasing interest on the use of dispersive liquid–liquid microextraction (DLLME) developed in 2006 by Rezaee has been found in the field of separation science. DLLME is miniaturized format of liquid–liquid extraction in which acceptor-to-donor phase ratio is greatly reduced compared with other methods. In the present review, the combination of DLLME with different analytical techniques such as atomic absorption spectrometry (AAS), inductively coupled plasma-optical emission spectr...

  12. Development of AN All-Purpose Free Photogrammetric Tool

    Science.gov (United States)

    González-Aguilera, D.; López-Fernández, L.; Rodriguez-Gonzalvez, P.; Guerrero, D.; Hernandez-Lopez, D.; Remondino, F.; Menna, F.; Nocerino, E.; Toschi, I.; Ballabeni, A.; Gaiani, M.

    2016-06-01

    Photogrammetry is currently facing some challenges and changes mainly related to automation, ubiquitous processing and variety of applications. Within an ISPRS Scientific Initiative a team of researchers from USAL, UCLM, FBK and UNIBO have developed an open photogrammetric tool, called GRAPHOS (inteGRAted PHOtogrammetric Suite). GRAPHOS allows to obtain dense and metric 3D point clouds from terrestrial and UAV images. It encloses robust photogrammetric and computer vision algorithms with the following aims: (i) increase automation, allowing to get dense 3D point clouds through a friendly and easy-to-use interface; (ii) increase flexibility, working with any type of images, scenarios and cameras; (iii) improve quality, guaranteeing high accuracy and resolution; (iv) preserve photogrammetric reliability and repeatability. Last but not least, GRAPHOS has also an educational component reinforced with some didactical explanations about algorithms and their performance. The developments were carried out at different levels: GUI realization, image pre-processing, photogrammetric processing with weight parameters, dataset creation and system evaluation. The paper will present in detail the developments of GRAPHOS with all its photogrammetric components and the evaluation analyses based on various image datasets. GRAPHOS is distributed for free for research and educational needs.

  13. Facilities as teaching tools: A transformative participatory professional development experience

    Science.gov (United States)

    Wilson, Eric A.

    Resource consumption continues to increase as the population grows. In order to secure a sustainable future, society must educate the next generation to become "sustainability natives." Schools play a pivotal role in educating a sustainability-literate society. However, a disconnect exists between the hidden curriculum of the built environment and the enacted curriculum. This study employs a transformative participatory professional development model to instruct teachers on how to use their school grounds as teaching tools for the purpose of helping students make explicit choices in energy consumption, materials use, and sustainable living. Incorporating a phenomenological perspective, this study considers the lived experience of two sustainability coordinators. Grounded theory provides an interpretational context for the participants' interactions with each other and the professional development process. Through a year long professional development experience - commencing with an intense, participatory two-day workshop -the participants discussed challenges they faced with integrating facilities into school curriculum and institutionalizing a culture of sustainability. Two major needs were identified in this study. For successful sustainability initiatives, a hybrid model that melds top-down and bottom-up approaches offers the requisite mix of administrative support, ground level buy-in, and excitement vis-a-vis sustainability. Second, related to this hybrid approach, K-12 sustainability coordinators ideally need administrative capabilities with access to decision making, while remaining connected to students in a meaningful way, either directly in the classroom, as a mentor, or through work with student groups and projects.

  14. The development of a two-component force dynamometer and tool control system for dynamic machine tool research

    Science.gov (United States)

    Sutherland, I. A.

    1973-01-01

    The development is presented of a tooling system that makes a controlled sinusoidal oscillation simulating a dynamic chip removal condition. It also measures the machining forces in two mutually perpendicular directions without any cross sensitivity.

  15. Medicaid Analytic eXtract (MAX) Chartbooks

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicaid Analytic eXtract Chartbooks are research tools and reference guides on Medicaid enrollees and their Medicaid experience in 2002 and 2004. Developed for...

  16. Development of analytical methodologies to assess recalcitrant pesticide bioremediation in biobeds at laboratory scale.

    Science.gov (United States)

    Rivero, Anisleidy; Niell, Silvina; Cerdeiras, M Pía; Heinzen, Horacio; Cesio, María Verónica

    2016-06-01

    To assess recalcitrant pesticide bioremediation it is necessary to gradually increase the complexity of the biological system used in order to design an effective biobed assembly. Each step towards this effective biobed design needs a suitable, validated analytical methodology that allows a correct evaluation of the dissipation and bioconvertion. Low recovery yielding methods could give a false idea of a successful biodegradation process. To address this situation, different methods were developed and validated for the simultaneous determination of endosulfan, its main three metabolites, and chlorpyrifos in increasingly complex matrices where the bioconvertor basidiomycete Abortiporus biennis could grow. The matrices were culture media, bran, and finally a laboratory biomix composed of bran, peat and soil. The methodology for the analysis of the first evaluated matrix has already been reported. The methodologies developed for the other two systems are presented in this work. The targeted analytes were extracted from fungi growing over bran in semisolid media YNB (Yeast Nitrogen Based) with acetonitrile using shaker assisted extraction, The salting-out step was performed with MgSO4 and NaCl, and the extracts analyzed by GC-ECD. The best methodology was fully validated for all the evaluated analytes at 1 and 25mgkg(-1) yielding recoveries between 72% and 109% and RSDs methodology proved that A. biennis is able to dissipate 94% of endosulfan and 87% of chlorpyrifos after 90 days. Having assessed that A. biennis growing over bran can metabolize the studied pesticides, the next step faced was the development and validation of an analytical procedure to evaluate the analytes in a laboratory scale biobed composed of 50% of bran, 25% of peat and 25% of soil together with fungal micelium. From the different procedures assayed, only ultrasound assisted extraction with ethyl acetate allowed recoveries between 80% and 110% with RSDs <18%. Linearity, recovery, precision, matrix

  17. Iterative Development of an Online Dietary Recall Tool: INTAKE24

    Directory of Open Access Journals (Sweden)

    Emma Simpson

    2017-02-01

    Full Text Available Collecting large-scale population data on dietary intake is challenging, particularly when resources and funding are constrained. Technology offers the potential to develop novel ways of collecting large amounts of dietary information while making it easier, more convenient, intuitive, and engaging for users. INTAKE24 is an online multiple pass 24 h dietary recall tool developed for use in national food and nutrition surveys. The development of INTAKE24 was a four-stage iterative process of user interaction and evaluation with the intended end users, 11–24 years old. A total of 80 11–24 years old took part in the evaluation, 20 at each stage. Several methods were used to elicit feedback from the users including, ‘think aloud’, ‘eye tracking’, semi-structured interviews, and a system usability scale. Each participant completed an interviewer led recall post system completion. Key system developments generated from the user feedback included a ‘flat’ interface, which uses only a single interface screen shared between all of the various activities (e.g., free text entry, looking up foods in the database, portion size estimation. Improvements to the text entry, search functionality, and navigation around the system were also influenced through feedback from users at each stage. The time to complete a recall using INTAKE24 almost halved from the initial prototype to the end system, while the agreement with an interviewer led recall improved. Further developments include testing the use of INTAKE24 with older adults and translation into other languages for international use. Our future aim is to validate the system with recovery biomarkers.

  18. Iterative Development of an Online Dietary Recall Tool: INTAKE24

    Science.gov (United States)

    Simpson, Emma; Bradley, Jennifer; Poliakov, Ivan; Jackson, Dan; Olivier, Patrick; Adamson, Ashley J.; Foster, Emma

    2017-01-01

    Collecting large-scale population data on dietary intake is challenging, particularly when resources and funding are constrained. Technology offers the potential to develop novel ways of collecting large amounts of dietary information while making it easier, more convenient, intuitive, and engaging for users. INTAKE24 is an online multiple pass 24 h dietary recall tool developed for use in national food and nutrition surveys. The development of INTAKE24 was a four-stage iterative process of user interaction and evaluation with the intended end users, 11–24 years old. A total of 80 11–24 years old took part in the evaluation, 20 at each stage. Several methods were used to elicit feedback from the users including, ‘think aloud’, ‘eye tracking’, semi-structured interviews, and a system usability scale. Each participant completed an interviewer led recall post system completion. Key system developments generated from the user feedback included a ‘flat’ interface, which uses only a single interface screen shared between all of the various activities (e.g., free text entry, looking up foods in the database, portion size estimation). Improvements to the text entry, search functionality, and navigation around the system were also influenced through feedback from users at each stage. The time to complete a recall using INTAKE24 almost halved from the initial prototype to the end system, while the agreement with an interviewer led recall improved. Further developments include testing the use of INTAKE24 with older adults and translation into other languages for international use. Our future aim is to validate the system with recovery biomarkers. PMID:28208763

  19. A review on recent developments for biomolecule separation at analytical scale using microfluidic devices.

    Science.gov (United States)

    Tetala, Kishore K R; Vijayalakshmi, M A

    2016-02-04

    Microfluidic devices with their inherent advantages like the ability to handle 10(-9) to 10(-18) L volume, multiplexing of microchannels, rapid analysis and on-chip detection are proving to be efficient systems in various fields of life sciences. This review highlights articles published since 2010 that reports the use of microfluidic devices to separate biomolecules (DNA, RNA and proteins) using chromatography principles (size, charge, hydrophobicity and affinity) along with microchip capillary electrophoresis, isotachophoresis etc. A detailed overview of stationary phase materials and the approaches to incorporate them within the microchannels of microchips is provided as well as a brief overview of chemical methods to immobilize ligand(s). Furthermore, we review research articles that deal with microfluidic devices as analytical tools for biomolecule (DNA, RNA and protein) separation.

  20. Analytical solution of electromagnetic field generated by induction logging tool in a fan-ring slot of drill collar while drilling

    Institute of Scientific and Technical Information of China (English)

    2009-01-01

    Based on the structural characteristic of metal drill collar for induction logging while drilling, we have given the analytical formulae of lengthways fields Ez and Hz when the tool is located in a fan-ring shaped slot of drill collar by the boundary conditions of electromagnetic field, and derived the other components of electromagnetic field in and out the fan-ring slot from Ez and Hz. In the other intervals of formation, where the drill collar is a solid cylinder, the analytical formulae of field are educed through the method of variable coefficient. The total analytical solutions of field in whole space have been obtained. With the help of the analytical formulae, we have also given numerical examples and analyzed the distributive characteristic of electromagnetic field. From the computational results we find that the secondary scattering field Hz is in a linear relation with the conductivity of stratum. The characteristic of field is very useful for induction logging while drilling, which can be used to measure and analyze the logging responses of the stratum conductivity. This paper sets up a theoretical foundation for us to study the distrbutions of field and to direct the design of logging instruments.

  1. Development of Next Generation Multiphase Pipe Flow Prediction Tools

    Energy Technology Data Exchange (ETDEWEB)

    Tulsa Fluid Flow

    2008-08-31

    The developments of fields in deep waters (5000 ft and more) is a common occurrence. It is inevitable that production systems will operate under multiphase flow conditions (simultaneous flow of gas-oil-and water possibly along with sand, hydrates, and waxes). Multiphase flow prediction tools are essential for every phase of the hydrocarbon recovery from design to operation. The recovery from deep-waters poses special challenges and requires accurate multiphase flow predictive tools for several applications including the design and diagnostics of the production systems, separation of phases in horizontal wells, and multiphase separation (topside, seabed or bottom-hole). It is very crucial to any multiphase separation technique that is employed either at topside, seabed or bottom-hole to know inlet conditions such as the flow rates, flow patterns, and volume fractions of gas, oil and water coming into the separation devices. The overall objective was to develop a unified model for gas-oil-water three-phase flow in wells, flow lines, and pipelines to predict the flow characteristics such as flow patterns, phase distributions, and pressure gradient encountered during petroleum production at different flow conditions (pipe diameter and inclination, fluid properties and flow rates). The project was conducted in two periods. In Period 1 (four years), gas-oil-water flow in pipes were investigated to understand the fundamental physical mechanisms describing the interaction between the gas-oil-water phases under flowing conditions, and a unified model was developed utilizing a novel modeling approach. A gas-oil-water pipe flow database including field and laboratory data was formed in Period 2 (one year). The database was utilized in model performance demonstration. Period 1 primarily consisted of the development of a unified model and software to predict the gas-oil-water flow, and experimental studies of the gas-oil-water project, including flow behavior description and

  2. Developing Anticipatory Life Cycle Assessment Tools to Support Responsible Innovation

    Science.gov (United States)

    Wender, Benjamin

    Several prominent research strategy organizations recommend applying life cycle assessment (LCA) early in the development of emerging technologies. For example, the US Environmental Protection Agency, the National Research Council, the Department of Energy, and the National Nanotechnology Initiative identify the potential for LCA to inform research and development (R&D) of photovoltaics and products containing engineered nanomaterials (ENMs). In this capacity, application of LCA to emerging technologies may contribute to the growing movement for responsible research and innovation (RRI). However, existing LCA practices are largely retrospective and ill-suited to support the objectives of RRI. For example, barriers related to data availability, rapid technology change, and isolation of environmental from technical research inhibit application of LCA to developing technologies. This dissertation focuses on development of anticipatory LCA tools that incorporate elements of technology forecasting, provide robust explorations of uncertainty, and engage diverse innovation actors in overcoming retrospective approaches to environmental assessment and improvement of emerging technologies. Chapter one contextualizes current LCA practices within the growing literature articulating RRI and identifies the optimal place in the stage gate innovation model to apply LCA. Chapter one concludes with a call to develop anticipatory LCA---building on the theory of anticipatory governance---as a series of methodological improvements that seek to align LCA practices with the objectives of RRI. Chapter two provides a framework for anticipatory LCA, identifies where research from multiple disciplines informs LCA practice, and builds off the recommendations presented in the preceding chapter. Chapter two focuses on crystalline and thin film photovoltaics (PV) to illustrate the novel framework, in part because PV is an environmentally motivated technology undergoing extensive R&D efforts and

  3. Development and assessment of the Alberta Context Tool

    Directory of Open Access Journals (Sweden)

    Birdsell Judy M

    2009-12-01

    Full Text Available Abstract Background The context of healthcare organizations such as hospitals is increasingly accepted as having the potential to influence the use of new knowledge. However, the mechanisms by which the organizational context influences evidence-based practices are not well understood. Current measures of organizational context lack a theory-informed approach, lack construct clarity and generally have modest psychometric properties. This paper presents the development and initial psychometric validation of the Alberta Context Tool (ACT, an eight dimension measure of organizational context for healthcare settings. Methods Three principles guided the development of the ACT: substantive theory, brevity, and modifiability. The Promoting Action on Research Implementation in Health Services (PARiHS framework and related literature were used to guide selection of items in the ACT. The ACT was required to be brief enough to be tolerated in busy and resource stretched work settings and to assess concepts of organizational context that were potentially modifiable. The English version of the ACT was completed by 764 nurses (752 valid responses working in seven Canadian pediatric care hospitals as part of its initial validation. Cronbach's alpha, exploratory factor analysis, analysis of variance, and tests of association were used to assess instrument reliability and validity. Results Factor analysis indicated a 13-factor solution (accounting for 59.26% of the variance in 'organizational context'. The composition of the factors was similar to those originally conceptualized. Cronbach's alpha for the 13 factors ranged from .54 to .91 with 4 factors performing below the commonly accepted alpha cut off of .70. Bivariate associations between instrumental research utilization levels (which the ACT was developed to predict and the ACT's 13 factors were statistically significant at the 5% level for 12 of the 13 factors. Each factor also showed a trend of

  4. Analytical Method Development and Validation of Related Substance Method for Bortezomib for Injection 3.5 mg/Vial by RP-HPLC Method

    Directory of Open Access Journals (Sweden)

    Utage M

    2013-04-01

    Full Text Available An accurate, precise, simple and economical High Performance Liquid Chromatographic method for therelated substance determination of Bortezomib in its lyophilized dosage form has been developed. Themethod developed is Reverse Phase High Performance Liquid Chromatographic method using HypersilBDS C18 column (Length: 150mm, Diameter: 4.6mm, Particle size: 5μ with Gradient programmed anda simple Acetonitrile, Water and Formic acid in the ratio of 30:70:0.1 (v/v/v respectively as mobilephase A and Acetonitrile, Water and Formic acid in the ratio of 80:20:0.1 (v/v/v respectively. Themethod so developed was validated in compliance with the regulatory guidelines by using welldeveloped analytical method validation tool which comprises with the analytical method validationparameters like Linearity, Accuracy, Method precision, Specificity with forced degradation, Systemsuitability, Robustness, LOD, LOQ and Ruggedness. The results obtained were well within theacceptance criteria.

  5. Developing a green building assessment tool for developing countries - Case of Jordan

    Energy Technology Data Exchange (ETDEWEB)

    Ali, Hikmat H. [Department of Architecture, Jordan University of Science and Technology, PO Box 3030, Irbid 22110 (Jordan); Al Nsairat, Saba F. [Department of Research and Design, Greater Amman Municipality, Amman (Jordan)

    2009-05-15

    The purpose of this research is to contribute to a better understanding of the concept of green building assessment tool and its role for achieving sustainable development through developing an effective green building rating system for residential units in Jordan in terms of the dimensions through which sustainable development tools are being produced and according to the local context. Developing such system is becoming necessary in the Developing World because of the considerable environmental, social and economical problems. Jordan as one of these countries is in need for this system, especially with poor resources and inefficient use. Therefore, this research studied international green building assessment tools such as such as LEED, CASBEE, BREEAM, GBTool, and others. Then defined new assessment items respecting the local conditions of Jordan and discussed them with (60) various stakeholders; 50% of them were experts of sustainable development. After selecting the assessment items they were weighted using the AHP method. The outcome of the research was a suggested green building assessment tool (SABA Green Building Rating System) - computer based program - that suits the Jordanian context in terms of environmental, social and economical perspectives. (author)

  6. Analytical tools for solitons and periodic waves corresponding to phonons on Lennard-Jones lattices in helical proteins

    DEFF Research Database (Denmark)

    D'ovidio, Francesco; Bohr, Henrik; Lindgård, Per-Anker

    2005-01-01

    -Jones potential, the solitons can be characterized analytically with a good quantitative agreement using formulas for a Toda potential with parameters fitted to the Lennard-Jones potential. We also discuss and show the robustness of the family of periodic solutions called cnoidal waves, corresponding to phonons....... The soliton phenomena described in the simulations of alpha helices may help to explain recent x-ray experiments on long alpha helices in Rhodopsin where a long lifetime of the vibrational modes has been observed....

  7. Recent Development in Big Data Analytics for Business Operations and Risk Management.

    Science.gov (United States)

    Choi, Tsan-Ming; Chan, Hing Kai; Yue, Xiaohang

    2017-01-01

    "Big data" is an emerging topic and has attracted the attention of many researchers and practitioners in industrial systems engineering and cybernetics. Big data analytics would definitely lead to valuable knowledge for many organizations. Business operations and risk management can be a beneficiary as there are many data collection channels in the related industrial systems (e.g., wireless sensor networks, Internet-based systems, etc.). Big data research, however, is still in its infancy. Its focus is rather unclear and related studies are not well amalgamated. This paper aims to present the challenges and opportunities of big data analytics in this unique application domain. Technological development and advances for industrial-based business systems, reliability and security of industrial systems, and their operational risk management are examined. Important areas for future research are also discussed and revealed.

  8. A Participatory Approach to Develop the Power Mobility Screening Tool and the Power Mobility Clinical Driving Assessment Tool

    Directory of Open Access Journals (Sweden)

    Deepan C. Kamaraj

    2014-01-01

    Full Text Available The electric powered wheelchair (EPW is an indispensable assistive device that increases participation among individuals with disabilities. However, due to lack of standardized assessment tools, developing evidence based training protocols for EPW users to improve driving skills has been a challenge. In this study, we adopt the principles of participatory research and employ qualitative methods to develop the Power Mobility Screening Tool (PMST and Power Mobility Clinical Driving Assessment (PMCDA. Qualitative data from professional experts and expert EPW users who participated in a focus group and a discussion forum were used to establish content validity of the PMCDA and the PMST. These tools collectively could assess a user’s current level of bodily function and their current EPW driving capacity. Further multicenter studies are necessary to evaluate the psychometric properties of these tests and develop EPW driving training protocols based on these assessment tools.

  9. A Process Analytical Technology (PAT) approach to control a new API manufacturing process: development, validation and implementation.

    Science.gov (United States)

    Schaefer, Cédric; Clicq, David; Lecomte, Clémence; Merschaert, Alain; Norrant, Edith; Fotiadu, Frédéric

    2014-03-01

    Pharmaceutical companies are progressively adopting and introducing Process Analytical Technology (PAT) and Quality-by-Design (QbD) concepts promoted by the regulatory agencies, aiming the building of the quality directly into the product by combining thorough scientific understanding and quality risk management. An analytical method based on near infrared (NIR) spectroscopy was developed as a PAT tool to control on-line an API (active pharmaceutical ingredient) manufacturing crystallization step during which the API and residual solvent contents need to be precisely determined to reach the predefined seeding point. An original methodology based on the QbD principles was designed to conduct the development and validation of the NIR method and to ensure that it is fitted for its intended use. On this basis, Partial least squares (PLS) models were developed and optimized using chemometrics methods. The method was fully validated according to the ICH Q2(R1) guideline and using the accuracy profile approach. The dosing ranges were evaluated to 9.0-12.0% w/w for the API and 0.18-1.50% w/w for the residual methanol. As by nature the variability of the sampling method and the reference method are included in the variability obtained for the NIR method during the validation phase, a real-time process monitoring exercise was performed to prove its fit for purpose. The implementation of this in-process control (IPC) method on the industrial plant from the launch of the new API synthesis process will enable automatic control of the final crystallization step in order to ensure a predefined quality level of the API. In addition, several valuable benefits are expected including reduction of the process time, suppression of a rather difficult sampling and tedious off-line analyses.

  10. Towards the Development of Web-based Business intelligence Tools

    DEFF Research Database (Denmark)

    Georgiev, Lachezar; Tanev, Stoyan

    2011-01-01

    This paper focuses on using web search techniques in examining the co-creation strategies of technology driven firms. It does not focus on the co-creation results but describes the implementation of a software tool using data mining techniques to analyze the content on firms’ websites. The tool...

  11. Development of precision spray forming for rapid tooling

    Energy Technology Data Exchange (ETDEWEB)

    Yang Yunfeng [VTT Technical Research Centre of Finland, POB 1000, FI-02044 VTT (Finland); Hannula, Simo-Pekka [VTT Technical Research Centre of Finland, POB 1000, FI-02044 VTT (Finland); Laboratory of Materials Science, Helsinki University of Technology, POB 6200, FI-02015 TKK (Finland)], E-mail: simo-pekka.hannula@tkk.fi

    2008-03-25

    The aim of the work is to improve the capability of the precision spray forming (PSF) rapid tooling process so that it can be extended to various applications. This work comprises the upgrading of the current spray-forming machine from single atomizer to twin atomizers, so that the capability is much improved in terms of insert size and complexity. As a result, the insert size is increased from about 200 mm to 400 mm in diameter, and the process is more reliable to make complex structures. Know-how is accumulated for making large and/or complex inserts with controllable surface and internal soundness. A process of spray forming conformal cooling channels in die inserts or other components used at elevated temperatures is also developed and various mould inserts are spray formed. In this paper the plant modification is described. It is shown that the twin atomizers are more reliable in spray forming small inserts of about 200 mm in diameter and of high complexity than the single atomizer system. Spray forming of disc type inserts up to 400 mm diameter is demonstrated. Influence of deposition temperature and substrate moving speed, as well as the treatment of the ceramic mould surface is determined and the technical measures to prevent surface defects related to large insert spray forming are specified.

  12. The interaction of syntax, semantics & pragmatics in grammars: the development of analytic tools in modern linguistics

    Directory of Open Access Journals (Sweden)

    Robert D. Van Valin Junior

    2001-02-01

    Full Text Available

    One of the primary tasks facing a grammatical theory is to capture the interaction of syntax, semantics and pragmatics in linguistic systems. This is essential if linguistic theory is to explain the communicative functions of grammatical structures in particular languages and across languages. The questions which must be answered include: what is the appropriate universally valid representation for syntactic structure?, what would be an adequate representation of crucial aspects of the semantics of propositions?, how can discourse-pragmatic information be represented in a grammatically relevant way, and, most important, how do these different representations interact with each other? In this paper answers to these questions will be given in terms of Role and Reference Grammar (Van Valin, 1993; Van Valin & La Polla, 1997.

  13. Developing the role of big data and analytics in health professional education.

    Science.gov (United States)

    Ellaway, Rachel H; Pusic, Martin V; Galbraith, Robert M; Cameron, Terri

    2014-03-01

    As we capture more and more data about learners, their learning, and the organization of their learning, our ability to identify emerging patterns and to extract meaning grows exponentially. The insights gained from the analyses of these large amounts of data are only helpful to the extent that they can be the basis for positive action such as knowledge discovery, improved capacity for prediction, and anomaly detection. Big Data involves the aggregation and melding of large and heterogeneous datasets while education analytics involves looking for patterns in educational practice or performance in single or aggregate datasets. Although it seems likely that the use of education analytics and Big Data techniques will have a transformative impact on health professional education, there is much yet to be done before they can become part of mainstream health professional education practice. If health professional education is to be accountable for its programs run and are developed, then health professional educators will need to be ready to deal with the complex and compelling dynamics of analytics and Big Data. This article provides an overview of these emerging techniques in the context of health professional education.

  14. Analytical development and optimization of a graphene–solution interface capacitance model

    Directory of Open Access Journals (Sweden)

    Hediyeh Karimi

    2014-05-01

    Full Text Available Graphene, which as a new carbon material shows great potential for a range of applications because of its exceptional electronic and mechanical properties, becomes a matter of attention in these years. The use of graphene in nanoscale devices plays an important role in achieving more accurate and faster devices. Although there are lots of experimental studies in this area, there is a lack of analytical models. Quantum capacitance as one of the important properties of field effect transistors (FETs is in our focus. The quantum capacitance of electrolyte-gated transistors (EGFETs along with a relevant equivalent circuit is suggested in terms of Fermi velocity, carrier density, and fundamental physical quantities. The analytical model is compared with the experimental data and the mean absolute percentage error (MAPE is calculated to be 11.82. In order to decrease the error, a new function of E composed of α and β parameters is suggested. In another attempt, the ant colony optimization (ACO algorithm is implemented for optimization and development of an analytical model to obtain a more accurate capacitance model. To further confirm this viewpoint, based on the given results, the accuracy of the optimized model is more than 97% which is in an acceptable range of accuracy.

  15. CRMS vegetation analytical team framework: Methods for collection, development, and use of vegetation response variables

    Science.gov (United States)

    Cretini, Kari F.; Visser, Jenneke M.; Krauss, Ken W.; Steyer, Gregory D.

    2011-01-01

    This document identifies the main objectives of the Coastwide Reference Monitoring System (CRMS) vegetation analytical team, which are to provide (1) collection and development methods for vegetation response variables and (2) the ways in which these response variables will be used to evaluate restoration project effectiveness. The vegetation parameters (that is, response variables) collected in CRMS and other coastal restoration projects funded under the Coastal Wetlands Planning, Protection and Restoration Act (CWPPRA) are identified, and the field collection methods for these parameters are summarized. Existing knowledge on community and plant responses to changes in environmental drivers (for example, flooding and salinity) from published literature and from the CRMS and CWPPRA monitoring dataset are used to develop a suite of indices to assess wetland condition in coastal Louisiana. Two indices, the floristic quality index (FQI) and a productivity index, are described for herbaceous and forested vegetation. The FQI for herbaceous vegetation is tested with a long-term dataset from a CWPPRA marsh creation project. Example graphics for this index are provided and discussed. The other indices, an FQI for forest vegetation (that is, trees and shrubs) and productivity indices for herbaceous and forest vegetation, are proposed but not tested. New response variables may be added or current response variables removed as data become available and as our understanding of restoration success indicators develops. Once indices are fully developed, each will be used by the vegetation analytical team to assess and evaluate CRMS/CWPPRA project and program effectiveness. The vegetation analytical teams plan to summarize their results in the form of written reports and/or graphics and present these items to CRMS Federal and State sponsors, restoration project managers, landowners, and other data users for their input.

  16. 41 CFR 102-80.120 - What analytical and empirical tools should be used to support the life safety equivalency...

    Science.gov (United States)

    2010-07-01

    ... Safety, NEPA 101A) should be used to support the life safety equivalency evaluation. If fire modeling is... empirical tools should be used to support the life safety equivalency evaluation? 102-80.120 Section 102-80...) FEDERAL MANAGEMENT REGULATION REAL PROPERTY 80-SAFETY AND ENVIRONMENTAL MANAGEMENT Accident and...

  17. Development of Tools and Techniques for Processing STORRM Flight Data

    Science.gov (United States)

    Robinson, Shane; D'Souza, Christopher

    2011-01-01

    While at JSC for the summer of 2011, I was assigned to work on the sensor test for Orion relative-navigation risk mitigation (STORRM) development test objective (DTO). The STORRM DTO was flown on-board Endeavor during STS-134. The objective of the STORRM DTO is to test the visual navigation system (VNS), which will be used as the primary relative navigation sensor for the Orion spacecraft. The VNS is a flash lidar system intended to provide both line of sight and range information during rendezvous and proximity operations. The STORRM DTO also serves as a testbed for the high-resolution docking camera. This docking camera will be used to provide piloting cues for the crew during proximity operations. These instruments were mounted next to the trajectory control sensor (TCS) in Endeavour s payload bay. My principle objective for the summer was to generate a best estimated trajectory (BET) for Endeavor using the flight data collected by the VNS during rendezvous and the unprecedented re-rendezvous with the ISS. I processed the raw images from the VNS to produce range and bearing measurements. I then aggregated these measurements and extracted the measurements corresponding to individual reflectors. I combined the information contained in these measurements with data from the Endeavour's inertial sensors using Kalman smoothing techniques to ultimately produce a BET. This work culminated with a final presentation of the result to division management. Development of this tool required that traditional linear smoothing techniques be modified in a novel fashion to permit for the inclusion of non-linear measurements. This internship has greatly helped me further my career by providing exposure to real engineering projects. I also have benefited immensely from the mentorship of the engineers working on these projects. Many of the lessons I learned and experiences I had are of particular value because then can only be found in a place like JSC.

  18. Learn Xcode Tools for Mac OS X and iPhone Development

    CERN Document Server

    Piper, I

    2010-01-01

    This book will give you a thorough grounding in the principal and supporting tools and technologies that make up the Xcode Developer Tools suite. Apple has provided a comprehensive collection of developer tools, and this is the first book to examine the complete Apple programming environment for both Mac OS X and iPhone. * Comprehensive coverage of all the Xcode developer tools * Additional coverage of useful third-party development tools* Not just a survey of features, but a serious examination of the complete development process for Mac OS X and iPhone applications What you'll learn* The boo

  19. Analytic Materials

    CERN Document Server

    Milton, Graeme W

    2016-01-01

    The theory of inhomogeneous analytic materials is developed. These are materials where the coefficients entering the equations involve analytic functions. Three types of analytic materials are identified. The first two types involve an integer $p$. If $p$ takes its maximum value then we have a complete analytic material. Otherwise it is incomplete analytic material of rank $p$. For two-dimensional materials further progress can be made in the identification of analytic materials by using the well-known fact that a $90^\\circ$ rotation applied to a divergence free field in a simply connected domain yields a curl-free field, and this can then be expressed as the gradient of a potential. Other exact results for the fields in inhomogeneous media are reviewed. Also reviewed is the subject of metamaterials, as these materials provide a way of realizing desirable coefficients in the equations.

  20. Novel Applications of Lanthanoides as Analytical or Diagnostic Tools in the Life Sciences by ICP-MS-based Techniques

    Science.gov (United States)

    Müller, Larissa; Traub, Heike; Jakubowski, Norbert

    2016-11-01

    Inductively coupled plasma mass spectrometry (ICP-MS) is a well-established analytical method for multi-elemental analysis in particular for elements at trace and ultra-trace levels. It has found acceptance in various application areas during the last decade. ICP-MS is also more and more applied for detection in the life sciences. For these applications, ICP-MS excels by a high sensitivity, which is independent of the molecular structure of the analyte, a wide linear dynamic range and by excellent multi-element capabilities. Furthermore, methods based on ICP-MS offer simple quantification concepts, for which usually (liquid) standards are applied, low matrix effects compared to other conventional bioanalytical techniques, and relative limits of detection (LODs) in the low pg g-1 range and absolute LODs down to the attomol range. In this chapter, we focus on new applications where the multi-element capability of ICP-MS is used for detection of lanthanoides or rare earth elements, which are applied as elemental stains or tags of biomolecules and in particular of antibodies.

  1. Determination of thin noble metal layers using laser ablation ICP-MS: An analytical tool for NobleChem technology

    Energy Technology Data Exchange (ETDEWEB)

    Guenther-Leopold, Ines; Hellwig, Christian [Paul Scherrer Institut, PSI, CH-5232 Villigen (Switzerland); Guillong, Marcel [ETH Zurich HG, Raemistrasse 101, 8092 Zurich (Switzerland)

    2006-07-01

    understand the transport, (re-)distribution and deposition behaviour of the noble metals in the reactor coolant circuit and to control the SCC mitigation effectiveness of NobleChem, analytical methods determining the local Pt and Rh concentration on highly radioactive deposition and crack/crevice monitors or components/fuel surfaces are required. Laser Ablation Inductively Coupled Plasma Mass Spectrometry (LA-ICP-MS) is a promising method for this purpose. LA-ICP-MS has gained increasing popularity over the last decade for the direct multi-element determination of major, minor, and trace elements in a variety of solid materials in geology, chemistry, metallurgy and biology. From the early experiments with IR laser, the development moved quickly towards the use of UV lasers. Shorter wavelength improved the laser-sample interaction primarily for transparent samples. Several types of lasers are in use, whereas the most widespread used LA systems are based on Nd:YAG lasers operating at the fourth harmonic at 266 nm. It offers the advantages of high spatial resolution, low sample preparation needs, low limits of detection and good quantification capabilities. A lot of effort has been made in the last years to improve the sensitivity of the technique and to simplify the quantification. Most of the work carried out focused on the sampling in terms of the laser wavelengths, pulse duration, carrier gas and ablation cell design as significant parameters influencing the aerosol generation, transport to the ICP and ionisation therein. Laser ablation ICP-MS has previously been used for thin layer and depth profile analyses. The detection and quantification capabilities for the determination of local noble metal concentrations using LA-ICP-MS were evaluated by the analysis of austenitic stainless steel samples homogeneously coated with platinum. The paper has the following structure: Introduction; Experimental; Sample preparation; Instrumentation; Results; Conclusion. To summarize, in a

  2. The Role Dafachronic Acid Signaling in Development and Longevity in Caenorhabditis elegans: Digging Deeper Using Cutting Edge Analytical Chemistry

    Directory of Open Access Journals (Sweden)

    Hugo eAguilaniu

    2016-02-01

    Full Text Available Steroid hormones regulate physiological processes in species ranging from plants to humans. A wide range of steroid hormones exist, and their contributions to processes such as growth, reproduction, development, and aging, is almost always complex. Understanding the biosynthetic pathways that generate steroid hormones and the signaling pathways that mediate their effects is thus of fundamental importance. In this work, we review recent advances in (i the biological role of steroid hormones in the roundworm Caenorhabditis elegans and (ii the development of novel methods to facilitate the detection and identification of these molecules. Our current understanding of steroid signaling in this simple organism serves to illustrate the challenges we face moving forward. First, it seems clear that we have not yet identified all of the enzymes responsible for steroid biosynthesis and/or degradation. Second, perturbation of steroid signaling affects a wide range of phenotypes, and subtly different steroid molecules can have distinct effects. Finally, steroid hormone levels are critically important, and minute variations in quantity can profoundly impact a phenotype. Thus, it is imperative that we develop innovative analytical tools and combine them with cutting-edge approaches such as comprehensive and highly selective liquid chromatography coupled to mass spectrometry (LC-MS based or new methods such as supercritical fluid chromatography coupled to mass spectrometry (SFC-MS if we are to obtain a better understanding of the biological functions of steroid signaling.

  3. Doing social media analytics

    Directory of Open Access Journals (Sweden)

    Phillip Brooker

    2016-07-01

    Full Text Available In the few years since the advent of ‘Big Data’ research, social media analytics has begun to accumulate studies drawing on social media as a resource and tool for research work. Yet, there has been relatively little attention paid to the development of methodologies for handling this kind of data. The few works that exist in this area often reflect upon the implications of ‘grand’ social science methodological concepts for new social media research (i.e. they focus on general issues such as sampling, data validity, ethics, etc.. By contrast, we advance an abductively oriented methodological suite designed to explore the construction of phenomena played out through social media. To do this, we use a software tool – Chorus – to illustrate a visual analytic approach to data. Informed by visual analytic principles, we posit a two-by-two methodological model of social media analytics, combining two data collection strategies with two analytic modes. We go on to demonstrate each of these four approaches ‘in action’, to help clarify how and why they might be used to address various research questions.

  4. The development of a standard for a power plant analytical chemistry quality management system

    Energy Technology Data Exchange (ETDEWEB)

    Meils, D.E. [Scientech, LLC, Dunedin, FL (United States); Mastroianni, J.A. [Scientech Information Services, Oshawa, ON (Canada)

    2008-04-15

    This paper reports on the changes that have taken place since 2004 in the development of a Standard that defines those objectives that must be met in order for a power plant laboratory to demonstrate it operates a technically competent quality management system and is capable of producing technically competent results. The Standard for a Power Plant Analytical Chemistry Quality Management System was produced by the Power Plant Chemistry QA/QC Advisory Group and includes those practices required to meet the stated objectives. (orig.)

  5. WP3 Prototype development for operational planning tool

    DEFF Research Database (Denmark)

    Kristoffersen, Trine; Meibom, Peter; Apfelbeck, J.

    of electricity load and wind power production, and to cover forced outages of power plants and transmission lines. Work has been carried out to include load uncertainty and forced outages in the two main components of the Wilmar Planning tool namely the Scenario Tree Tool and the Joint Market Model. This work...... is documented in chapter 1 and 2. The inclusion of load uncertainty and forced outages in the Scenario Tree Tool enables calculation of the demand for reserve power depending on the forecast horizon. The algorithm is given in Section 3.1. The design of a modified version of the Joint Market Model enabling....... Further, the methodology to identify extreme events on the basis of the existing tools is described. Within the SUPWIND consortium there has been an interest in using the Joint Market Model to model smaller parts of a power system but with more detailed representation of the transmission and distribution...

  6. Reality of LOG analytical tool of ATM teller machines movement actions%ATM取款机机芯动作LOG文件解析工具的实现

    Institute of Scientific and Technical Information of China (English)

    何惠英; 李纪红; 俞妍; 沈虹

    2013-01-01

    To reduce the demands to general maintenance personnels for bank self-service teller machines, it provides a log files analytical tool of ATM teller machine in Chinese based on Python language. By analyzing data in nearly 20,000 machines of 500G capacity with the analytical tool based on the method mentioned in this article, it can concluded that by using this method , the operating performance of the machines can be obtained fastly, accurately and detailedly, and machine failures can be excluded easily.%文中基于降低对银行自助取款机一般维护人员要求的目的,采用Python编程语言编写了取款机机芯运行记录文件的中文解析工具.通过近两万台500G容量的生产环境下机器运行数据分析,得出采用本文所述方法编写的解析工具可以快速、准确、详细地获得机器的运行性能并能据此排除机器故障的结论.

  7. The Development and Demonstration of The Metric Assessment Tool

    Science.gov (United States)

    1993-09-01

    motivate continuous improvement and likewise quality. Attributen of MNaninafui Metrica Section Overview. The importance of metrics cannot be overstated...some of the attributes of meaningful measures discussed earlier in this chapter. The Metrica Handbook. This guide is utilized by a variety of Air...Metric Assessment Tool. 3-8 Metrica Belaction The metric assessment tool was designed to apply to any type of metric. Two criteria were established for

  8. Developing the Next Generation of Tools for Simulating Galaxy Outflows

    Science.gov (United States)

    Scannapieco, Evan

    Outflows are observed in starbursting galaxies of all masses and at all cosmological epochs. They play a key role throughout the history of the Universe: shaping the galaxy mass-metallicity relation, drastically affecting the content and number density of dwarf galaxies, and transforming the chemical composition of the intergalactic medium. Yet, a complete model of galaxy out ows has proven to be elusive, as it requires both a better understanding of the evolution of the turbulent, multiphase gas in and around starbursting galaxies, and better tools to reproduce this evolution in galaxy-scale simulations. Here we propose to conduct a detailed series of numerical simulations designed to help develop such next-generation tools for the simulation of galaxy outflows. The program will consist of three types of direct numerical simulations, each of which will be targeted to allow galaxy-scale simulations to more accurately model key microphysical processes and their observational consequences. Our first set of simulations will be targeted at better modeling the starbursting interstellar medium (ISM) from which galaxy outflows are driven. The surface densities in starbursting galaxies are much larger than those in the Milky Way, resulting in larger gravitational accelerations and random velocities exceeding 30 or even 100 km/s. Under these conditions, the thermal stability of the ISM is changed dramatically, due to the sharp peak in gas cooling efficiency at H 200,000 K. Our simulations will carefully quantify the key ways in which this medium differs from the local ISM, and the consequences of these differences for when, where, and how outflows are driven. A second set of simulations will be targeted at better modeling the observed properties of rapidly cooling, highly turbulent gas. Because gas cooling in and around starbursts is extremely efficient, turbulent motions are often supersonic, which leads to a distribution of ionization states that is vastly different than

  9. Developing Tools to Assess European Trace Gas Trends

    Science.gov (United States)

    Wilson, Rebecca; Fleming, Zoe; Henne, Stephan; Monks, Paul

    2010-05-01

    The GEOmon (Global Earth Observation and MONitoring) project has produced a harmonised data set of trace gases from thirty ground-based measurement stations belonging to a number of regional, national and European air quality networks (e.g. EMEP, GAW). A variety of tools have been developed in R to evaluate European trace gas trends as a method to assess data quality and the effectiveness of European emission legislation. Long-term O3, NO2 and CO have been characterised at all sites using lowess regression. Additionally, O3 was deseasonalised and linear trends were fitted to and quantified for monthly means, 5th and 95th percentiles (to illustrate changes in mean, background and peak concentrations respectively). Twenty-four of these sites have data between 1996-2005 (Incl). Analysis of these sites for the time period provides an easily comparable characterisation of continental-scale O3 trends. However, few sites have statistically significant trends during this limited analysis period. The RETRO monthly NOx emissions fluxes at the GEOmon harmonised data sites were plotted from 1985-2000. The introduction of catalytic converters in Europe in 1985 and subsequent EU legislation in 1993 (requiring catalytic converters in all new petrol cars sold), corresponds to a decrease in NOx emissions throughout 1990's for the majority of sites. It is noted that the rate of reduction in NOx emissions decreases from the mid-1990's to 2000 for fifteen locations. This may account for the less pronounced, and reduced statistical significance of, O3 trends during the 1996-2005 period. Although the spatial distribution of European O3 trends 1996-2005 is inconclusive for the present GEOmon harmonised dataset, the expansion to more European sites may lead to a more detailed characterisation.

  10. Development of a multi-residue analytical method for TBBP-A and PBDEs in various biological matrices using unique reduced size sample

    Energy Technology Data Exchange (ETDEWEB)

    Andre, F.; Cariou, R.; Antignac, J.P.; Le Bizec, B. [Ecole Nationale Veterinaire de Nantes (FR). Laboratoire d' Etudes des Residus et Contaminants dans les Aliments (LABERCA); Debrauwer, L.; Zalko, D. [Institut National de Recherches Agronomiques (INRA), 31-Toulouse (France). UMR 1089 Xenobiotiques

    2004-09-15

    The impact of brominated flame retardants on the environment and their potential risk for animal and human health is a present time concern for the scientific community. Numerous studies related to the detection of tetrabromobisphenol A (TBBP-A) and polybrominated diphenylethers (PBDEs) have been developed over the last few years; they were mainly based on GC-ECD, GC-NCI-MS or GC-EI-HRMS, and recently GC-EI-MS/MS. The sample treatment is usually derived from the analytical methods used for dioxins, but recently some authors proposed the utilisation of solid phase extraction (SPE) cartridges. In this study, a new analytical strategy is presented for the multi-residue analysis of TBBP-A and PBDEs from a unique reduced size sample. The main objective of this analytical development is to be applied for background exposure assessment of French population groups to brominated flame retardants, for which, to our knowledge, no data exist. A second objective is to provide an efficient analytical tool to study the transfer of these contaminants through the environment to living organisms, including degradation reactions and metabolic biotransformations.

  11. Emergy Evaluation: A Tool for the Assessment of Sustainability in Project Development

    Directory of Open Access Journals (Sweden)

    Natalia Andrea Cano Londoño

    2014-02-01

    Full Text Available This contributiondescribes the basic principles of the emergy accounting method as a technique for quantitativeassessment of sustainability in developing projects. Emergy determines the amount of energy used directly and indirectly for generating resources, services and products. In addition,this analytical method integrates economics, ecology and thermodynamics.This allows a comprehensive assessment of the economic, social and environmental impacts in a theoretical way. Currently, Emergy evaluation is considered as one of the life cycle sustainability assessment methods in areas similar to ecological, economics, and environmental engineering. Furthermore, the entire methodological path for the application and development of the emergy accounting method was well described. In addition, some case studies where emergy has been the key data needed for sustainability decision making were analyzed and discussed. This work further provides the practical connection between theoretical definitions of emergy, data needs, and mathematical definitions of indicators for emergy sustainability assessment. With the accomplishment of this contribution, emergy assessment can be achieved and proposed as a tool for the development of sustainable projects.

  12. Conceptual air sparging decision tool in support of the development of an air sparging optimization decision tool

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-09-01

    The enclosed document describes a conceptual decision tool (hereinafter, Tool) for determining applicability of and for optimizing air sparging systems. The Tool was developed by a multi-disciplinary team of internationally recognized experts in air sparging technology, lead by a group of project and task managers at Parsons Engineering Science, Inc. (Parsons ES). The team included Mr. Douglas Downey and Dr. Robert Hinchee of Parsons ES, Dr. Paul Johnson of Arizona State University, Dr. Richard Johnson of Oregon Graduate Institute, and Mr. Michael Marley of Envirogen, Inc. User Community Panel Review was coordinated by Dr. Robert Siegrist of Colorado School of Mines (also of Oak Ridge National Laboratory) and Dr. Thomas Brouns of Battelle/Pacific Northwest Laboratory. The Tool is intended to provide guidance to field practitioners and environmental managers for evaluating the applicability and optimization of air sparging as remedial action technique.

  13. Nanopeptamers for the development of small-analyte lateral flow tests with a positive readout.

    Science.gov (United States)

    Vanrell, Lucía; Gonzalez-Techera, Andrés; Hammock, Bruce D; Gonzalez-Sapienza, Gualberto

    2013-01-15

    There is a great demand for rapid tests that can be used on-site for the detection of small analytes, such as pesticides, persistent organic pollutants, explosives, toxins, medicinal and abused drugs, hormones, etc. Dipsticks and lateral flow devices, which are simple and provide a visual readout, may be the answer, but the available technology for these compounds requires a competitive format that loses sensitivity and produces readings inversely proportional to the analyte concentration, which is counterintuitive and may lead to potential misinterpretation of the result. In this work, protein-multipeptide constructs composed of anti-immunocomplex peptides selected from phage libraries and streptavidin/avidin as core protein were used for direct detection of small compounds in a noncompetitive two-site immunoassay format that performs with increased sensitivity and positive readout. These constructs that we termed "nanopeptamers" allow the development of rapid point-of-use tests with a positive visual end point of easy interpretation. As proof of concept, lateral flow assays for the herbicides molinate and clomazone were developed and their performance was characterized with field samples.

  14. The recent developments in dispersive liquid–liquid microextraction for preconcentration and determination of inorganic analytes

    Directory of Open Access Journals (Sweden)

    H.M. Al-Saidi

    2014-12-01

    Full Text Available Recently, increasing interest on the use of dispersive liquid–liquid microextraction (DLLME developed in 2006 by Rezaee has been found in the field of separation science. DLLME is miniaturized format of liquid–liquid extraction in which acceptor-to-donor phase ratio is greatly reduced compared with other methods. In the present review, the combination of DLLME with different analytical techniques such as atomic absorption spectrometry (AAS, inductively coupled plasma-optical emission spectrometry (ICP-OES, gas chromatography (GC, and high-performance liquid chromatography (HPLC for preconcentration and determination of inorganic analytes in different types of samples will be discussed. Recent developments in DLLME, e.g., displacement-DLLME, the use of an auxiliary solvent for adjustment of density of extraction mixture, and the application of ionic liquid-based DLLME in determination of inorganic species even in the presence of high content of salts are presented in the present review. Finally, comparison of DLLME with the other liquid-phase microextraction approaches and limitations of this technique are provided.

  15. A Methodology to Develop Design Support Tools for Stand-alone Photovoltaic Systems in Developing Countries

    Directory of Open Access Journals (Sweden)

    Stefano Mandelli

    2014-08-01

    Full Text Available As pointed out in several analyses, Stand-Alone Photovoltaic systems may be a relevant option for rural electrification in Developing Countries. In this context, Micro and Small Enterprises which supply customized Stand-Alone Photovoltaic systems play a pivotal role in the last-mile-distribution of this technology. Nevertheless, a number of issues limit the development of these enterprises curbing also potential spinoff benefits. A common business bottleneck is the lack of technical skills since usually few people have the expertise to design and formulate estimates for customers. The long-term solution to tackle this issue implies the implementation of a capacity building process, but this solution rarely matches with time-to-market urgency of local enterprises. Therefore, we propose in this study a simple, but general methodology which can be used to set up Design Support Tools for Micro and Small Enterprises that supply Stand-Alone Photovoltaic systems in rural areas of Developing Countries. After a brief review of the techniques and commercial software available to design the targeted technology, we describe the methodology highlighting the structure, the sizing equations and the main features that should be considered in developing a Design Support Tool. Then, we apply the methodology to set up a tool for use in Uganda and we compare the results with two commercial codes (NSolVx and HOMER. The results show that the implemented Design Support Tool develops correct system designs and presents some advantages for being disseminated in rural areas. Indeed it supports the user in providing the input data, selecting the main system components and delivering estimates to customers.

  16. Lilith: A software framework for the rapid development of scalable tools for distributed computing

    Energy Technology Data Exchange (ETDEWEB)

    Gentile, A.C.; Evensky, D.A.; Armstrong, R.C.

    1998-03-01

    Lilith is a general purpose framework, written in Java, that provides a highly scalable distribution of user code across a heterogeneous computing platform. By creation of suitable user code, the Lilith framework can be used for tool development. The scalable performance provided by Lilith is crucial to the development of effective tools for large distributed systems. Furthermore, since Lilith handles the details of code distribution and communication, the user code need focus primarily on the tool functionality, thus, greatly decreasing the time required for tool development. In this paper, the authors concentrate on the use of the Lilith framework to develop scalable tools. The authors review the functionality of Lilith and introduce a typical tool capitalizing on the features of the framework. They present new Objects directly involved with tool creation. They explain details of development and illustrate with an example. They present timing results demonstrating scalability.

  17. Towards a Reference Architecture to Provision Tools as a Service for Global Software Development

    DEFF Research Database (Denmark)

    Chauhan, Aufeef; Babar, Muhammad Ali

    2014-01-01

    Organizations involve in Global Software Development (GSD) face challenges in terms of having access to appropriate set of tools for performing distributed engineering and development activities, integration between heterogeneous desktop and web-based tools, management of artifacts developed...... and maintained over distant locations using different kind of tools, traceability among artifacts, and access to artifacts and data of sensitive nature. These challenges pose additional constraints on specific projects and reduce the possibility to carry out their engineering and development in globally...

  18. Development of analytical methods for multiplex bio-assay with inductively coupled plasma mass spectrometry.

    Science.gov (United States)

    Ornatsky, Olga I; Kinach, Robert; Bandura, Dmitry R; Lou, Xudong; Tanner, Scott D; Baranov, Vladimir I; Nitz, Mark; Winnik, Mitchell A

    2008-01-01

    Advances in the development of highly multiplexed bio-analytical assays with inductively coupled plasma mass spectrometry (ICP-MS) detection are discussed. Use of novel reagents specifically designed for immunological methods utilizing elemental analysis is presented. The major steps of method development, including selection of elements for tags, validation of tagged reagents, and examples of multiplexed assays, are considered in detail. The paper further describes experimental protocols for elemental tagging of antibodies, immunostaining of live and fixed human leukemia cells, and preparation of samples for ICP-MS analysis. Quantitative analysis of surface antigens on model cell lines using a cocktail of seven lanthanide labeled antibodies demonstrated high specificity and concordance with conventional immunophenotyping.

  19. Thermal Lens Spectroscopy as a 'new' analytical tool for actinide determination in nuclear reprocessing processes

    Energy Technology Data Exchange (ETDEWEB)

    Canto, Fabrice; Couston, Laurent; Magnaldo, Alastair [CEA-Valrho DEN/DRCP/SCPS/LCAM BP17171 30207 Bagnols/Ceze cedex (France); Broquin, Jean-Emmanuel [IMEP/ENSERG 23 rue des Martyrs BP257 38016 Grenoble (France); Signoret, Philippe [UM2/IES UMR 5214. Place Eugene Bataillon 34095 Montpellier cedex5 (France)

    2008-07-01

    Thermal Lens Spectroscopy (TLS) consists of measuring the effects induced by the relaxation of molecules excited by photons. Twenty years ago, the Cea already worked on TLS. Technologic reasons impeded. But, needs in sensitive analytical methods coupled with very low sample volumes (for example, traces of Np in the COEX{sup TM} process) and also the reduction of the nuclear wastes encourage us to revisit this method thanks to the improvement of optoelectronic technologies. We can also imagine coupling TLS with micro-fluidic technologies, decreasing significantly the experiments cost. Generally two laser beams are used for TLS: one for the selective excitation by molecular absorption (inducing the thermal lens) and one for probing the thermal lens. They can be coupled with different geometries, collinear or perpendicular, depending on the application and on the laser mode. Also, many possibilities of measurement have been studied to detect the thermal lens signal: interferometry, direct intensities variations, deflection etc... In this paper, one geometrical configuration and two measurements have been theoretically evaluated. For a single photodiode detection (z-scan) the limit of detection is calculated to be near 5*10{sup -6} mol*L{sup -1} for Np(IV) in dodecane. (authors)

  20. e-Health readiness assessment tools for healthcare institutions in developing countries.

    Science.gov (United States)

    Khoja, Shariq; Scott, Richard E; Casebeer, Ann L; Mohsin, M; Ishaq, A F M; Gilani, Salman

    2007-08-01

    e-Health Readiness refers to the preparedness of healthcare institutions or communities for the anticipated change brought by programs related to Information and Communications Technology (ICT). This paper presents e-Health Readiness assessment tools developed for healthcare institutions in developing countries. The objectives of the overall study were to develop e-health readiness assessment tools for public and private healthcare institutions in developing countries, and to test these tools in Pakistan. Tools were developed using participatory action research to capture partners' opinions, reviewing existing tools, and developing a conceptual framework based on available literature on the determinants of access to e-health. Separate tools were developed for managers and for healthcare providers to assess e-health readiness within their institutions. The tools for managers and healthcare providers contained 54 and 50 items, respectively. Each tool contained four categories of readiness. The items in each category were distributed into sections, which either represented a determinant of access to e-health, or an important aspect of planning. The conceptual framework, and the validity and reliability testing of these tools are presented in separate papers. e-Health readiness assessment tools for healthcare providers and managers have been developed for healthcare institutions in developing countries.

  1. CyberCIEGE Scenario Development Tool User’s Guide

    Science.gov (United States)

    2010-04-01

    Tutorial .............................................................................................................................. 12 Scenario...basic structure of the tool. Then follow the tutorial within this guide to learn some of the mechanics of using the SDT. SDT Layout Reusable...housing the computer and walk off with the entire computer. Or, if the asset’s attacker motive is based on integrity, the attacker might hack into

  2. INL Review of Fueling Machine Inspection Tool Development Proposal

    Energy Technology Data Exchange (ETDEWEB)

    Griffith, George [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-03-01

    A review of a technical proposal for James Fischer Nuclear. The document describes an inspection tool to examine the graphite moderator in an AGR reactor. The system is an optical system to look at the graphite blocks for cracks. INL reviews the document for technical value.

  3. Developing a Quantitative Tool for Sustainability Assessment of HEIs

    Science.gov (United States)

    Waheed, Bushra; Khan, Faisal I.; Veitch, Brian

    2011-01-01

    Purpose: Implementation of a sustainability paradigm demands new choices and innovative ways of thinking. The main objective of this paper is to provide a meaningful sustainability assessment tool for make informed decisions, which is applied to higher education institutions (HEIs). Design/methodology/approach: The objective is achieved by…

  4. Development of environmental tools for anopheline larval control

    NARCIS (Netherlands)

    Imbahale, S.S.; Mweresa, C.K.; Takken, W.; Mukabana, W.R.

    2011-01-01

    Background: Malaria mosquitoes spend a considerable part of their life in the aquatic stage, rendering them vulnerable to interventions directed to aquatic habitats. Recent successes of mosquito larval control have been reported using environmental and biological tools. Here, we report the effects o

  5. Cognitive and Social Constructivism: Developing Tools for an Effective Classroom

    Science.gov (United States)

    Powell, Katherine C.; Kalina, Cody J.

    2009-01-01

    An effective classroom, where teachers and students are communicating optimally, is dependent on using constructivist strategies, tools and practices. There are two major types of constructivism in the classroom: (1) Cognitive or individual constructivism depending on Piaget's theory, and (2) Social constructivism depending on Vygotsky's theory.…

  6. The development of a partnering assessment tool for projects

    NARCIS (Netherlands)

    Holkers, A.; Voordijk, J.T.; Greenwood, D.

    2008-01-01

    Many firms in the construction industry claim to be working in a ‘partnering’ or even in an ‘integrated’ way. It is, however, very difficult to verify these claims with the tools currently available. The purpose of this study was to collect and refine existing work on integrative and collaborative w

  7. Model Analytical Development for Physical, Chemical, and Biological Characterization of Momordica charantia Vegetable Drug

    Directory of Open Access Journals (Sweden)

    Deysiane Oliveira Brandão

    2016-01-01

    Full Text Available Momordica charantia is a species cultivated throughout the world and widely used in folk medicine, and its medicinal benefits are well documented, especially its pharmacological properties, including antimicrobial activities. Analytical methods have been used to aid in the characterization of compounds derived from plant drug extracts and their products. This paper developed a methodological model to evaluate the integrity of the vegetable drug M. charantia in different particle sizes, using different analytical methods. M. charantia was collected in the semiarid region of Paraíba, Brazil. The herbal medicine raw material derived from the leaves and fruits in different particle sizes was analyzed using thermoanalytical techniques as thermogravimetry (TG and differential thermal analysis (DTA, pyrolysis coupled to gas chromatography/mass spectrometry (PYR-GC/MS, and nuclear magnetic resonance (1H NMR, in addition to the determination of antimicrobial activity. The different particle surface area among the samples was differentiated by the techniques. DTA and TG were used for assessing thermal and kinetic parameters and PYR-GC/MS was used for degradation products chromatographic identification through the pyrograms. The infusions obtained from the fruit and leaves of Momordica charantia presented antimicrobial activity.

  8. Development of Analytical Approach to Evaluate (DiffServ-MIPv6 Scheme

    Directory of Open Access Journals (Sweden)

    Loay F. Hussien

    2014-03-01

    Full Text Available The aspiration of Mobile IPv6 is to provide uninterrupted network connectivity while the mobile node is moving between different access points or domains. Nonetheless, it does not provide QoS guaranteed to its users same as the traditional Internet protocol IP. It merely can provide Best-Effort (BE service to all its applications despite of the application requirements. The future wireless network would be based on IPv6 to provide services to Internet mobile users. Hence, one of main requirements of next generation IP based networks is providing QoS for real-time traffic that will be transporting through MIPv6 networks. This study presents the analytical analysis for the previously proposed scheme (DiffServ-MIPv6 that applies the DiffServ platform to Mobile IPv6 network in order to suit the needs of both QoS guaranteed and mobility in communication. The analytical evaluation is developed to assess the performance of the proposed scheme (DiffServ-MIPv6 compared to the standard MIPv6 protocol in terms of signaling cost. The signaling cost is measured against two factors session-to-mobility ratio and packet arrival rate.

  9. Model Analytical Development for Physical, Chemical, and Biological Characterization of Momordica charantia Vegetable Drug

    Science.gov (United States)

    Guimarães, Geovani Pereira; Santos, Ravely Lucena; Júnior, Fernando José de Lima Ramos; da Silva, Karla Monik Alves; de Souza, Fabio Santos

    2016-01-01

    Momordica charantia is a species cultivated throughout the world and widely used in folk medicine, and its medicinal benefits are well documented, especially its pharmacological properties, including antimicrobial activities. Analytical methods have been used to aid in the characterization of compounds derived from plant drug extracts and their products. This paper developed a methodological model to evaluate the integrity of the vegetable drug M. charantia in different particle sizes, using different analytical methods. M. charantia was collected in the semiarid region of Paraíba, Brazil. The herbal medicine raw material derived from the leaves and fruits in different particle sizes was analyzed using thermoanalytical techniques as thermogravimetry (TG) and differential thermal analysis (DTA), pyrolysis coupled to gas chromatography/mass spectrometry (PYR-GC/MS), and nuclear magnetic resonance (1H NMR), in addition to the determination of antimicrobial activity. The different particle surface area among the samples was differentiated by the techniques. DTA and TG were used for assessing thermal and kinetic parameters and PYR-GC/MS was used for degradation products chromatographic identification through the pyrograms. The infusions obtained from the fruit and leaves of Momordica charantia presented antimicrobial activity. PMID:27579215

  10. Developing a MATLAB(registered)-Based Tool for Visualization and Transformation

    Science.gov (United States)

    Anderton, Blake J.

    2003-01-01

    An important step in the structural design and development of spacecraft is the experimental identification of a structure s modal characteristics, such as its natural frequencies and modes of vibration. These characteristics are vital to developing a representative model of any given structure or analyzing the range of input frequencies that can be handled by a particular structure. When setting up such a representative model of a structure, careful measurements using precision equipment (such as accelerometers and instrumented hammers) must be made on many individual points of the structure in question. The coordinate location of each data point is used to construct a wireframe geometric model of the structure. Response measurements obtained from the accelerometers is used to generate the modal shapes of the particular structure. Graphically, this is displayed as a combination of the ways a structure will ideally respond to a specified force input. Two types of models of the tested structure are often used in modal analysis: an analytic model showing expected behavior of the structure, and an experimental model showing measured results due to observed phenomena. To evaluate the results from the experimental model, a comparison of analytic and experimental results must be made between the two models. However, comparisons between these two models become difficult when the two coordinate orientations differ in a manner such that results are displayed in an unclear fashion. Such a problem proposes the need for a tool that not only communicates a graphical image of a structure s wireframe geometry based on various measurement locations (called nodes), but also allows for a type of transformation of the image s coordinate geometry so that a model s coordinate orientation is made to match the orientation of another model. Such a tool should also be designed so that it is able to construct coordinate geometry based on many different listings of node locations and is able

  11. The Profile Envision and Splicing Tool (PRESTO): Developing an Atmospheric Wind Analysis Tool for Space Launch Vehicles Using Python

    Science.gov (United States)

    Orcutt, John M.; Barbre, Robert E., Jr.; Brenton, James C.; Decker, Ryan K.

    2017-01-01

    Launch vehicle programs require vertically complete atmospheric profiles. Many systems at the ER to make the necessary measurements, but all have different EVR, vertical coverage, and temporal coverage. MSFC Natural Environments Branch developed a tool to create a vertically complete profile from multiple inputs using Python. Forward work: Finish Formal Testing Acceptance Testing, End-to-End Testing. Formal Release

  12. Development of the Assessment of Burden of COPD tool : an integrated tool to measure the burden of COPD

    NARCIS (Netherlands)

    Slok, Annerika H. M.; in 't Veen, Johannes C. C. M.; Chavannes, Niels H.; van der Molen, Thys; Rutten-van Molken, Maureen P. M. H.; Kerstjens, Huib A. B.; Salome, Philippe L.; Holverda, Sebastiaan; Dekhuijzen, P. N. Richard; Schuiten, Denise; Asijee, Guus M.; van Schayck, Onno C. P.

    2014-01-01

    In deciding on the treatment plan for patients with chronic obstructive pulmonary disease (COPD), the burden of COPD as experienced by patients should be the core focus. It is therefore important for daily practice to develop a tool that can both assess the burden of COPD and facilitate communicatio

  13. Development of the assessment of burden of COPD tool: An integrated tool to measure the burden of COPD

    NARCIS (Netherlands)

    A.H.M. Slok (Annerika); J.C.C.M. in 't Veen (Johannes); N.H. Chavannes (Nicolas); T. van der Molen (Thys); M.P.M.H. Rutten-van Mölken (Maureen); H.A.M. Kerstjens (Huib); J. Salomé; S. Holverda (Sebastiaan); P.N.R. Dekhuijzen (Richard); D. Schuiten (Denise); G.M. Asijee (Guus); O.C.P. Schayck (Onno)

    2014-01-01

    textabstractIn deciding on the treatment plan for patients with chronic obstructive pulmonary disease (COPD), the burden of COPD as experienced by patients should be the core focus. It is therefore important for daily practice to develop a tool that can both assess the burden of COPD and facilitate

  14. Development of the Assessment of Burden of COPD tool: an integrated tool to measure the burden of COPD

    NARCIS (Netherlands)

    Slok, A.H.; Veen, J.C. In 't; Chavannes, N.H.; Molen, T. van der; Molken, M.P. Rutten-van; Kerstjens, H.A.; Salome, P.L.; Holverda, S.; Dekhuijzen, P.N.R.; Schuiten, D.; Asijee, G.M.; Schayck, O.C.P. van

    2014-01-01

    In deciding on the treatment plan for patients with chronic obstructive pulmonary disease (COPD), the burden of COPD as experienced by patients should be the core focus. It is therefore important for daily practice to develop a tool that can both assess the burden of COPD and facilitate communicatio

  15. Lilith: A software framework for the rapid development of scalable tools for distributed computing

    Energy Technology Data Exchange (ETDEWEB)

    Gentile, A.C.; Evensky, D.A.; Armstrong, R.C.

    1997-12-31

    Lilith is a general purpose tool that provides a highly scalable, easy distribution of user code across a heterogeneous computing platform. By handling the details of code distribution and communication, such a framework allows for the rapid development of tools for the use and management of large distributed systems. This speed-up in development not only enables the easy creation of tools as needed but also facilitates the ultimate development of more refined, hard-coded tools as well. Lilith is written in Java, providing platform independence and further facilitating rapid tool development through Object reuse and ease of development. The authors present the user-involved objects in the Lilith Distributed Object System and the Lilith User API. They present an example of tool development, illustrating the user calls, and present results demonstrating Lilith`s scalability.

  16. Improving Students' Understanding of Quantum Measurement Part 2: Development of Research-based Learning Tools

    CERN Document Server

    Zhu, Guangtian

    2016-01-01

    We describe the development and implementation of research-based learning tools such as the Quantum Interactive Learning Tutorials (QuILTs) and peer instruction tools to reduce students' common difficulties with issues related to measurement in quantum mechanics. A preliminary evaluation shows that these learning tools are effective in improving students' understanding of concepts related to quantum measurement.

  17. "Development Radar": The Co-Configuration of a Tool in a Learning Network

    Science.gov (United States)

    Toiviainen, Hanna; Kerosuo, Hannele; Syrjala, Tuula

    2009-01-01

    Purpose: The paper aims to argue that new tools are needed for operating, developing and learning in work-life networks where academic and practice knowledge are intertwined in multiple levels of and in boundary-crossing across activities. At best, tools for learning are designed in a process of co-configuration, as the analysis of one tool,…

  18. Disability Rights, Gender, and Development: A Resource Tool for Action. Full Report

    Science.gov (United States)

    de Silva de Alwis, Rangita

    2008-01-01

    This resource tool builds a normative framework to examine the intersections of disability rights and gender in the human rights based approach to development. Through case studies, good practices and analyses the research tool makes recommendations and illustrates effective tools for the implementation of gender and disability sensitive laws,…

  19. Improving students’ understanding of quantum measurement. II. Development of research-based learning tools

    Directory of Open Access Journals (Sweden)

    Guangtian Zhu1,2

    2012-04-01

    Full Text Available We describe the development and implementation of research-based learning tools such as the Quantum Interactive Learning Tutorials and peer-instruction tools to reduce students’ common difficulties with issues related to measurement in quantum mechanics. A preliminary evaluation shows that these learning tools are effective in improving students’ understanding of concepts related to quantum measurement.

  20. Conception and Development Tools for SCOrWare - Version 2.0

    OpenAIRE

    Drapeau, Stéphane; Blondelle, Gaël; Fournier, Damien; Merle, Philippe; Pantel, Marc; Belaid, Djamel; Tata, Samir; Juliot, Etienne; Dutoo, Marc

    2009-01-01

    The SCOrWare project objectives are to develop: ● A runtime platform for deploying, executing, and managing SCA based applications. ● A set of development tools for modeling, designing, and implementing SCA based applications. ● A set of demonstrators. This document specifies the set of tools used to design, develop, test, and deploy elements to build a distributed architecture compound of components, services, and business services. These tools are based on standards like MDA (Model Driven A...

  1. The development of the RISK tool for fall prevention.

    Science.gov (United States)

    Brians, L K; Alexander, K; Grota, P; Chen, R W; Dumas, V

    1991-01-01

    The authors tailored a 26-item risk assessment tool (RAT) for falls based on a literature review and an analysis of causative factors of falls that had occurred over a 3-month period at the Olin E. Teague VA Medical Center, an 1,100-bed acute medical-surgical, psychiatric, and extended care facility in Temple, TX. The RAT was completed by nursing staff on 10 patient units (four medical, four surgical, and two nursing home units) for all admissions during the period. A 25% sample of the completed RATs was randomly selected (n = 208). Pearson's correlation coefficient was used to identify factors that would most likely predict falls from the RATs of the randomly selected group and of the patients who fell (n = 78). Only 4 of the 26 items were statistically related to falls. Based on findings from this study, the RAT was shortened to the four items and called the RISK (Reassessment Is Safe "Kare") tool.

  2. Reflection on the development process of a sustainability assessment tool: learning from a Flemish case

    Directory of Open Access Journals (Sweden)

    Laure Triste

    2014-09-01

    Full Text Available Adoption of sustainability assessment tools in agricultural practice is often disappointing. One of the critical success factors for adoption is the tool development process. Because scientific attention to these development processes and insights about them are rather limited, we aimed to foster the scientific debate on this topic. This was done by reflecting on the development process of a Flemish sustainability assessment tool, MOTIFS. MOTIFS was developed with the aim of becoming widely adopted by farmers and farm advisors, but this result was not achieved. Our reflection process showed success factors favoring and barriers hindering tool adoption. These were grouped into three clusters of lessons learned for sound tool development: (1 institutional embeddedness, (2 ownership, and (3 tool functions. This clustering allowed us to formulate actions for researchers on the following aspects: (1 learning from stakeholders and end users, (2 providing coaching for appropriate tool use, and (3 structuring development of different tool types and exploring spin-offs from existing tools. We hope these normative results evoke other researchers to feed a debate on understanding tool development.

  3. Developing and implementing an oral care policy and assessment tool.

    LENUS (Irish Health Repository)

    Stout, Michelle

    2012-01-09

    Oral hygiene is an essential aspect of nursing care. Poor oral care results in patients experiencing pain and discomfort, puts individuals at risk of nutritional deficiency and infection, and has an adverse effect on quality of life. This article describes how an oral care policy and assessment tool were updated to ensure the implementation of evidence-based practice at one hospital in the Republic of Ireland.

  4. Development of MWL-AUC / CCD-C-AUC / SLS-AUC detectors for the analytical ultracentrifuge

    OpenAIRE

    2009-01-01

    Analytical ultracentrifugation (AUC) has made an important contribution to polymer and particle characterization since its invention by Svedberg (Svedberg and Nichols 1923; Svedberg and Pederson 1940) in 1923. In 1926, Svedberg won the Nobel price for his scientific work on disperse systems including work with AUC. The first important discovery performed with AUC was to show the existence of macromolecules. Since that time AUC has become an important tool to study polymers in biophysics and b...

  5. Residue-specific radioimmunoanalysis: a novel analytical tool. Application to the C-terminus of CCK/gastrin peptides

    Energy Technology Data Exchange (ETDEWEB)

    Rehfeld, J.F. (Rigshospitalet, Copenhagen (Denmark)); Morley, J.S. (Imperial Chemical Industries Ltd., Alderley Park (UK). Pharmaceutical Div.)

    1983-02-01

    Five antisera directed against the common bioactive C-terminal tetrapeptide sequence of cholecystokinin (CCK) and gastrin were examined with respect to the significance of each residue for the antibody binding. Systematic substitutions and/or derivatizations of each of the four residues showed a unique pattern for each antiserum although they were raised against the same antigen and have the same sequence-specificity. The pattern of reactivity towards the related cardioexcitatory FMRF amide peptide and analogues hereof confirmed the residue specificity of the antisera. While it is well known that even small covalent modifications of the antigen can influence the antibody binding profoundly, the great variations in significance of each residue among randomly selected antisera raised against the same antigen and specific for the same sequence has not been known so far. Hence, by appropriate combination of antisera their different residue specificity can be used for detection of amino acid substitutions or modifications. Such immunochemical sequence analysis requires only femto- or picomolar amounts of peptides, which need not necessarily be purified. Thus, residue-specific immunoanalysis may be a versatile tool in studies of species differences, phylogenesis and synthesis of peptides.

  6. Information and Communication Technologies: A Tool Empowering and Developing the Horizon of the Learner

    Science.gov (United States)

    Debande, Olivier; Ottersten, Eugenia Kazamaki

    2004-01-01

    In this article, we focus on the implementation and development of ICT in the education sector, challenging and developing the traditional learning environment whilst introducing new educational tools including e-learning. The paper investigates ICT as a tool empowering and developing learners lifelong learning opportunities. It defines a model of…

  7. Neurophysiological analytics for all! Free open-source software tools for documenting, analyzing, visualizing, and sharing using electronic notebooks.

    Science.gov (United States)

    Rosenberg, David M; Horn, Charles C

    2016-08-01

    Neurophysiology requires an extensive workflow of information analysis routines, which often includes incompatible proprietary software, introducing limitations based on financial costs, transfer of data between platforms, and the ability to share. An ecosystem of free open-source software exists to fill these gaps, including thousands of analysis and plotting packages written in Python and R, which can be implemented in a sharable and reproducible format, such as the Jupyter electronic notebook. This tool chain can largely replace current routines by importing data, producing analyses, and generating publication-quality graphics. An electronic notebook like Jupyter allows these analyses, along with documentation of procedures, to display locally or remotely in an internet browser, which can be saved as an HTML, PDF, or other file format for sharing with team members and the scientific community. The present report illustrates these methods using data from electrophysiological recordings of the musk shrew vagus-a model system to investigate gut-brain communication, for example, in cancer chemotherapy-induced emesis. We show methods for spike sorting (including statistical validation), spike train analysis, and analysis of compound action potentials in notebooks. Raw data and code are available from notebooks in data supplements or from an executable online version, which replicates all analyses without installing software-an implementation of reproducible research. This demonstrates the promise of combining disparate analyses into one platform, along with the ease of sharing this work. In an age of diverse, high-throughput computational workflows, this methodology can increase efficiency, transparency, and the collaborative potential of neurophysiological research.

  8. If I had a rich picture. . .: Insights into the use of ‘‘soft’’ methodological tools to support the development of interprofessional education.

    Science.gov (United States)

    Fougner, Marit; Habib, Laurence

    2008-10-01

    This paper describes a methodological experiment that aimed to test a small number of tools borrowed from Soft Systems Methodology. Those tools were intended to support action research for a project in interprofessional educational development. The intention with using those tools was two-fold: first, they were expected to help structure the analysis of the problem situation that the project was to address; second, they were to facilitate and document the project management process itself, by allowing for the different voices within the interprofessional project team to be heard. The paper relates how the tools functioned relatively successfully as analytical devices for the action researcher, but did not significantly contribute to further interprofessional collaboration or enhance dialogue between the action researcher and the project members. Issues of how to use the tools to support more effectively the existing dialogue across professional cultures and traditions are discussed.

  9. Using a Practical Instructional Development Process to Show That Integrating Lab and Active Learning Benefits Undergraduate Analytical Chemistry

    Science.gov (United States)

    Goacher, Robyn E.; Kline, Cynthia M.; Targus, Alexis; Vermette, Paul J.

    2017-01-01

    We describe how a practical instructional development process helped a first-year assistant professor rapidly develop, implement, and assess the impact on her Analytical Chemistry course caused by three changes: (a) moving the lab into the same semester as the lecture, (b) developing a more collaborative classroom environment, and (c) increasing…

  10. Educator Study Groups: A Professional Development Tool to Enhance Inclusion

    Science.gov (United States)

    Herner-Patnode, Leah

    2009-01-01

    Professional development can take many forms. The most effective development includes individual educators in the formation and planning process. Educator study groups are one form of professional development that allows major stakeholders in the education process the autonomy to develop individual and group goals. This often translates into an…

  11. Development of a Numerical Weather Analysis Tool for Assessing the Precooling Potential at Any Location

    Directory of Open Access Journals (Sweden)

    Dimitris Lazos

    2016-12-01

    Full Text Available Precooling a building overnight during the summer is a low cost practice that may provide significant help in decreasing energy demand and shaving peak loads in buildings. The effectiveness of precooling depends on the weather patterns at the location, however research in this field is predominantly focused in the building thermal response alone. This paper proposes an analytical tool for assessing the precooling potential through simulations from real data in a numerical weather prediction platform. Three dimensionless ratios are developed based on the meteorological analysis and the concept of degree hours that provide an understanding of the precooling potential, utilization and theoretical value. Simulations were carried out for five sites within the Sydney (Australia metro area and it was found that they have different responses to precooling, depending on their proximity to the ocean, vegetation coverage, and urban density. These effects cannot be detected when typical meteorological year data or data from weather stations at a distance from the building were used. Results from simulations in other Australian capitals suggest that buildings in continental and temperate climates have the potential to cover substantial parts of the cooling loads with precooling, assuming appropriate infrastructure is in place.

  12. Development and pilot testing of a vitiligo screening tool.

    Science.gov (United States)

    Sheth, Vaneeta M; Gunasekera, Nicole S; Silwal, Sujeeta; Qureshi, Abrar A

    2015-01-01

    Studies aimed at understanding the pathology, genetics, and therapeutic response of vitiligo rely on asking a single question about 'physician-diagnosed' vitiligo on surveys to identify subjects for research. However, this type of self-reporting is not sufficient. Our objective was to determine if the patient-administered Vitiligo Screening Tool (VISTO) is a sensitive and specific instrument for the detection of vitiligo in an adult population. The VISTO consists of eight closed-ended questions to assess whether the survey participant has ever been diagnosed with vitiligo by a healthcare worker and uses characteristic pictures and descriptions to inquire about the subtype and extent of any skin lesions. 159 patients at the Brigham and Women's Hospital dermatology clinic with or without a diagnosis of vitiligo were recruited. A board-certified dermatologist confirmed or excluded the diagnosis of vitiligo in each subject. 147 completed questionnaires were analyzed, 47 cases and 100 controls. The pictorial question showed 97.9% sensitivity and 98% specificity for diagnosis of vitiligo. Answering "yes" to being diagnosed with vitiligo by a dermatologist and choosing one photographic representation of vitiligo showed 95.2% sensitivity and 100% specificity for diagnosis of vitiligo. We conclude that VISTO is a highly sensitive and specific, low-burden, self-administered tool for identifying vitiligo among adult English speakers. We believe this tool will provide a simple, cost-effective way to confirm vitiligo prior to enrollment in clinical trials as well as for gathering large-scale epidemiologic data in remote populations. Future work to refine the VISTO is needed prior to use in genotype-phenotype correlation studies.

  13. Analytics for Education

    Science.gov (United States)

    MacNeill, Sheila; Campbell, Lorna M.; Hawksey, Martin

    2014-01-01

    This article presents an overview of the development and use of analytics in the context of education. Using Buckingham Shum's three levels of analytics, the authors present a critical analysis of current developments in the domain of learning analytics, and contrast the potential value of analytics research and development with real world…

  14. Digital design and communication tools for sustainable development

    Energy Technology Data Exchange (ETDEWEB)

    Totten, M.

    1995-12-31

    Within the computer and communications industry there is a strong sentiment that the speed and power of mainframe computers will be available at personal computer sizes and prices in the next few years. Coinciding with this is the expectation that large data/information/knowledge resource pools will be available online for download. This paper summarizes what is available now and what is coming in the future in computer technologies. Then the author talks the opportunities in `green` building design for energy efficiency and conservation and the type of design tools which will be coming in the future.

  15. Vaccinia Virus: A Tool for Research and Vaccine Development

    Science.gov (United States)

    Moss, Bernard

    1991-06-01

    Vaccinia virus is no longer needed for smallpox immunization, but now serves as a useful vector for expressing genes within the cytoplasm of eukaryotic cells. As a research tool, recombinant vaccinia viruses are used to synthesize biologically active proteins and analyze structure-function relations, determine the targets of humoral- and cell-mediated immunity, and investigate the immune responses needed for protection against specific infectious diseases. When more data on safety and efficacy are available, recombinant vaccinia and related poxviruses may be candidates for live vaccines and for cancer immunotherapy.

  16. AFM, SECM and QCM as useful analytical tools in the characterization of enzyme-based bioanalytical platforms.

    Science.gov (United States)

    Casero, Elena; Vázquez, Luis; Parra-Alfambra, Ana María; Lorenzo, Encarnación

    2010-08-01

    One of the key issues to develop biosensing platforms concerns the processes involved in enzyme immobilization on surfaces. The understanding of their fundamentals is crucial to obtain stable and catalytically active protein layers for developing successful biosensing devices. In this respect, the advent and development of new characterization techniques, in particular at the submicron level, has allowed the study of these processes with high resolution, which has opened new routes to improve, and eventually control, enzyme immobilization on electrode surfaces. This review focuses on the application of Atomic Force Microscopy (AFM), Scanning Electrochemical Microscopy (SECM) and Quartz Crystal Microbalance (QCM) techniques in the characterization of the successive immobilization steps involved in the development of bioanalytical platforms. A common advantage of these techniques is their ability to provide important information without damaging the immobilized biological sample due to the possibility of performing measurements under physiological conditions close to the native environment of the specimens. A particular emphasis is placed on the application of these techniques to the characterization of the immobilization of enzymes on different modified and unmodified surfaces as well as on the study of protein interactions, which is a more recent and less current application.

  17. Phase II Fort Ord Landfill Demonstration Task 8 - Refinement of In-line Instrumental Analytical Tools to Evaluate their Operational Utility and Regulatory Acceptance

    Energy Technology Data Exchange (ETDEWEB)

    Daley, P F

    2006-04-03

    The overall objective of this project is the continued development, installation, and testing of continuous water sampling and analysis technologies for application to on-site monitoring of groundwater treatment systems and remediation sites. In a previous project, an on-line analytical system (OLAS) for multistream water sampling was installed at the Fort Ord Operable Unit 2 Groundwater Treatment System, with the objective of developing a simplified analytical method for detection of Compounds of Concern at that plant, and continuous sampling of up to twelve locations in the treatment system, from raw influent waters to treated effluent. Earlier implementations of the water sampling and processing system (Analytical Sampling and Analysis Platform, A A+RT, Milpitas, CA) depended on off-line integrators that produced paper plots of chromatograms, and sent summary tables to a host computer for archiving. We developed a basic LabVIEW (National Instruments, Inc., Austin, TX) based gas chromatography control and data acquisition system that was the foundation for further development and integration with the ASAP system. Advantages of this integration include electronic archiving of all raw chromatographic data, and a flexible programming environment to support development of improved ASAP operation and automated reporting. The initial goals of integrating the preexisting LabVIEW chromatography control system with the ASAP, and demonstration of a simplified, site-specific analytical method were successfully achieved. However, although the principal objective of this system was assembly of an analytical system that would allow plant operators an up-to-the-minute view of the plant's performance, several obstacles remained. Data reduction with the base LabVIEW system was limited to peak detection and simple tabular output, patterned after commercial chromatography integrators, with compound retention times and peak areas. Preparation of calibration curves, method

  18. Development of an Automated Security Risk Assessment Methodology Tool for Critical Infrastructures.

    Energy Technology Data Exchange (ETDEWEB)

    Jaeger, Calvin Dell; Roehrig, Nathaniel S.; Torres, Teresa M.

    2008-12-01

    This document presents the security automated Risk Assessment Methodology (RAM) prototype tool developed by Sandia National Laboratories (SNL). This work leverages SNL's capabilities and skills in security risk analysis and the development of vulnerability assessment/risk assessment methodologies to develop an automated prototype security RAM tool for critical infrastructures (RAM-CITM). The prototype automated RAM tool provides a user-friendly, systematic, and comprehensive risk-based tool to assist CI sector and security professionals in assessing and managing security risk from malevolent threats. The current tool is structured on the basic RAM framework developed by SNL. It is envisioned that this prototype tool will be adapted to meet the requirements of different CI sectors and thereby provide additional capabilities.

  19. Developing theological tools for a strategic engagement with Human Enhancement.

    Science.gov (United States)

    Tomkins, Justin

    2014-01-01

    The literature on Human Enhancement may indeed have reached a critical mass yet theological engagement with the subject is still thin. Human Enhancement has already been established as a key topic within research and captivating visions of the future have been allied with a depth of philosophical analysis. Some Transhumanists have pointed to a theological dimension to their position and some who have warned against enhancement might be seen as having done so from a perspective shaped by a Judeo-Christian worldview. Nonetheless, in neither of these cases has theology been central to engagement with the enhancement quest.Christian theologians who have begun to open up such an engagement with Human Enhancement include Brent Waters, Robert Song and Celia Deane-Drummond. The work they have already carried out is insightful and important yet due to the scale of the possible engagement, the wealth of Christian theology which might be applied to Human Enhancement remains largely untapped. This paper explores how three key aspects of Christian theology, eschatology, love of God and love of neighbour, provide valuable tools for a theological engagement with Human Enhancement. It is proposed that such theological tools need to be applied to Human Enhancement if the debate is to be resourced with the Christian theological perspective of what it means to be human in our contemporary technological context and if society is to have the choice of maintaining its Christian foundations.

  20. Magnetic Resonance Imaging: A Tool for Pork Pie Development.

    Science.gov (United States)

    Gaunt, Adam P; Morris, Robert H; Newton, Michael I

    2013-08-28

    The traditional British pork pie consists of roughly chopped pork cooked in a hot water pastry crust. Due to shrinkage of the meat during cooking, the gap formed around the meat is usually sealed using a gelatin based jelly to exclude air and thus help to preserve the pie. The properties of the jelly are such that it will ingress into the pastry crust causing undesirable softening. The jelly is traditionally produced by simmering pig trotters with seasoning for several hours. In this work we demonstrate the potential of magnetic resonance imaging (MRI) as a tool for investigating the conditions required for producing jellies with different properties and present two examples of this use. Firstly we demonstrate that MRI can determine the ability of water to diffuse through the jelly which is critical in minimizing the amount of moisture moving from the jelly to the crust. Secondly, the impact of jelly temperature on the penetration length into the crust is investigated. These examples highlight the power of MRI as a tool for food assessment.

  1. Analyzing the urban development of Isfahan districts in the housing sector using analytic network process (ANP

    Directory of Open Access Journals (Sweden)

    S. Hadizadeh Zargar

    2013-01-01

    Full Text Available Extended abstract1-IntroductionDevelopment is a multidimensional and complex process that involves making changes in social attitudes and national institutions as well as accelerating the economic growth, reducing inequalities and eradicating poverty. Housing is considered as an integral part of development in a society. With its large economic, social, cultural, environmental, and physical dimensions, this sector plays a pivotal role in presenting the characteristics and improving the appearance of the society in general. Identifying and assessing the housing condition in a country depends on the detection and analysis of the factors affecting housing, which can be considered as guidelines for the resource allocation in future planning and for promoting justice and sustainable development. Isfahan city, with a population of over one million people, is the third most populous city in the country and has fourteen districts. This research can help policymakers and planners alleviate the poverty, promote the social justice and formulate appropriate policies by looking into the indicators of the housing sector in the fourteen districts of Isfahan and their ranking using analytic network process. 2- Theoretical BasesHousing is an extremely complex and extensive issue with different spatial, architectural, physical, economic, social, financial, psychological and medical aspects. As such, in this study various definitions have been proposed that, for example, deal with housing as a physical location and as a shelter, considering it as one of the basic needs of the households. In addition to the physical location, the housing includes the entire residential environment, which has various dimensions and goes beyond the physical shelter. Housing is the first unit of society and the most important unit of human settlements that represents the smallest unit of the planning. Most governments, in response to the importance of housing, incorporate the housing

  2. Development and Testing of an Optimised Combined Analytical Instrument for Planetary Applications

    Science.gov (United States)

    Lerman, Hannah; Hutchinson, Ian

    2016-10-01

    Miniaturised, analytical instruments that can simultaneously obtain complementary (molecular and elemental) information about the composition of a sample are likely to be a key feature of the next generation of planetary exploration missions. Certain spectroscopic techniques, such as Raman spectroscopy, can provide information on the molecular composition of an unknown sample whereas others, such as Laser-Induced Breakdown Spectroscopy (LIBS) and X-Ray Fluorescence (XRF), enable the determination of the elemental composition of a material. Combining two or more of these techniques into one instrument package enables a broader range of the scientific goals of a particular mission to be obtained (i.e. full composition analysis and structural information about the sample and therefore geological history). In order to determine the most appropriate design for such an instrument, we have developed some radiometric models to assess the overall scientific capability of various analytical technique combinations. We have then used these models to perform a number of trade-offs to evaluate the optimum instrument design for a particular set of science requirements (such as, to acquire composition information with suitable sensitivity and uncertainty). The performance of one of these designs was then thoroughly investigated by building a prototype instrument. The construction of our instrument focuses on the optimum design for combining the multiple instrument sub-systems so that the overall mass, power and cost budgets can be minimised, whilst achieving the wider and more comprehensive range of scientific goals. Here we report on measurements obtained from field test campaigns that have been performed in order to verify model predictions and overall scientific performance. These tests include operation in extreme environments such as dry deserts and under water.

  3. Development of analytically capable time-of-flight mass spectrometer with continuous ion introduction.

    Science.gov (United States)

    Hárs, György; Dobos, Gábor

    2010-03-01

    The present article describes the results and findings explored in the course of the development of the analytically capable prototype of continuous time-of-flight (CTOF) mass spectrometer. Currently marketed pulsed TOF (PTOF) instruments use ion introduction with a 10 ns or so pulse width, followed by a waiting period roughly 100 micros. Accordingly, the sample is under excitation in 10(-4) part of the total measuring time. This very low duty cycle severely limits the sensitivity of the PTOF method. A possible approach to deal with this problem is to use linear sinusoidal dual modulation technique (CTOF) as described in this article. This way the sensitivity of the method is increased, due to the 50% duty cycle of the excitation. All other types of TOF spectrometer use secondary electron multiplier (SEM) for detection, which unfortunately discriminates in amplification in favor of the lighter ions. This discrimination effect is especially undesirable in a mass spectrometric method, which targets high mass range. In CTOF method, SEM is replaced with Faraday cup detector, thus eliminating the mass discrimination effect. Omitting SEM is made possible by the high ion intensity and the very slow ion detection with some hundred hertz detection bandwidth. The electrometer electronics of the Faraday cup detector operates with amplification 10(10) V/A. The primary ion beam is highly monoenergetic due to the construction of the ion gun, which made possible to omit any electrostatic mirror configuration for bunching the ions. The measurement is controlled by a personal computer and the intelligent signal generator Type Tabor WW 2571, which uses the direct digital synthesis technique for making arbitrary wave forms. The data are collected by a Labjack interface board, and the fast Fourier transformation is performed by the software. Noble gas mixture has been used to test the analytical capabilities of the prototype setup. Measurement presented proves the results of the

  4. Development of analytically capable time-of-flight mass spectrometer with continuous ion introduction

    Science.gov (United States)

    Hárs, György; Dobos, Gábor

    2010-03-01

    The present article describes the results and findings explored in the course of the development of the analytically capable prototype of continuous time-of-flight (CTOF) mass spectrometer. Currently marketed pulsed TOF (PTOF) instruments use ion introduction with a 10 ns or so pulse width, followed by a waiting period roughly 100 μs. Accordingly, the sample is under excitation in 10-4 part of the total measuring time. This very low duty cycle severely limits the sensitivity of the PTOF method. A possible approach to deal with this problem is to use linear sinusoidal dual modulation technique (CTOF) as described in this article. This way the sensitivity of the method is increased, due to the 50% duty cycle of the excitation. All other types of TOF spectrometer use secondary electron multiplier (SEM) for detection, which unfortunately discriminates in amplification in favor of the lighter ions. This discrimination effect is especially undesirable in a mass spectrometric method, which targets high mass range. In CTOF method, SEM is replaced with Faraday cup detector, thus eliminating the mass discrimination effect. Omitting SEM is made possible by the high ion intensity and the very slow ion detection with some hundred hertz detection bandwidth. The electrometer electronics of the Faraday cup detector operates with amplification 1010 V/A. The primary ion beam is highly monoenergetic due to the construction of the ion gun, which made possible to omit any electrostatic mirror configuration for bunching the ions. The measurement is controlled by a personal computer and the intelligent signal generator Type Tabor WW 2571, which uses the direct digital synthesis technique for making arbitrary wave forms. The data are collected by a Labjack interface board, and the fast Fourier transformation is performed by the software. Noble gas mixture has been used to test the analytical capabilities of the prototype setup. Measurement presented proves the results of the mathematical

  5. Geo-Sandbox: An Interactive Geoscience Training Tool with Analytics to Better Understand Student Problem Solving Approaches

    Science.gov (United States)

    Butt, N.; Pidlisecky, A.; Ganshorn, H.; Cockett, R.

    2015-12-01

    The software company 3 Point Science has developed three interactive learning programs designed to teach, test and practice visualization skills and geoscience concepts. A study was conducted with 21 geoscience students at the University of Calgary who participated in 2 hour sessions of software interaction and written pre and post-tests. Computer and SMART touch table interfaces were used to analyze user interaction, problem solving methods and visualization skills. By understanding and pinpointing user problem solving methods it is possible to reconstruct viewpoints and thought processes. This could allow us to give personalized feedback in real time, informing the user of problem solving tips and possible misconceptions.

  6. TOOLS TO MINIMIZE RISK UNDER DEVELOPMENT OF HIGH-TECH PRODUCTS

    OpenAIRE

    AVDONIN BORIS N.; BATKOVSKY ALEXANDR M.; BATKOVSKY MIKHAIL A.

    2014-01-01

    The article describes the methodological bases and the economic and mathematical tools to minimize technological risks developing hightech products. A new, more efficient organization of the process of developing high-tech products, as well as tools for assessing the technical and economic efficiency of new technologies introduced on manufacturers of these products is offered.

  7. ARBOOK: Development and Assessment of a Tool Based on Augmented Reality for Anatomy

    Science.gov (United States)

    Ferrer-Torregrosa, J.; Torralba, J.; Jimenez, M. A.; García, S.; Barcia, J. M.

    2015-01-01

    The evolution of technologies and the development of new tools with educational purposes are growing up. This work presents the experience of a new tool based on augmented reality (AR) focusing on the anatomy of the lower limb. ARBOOK was constructed and developed based on TC and MRN images, dissections and drawings. For ARBOOK evaluation, a…

  8. Positioning Mentoring as a Coach Development Tool: Recommendations for Future Practice and Research

    Science.gov (United States)

    McQuade, Sarah; Davis, Louise; Nash, Christine

    2015-01-01

    Current thinking in coach education advocates mentoring as a development tool to connect theory and practice. However, little empirical evidence exists to evaluate the effectiveness of mentoring as a coach development tool. Business, education, and nursing precede the coaching industry in their mentoring practice, and research findings offered in…

  9. Development of analytical methods for polycyclic aromatic hydrocarbons (PAHs) in airborne particulates:A review

    Institute of Scientific and Technical Information of China (English)

    LIU Li-bin; LIU Yan; LIN Jin-ming; TANG Ning; HAYAKAWA Kazuichi; MAEDA Tsuneaki

    2007-01-01

    In the present work,the different sample collection, pretreatment and analytical methods for polycyclic aromatic hydrocarbons (PAHs) in airborne particulates is systematacially reviewed, and the applications of these pretreatment and analytical methods for PAHs are compared in detail. Some comments on the future expectation are also presented.

  10. The influence of the sample matrix on LC-MS/MS method development and analytical performance

    NARCIS (Netherlands)

    Koster, Remco Arjan

    2015-01-01

    In order to provide personalized patient treatment, a large number of analytical procedures is needed to measure a large variety of drugs in various human matrices. The analytical technique used for this research is Liquid Chromatography coupled with triple quadrupole mass spectrometry (LC-MS/MS). E

  11. Recent Developments in the Speciation and Determination of Mercury Using Various Analytical Techniques

    Directory of Open Access Journals (Sweden)

    Lakshmi Narayana Suvarapu

    2015-01-01

    Full Text Available This paper reviews the speciation and determination of mercury by various analytical techniques such as atomic absorption spectrometry, voltammetry, inductively coupled plasma techniques, spectrophotometry, spectrofluorometry, high performance liquid chromatography, and gas chromatography. Approximately 126 research papers on the speciation and determination of mercury by various analytical techniques published in international journals since 2013 are reviewed.

  12. Using Multilingual Analytics to Explore the Usage of a Learning Portal in Developing Countries

    Science.gov (United States)

    Protonotarios, Vassilis; Stoitsis, Giannis; Kastrantas, Kostas; Sanchez-Alonso, Salvador

    2013-01-01

    Learning analytics is a domain that has been constantly evolving throughout recent years due to the acknowledgement of its importance by those using intelligent data, learner-produced data, and analysis models to discover information and social connections for predicting and advising people's learning [1]. Learning analytics may be applied in a…

  13. Development of a Multi-Site and Multi-Device Webgis-Based Tool for Tidal Current Energy Development

    Science.gov (United States)

    Ang, M. R. C. O.; Panganiban, I. K.; Mamador, C. C.; De Luna, O. D. G.; Bausas, M. D.; Cruz, J. P.

    2016-06-01

    A multi-site, multi-device and multi-criteria decision support tool designed to support the development of tidal current energy in the Philippines was developed. Its platform is based on Geographic Information Systems (GIS) which allows for the collection, storage, processing, analyses and display of geospatial data. Combining GIS tools with open source web development applications, it becomes a webGIS-based marine spatial planning tool. To date, the webGIS-based tool displays output maps and graphs of power and energy density, site suitability and site-device analysis. It enables stakeholders and the public easy access to the results of tidal current energy resource assessments and site suitability analyses. Results of the initial development showed that it is a promising decision support tool for ocean renewable energy project developments.

  14. Tools influencing on business development and population's social protection

    Directory of Open Access Journals (Sweden)

    Susanna Alieva

    2009-10-01

    Full Text Available The paper raises the issue how to develop public policy instruments that could efficiently target bothdevelopment of entrepreneurship and strengthening of social safety of workers.

  15. Development of analytical competencies and professional identities through school-based learning in Denmark

    Science.gov (United States)

    Andresen, Bent B.

    2015-12-01

    This article presents the main results of a case study on teachers' professional development in terms of competence and identity. The teachers involved in the study are allocated time by their schools to participate in professional "affinity group" meetings. During these meetings, the teachers gather and analyse school-based data about factors which persistently create and sustain challenges in effective student education (grade K-10). This process improves their understanding and undertaking of job-related tasks. The affinity group meetings also influence the teachers' professional identity. The research findings thus illustrate the fact that the analytical approach of affinity groups, based on the analysis of the difficulties in their daily job, provides good results in terms of competencies and identity perception. In general, as a result of meeting in affinity groups, adult learners develop professional competencies and identities which are considered crucial in rapidly changing schools characterised by an increased focus on, among other things, lifelong learning, social inclusion, school digitalisation, and information literacy. The research findings are thus relevant for ministries and school owners, teacher-trainers and supervisors, schools and other educational institutions, as well as teachers and their organisations worldwide.

  16. A Hybrid Fuzzy Analytic Network Process Approach to the New Product Development Selection Problem

    Directory of Open Access Journals (Sweden)

    Chiuh-Cheng Chyu

    2014-01-01

    Full Text Available New product development selection is a complex decision-making process. To uphold their competence in competitive business environments, enterprises are required to continuously introduce novel products into markets. This paper presents a fuzzy analytic network process (FANP for solving the product development selection problem. The fuzzy set theory is adopted to represent ambiguities and vagueness involved in each expert’s judgment. In the proposed model, the fuzzy Kano method and fuzzy DEMATEL are employed to filter criteria and establish interactions among the criteria, whereas the SAM is applied to aggregate experts’ opinions. Unlike the commonly used top-down relation-structuring approach, the proposed FANP first identifies the interdependence among the criteria and then the identified relationships are mapped to the clusters. This approach is more realistic, since the inner and outer relationships between criteria are simultaneously considered to establish the relationships among clusters. The proposed model is illustrated through a real life example, with a comparative analysis using modified TOPSIS and gray relation analysis in the synthesizing phase. The concluded results were approved by the case company. The proposed methodology not only is useful in the case study, but also can be generally applied in other similar decision situations.

  17. Development of the smartphone-based colorimetry for multi-analyte sensing arrays.

    Science.gov (United States)

    Hong, Jong Il; Chang, Byoung-Yong

    2014-05-21

    Here we report development of a smartphone app (application) that digitizes the colours of a colorimetric sensor array. A conventional colorimetric sensor array consists of multiple paper-based sensors, and reports the detection results in terms of colour change. Evaluation of the colour changes is normally done by the naked eye, which may cause uncertainties due to personal subjectivity and the surrounding conditions. Solutions have been particularly sought in smartphones as they are capable of spectrometric functions. Our report specifically focuses on development of a practical app for immediate point-of-care (POC) multi-analyte sensing without additional devices. First, the individual positions of the sensors are automatically identified by the smartphone; second, the colours measured at each sensor are digitized based on a correction algorithm; and third, the corrected colours are converted to concentration values by pre-loaded calibration curves. All through these sequential processes, the sensor array taken in a smartphone snapshot undergoes laboratory-level spectrometry. The advantages of inexpensive and convenient paper-based colorimetry and the ubiquitous smartphone are tied to achieve a ready-to-go POC diagnosis.

  18. Experimental, numerical and analytical modelling of a newly developed rockfall protective cable-net structure

    Directory of Open Access Journals (Sweden)

    S. Dhakal

    2011-12-01

    Full Text Available An innovative configuration of pocket-type rockfall protective cable-net structure, known as Long-span Pocket-type Rock-net (LPR, has been developed in Japan. The global performance of the proposed system was initially checked by the experimental (full-scale modelling. Given the various limitations of the physical experiments, particularly for the parametric study to have a detailed understanding of the newly developed system, a reliable and simplified method of numerical modelling is felt necessary. Again, given the sophistication involved with the method of numerical simulation, a yet simplified modelling approach may prove more effective. On top of this background, this paper presents a three-tier modelling of a design of LPR. After physical modelling, which has revealed that the displacement response may be taken more vital for LPR performance, Finite Element based numerical modelling is presented. The programme LS-DYNA is used and the models are calibrated and verified with the element- and structure-level experiments. Finally, a simple analytical modelling consisting of the equivalently linear and elastic, lumped-mass, single-degree-of-freedom system, capable of predicting the global displacement response, is proposed based on the basic principles of conservation of linear momentum and energy. The model is back-calculated and modified from the analyses of the verified numerical model.

  19. PTaaS: Platform for Providing Software Developing Applications and Tools as a Service

    DEFF Research Database (Denmark)

    Chauhan, Muhammad Aufeef; Babar, Muhammad Ali

    2014-01-01

    of large number of complex activities that does not only include technological aspects but also social aspects. A large number of applications and tools have been devised for providing solutions to the challenges of the GSD that emerge as a result of distributed development teams. However...... technological support for it that is not limited to one specific tools and a particular phase of software development life cycle. In this thesis, we have explored the possibility of offering software development applications and tools as services that can be acquired on demand according to the software...... development process in globally distributed environment. We have performed the structured review of the literature on GSD tools to identify attributes of the software development tools that have been introduced for addressing GSD challenges and we have discussed significance of technology alignment...

  20. Nanopore analytics: sensing of single molecules.

    Science.gov (United States)

    Howorka, Stefan; Siwy, Zuzanna

    2009-08-01

    In nanopore analytics, individual molecules pass through a single nanopore giving rise to detectable temporary blockades in ionic pore current. Reflecting its simplicity, nanopore analytics has gained popularity and can be conducted with natural protein as well as man-made polymeric and inorganic pores. The spectrum of detectable analytes ranges from nucleic acids, peptides, proteins, and biomolecular complexes to organic polymers and small molecules. Apart from being an analytical tool, nanopores have developed into a general platform technology to investigate the biophysics, physicochemistry, and chemistry of individual molecules (critical review, 310 references).