WorldWideScience

Sample records for analytical tool development

  1. Developing a Learning Analytics tool

    DEFF Research Database (Denmark)

    Wahl, Christian; Belle, Gianna; Clemmensen, Anita Lykke

    This poster describes how learning analytics and collective intelligence can be combined in order to develop a tool for providing support and feedback to learners and teachers regarding students self-initiated learning activities.......This poster describes how learning analytics and collective intelligence can be combined in order to develop a tool for providing support and feedback to learners and teachers regarding students self-initiated learning activities....

  2. Developing a learning analytics tool

    DEFF Research Database (Denmark)

    Wahl, Christian; Belle, Gianna; Clemmensen, Anita Lykke

    This poster describes how learning analytics and collective intelligence can be combined in order to develop a tool for providing support and feedback to learners and teachers regarding students self-initiated learning activities.......This poster describes how learning analytics and collective intelligence can be combined in order to develop a tool for providing support and feedback to learners and teachers regarding students self-initiated learning activities....

  3. Cryogenic Propellant Feed System Analytical Tool Development

    Science.gov (United States)

    Lusby, Brian S.; Miranda, Bruno M.; Collins, Jacob A.

    2011-01-01

    The Propulsion Systems Branch at NASA s Lyndon B. Johnson Space Center (JSC) has developed a parametric analytical tool to address the need to rapidly predict heat leak into propellant distribution lines based on insulation type, installation technique, line supports, penetrations, and instrumentation. The Propellant Feed System Analytical Tool (PFSAT) will also determine the optimum orifice diameter for an optional thermodynamic vent system (TVS) to counteract heat leak into the feed line and ensure temperature constraints at the end of the feed line are met. PFSAT was developed primarily using Fortran 90 code because of its number crunching power and the capability to directly access real fluid property subroutines in the Reference Fluid Thermodynamic and Transport Properties (REFPROP) Database developed by NIST. A Microsoft Excel front end user interface was implemented to provide convenient portability of PFSAT among a wide variety of potential users and its ability to utilize a user-friendly graphical user interface (GUI) developed in Visual Basic for Applications (VBA). The focus of PFSAT is on-orbit reaction control systems and orbital maneuvering systems, but it may be used to predict heat leak into ground-based transfer lines as well. PFSAT is expected to be used for rapid initial design of cryogenic propellant distribution lines and thermodynamic vent systems. Once validated, PFSAT will support concept trades for a variety of cryogenic fluid transfer systems on spacecraft, including planetary landers, transfer vehicles, and propellant depots, as well as surface-based transfer systems. The details of the development of PFSAT, its user interface, and the program structure will be presented.

  4. Developing Healthcare Data Analytics APPs with Open Data Science Tools.

    Science.gov (United States)

    Hao, Bibo; Sun, Wen; Yu, Yiqin; Xie, Guotong

    2017-01-01

    Recent advances in big data analytics provide more flexible, efficient, and open tools for researchers to gain insight from healthcare data. Whilst many tools require researchers to develop programs with programming languages like Python, R and so on, which is not a skill set grasped by many researchers in the healthcare data analytics area. To make data science more approachable, we explored existing tools and developed a practice that can help data scientists convert existing analytics pipelines to user-friendly analytics APPs with rich interactions and features of real-time analysis. With this practice, data scientists can develop customized analytics pipelines as APPs in Jupyter Notebook and disseminate them to other researchers easily, and researchers can benefit from the shared notebook to perform analysis tasks or reproduce research results much more easily.

  5. Using complaints to enhance quality improvement: developing an analytical tool.

    Science.gov (United States)

    Hsieh, Sophie Yahui

    2012-01-01

    This study aims to construct an instrument for identifying certain attributes or capabilities that might enable healthcare staff to use complaints to improve service quality. PubMed and ProQuest were searched, which in turn expanded access to other literature. Three paramount dimensions emerged for healthcare quality management systems: managerial, operational, and technical (MOT). The paper reveals that the managerial dimension relates to quality improvement program infrastructure. It contains strategy, structure, leadership, people and culture. The operational dimension relates to implementation processes: organizational changes and barriers when using complaints to enhance quality. The technical dimension emphasizes the skills, techniques or information systems required to achieve successfully continuous quality improvement. The MOT model was developed by drawing from the relevant literature. However, individuals have different training, interests and experiences and, therefore, there will be variance between researchers when generating the MOT model. The MOT components can be the guidelines for examining whether patient complaints are used to improve service quality. However, the model needs testing and validating by conducting further research before becoming a theory. Empirical studies on patient complaints did not identify any analytical tool that could be used to explore how complaints can drive quality improvement. This study developed an instrument for identifying certain attributes or capabilities that might enable healthcare professionals to use complaints and improve service quality.

  6. Social Data Analytics Tool

    DEFF Research Database (Denmark)

    Hussain, Abid; Vatrapu, Ravi

    2014-01-01

    This paper presents the design, development and demonstrative case studies of the Social Data Analytics Tool, SODATO. Adopting Action Design Framework [1], the objective of SODATO [2] is to collect, store, analyze, and report big social data emanating from the social media engagement of and socia...

  7. Harnessing scientific literature reports for pharmacovigilance. Prototype software analytical tool development and usability testing.

    Science.gov (United States)

    Sorbello, Alfred; Ripple, Anna; Tonning, Joseph; Munoz, Monica; Hasan, Rashedul; Ly, Thomas; Francis, Henry; Bodenreider, Olivier

    2017-03-22

    We seek to develop a prototype software analytical tool to augment FDA regulatory reviewers' capacity to harness scientific literature reports in PubMed/MEDLINE for pharmacovigilance and adverse drug event (ADE) safety signal detection. We also aim to gather feedback through usability testing to assess design, performance, and user satisfaction with the tool. A prototype, open source, web-based, software analytical tool generated statistical disproportionality data mining signal scores and dynamic visual analytics for ADE safety signal detection and management. We leveraged Medical Subject Heading (MeSH) indexing terms assigned to published citations in PubMed/MEDLINE to generate candidate drug-adverse event pairs for quantitative data mining. Six FDA regulatory reviewers participated in usability testing by employing the tool as part of their ongoing real-life pharmacovigilance activities to provide subjective feedback on its practical impact, added value, and fitness for use. All usability test participants cited the tool's ease of learning, ease of use, and generation of quantitative ADE safety signals, some of which corresponded to known established adverse drug reactions. Potential concerns included the comparability of the tool's automated literature search relative to a manual 'all fields' PubMed search, missing drugs and adverse event terms, interpretation of signal scores, and integration with existing computer-based analytical tools. Usability testing demonstrated that this novel tool can automate the detection of ADE safety signals from published literature reports. Various mitigation strategies are described to foster improvements in design, productivity, and end user satisfaction.

  8. Development of Multi-slice Analytical Tool to Support BIM-based Design Process

    Science.gov (United States)

    Atmodiwirjo, P.; Johanes, M.; Yatmo, Y. A.

    2017-03-01

    This paper describes the on-going development of computational tool to analyse architecture and interior space based on multi-slice representation approach that is integrated with Building Information Modelling (BIM). Architecture and interior space is experienced as a dynamic entity, which have the spatial properties that might be variable from one part of space to another, therefore the representation of space through standard architectural drawings is sometimes not sufficient. The representation of space as a series of slices with certain properties in each slice becomes important, so that the different characteristics in each part of space could inform the design process. The analytical tool is developed for use as a stand-alone application that utilises the data exported from generic BIM modelling tool. The tool would be useful to assist design development process that applies BIM, particularly for the design of architecture and interior spaces that are experienced as continuous spaces. The tool allows the identification of how the spatial properties change dynamically throughout the space and allows the prediction of the potential design problems. Integrating the multi-slice analytical tool in BIM-based design process thereby could assist the architects to generate better design and to avoid unnecessary costs that are often caused by failure to identify problems during design development stages.

  9. Development of a Suite of Analytical Tools for Energy and Water Infrastructure Knowledge Discovery

    Science.gov (United States)

    Morton, A.; Piburn, J.; Stewart, R.; Chandola, V.

    2017-12-01

    Energy and water generation and delivery systems are inherently interconnected. With demand for energy growing, the energy sector is experiencing increasing competition for water. With increasing population and changing environmental, socioeconomic, and demographic scenarios, new technology and investment decisions must be made for optimized and sustainable energy-water resource management. This also requires novel scientific insights into the complex interdependencies of energy-water infrastructures across multiple space and time scales. To address this need, we've developed a suite of analytical tools to support an integrated data driven modeling, analysis, and visualization capability for understanding, designing, and developing efficient local and regional practices related to the energy-water nexus. This work reviews the analytical capabilities available along with a series of case studies designed to demonstrate the potential of these tools for illuminating energy-water nexus solutions and supporting strategic (federal) policy decisions.

  10. On the Design, Development and Use of the Social Data Analytics Tool (SODATO)

    DEFF Research Database (Denmark)

    Hussain, Abid

    Science, Computational Social Science and Information Systems, the PhD project addressed two general research questions about the technological architectures and design principles for big social data analytics in an organisational context. The PhD project is grounded in the theory of socio......-technical interactions for better understanding perception of, and action on, the screen when individuals use social media platforms such as Facebook. Based on the theory of socio-technical interactions, a conceptual model of social data was generated. This conceptual model of social data model consists of two......This PhD is about the design, development and evaluation of the Social Data Analytics Tool (SODATO) to collect, store, analyze, and report big social data emanating from the social media engagement of and social media conversations about organizations. Situated with in the academic domains of Data...

  11. Development of a research ethics knowledge and analytical skills assessment tool.

    Science.gov (United States)

    Taylor, Holly A; Kass, Nancy E; Ali, Joseph; Sisson, Stephen; Bertram, Amanda; Bhan, Anant

    2012-04-01

    The goal of this project was to develop and validate a new tool to evaluate learners' knowledge and skills related to research ethics. A core set of 50 questions from existing computer-based online teaching modules were identified, refined and supplemented to create a set of 74 multiple-choice, true/false and short answer questions. The questions were pilot-tested and item discrimination was calculated for each question. Poorly performing items were eliminated or refined. Two comparable assessment tools were created. These assessment tools were administered as a pre-test and post-test to a cohort of 58 Indian junior health research investigators before and after exposure to a new course on research ethics. Half of the investigators were exposed to the course online, the other half in person. Item discrimination was calculated for each question and Cronbach's α for each assessment tool. A final version of the assessment tool that incorporated the best questions from the pre-/post-test phase was used to assess retention of research ethics knowledge and skills 3 months after course delivery. The final version of the REKASA includes 41 items and had a Cronbach's α of 0.837. The results illustrate, in one sample of learners, the successful, systematic development and use of a knowledge and skills assessment tool in research ethics capable of not only measuring basic knowledge in research ethics and oversight but also assessing learners' ability to apply ethics knowledge to the analytical task of reasoning through research ethics cases, without reliance on essay or discussion-based examination. These promising preliminary findings should be confirmed with additional groups of learners.

  12. FIGURED WORLDS AS AN ANALYTIC AND METHODOLOGICAL TOOL IN PROFESSIONAL TEACHER DEVELOPMENT

    DEFF Research Database (Denmark)

    Møller, Hanne; Brok, Lene Storgaard

    ,“(Holland et al., 1998, p. 52) and gives a framework for understanding meaning-making in particular pedagogical settings. We exemplify our use of the term Figured Worlds, both as an analytic and methodological tool for empirical studies in kindergarten and school. Based on data sources, such as field notes...

  13. Analytical Hierarchy Process for Developing a Building Performance-Risk Rating Tool

    Directory of Open Access Journals (Sweden)

    Khalil Natasha

    2016-01-01

    Full Text Available The need to optimize the performance of buildings has increased consequently due to the expansive supply of facilities in higher education building (HEB. Proper performance assessment as a proactive measure may help university building in achieving performance optimization. However, the current maintenance programs or performance evaluation in the HEB is a systemic and cyclic process where maintenance is considered as an operational issue and not as opposed to a strategic issue. Hence, this paper proposed a Building Performance Risk Rating Tool (BPRT as an improved measure for building performance evaluation by addressing the users' risk in health and safety aspects. The BPRT is developed from the result of a rating index using the Analytical Hierarchy Process (AHP method. 12 facilities management (FM experts and practitioners were involved in the rating process. The subjective weightings are analysed using the AHP computer software, the Expert Choice 11. The establishment of the BPRT was introduced as an aid of improvement towards the current performance assessment of HEB by emerging the concept of building performance and risk into a numerical strategic approach

  14. Analytical tool for risk assessment of landscape and urban planning: Spatial development impact assessment

    Science.gov (United States)

    Rehak, David; Senovsky, Michail; Balog, Karol; Dvorak, Jiri

    2011-06-01

    This article covers the issue of preventive protection of population, technical infrastructure, and the environment against adverse impacts of careless spatial development. In the first section, we describe the relationship between sustainable development and spatial development. This discussion is followed by a review of the current state of spatial development security, primarily at a national level in the Czech Republic. The remainder of the paper features our original contribution which is a tool for risk assessment in landscape and urban planning, the Spatial Development Impact Assessment (SDIA) tool. We briefly review the most significant semi-quantitative methods of risk analysis that were used as a starting point in implementing the tool, and we discuss several of SDIA's salient features, namely, the assessment process algorithm, the catalogue of hazard and asset groups, and the spatial development impact matrix.

  15. Cereals for developing gluten-free products and analytical tools for gluten detection

    OpenAIRE

    Rosell, Cristina M.; Barro Losada, Francisco; Sousa Martín, Carolina; Mena, M. Carmen

    2014-01-01

    Recently, gluten free foods have attracted much research interest motivated by the increasing market. Despite the motivation for developing gluten-free foods it is necessary to have a scientific basis for developing gluten-free foods and the tools for detecting the peptide sequence that could be immune-toxic to some persons. This review will be focused primarily on the cereal-based commodities available for developing gluten free blends, considering naturally gluten-free cereals in addition t...

  16. Development of rocket electrophoresis technique as an analytical tool in preformulation study of tetanus vaccine formulation.

    Science.gov (United States)

    Ahire, V J; Sawant, K K

    2006-08-01

    Rocket Electrophoresis (RE) technique relies on the difference in charges of the antigen and antibodies at the selected pH. The present study involves optimization of RE run conditions for Tetanus Toxoid (TT). Agarose gel (1% w/v, 20 ml, pH 8.6), anti-TT IgG - 1 IU/ml, temperature 4-8 degrees C and run duration of 18 h was found to be optimum. Height of the rocket-shaped precipitate was proportional to TT concentration. The RE method was found to be linear in the concentration range of 2.5 to 30 Lf/mL. The method was validated and found to be accurate, precise, and reproducible when analyzed statistically using student's t-test. RE was used as an analytical method for analyzing TT content in plain and marketed formulations as well as for the preformulation study of vaccine formulation where formulation additives were tested for compatibility with TT. The optimized RE method has several advantages: it uses safe materials, is inexpensive, and easy to perform. RE results are less prone to operator's bias as compared to flocculation test and can be documented by taking photographs and scanned by densitometer; RE can be easily standardized for the required antigen concentration by changing antitoxin concentration. It can be used as a very effective tool for qualitative and quantitative analysis and in preformulation studies of antigens.

  17. Analytic tools for information warfare

    Energy Technology Data Exchange (ETDEWEB)

    Vandewart, R.L.; Craft, R.L.

    1996-05-01

    Information warfare and system surety (tradeoffs between system functionality, security, safety, reliability, cost, usability) have many mechanisms in common. Sandia`s experience has shown that an information system must be assessed from a {ital system} perspective in order to adequately identify and mitigate the risks present in the system. While some tools are available to help in this work, the process is largely manual. An integrated, extensible set of assessment tools would help the surety analyst. This paper describes one approach to surety assessment used at Sandia, identifies the difficulties in this process, and proposes a set of features desirable in an automated environment to support this process.

  18. Tool Capability in Visual EAM Analytics

    Directory of Open Access Journals (Sweden)

    Dierk Jugel

    2015-04-01

    Full Text Available Enterprise Architectures (EA consist of a multitude of architecture elements, which relate in manifold ways to each other. As the change of a single element hence impacts various other elements, mechanisms for architecture analysis are important to stakeholders. The high number of relationships aggravates architecture analysis and makes it a complex yet important task. In practice EAs are often analyzed using visualizations. This article contributes to the field of visual analytics in enterprise architecture management (EAM by reviewing how state-of-the-art software platforms in EAM support stakeholders with respect to providing and visualizing the “right” information for decision-making tasks. We investigate the collaborative decision-making process in an experiment with master students using professional EAM tools by developing a research study. We evaluate the students’ findings by comparing them with the experience of an enterprise architect.

  19. Analytical tools in accelerator physics

    Energy Technology Data Exchange (ETDEWEB)

    Litvinenko, V.N.

    2010-09-01

    This paper is a sub-set of my lectures presented in the Accelerator Physics course (USPAS, Santa Rosa, California, January 14-25, 2008). It is based on my notes I wrote during period from 1976 to 1979 in Novosibirsk. Only few copies (in Russian) were distributed to my colleagues in Novosibirsk Institute of Nuclear Physics. The goal of these notes is a complete description starting from the arbitrary reference orbit, explicit expressions for 4-potential and accelerator Hamiltonian and finishing with parameterization with action and angle variables. To a large degree follow logic developed in Theory of Cyclic Particle Accelerators by A.A.Kolmensky and A.N.Lebedev [Kolomensky], but going beyond the book in a number of directions. One of unusual feature is these notes use of matrix function and Sylvester formula for calculating matrices of arbitrary elements. Teaching the USPAS course motivated me to translate significant part of my notes into the English. I also included some introductory materials following Classical Theory of Fields by L.D. Landau and E.M. Liftsitz [Landau]. A large number of short notes covering various techniques are placed in the Appendices.

  20. Public Interest Energy Research (PIER) Program Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Tengfang [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Flapper, Joris [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ke, Jing [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Kramer, Klaas [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sathaye, Jayant [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-02-01

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry - including four dairy processes - cheese, fluid milk, butter, and milk powder. BEST-Dairy tool developed in this project provides three options for the user to benchmark each of the dairy product included in the tool, with each option differentiated based on specific detail level of process or plant, i.e., 1) plant level; 2) process-group level, and 3) process-step level. For each detail level, the tool accounts for differences in production and other variables affecting energy use in dairy processes. The dairy products include cheese, fluid milk, butter, milk powder, etc. The BEST-Dairy tool can be applied to a wide range of dairy facilities to provide energy and water savings estimates, which are based upon the comparisons with the best available reference cases that were established through reviewing information from international and national samples. We have performed and completed alpha- and beta-testing (field testing) of the BEST-Dairy tool, through which feedback from voluntary users in the U.S. dairy industry was gathered to validate and improve the tool's functionality. BEST-Dairy v1.2 was formally published in May 2011, and has been made available for free downloads from the internet (i.e., http://best-dairy.lbl.gov). A user's manual has been developed and published as the companion documentation for use with the BEST-Dairy tool. In addition, we also carried out technology transfer activities by engaging the dairy industry in the process of tool development and testing, including field testing, technical presentations, and technical assistance throughout the project. To date, users from more than ten countries in addition to those in the U.S. have downloaded the BEST-Dairy from the LBNL website. It is expected that the use of BEST-Dairy tool will advance understanding of energy and

  1. Development of a Computer-based Benchmarking and Analytical Tool. Benchmarking and Energy & Water Savings Tool in Dairy Plants (BEST-Dairy)

    Energy Technology Data Exchange (ETDEWEB)

    Xu, Tengfang [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Flapper, Joris [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ke, Jing [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Kramer, Klaas [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Sathaye, Jayant [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-02-01

    The overall goal of the project is to develop a computer-based benchmarking and energy and water savings tool (BEST-Dairy) for use in the California dairy industry – including four dairy processes – cheese, fluid milk, butter, and milk powder.

  2. Guidance for the Design and Adoption of Analytic Tools.

    Energy Technology Data Exchange (ETDEWEB)

    Bandlow, Alisa [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-12-01

    The goal is to make software developers aware of common issues that can impede the adoption of analytic tools. This paper provides a summary of guidelines, lessons learned and existing research to explain what is currently known about what analysts want and how to better understand what tools they do and don't need.

  3. Development of simulation tools for improvement of measurement accuracy and efficiency in ultrasonic testing. Part 2. Development of fast simulator based on analytical approach

    International Nuclear Information System (INIS)

    Yamada, Hisao; Fukutomi, Hiroyuki; Lin, Shan; Ogata, Takashi

    2008-01-01

    CRIEPI developed a high speed simulation method to predict B scope images for crack-like defects under ultrasonic testing. This method is based on the geometrical theory of diffraction (GTD) to follow ultrasonic waves transmitted from the angle probe and with the aid of reciprocity relations to find analytical equations to express echoes received by the probe. The tip and mirror echoes from a slit with an arbitrary angle in the direction of thickness of test article and an arbitrary depth can be calculated by this method. Main object of the study is to develop a high speed simulation tool to gain B scope displays from the crack-like defect. This was achieved for the simple slits in geometry change regions by the prototype software based on the method. Fairy complete B scope images for slits could be obtained by about a minute on a current personal computer. The numerical predictions related to the surface opening slits were in excellent agreement with the relative experimental measurements. (author)

  4. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM 2.5) RESEARCH

    Energy Technology Data Exchange (ETDEWEB)

    Robinson P. Khosah; Charles G. Crawford

    2006-02-11

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-second month of development activities.

  5. DATABASE AND ANALYTICAL TOOL DEVELOPMENT FOR THE MANAGEMENT OF DATA DERIVED FROM US DOE (NETL) FUNDED FINE PARTICULATE (PM2.5) RESEARCH

    Energy Technology Data Exchange (ETDEWEB)

    Robinson P. Khosah; Charles G. Crawford

    2003-03-13

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase 1, which is currently in progress and will take twelve months to complete, will include the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. In Phase 2, which will be completed in the second year of the project, a platform for on-line data analysis will be developed. Phase 2 will include the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its sixth month of Phase

  6. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    Energy Technology Data Exchange (ETDEWEB)

    Robinson P. Khosah; Frank T. Alex

    2007-02-11

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project is being conducted in two phases. Phase One includes the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two, which is currently underway, involves the development of a platform for on-line data analysis. Phase Two includes the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now into its forty-eighth month of development activities.

  7. Database and Analytical Tool Development for the Management of Data Derived from US DOE (NETL) Funded Fine Particulate (PM2.5) Research

    Energy Technology Data Exchange (ETDEWEB)

    Robinson Khosah

    2007-07-31

    Advanced Technology Systems, Inc. (ATS) was contracted by the U. S. Department of Energy's National Energy Technology Laboratory (DOE-NETL) to develop a state-of-the-art, scalable and robust web-accessible database application to manage the extensive data sets resulting from the DOE-NETL-sponsored ambient air monitoring programs in the upper Ohio River valley region. The data management system was designed to include a web-based user interface that will allow easy access to the data by the scientific community, policy- and decision-makers, and other interested stakeholders, while providing detailed information on sampling, analytical and quality control parameters. In addition, the system will provide graphical analytical tools for displaying, analyzing and interpreting the air quality data. The system will also provide multiple report generation capabilities and easy-to-understand visualization formats that can be utilized by the media and public outreach/educational institutions. The project was conducted in two phases. Phase One included the following tasks: (1) data inventory/benchmarking, including the establishment of an external stakeholder group; (2) development of a data management system; (3) population of the database; (4) development of a web-based data retrieval system, and (5) establishment of an internal quality assurance/quality control system on data management. Phase Two involved the development of a platform for on-line data analysis. Phase Two included the following tasks: (1) development of a sponsor and stakeholder/user website with extensive online analytical tools; (2) development of a public website; (3) incorporation of an extensive online help system into each website; and (4) incorporation of a graphical representation (mapping) system into each website. The project is now technically completed.

  8. A new tool for the evaluation of the analytical procedure: Green Analytical Procedure Index.

    Science.gov (United States)

    Płotka-Wasylka, J

    2018-05-01

    A new means for assessing analytical protocols relating to green analytical chemistry attributes has been developed. The new tool, called GAPI (Green Analytical Procedure Index), evaluates the green character of an entire analytical methodology, from sample collection to final determination, and was created using such tools as the National Environmental Methods Index (NEMI) or Analytical Eco-Scale to provide not only general but also qualitative information. In GAPI, a specific symbol with five pentagrams can be used to evaluate and quantify the environmental impact involved in each step of an analytical methodology, mainly from green through yellow to red depicting low, medium to high impact, respectively. The proposed tool was used to evaluate analytical procedures applied in the determination of biogenic amines in wine samples, and polycyclic aromatic hydrocarbon determination by EPA methods. GAPI tool not only provides an immediately perceptible perspective to the user/reader but also offers exhaustive information on evaluated procedures. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. Developing an analytical tool for evaluating EMS system design changes and their impact on cardiac arrest outcomes: combining geographic information systems with register data on survival rates

    Directory of Open Access Journals (Sweden)

    Sund Björn

    2013-02-01

    Full Text Available Abstract Background Out-of-hospital cardiac arrest (OHCA is a frequent and acute medical condition that requires immediate care. We estimate survival rates from OHCA in the area of Stockholm, through developing an analytical tool for evaluating Emergency Medical Services (EMS system design changes. The study also is an attempt to validate the proposed model used to generate the outcome measures for the study. Methods and results This was done by combining a geographic information systems (GIS simulation of driving times with register data on survival rates. The emergency resources comprised ambulance alone and ambulance plus fire services. The simulation model predicted a baseline survival rate of 3.9 per cent, and reducing the ambulance response time by one minute increased survival to 4.6 per cent. Adding the fire services as first responders (dual dispatch increased survival to 6.2 per cent from the baseline level. The model predictions were validated using empirical data. Conclusion We have presented an analytical tool that easily can be generalized to other regions or countries. The model can be used to predict outcomes of cardiac arrest prior to investment in EMS design changes that affect the alarm process, e.g. (1 static changes such as trimming the emergency call handling time or (2 dynamic changes such as location of emergency resources or which resources should carry a defibrillator.

  10. Developing an analytical tool for evaluating EMS system design changes and their impact on cardiac arrest outcomes: combining geographic information systems with register data on survival rates.

    Science.gov (United States)

    Sund, Björn

    2013-02-15

    Out-of-hospital cardiac arrest (OHCA) is a frequent and acute medical condition that requires immediate care. We estimate survival rates from OHCA in the area of Stockholm, through developing an analytical tool for evaluating Emergency Medical Services (EMS) system design changes. The study also is an attempt to validate the proposed model used to generate the outcome measures for the study. This was done by combining a geographic information systems (GIS) simulation of driving times with register data on survival rates. The emergency resources comprised ambulance alone and ambulance plus fire services. The simulation model predicted a baseline survival rate of 3.9 per cent, and reducing the ambulance response time by one minute increased survival to 4.6 per cent. Adding the fire services as first responders (dual dispatch) increased survival to 6.2 per cent from the baseline level. The model predictions were validated using empirical data. We have presented an analytical tool that easily can be generalized to other regions or countries. The model can be used to predict outcomes of cardiac arrest prior to investment in EMS design changes that affect the alarm process, e.g. (1) static changes such as trimming the emergency call handling time or (2) dynamic changes such as location of emergency resources or which resources should carry a defibrillator.

  11. Chemometrics tools used in analytical chemistry: an overview.

    Science.gov (United States)

    Kumar, Naveen; Bansal, Ankit; Sarma, G S; Rawal, Ravindra K

    2014-06-01

    This article presents various important tools of chemometrics utilized as data evaluation tools generated by various hyphenated analytical techniques including their application since its advent to today. The work has been divided into various sections, which include various multivariate regression methods and multivariate resolution methods. Finally the last section deals with the applicability of chemometric tools in analytical chemistry. The main objective of this article is to review the chemometric methods used in analytical chemistry (qualitative/quantitative), to determine the elution sequence, classify various data sets, assess peak purity and estimate the number of chemical components. These reviewed methods further can be used for treating n-way data obtained by hyphenation of LC with multi-channel detectors. We prefer to provide a detailed view of various important methods developed with their algorithm in favor of employing and understanding them by researchers not very familiar with chemometrics. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Artist - analytical RT inspection simulation tool

    International Nuclear Information System (INIS)

    Bellon, C.; Jaenisch, G.R.

    2007-01-01

    The computer simulation of radiography is applicable for different purposes in NDT such as for the qualification of NDT systems, the prediction of its reliability, the optimization of system parameters, feasibility analysis, model-based data interpretation, education and training of NDT/NDE personnel, and others. Within the framework of the integrated project FilmFree the radiographic testing (RT) simulation software developed by BAM is being further developed to meet practical requirements for inspection planning in digital industrial radiology. It combines analytical modelling of the RT inspection process with the CAD-orientated object description applicable to various industrial sectors such as power generation, railways and others. (authors)

  13. Analytical framework and tool kit for SEA follow-up

    International Nuclear Information System (INIS)

    Nilsson, Mans; Wiklund, Hans; Finnveden, Goeran; Jonsson, Daniel K.; Lundberg, Kristina; Tyskeng, Sara; Wallgren, Oskar

    2009-01-01

    Most Strategic Environmental Assessment (SEA) research and applications have so far neglected the ex post stages of the process, also called SEA follow-up. Tool kits and methodological frameworks for engaging effectively with SEA follow-up have been conspicuously missing. In particular, little has so far been learned from the much more mature evaluation literature although many aspects are similar. This paper provides an analytical framework and tool kit for SEA follow-up. It is based on insights and tools developed within programme evaluation and environmental systems analysis. It is also grounded in empirical studies into real planning and programming practices at the regional level, but should have relevance for SEA processes at all levels. The purpose of the framework is to promote a learning-oriented and integrated use of SEA follow-up in strategic decision making. It helps to identify appropriate tools and their use in the process, and to systematise the use of available data and knowledge across the planning organization and process. It distinguishes three stages in follow-up: scoping, analysis and learning, identifies the key functions and demonstrates the informational linkages to the strategic decision-making process. The associated tool kit includes specific analytical and deliberative tools. Many of these are applicable also ex ante, but are then used in a predictive mode rather than on the basis of real data. The analytical element of the framework is organized on the basis of programme theory and 'DPSIR' tools. The paper discusses three issues in the application of the framework: understanding the integration of organizations and knowledge; understanding planners' questions and analytical requirements; and understanding interests, incentives and reluctance to evaluate

  14. Commentary on "Theory-Led Design of Instruments and Representations in Learning Analytics: Developing a Novel Tool for Orchestration of Online Collaborative Learning"

    Science.gov (United States)

    Teplovs, Chris

    2015-01-01

    This commentary reflects on the contributions to learning analytics and theory by a paper that describes how multiple theoretical frameworks were woven together to inform the creation of a new, automated discourse analysis tool. The commentary highlights the contributions of the original paper, provides some alternative approaches, and touches on…

  15. Analytical Tools to Improve Optimization Procedures for Lateral Flow Assays.

    Science.gov (United States)

    Hsieh, Helen V; Dantzler, Jeffrey L; Weigl, Bernhard H

    2017-05-28

    Immunochromatographic or lateral flow assays (LFAs) are inexpensive, easy to use, point-of-care medical diagnostic tests that are found in arenas ranging from a doctor's office in Manhattan to a rural medical clinic in low resource settings. The simplicity in the LFA itself belies the complex task of optimization required to make the test sensitive, rapid and easy to use. Currently, the manufacturers develop LFAs by empirical optimization of material components (e.g., analytical membranes, conjugate pads and sample pads), biological reagents (e.g., antibodies, blocking reagents and buffers) and the design of delivery geometry. In this paper, we will review conventional optimization and then focus on the latter and outline analytical tools, such as dynamic light scattering and optical biosensors, as well as methods, such as microfluidic flow design and mechanistic models. We are applying these tools to find non-obvious optima of lateral flow assays for improved sensitivity, specificity and manufacturing robustness.

  16. Evaluation of Linked Data tools for Learning Analytics

    NARCIS (Netherlands)

    Hendrik, Drachsler; Herder, Eelco; d'Aquin, Mathieu; Dietze, Stefan

    2013-01-01

    Drachsler, H., Herder, E., d'Aquin, M., & Dietze, S. (2013, 8-12 April). Evaluation of Linked Data tools for Learning Analytics. Presentation given in the tutorial on 'Using Linked Data for Learning Analytics' at LAK2013, the Third Conference on Learning Analytics and Knowledge, Leuven, Belgium.

  17. PIXE as an analytical/educational tool

    International Nuclear Information System (INIS)

    Williams, E.T.

    1991-01-01

    The advantages and disadvantages of PIXE as an analytical method will be summarized. The authors will also discuss the advantages of PIXE as a means of providing interesting and feasible research projects for undergraduate science students

  18. Coupling databases and advanced analytic tools (R)

    OpenAIRE

    Seakomo, Saviour Sedem Kofi

    2014-01-01

    Today, several contemporary organizations collect various kinds of data, creating large data repositories. But the capacity to perform advanced analytics over these large amount of data stored in databases remains a significant challenge to statistical software (R, S, SAS, SPSS, etc) and data management systems (DBMSs). This is because while statistical software provide comprehensive analytics and modelling functionalities, they can only handle limited amounts of data. The data management sys...

  19. Analytical and numerical tools for vacuum systems

    CERN Document Server

    Kersevan, R

    2007-01-01

    Modern particle accelerators have reached a level of sophistication which require a thorough analysis of all their sub-systems. Among the latter, the vacuum system is often a major contributor to the operating performance of a particle accelerator. The vacuum engineer has nowadays a large choice of computational schemes and tools for the correct analysis, design, and engineering of the vacuum system. This paper is a review of the different type of algorithms and methodologies which have been developed and employed in the field since the birth of vacuum technology. The different level of detail between simple back-of-the-envelope calculations and more complex numerical analysis is discussed by means of comparisons. The domain of applicability of each method is discussed, together with its pros and cons.

  20. Analytical Quality by Design: A Tool for Regulatory Flexibility and Robust Analytics

    OpenAIRE

    Peraman, Ramalingam; Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the...

  1. Factors Influencing Beliefs for Adoption of a Learning Analytics Tool: An Empirical Study

    Science.gov (United States)

    Ali, Liaqat; Asadi, Mohsen; Gasevic, Dragan; Jovanovic, Jelena; Hatala, Marek

    2013-01-01

    Present research and development offer various learning analytics tools providing insights into different aspects of learning processes. Adoption of a specific tool for practice is based on how its learning analytics are perceived by educators to support their pedagogical and organizational goals. In this paper, we propose and empirically validate…

  2. Analytical Aerodynamic Simulation Tools for Vertical Axis Wind Turbines

    International Nuclear Information System (INIS)

    Deglaire, Paul

    2010-01-01

    Wind power is a renewable energy source that is today the fastest growing solution to reduce CO 2 emissions in the electric energy mix. Upwind horizontal axis wind turbine with three blades has been the preferred technical choice for more than two decades. This horizontal axis concept is today widely leading the market. The current PhD thesis will cover an alternative type of wind turbine with straight blades and rotating along the vertical axis. A brief overview of the main differences between the horizontal and vertical axis concept has been made. However the main focus of this thesis is the aerodynamics of the wind turbine blades. Making aerodynamically efficient turbines starts with efficient blades. Making efficient blades requires a good understanding of the physical phenomena and effective simulations tools to model them. The specific aerodynamics for straight bladed vertical axis turbine flow are reviewed together with the standard aerodynamic simulations tools that have been used in the past by blade and rotor designer. A reasonably fast (regarding computer power) and accurate (regarding comparison with experimental results) simulation method was still lacking in the field prior to the current work. This thesis aims at designing such a method. Analytical methods can be used to model complex flow if the geometry is simple. Therefore, a conformal mapping method is derived to transform any set of section into a set of standard circles. Then analytical procedures are generalized to simulate moving multibody sections in the complex vertical flows and forces experienced by the blades. Finally the fast semi analytical aerodynamic algorithm boosted by fast multipole methods to handle high number of vortices is coupled with a simple structural model of the rotor to investigate potential aeroelastic instabilities. Together with these advanced simulation tools, a standard double multiple streamtube model has been developed and used to design several straight bladed

  3. New authentication mechanism using certificates for big data analytic tools

    OpenAIRE

    Velthuis, Paul

    2017-01-01

    Companies analyse large amounts of sensitive data on clusters of machines, using a framework such as Apache Hadoop to handle inter-process communication, and big data analytic tools such as Apache Spark and Apache Flink to analyse the growing amounts of data. Big data analytic tools are mainly tested on performance and reliability. Security and authentication have not been enough considered and they lack behind. The goal of this research is to improve the authentication and security for data ...

  4. Web analytics tools and web metrics tools: An overview and comparative analysis

    OpenAIRE

    Bekavac, Ivan; Garbin Praničević, Daniela

    2015-01-01

    The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytic...

  5. Game development tool essentials

    CERN Document Server

    Berinstein, Paula; Ardolino, Alessandro; Franco, Simon; Herubel, Adrien; McCutchan, John; Nedelcu, Nicusor; Nitschke, Benjamin; Olmstead, Don; Robinet, Fabrice; Ronchi, Christian; Turkowski, Rita; Walter, Robert; Samour, Gustavo

    2014-01-01

    Offers game developers new techniques for streamlining the critical game tools pipeline. Inspires game developers to share their secrets and improve the productivity of the entire industry. Helps game industry practitioners compete in a hyper-competitive environment.

  6. Analytical Tools to Improve Optimization Procedures for Lateral Flow Assays

    Directory of Open Access Journals (Sweden)

    Helen V. Hsieh

    2017-05-01

    Full Text Available Immunochromatographic or lateral flow assays (LFAs are inexpensive, easy to use, point-of-care medical diagnostic tests that are found in arenas ranging from a doctor’s office in Manhattan to a rural medical clinic in low resource settings. The simplicity in the LFA itself belies the complex task of optimization required to make the test sensitive, rapid and easy to use. Currently, the manufacturers develop LFAs by empirical optimization of material components (e.g., analytical membranes, conjugate pads and sample pads, biological reagents (e.g., antibodies, blocking reagents and buffers and the design of delivery geometry. In this paper, we will review conventional optimization and then focus on the latter and outline analytical tools, such as dynamic light scattering and optical biosensors, as well as methods, such as microfluidic flow design and mechanistic models. We are applying these tools to find non-obvious optima of lateral flow assays for improved sensitivity, specificity and manufacturing robustness.

  7. Electronic tongue: An analytical gustatory tool

    Directory of Open Access Journals (Sweden)

    Rewanthwar Swathi Latha

    2012-01-01

    Full Text Available Taste is an important organoleptic property governing acceptance of products for administration through mouth. But majority of drugs available are bitter in taste. For patient acceptability and compliance, bitter taste drugs are masked by adding several flavoring agents. Thus, taste assessment is one important quality control parameter for evaluating taste-masked formulations. The primary method for the taste measurement of drug substances and formulations is by human panelists. The use of sensory panelists is very difficult and problematic in industry and this is due to the potential toxicity of drugs and subjectivity of taste panelists, problems in recruiting taste panelists, motivation and panel maintenance are significantly difficult when working with unpleasant products. Furthermore, Food and Drug Administration (FDA-unapproved molecules cannot be tested. Therefore, analytical taste-sensing multichannel sensory system called as electronic tongue (e-tongue or artificial tongue which can assess taste have been replacing the sensory panelists. Thus, e-tongue includes benefits like reducing reliance on human panel. The present review focuses on the electrochemical concepts in instrumentation, performance qualification of E-tongue, and applications in various fields.

  8. Analytical Tools for the Routine Use of the UMAE

    International Nuclear Information System (INIS)

    D'Auria, F.; Eramo, A.; Gianotti, W.

    1998-01-01

    UMAE (Uncertainty Methodology based on Accuracy Extrapolation) is methodology developed to calculate the uncertainties related to thermal-hydraulic code results in Nuclear Power Plant transient analysis. The use of the methodology has shown the need of specific analytical tools to simplify some steps of its application and making clearer individual procedures adopted in the development. Three of these have been recently completed and are illustrated in this paper. The first one makes it possible to attribute ''weight factors'' to the experimental Integral Test Facilities; results are also shown. The second one deals with the calculation of the accuracy of a code results. The concerned computer program makes a comparison between experimental and calculated trends of any quantity and gives as output the accuracy of the calculation. The third one consists in a computer program suitable to get continuous uncertainty bands from single valued points. (author)

  9. Enabling analytical and Modeling Tools for Enhanced Disease Surveillance

    Energy Technology Data Exchange (ETDEWEB)

    Dawn K. Manley

    2003-04-01

    Early detection, identification, and warning are essential to minimize casualties from a biological attack. For covert attacks, sick people are likely to provide the first indication of an attack. An enhanced medical surveillance system that synthesizes distributed health indicator information and rapidly analyzes the information can dramatically increase the number of lives saved. Current surveillance methods to detect both biological attacks and natural outbreaks are hindered by factors such as distributed ownership of information, incompatible data storage and analysis programs, and patient privacy concerns. Moreover, because data are not widely shared, few data mining algorithms have been tested on and applied to diverse health indicator data. This project addressed both integration of multiple data sources and development and integration of analytical tools for rapid detection of disease outbreaks. As a first prototype, we developed an application to query and display distributed patient records. This application incorporated need-to-know access control and incorporated data from standard commercial databases. We developed and tested two different algorithms for outbreak recognition. The first is a pattern recognition technique that searches for space-time data clusters that may signal a disease outbreak. The second is a genetic algorithm to design and train neural networks (GANN) that we applied toward disease forecasting. We tested these algorithms against influenza, respiratory illness, and Dengue Fever data. Through this LDRD in combination with other internal funding, we delivered a distributed simulation capability to synthesize disparate information and models for earlier recognition and improved decision-making in the event of a biological attack. The architecture incorporates user feedback and control so that a user's decision inputs can impact the scenario outcome as well as integrated security and role-based access-control for communicating

  10. FUMAC-84. A hybrid PCI analytical tool

    International Nuclear Information System (INIS)

    Matheson, J.E.; Walton, L.A.

    1984-01-01

    ''FUMAC-84'', a new computer code currently under development at Babcock and Wilcox, will be used to analyze PCMI in light water reactor fuel rods. This is a hybrid code in the sense that the pellet behaviour is predicted from deterministic models which incorporate the large data base being generated by the international fuel performance programs (OVERRAMP, SUPER-RAMP, NFIR, etc.), while the cladding is modelled using finite elements. The fuel cracking and relocation model developed for FUMAC is semi-empirical and includes data up to 35 GWd/mtU and linear heat rates ranging from 100 to 700 W/Cm. With this model the onset of cladding ridging has been accurately predicted for steady-state operation. Transient behaviour of the pellet is still under investigation and the model is being enhanced to include these effects. The cladding model integrates the mechanical damage over a power history by solving the finite element assumed displacement problem in a quasistatic manner. Early work on FUMAC-84 has been directed at the development and benchmarking of the interim code. The purpose of the interim code is to provide a vehicle to proof out the deterministic pellet models which have been developed. To date the cracking model and the relocation model have been benchmarked. The thermal model for the pellet was developed by fitting data from several Halden experiments. The ability to accurately predict cladding ridging behaviour has been used to test how well the pellet swelling, densification and compliance models work in conjunction with fuel cladding material models. Reasonable results have been achieved for the steady-state cases while difficulty has been encountered in trying to reproduce transient results. Current work includes an effort to improve the ability of the models to handle transients well. (author)

  11. The use of generalised audit software by internal audit functions in a developing country: The purpose of the use of generalised audit software as a data analytics tool

    Directory of Open Access Journals (Sweden)

    D.P. van der Nest

    2017-11-01

    Full Text Available This article explores the purpose of the use of generalised audit software as a data analytics tool by internal audit functions in the locally controlled banking industry of South Africa. The evolution of the traditional internal audit methodology of collecting audit evidence through the conduct of interviews, the completion of questionnaires, and by testing controls on a sample basis, is long overdue, and such practice in the present technological, data-driven era will soon render such an internal audit function obsolete. The research results indicate that respondents are utilising GAS for a variety of purposes but that its frequency of use is not yet optimal and that there is still much room for improvement for tests of controls purposes. The top five purposes for which the respondents make use of GAS often to always during separate internal audit engagements are: (1 to identify transactions with specific characteristics or control criteria for tests of control purposes; (2 for conducting full population analysis; (3 to identify account balances over a certain amount; (4 to identify and report on the frequency of occurrence of risks or frequency of occurrence of specific events; and (5 to obtain audit evidence about control effectiveness

  12. The Bio-Analytic Resource: Data visualization and analytic tools for multiple levels of plant biology

    Directory of Open Access Journals (Sweden)

    Jamie Waese

    2016-11-01

    Full Text Available The Bio-Analytic Resource for Plant Biology (BAR is a portal for accessing large data sets from approximately 15 different plant species, with a focus on transcriptomic, protein-protein interaction, and promoter data. It consists of numerous databases for which its curators have added useful metadata, data visualization tools to display the query results from these databases, and visual analytic tools to identify e.g. gene expression patterns of interest based on publicly-available data. We briefly cover some of these tools and scenarios in which they might be useful for plant researchers.

  13. Landscape History and Theory: from Subject Matter to Analytical Tool

    Directory of Open Access Journals (Sweden)

    Jan Birksted

    2003-10-01

    Full Text Available This essay explores how landscape history can engage methodologically with the adjacent disciplines of art history and visual/cultural studies. Central to the methodological problem is the mapping of the beholder - spatially, temporally and phenomenologically. In this mapping process, landscape history is transformed from subject matter to analytical tool. As a result, landscape history no longer simply imports and applies ideas from other disciplines but develops its own methodologies to engage and influence them. Landscape history, like art history, thereby takes on a creative cultural presence. Through that process, landscape architecture and garden design regains the cultural power now carried by the arts and museum studies, and has an effect on the innovative capabilities of contemporary landscape design.

  14. Metal-hexacyanoferrate films: a tool in analytical Chemistry

    OpenAIRE

    Mattos, Ivanildo Luiz de; Gorton, Lo

    2001-01-01

    Chemically modified electrodes based on hexacyanometalate films are presented as a tool in analytical chemistry. Use of amperometric sensors and/or biosensors based on the metal-hexacyanoferrate films is a tendency. This article reviews some applications of these films for analytical determination of both inorganic (e.g. As3+, S2O3(2-)) and organic (e.g. cysteine, hydrazine, ascorbic acid, gluthatione, glucose, etc.) compounds.

  15. Web analytics tools and web metrics tools: An overview and comparative analysis

    Directory of Open Access Journals (Sweden)

    Ivan Bekavac

    2015-10-01

    Full Text Available The aim of the paper is to compare and analyze the impact of web analytics tools for measuring the performance of a business model. Accordingly, an overview of web analytics and web metrics tools is given, including their characteristics, main functionalities and available types. The data acquisition approaches and proper choice of web tools for particular business models are also reviewed. The research is divided in two sections. First, a qualitative focus is placed on reviewing web analytics tools to exploring their functionalities and ability to be integrated into the respective business model. Web analytics tools support the business analyst’s efforts in obtaining useful and relevant insights into market dynamics. Thus, generally speaking, selecting a web analytics and web metrics tool should be based on an investigative approach, not a random decision. The second section is a quantitative focus shifting from theory to an empirical approach, and which subsequently presents output data resulting from a study based on perceived user satisfaction of web analytics tools. The empirical study was carried out on employees from 200 Croatian firms from either an either IT or marketing branch. The paper contributes to highlighting the support for management that available web analytics and web metrics tools available on the market have to offer, and based on the growing needs of understanding and predicting global market trends.

  16. Ultramicroelectrode Array Based Sensors: A Promising Analytical Tool for Environmental Monitoring

    OpenAIRE

    Orozco, Jahir; Fern?ndez-S?nchez, C?sar; Jim?nez-Jorquera, Cecilia

    2010-01-01

    The particular analytical performance of ultramicroelectrode arrays (UMEAs) has attracted a high interest by the research community and has led to the development of a variety of electroanalytical applications. UMEA-based approaches have demonstrated to be powerful, simple, rapid and cost-effective analytical tools for environmental analysis compared to available conventional electrodes and standardised analytical techniques. An overview of the fabrication processes of UMEAs, their characteri...

  17. Network analytical tool for monitoring global food safety highlights China.

    Directory of Open Access Journals (Sweden)

    Tamás Nepusz

    Full Text Available BACKGROUND: The Beijing Declaration on food safety and security was signed by over fifty countries with the aim of developing comprehensive programs for monitoring food safety and security on behalf of their citizens. Currently, comprehensive systems for food safety and security are absent in many countries, and the systems that are in place have been developed on different principles allowing poor opportunities for integration. METHODOLOGY/PRINCIPAL FINDINGS: We have developed a user-friendly analytical tool based on network approaches for instant customized analysis of food alert patterns in the European dataset from the Rapid Alert System for Food and Feed. Data taken from alert logs between January 2003-August 2008 were processed using network analysis to i capture complexity, ii analyze trends, and iii predict possible effects of interventions by identifying patterns of reporting activities between countries. The detector and transgressor relationships are readily identifiable between countries which are ranked using i Google's PageRank algorithm and ii the HITS algorithm of Kleinberg. The program identifies Iran, China and Turkey as the transgressors with the largest number of alerts. However, when characterized by impact, counting the transgressor index and the number of countries involved, China predominates as a transgressor country. CONCLUSIONS/SIGNIFICANCE: This study reports the first development of a network analysis approach to inform countries on their transgressor and detector profiles as a user-friendly aid for the adoption of the Beijing Declaration. The ability to instantly access the country-specific components of the several thousand annual reports will enable each country to identify the major transgressors and detectors within its trading network. Moreover, the tool can be used to monitor trading countries for improved detector/transgressor ratios.

  18. Analytical Quality by Design: A Tool for Regulatory Flexibility and Robust Analytics

    Directory of Open Access Journals (Sweden)

    Ramalingam Peraman

    2015-01-01

    Full Text Available Very recently, Food and Drug Administration (FDA has approved a few new drug applications (NDA with regulatory flexibility for quality by design (QbD based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design. It allows the analytical method for movement within method operable design region (MODR. Unlike current methods, analytical method developed using analytical quality by design (AQbD approach reduces the number of out-of-trend (OOT results and out-of-specification (OOS results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10. Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT.

  19. Analytical Quality by Design: A Tool for Regulatory Flexibility and Robust Analytics

    Science.gov (United States)

    Bhadraya, Kalva; Padmanabha Reddy, Yiragamreddy

    2015-01-01

    Very recently, Food and Drug Administration (FDA) has approved a few new drug applications (NDA) with regulatory flexibility for quality by design (QbD) based analytical approach. The concept of QbD applied to analytical method development is known now as AQbD (analytical quality by design). It allows the analytical method for movement within method operable design region (MODR). Unlike current methods, analytical method developed using analytical quality by design (AQbD) approach reduces the number of out-of-trend (OOT) results and out-of-specification (OOS) results due to the robustness of the method within the region. It is a current trend among pharmaceutical industry to implement analytical quality by design (AQbD) in method development process as a part of risk management, pharmaceutical development, and pharmaceutical quality system (ICH Q10). Owing to the lack explanatory reviews, this paper has been communicated to discuss different views of analytical scientists about implementation of AQbD in pharmaceutical quality system and also to correlate with product quality by design and pharmaceutical analytical technology (PAT). PMID:25722723

  20. Learning Analytics: drivers, developments and challenges

    Directory of Open Access Journals (Sweden)

    Rebecca Ferguson

    2014-12-01

    Full Text Available Learning analytics is a significant area of Technology-Enhanced Learning (TEL that has emerged during the last decade. This review of the field begins with an examination of the technological, educational and political factors that have driven the development of analytics in educational settings. It goes on to chart the emergence of learning analytics, including their origins in the 20th century, the development of data-driven analytics, the rise of learning-focused perspectives and the influence of national economic concerns. It next focuses on the relationships between learning analytics, educational data mining and academic analytics. Finally, it examines developing areas of learning analytics research, and identifies a series of future challenges.

  1. An analytical framework and tool ('InteRa') for integrating the informal recycling sector in waste and resource management systems in developing countries.

    Science.gov (United States)

    Velis, Costas A; Wilson, David C; Rocca, Ondina; Smith, Stephen R; Mavropoulos, Antonis; Cheeseman, Chris R

    2012-09-01

    In low- and middle-income developing countries, the informal (collection and) recycling sector (here abbreviated IRS) is an important, but often unrecognised, part of a city's solid waste and resources management system. Recent evidence shows recycling rates of 20-30% achieved by IRS systems, reducing collection and disposal costs. They play a vital role in the value chain by reprocessing waste into secondary raw materials, providing a livelihood to around 0.5% of urban populations. However, persisting factual and perceived problems are associated with IRS (waste-picking): occupational and public health and safety (H&S), child labour, uncontrolled pollution, untaxed activities, crime and political collusion. Increasingly, incorporating IRS as a legitimate stakeholder and functional part of solid waste management (SWM) is attempted, further building recycling rates in an affordable way while also addressing the negatives. Based on a literature review and a practitioner's workshop, here we develop a systematic framework--or typology--for classifying and analysing possible interventions to promote the integration of IRS in a city's SWM system. Three primary interfaces are identified: between the IRS and the SWM system, the materials and value chain, and society as a whole; underlain by a fourth, which is focused on organisation and empowerment. To maximise the potential for success, IRS integration/inclusion/formalisation initiatives should consider all four categories in a balanced way and pay increased attention to their interdependencies, which are central to success, including specific actions, such as the IRS having access to source separated waste. A novel rapid evaluation and visualisation tool is presented--integration radar (diagram) or InterRa--aimed at illustrating the degree to which a planned or existing intervention considers each of the four categories. The tool is further demonstrated by application to 10 cases around the world, including a step

  2. Big Data Analytics Tools as Applied to ATLAS Event Data

    CERN Document Server

    Vukotic, Ilija; The ATLAS collaboration

    2016-01-01

    Big Data technologies have proven to be very useful for storage, processing and visualization of derived metrics associated with ATLAS distributed computing (ADC) services. Log file data and database records, and metadata from a diversity of systems have been aggregated and indexed to create an analytics platform for ATLAS ADC operations analysis. Dashboards, wide area data access cost metrics, user analysis patterns, and resource utilization efficiency charts are produced flexibly through queries against a powerful analytics cluster. Here we explore whether these techniques and analytics ecosystem can be applied to add new modes of open, quick, and pervasive access to ATLAS event data so as to simplify access and broaden the reach of ATLAS public data to new communities of users. An ability to efficiently store, filter, search and deliver ATLAS data at the event and/or sub-event level in a widely supported format would enable or significantly simplify usage of big data, statistical and machine learning tools...

  3. Analytical Modelling Of Milling For Tool Design And Selection

    Science.gov (United States)

    Fontaine, M.; Devillez, A.; Dudzinski, D.

    2007-05-01

    This paper presents an efficient analytical model which allows to simulate a large panel of milling operations. A geometrical description of common end mills and of their engagement in the workpiece material is proposed. The internal radius of the rounded part of the tool envelope is used to define the considered type of mill. The cutting edge position is described for a constant lead helix and for a constant local helix angle. A thermomechanical approach of oblique cutting is applied to predict forces acting on the tool and these results are compared with experimental data obtained from milling tests on a 42CrMo4 steel for three classical types of mills. The influence of some tool's geometrical parameters on predicted cutting forces is presented in order to propose optimisation criteria for design and selection of cutting tools.

  4. Analytical Modelling Of Milling For Tool Design And Selection

    International Nuclear Information System (INIS)

    Fontaine, M.; Devillez, A.; Dudzinski, D.

    2007-01-01

    This paper presents an efficient analytical model which allows to simulate a large panel of milling operations. A geometrical description of common end mills and of their engagement in the workpiece material is proposed. The internal radius of the rounded part of the tool envelope is used to define the considered type of mill. The cutting edge position is described for a constant lead helix and for a constant local helix angle. A thermomechanical approach of oblique cutting is applied to predict forces acting on the tool and these results are compared with experimental data obtained from milling tests on a 42CrMo4 steel for three classical types of mills. The influence of some tool's geometrical parameters on predicted cutting forces is presented in order to propose optimisation criteria for design and selection of cutting tools

  5. Applications of Spatial Data Using Business Analytics Tools

    Directory of Open Access Journals (Sweden)

    Anca Ioana ANDREESCU

    2011-12-01

    Full Text Available This paper addresses the possibilities of using spatial data in business analytics tools, with emphasis on SAS software. Various kinds of map data sets containing spatial data are presented and discussed. Examples of map charts illustrating macroeconomic parameters demonstrate the application of spatial data for the creation of map charts in SAS Enterprise Guise. Extended features of map charts are being exemplified by producing charts via SAS programming procedures.

  6. Developing Foucault's Discourse Analytic Methodology

    Directory of Open Access Journals (Sweden)

    Rainer Diaz-Bone

    2006-01-01

    Full Text Available A methodological position for a FOUCAULTian discourse analysis is presented. A sequence of analytical steps is introduced and an illustrating example is offered. It is emphasized that discourse analysis has to discover the system-level of discursive rules and the deeper structure of the discursive formation. Otherwise the analysis will be unfinished. Michel FOUCAULTs work is theoretically grounded in French structuralism and (the so called post-structuralism. In this paper, post-structuralism is not conceived as a means for overcoming of structuralism, but as a way of critically continuing the structural perspective. In this way, discursive structures can be related to discursive practices and the concept of structure can be disclosed (e. g. to inter-discourse or DERRIDAs concept of structurality. In this way, the structural methodology is continued and radicalized, but not given up. In this paper, FOUCAULTs theory is combined with the works of Michel PÊCHEUX and (especially for the sociology of knowledge and the sociology of culture Pierre BOURDIEU. The practice of discourse analysis is theoretically grounded. This practice can be conceived as a reflexive coupling of deconstruction and reconstruction in the material to be analyzed. This methodology therefore can be characterized as a reconstructive qualitative methodology. At the end of the article, forms of discourse analysis are criticized that do not intend to recover the system level of discursive rules and that do not intend to discover the deeper structure of the discursive formation (i. e. episteme, socio-episteme. These forms merely are commentaries of discourses (not their analyses, they remain phenomenological and are therefore: pre-structuralist. URN: urn:nbn:de:0114-fqs060168

  7. Ultramicroelectrode Array Based Sensors: A Promising Analytical Tool for Environmental Monitoring

    Directory of Open Access Journals (Sweden)

    Cecilia Jiménez-Jorquera

    2010-01-01

    Full Text Available The particular analytical performance of ultramicroelectrode arrays (UMEAs has attracted a high interest by the research community and has led to the development of a variety of electroanalytical applications. UMEA-based approaches have demonstrated to be powerful, simple, rapid and cost-effective analytical tools for environmental analysis compared to available conventional electrodes and standardised analytical techniques. An overview of the fabrication processes of UMEAs, their characterization and applications carried out by the Spanish scientific community is presented. A brief explanation of theoretical aspects that highlight their electrochemical behavior is also given. Finally, the applications of this transducer platform in the environmental field are discussed.

  8. Ultramicroelectrode array based sensors: a promising analytical tool for environmental monitoring.

    Science.gov (United States)

    Orozco, Jahir; Fernández-Sánchez, César; Jiménez-Jorquera, Cecilia

    2010-01-01

    The particular analytical performance of ultramicroelectrode arrays (UMEAs) has attracted a high interest by the research community and has led to the development of a variety of electroanalytical applications. UMEA-based approaches have demonstrated to be powerful, simple, rapid and cost-effective analytical tools for environmental analysis compared to available conventional electrodes and standardised analytical techniques. An overview of the fabrication processes of UMEAs, their characterization and applications carried out by the Spanish scientific community is presented. A brief explanation of theoretical aspects that highlight their electrochemical behavior is also given. Finally, the applications of this transducer platform in the environmental field are discussed.

  9. Doehlert matrix: a chemometric tool for analytical chemistry-review.

    Science.gov (United States)

    Ferreira, Sérgio L C; Dos Santos, Walter N L; Quintella, Cristina M; Neto, Benício B; Bosque-Sendra, Juan M

    2004-07-08

    A review of the use of the Doehlert matrix as a chemometric tool for the optimization of methods in analytical chemistry and other sciences is presented. The theoretical principles of Doehlert designs are described, including the coded values for the use of this matrix involving two, three, four and five variables. The advantages of this matrix in comparison with other response surface designs, such as central composite and Box-Behnken, designs are discussed. Finally, 57 references concerning the application of Doehlert matrices in the optimization of procedures involving spectroanalytical, electroanalytical and chromatographic techniques are considered.

  10. Selecting analytical tools for characterization of polymersomes in aqueous solution

    DEFF Research Database (Denmark)

    Habel, Joachim Erich Otto; Ogbonna, Anayo; Larsen, Nanna

    2015-01-01

    /purification. Of the analytical methods tested, Cryo-transmission electron microscopy and atomic force microscopy (AFM) turned out to be advantageous for polymersomes with smaller diameter than 200 nm, whereas confocal microscopy is ideal for diameters >400 nm. Polymersomes in the intermediate diameter range can be characterized...... using freeze fracture Cryo-scanning electron microscopy (FF-Cryo-SEM) and nanoparticle tracking analysis (NTA). Small angle X-ray scattering (SAXS) provides reliable data on bilayer thickness and internal structure, Cryo-TEM on multilamellarity. Taken together, these tools are valuable...

  11. Identity as an Analytical Tool to Explore Students’ Mathematical Writing

    DEFF Research Database (Denmark)

    Iversen, Steffen Møllegaard

    Learning to communicate in, with and about mathematics is a key part of learning mathematics (Niss & Højgaard, 2011). Therefore, understanding how students’ mathematical writing is shaped is important to mathematics education research. In this paper the notion of ‘writer identity’ (Ivanič, 1998......; Burgess & Ivanič, 2010) is introduced and operationalised in relation to students’ mathematical writing and through a case study it is illustrated how this analytic tool can facilitate valuable insights when exploring students’ mathematical writing....

  12. Development and Verification of Behavior of Tritium Analytic Code (BOTANIC)

    International Nuclear Information System (INIS)

    Park, Min Young; Kim, Eung Soo

    2014-01-01

    VHTR, one of the Generation IV reactor concepts, has a relatively high operation temperature and is usually suggested as a heat source for many industrial processes, including hydrogen production process. Thus, it is vital to trace tritium behavior in the VHTR system and the potential permeation rate to the industrial process. In other words, tritium is a crucial issue in terms of safety in the fission reactor system. Therefore, it is necessary to understand the behavior of tritium and the development of the tool to enable this is vital.. In this study, a Behavior of Tritium Analytic Code (BOTANIC) an analytic tool which is capable of analyzing tritium behavior is developed using a chemical process code called gPROMS. BOTANIC was then further verified using the analytic solutions and benchmark codes such as Tritium Permeation Analysis Code (TPAC) and COMSOL. In this study, the Behavior of Tritium Analytic Code, BOTANIC, has been developed using a chemical process code called gPROMS. The code has several distinctive features including non-diluted assumption, flexible applications and adoption of distributed permeation model. Due to these features, BOTANIC has the capability to analyze a wide range of tritium level systems and has a higher accuracy as it has the capacity to solve distributed models. BOTANIC was successfully developed and verified using analytical solution and the benchmark code calculation result. The results showed very good agreement with the analytical solutions and the calculation results of TPAC and COMSOL. Future work will be focused on the total system verification

  13. DSAT: Data Storage and Analytics Tool, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — The aim of this project is the development a large data warehousing and analysis tool for air traffic management (ATM) research that can be accessed by users through...

  14. Analytical techniques for biopharmaceutical development

    National Research Council Canada - National Science Library

    Wehr, Tim; Rodríguez-Díaz, Roberto; Tuck, Stephen (Stephen F.)

    2005-01-01

    ... for biopharmaceutical development / Tim Wehr, Roberto RodriguezDiaz, Stephen Tuck, editors. p. ; cm. Includes bibliographical references and index. ISBN 0-8247-2667-7 (alk. paper) 1. Protein drugs--Analysis--Laboratory manuals. [DNLM: 1. Pharmaceutical Preparations--analysis--Laboratory Manuals. 2. Biopharmaceutics--methods--Laboratory Ma...

  15. Current research relevant to the improvement of γ-ray spectroscopy as an analytical tool

    International Nuclear Information System (INIS)

    Meyer, R.A.; Tirsell, K.G.; Armantrout, G.A.

    1976-01-01

    Four areas of research that will have significant impact on the further development of γ-ray spectroscopy as an accurate analytical tool are considered. The areas considered are: (1) automation; (2) accurate multigamma ray sources; (3) accuracy of the current and future γ-ray energy scale, and (4) new solid state X and γ-ray detectors

  16. Development of Nuclear Analytical Technology

    Energy Technology Data Exchange (ETDEWEB)

    Park, Yong Joon; Kim, J. Y.; Sohn, S. C. (and others)

    2007-06-15

    The pre-treatment and handling techniques for the micro-particles in swipe samples were developed for the safeguards purpose. The development of screening technique for the swipe samples has been established using the nuclear fission track method as well as the alpha track method. The laser ablation system to take a nuclear particle present in swipe was designed and constructed for the determination of the enrichment factors for uranium or plutonium, and its performance was tested in atmosphere as well as in vacuum. The optimum conditions for the synthesis of silica based micro-particles were obtained for mass production. The optimum ion exchange resin was selected and the optimum conditions for the uranium adsorption in resin bead technique were established for the development of the enrichment factor for nuclear particles in swipe. The established technique was applied to the swipe taken directly from the nuclear facility and also to the archive samples of IAEA's environmental swipes. The evaluation of dose rate of neutron and secondary gamma-ray for the radiation shields were carried out to design the NIPS system, as well as the evaluation of the thermal neutron concentration effect by the various reflectors. D-D neutron generator was introduced as a neutron source for the NIPS system to have more advantages such as easier control and moderation capability than the {sup 252}Cf source. Simulated samples for explosive and chemical warfare were prepared to construct a prompt gamma-ray database. Based on the constructed database, a computer program for the detection of illicit chemical and nuclear materials was developed using the MATLAB software.

  17. Development of nuclear analytical technology

    International Nuclear Information System (INIS)

    Jee, Kwang Yong; Kim, W. H.; Park, Yeong J.; Park, Yong J.; Sohn, S. C.; Song, B. C.; Jeon, Y. S.; Pyo, H. Y.; Ha, Y. K.

    2004-04-01

    The objectives of this study are to develop the technology for the determination of isotopic ratios of nuclear particles detected from swipe samples and to develop the NIPS system. The R and D contents and results of this study are firstly the production of nuclear micro particle(1 ∼ 20 μm) and standardization, the examination of variation in fission track characteristic according to nuclear particle size and enrichment( 235 U: 1-50%), the construction of database and the application of this technique to swipe samples. If this technique is verified its superiority by various field tests and inter-laboratory comparison program with other institutes in developed countries, it can be possible to join NWAL supervised under IAEA and to export our technology abroad. Secondly, characteristics of alpha track by boron (n, α) nuclear reaction were studied to measure both total boron concentration and 10B enrichment. The correlation of number of alpha tracks and various 10B concentration was studied to evaluate the reliability of this method. Especially, cadmium shielding technique was introduced to reduce the background of alpha tracks by covering the solid track detector and the multi-dot detector plate was developed to increase the reproducibility of measurement by making boron solution dried evenly in the plate. The results of the alpha track method were found to be well agreed with those of mass spectroscopy within less than 10 % deviation. Finally, the NIPS system using 252 Cf neutron source was developed and prompt gamma spectrum and its background were obtained. Monte Carlo method using MCNP-4B code was utilized for the interpretation of neutron and gamma-ray shielding condition as well as the moderation of a fast neutron. Gamma-gamma coincidence was introduced to reduce the prompt gamma background. The counting efficiency of the HPGe detector was calibrated in the energy range from 50 keV to 10 MeV using radio isotope standards and prompt gamma rays of Cl for the

  18. New analytical tools combining gel electrophoresis and mass spectrometry

    OpenAIRE

    Tobolkina, Elena

    2014-01-01

    Proteomics has been one of the main projects challenging biological and analytical chemists for many years. The separation, identification and quantification of all the proteins expressed within biological systems remain the main objectives of proteomics. Due to sample complexity, the development of fractionation, separation, purification and detection techniques that possess appropriate resolution to separate a large number of proteins, as well as being sensitive and fast enough for high thr...

  19. Designing a Collaborative Visual Analytics Tool for Social and Technological Change Prediction.

    Energy Technology Data Exchange (ETDEWEB)

    Wong, Pak C.; Leung, Lai-Yung R.; Lu, Ning; Scott, Michael J.; Mackey, Patrick S.; Foote, Harlan P.; Correia, James; Taylor, Zachary T.; Xu, Jianhua; Unwin, Stephen D.; Sanfilippo, Antonio P.

    2009-09-01

    We describe our ongoing efforts to design and develop a collaborative visual analytics tool to interactively model social and technological change of our society in a future setting. The work involves an interdisciplinary team of scientists from atmospheric physics, electrical engineering, building engineering, social sciences, economics, public policy, and national security. The goal of the collaborative tool is to predict the impact of global climate change on the U.S. power grids and its implications for society and national security. These future scenarios provide critical assessment and information necessary for policymakers and stakeholders to help formulate a coherent, unified strategy toward shaping a safe and secure society. The paper introduces the problem background and related work, explains the motivation and rationale behind our design approach, presents our collaborative visual analytics tool and usage examples, and finally shares the development challenge and lessons learned from our investigation.

  20. Analytical Tools for Functional Assessment of Architectural Layouts

    Science.gov (United States)

    Bąkowski, Jarosław

    2017-10-01

    Functional layout of the building, understood as a layout or set of the facility rooms (or groups of rooms) with a system of internal communication, creates an environment and a place of mutual relations between the occupants of the object. Achieving optimal (from the occupants’ point of view) spatial arrangement is possible through activities that often go beyond the stage of architectural design. Adopted in the architectural design, most often during trial and error process or on the basis of previous experience (evidence-based design), functional layout is subject to continuous evaluation and dynamic changing since the beginning of its use. Such verification of the occupancy phase allows to plan future, possible transformations, as well as to develop model solutions for use in other settings. In broader terms, the research hypothesis is to examine whether and how the collected datasets concerning the facility and its utilization can be used to develop methods for assessing functional layout of buildings. In other words, if it is possible to develop an objective method of assessing functional layouts basing on a set of buildings’ parameters: technical, technological and functional ones and whether the method allows developing a set of tools enhancing the design methodology of complex functional objects. By linking the design with the construction phase it is possible to build parametric models of functional layouts, especially in the context of sustainable design or lean design in every aspect: ecological (by reducing the property’s impact on environment), economic (by optimizing its cost) and social (through the implementation of high-performance work environment). Parameterization of size and functional connections of the facility become part of the analyses, as well as the element of model solutions. The “lean” approach means the process of analysis of the existing scheme and consequently - finding weak points as well as means for eliminating these

  1. Laser-induced plasma spectrometry: truly a surface analytical tool

    International Nuclear Information System (INIS)

    Vadillo, Jose M.; Laserna, J.

    2004-01-01

    For a long period, analytical applications of laser induced plasma spectrometry (LIPS) have been mainly restricted to overall and quantitative determination of elemental composition in bulk, solid samples. However, introduction of new compact and reliable solid state lasers and technological development in multidimensional intensified detectors have made possible the seeking of new analytical niches for LIPS where its analytical advantages (direct sampling from any material irrespective of its conductive status without sample preparation and with sensitivity adequate for many elements in different matrices) could be fully exploited. In this sense, the field of surface analysis could take advantage from the cited advantages taking into account in addition, the capability of LIPS for spot analysis, line scan, depth-profiling, area analysis and compositional mapping with a single instrument in air at atmospheric pressure. This review paper outlines the fundamental principles of laser-induced plasma emission relevant to sample surface studies, discusses the experimental parameters governing the spatial (lateral and in-depth) resolution in LIPS analysis and presents the applications concerning surface examination

  2. Distributed Engine Control Empirical/Analytical Verification Tools

    Science.gov (United States)

    DeCastro, Jonathan; Hettler, Eric; Yedavalli, Rama; Mitra, Sayan

    2013-01-01

    NASA's vision for an intelligent engine will be realized with the development of a truly distributed control system featuring highly reliable, modular, and dependable components capable of both surviving the harsh engine operating environment and decentralized functionality. A set of control system verification tools was developed and applied to a C-MAPSS40K engine model, and metrics were established to assess the stability and performance of these control systems on the same platform. A software tool was developed that allows designers to assemble easily a distributed control system in software and immediately assess the overall impacts of the system on the target (simulated) platform, allowing control system designers to converge rapidly on acceptable architectures with consideration to all required hardware elements. The software developed in this program will be installed on a distributed hardware-in-the-loop (DHIL) simulation tool to assist NASA and the Distributed Engine Control Working Group (DECWG) in integrating DCS (distributed engine control systems) components onto existing and next-generation engines.The distributed engine control simulator blockset for MATLAB/Simulink and hardware simulator provides the capability to simulate virtual subcomponents, as well as swap actual subcomponents for hardware-in-the-loop (HIL) analysis. Subcomponents can be the communication network, smart sensor or actuator nodes, or a centralized control system. The distributed engine control blockset for MATLAB/Simulink is a software development tool. The software includes an engine simulation, a communication network simulation, control algorithms, and analysis algorithms set up in a modular environment for rapid simulation of different network architectures; the hardware consists of an embedded device running parts of the CMAPSS engine simulator and controlled through Simulink. The distributed engine control simulation, evaluation, and analysis technology provides unique

  3. Hyperspectral microscopy as an analytical tool for nanomaterials.

    Science.gov (United States)

    Roth, Gary A; Tahiliani, Sahil; Neu-Baker, Nicole M; Brenner, Sara A

    2015-01-01

    Hyperspectral microscopy is an advanced visualization technique that combines hyperspectral imaging with state-of-the-art optics and computer software to enable the rapid identification of materials at the micro- and nanoscales. Achieving this level of resolution has traditionally required time-consuming and costly electron microscopy techniques. While hyperspectral microscopy has already been applied to the analysis of bulk materials and biologicals, it shows extraordinary promise as an analytical tool to locate individual nanoparticles and aggregates in complex samples through rapid optical and spectroscopic identification. This technique can be used to not only screen for the presence of nanomaterials, but also to locate, identify, and characterize them. It could also be used to identify a subset of samples that would then move on for further analysis via other advanced metrology. This review will describe the science and origins of hyperspectral microscopy, examine current and emerging applications in life science, and examine potential applications of this technology that could improve research efficiency or lead to novel discoveries. © 2015 Wiley Periodicals, Inc.

  4. Prioritizing pesticide compounds for analytical methods development

    Science.gov (United States)

    Norman, Julia E.; Kuivila, Kathryn; Nowell, Lisa H.

    2012-01-01

    The U.S. Geological Survey (USGS) has a periodic need to re-evaluate pesticide compounds in terms of priorities for inclusion in monitoring and studies and, thus, must also assess the current analytical capabilities for pesticide detection. To meet this need, a strategy has been developed to prioritize pesticides and degradates for analytical methods development. Screening procedures were developed to separately prioritize pesticide compounds in water and sediment. The procedures evaluate pesticide compounds in existing USGS analytical methods for water and sediment and compounds for which recent agricultural-use information was available. Measured occurrence (detection frequency and concentrations) in water and sediment, predicted concentrations in water and predicted likelihood of occurrence in sediment, potential toxicity to aquatic life or humans, and priorities of other agencies or organizations, regulatory or otherwise, were considered. Several existing strategies for prioritizing chemicals for various purposes were reviewed, including those that identify and prioritize persistent, bioaccumulative, and toxic compounds, and those that determine candidates for future regulation of drinking-water contaminants. The systematic procedures developed and used in this study rely on concepts common to many previously established strategies. The evaluation of pesticide compounds resulted in the classification of compounds into three groups: Tier 1 for high priority compounds, Tier 2 for moderate priority compounds, and Tier 3 for low priority compounds. For water, a total of 247 pesticide compounds were classified as Tier 1 and, thus, are high priority for inclusion in analytical methods for monitoring and studies. Of these, about three-quarters are included in some USGS analytical method; however, many of these compounds are included on research methods that are expensive and for which there are few data on environmental samples. The remaining quarter of Tier 1

  5. Enhancing Safeguards through Information Analysis: Business Analytics Tools

    International Nuclear Information System (INIS)

    Vincent, J.; Midwinter, J.

    2015-01-01

    For the past 25 years the IBM i2 Intelligence Analysis product portfolio has assisted over 4,500 organizations across law enforcement, defense, government agencies, and commercial private sector businesses to maximize the value of the mass of information to discover and disseminate actionable intelligence that can help identify, investigate, predict, prevent, and disrupt criminal, terrorist, and fraudulent acts; safeguarding communities, organizations, infrastructures, and investments. The collaborative Intelligence Analysis environment delivered by i2 is specifically designed to be: · scalable: supporting business needs as well as operational and end user environments · modular: an architecture which can deliver maximum operational flexibility with ability to add complimentary analytics · interoperable: integrating with existing environments and eases information sharing across partner agencies · extendable: providing an open source developer essential toolkit, examples, and documentation for custom requirements i2 Intelligence Analysis brings clarity to complex investigations and operations by delivering industry leading multidimensional analytics that can be run on-demand across disparate data sets or across a single centralized analysis environment. The sole aim is to detect connections, patterns, and relationships hidden within high-volume, all-source data, and to create and disseminate intelligence products in near real time for faster informed decision making. (author)

  6. Android development tools for Eclipse

    CERN Document Server

    Shah, Sanjay

    2013-01-01

    A standard tutorial aimed at developing Android applications in a practical manner.Android Development Tools for Eclipse is aimed at beginners and existing developers who want to learn more about Android development. It is assumed that you have experience in Java programming and that you have used IDE for development.

  7. Development of integrated analytical data management system

    International Nuclear Information System (INIS)

    Onishi, Koichi; Wachi, Isamu; Hiroki, Toshio

    1986-01-01

    The Analysis Subsection of Technical Service Section, Tokai Reprocessing Plant, Tokai Works, is engaged in analysis activities required for the management of processes and measurements in the plant. Currently, it has been desired to increase the reliability of analytical data and to perform analyses more rapidly to cope with the increasing number of analysis works. To meet this end, on-line data processing has been promoted and advanced analytical equipment has been introduced in order to enhance automization. In the present study, an integrated analytical data mangement system is developed which serves for improvement of reliability of analytical data as well as for rapid retrieval and automatic compilation of these data. Fabrication of a basic model of the system has been nearly completed and test operation has already been started. In selecting hardware to be used, examinations were made on easiness of system extension, Japanese language processing function for improving man-machine interface, large-capacity auxiliary memory system, and data base processing function. The existing analysis works wer reviewed in establishing the basic design of the system. According to this basic design, the system can perform such works as analysis of application slips received from clients as well as recording, sending, filing and retrieval of analysis results. (Nogami, K.)

  8. Web analytics as tool for improvement of website taxonomies

    DEFF Research Database (Denmark)

    Jonasen, Tanja Svarre; Ådland, Marit Kristine; Lykke, Marianne

    The poster examines how web analytics can be used to provide information about users and inform design and redesign of taxonomies. It uses a case study of the website Cancer.dk by the Danish Cancer Society. The society is a private organization with an overall goal to prevent the development...... provides information about e.g. subjects of interest, searching behaviour, browsing patterns in website structure as well as tag clouds, page views. The poster discusses benefits and challenges of the two web metrics, with a model of how to use search and tag data for the design of taxonomies, e.g. choice...... of cancer, improve patients’ chances of recovery, and limit the physical, psychological and social side-effects of cancer. The website is the main channel for communication and knowledge sharing with patients, their relatives and professionals. The present study consists of two independent analyses, one...

  9. Ultrasonic trap as analytical tool; Die Ultraschallfalle als analytisches Werkzeug

    Energy Technology Data Exchange (ETDEWEB)

    Leiterer, Jork

    2009-11-30

    nanoparticles. This comprises fields of research like biomineralisation, protein agglomeration, distance dependent effects of nanocrystalline quantum dots and the in situ observation of early crystallization stages. In summary, the results of this work open a broad area of application to use the ultrasonic trap as an analytical tool. [German] Die Ultraschallfalle bietet eine besondere Moeglichkeit zur Handhabung von Proben im Mikrolitermassstab. Durch die akustische Levitation wird die Probe kontaktfrei in einer gasfoermigen Umgebung positioniert und somit dem Einfluss fester Oberflaechen entzogen. In dieser Arbeit werden die Moeglichkeiten der Ultraschallfalle fuer den Einsatz in der Analytik experimentell untersucht. Durch die Kopplung mit typischen kontaktlosen Analysemethoden wie der Spektroskopie und der Roentgenstreuung werden die Vorteile dieser Levitationstechnik an verschiedenen Materialien wie anorganischen, organischen, pharmazeutischen Substanzen bis hin zu Proteinen, Nano- und Mikropartikeln demonstriert. Es wird gezeigt, dass die Nutzung der akustischen Levitation zuverlaessig eine beruehrungslose Probenhandhabung fuer den Einsatz spektroskopischer Methoden (LIF, Raman) sowie erstmalig Methoden der Roentgenstreuung (EDXD, SAXS, WAXS) und Roentgenfluoreszenz (RFA, XANES) ermoeglicht. Fuer alle genannten Methoden erwies sich die wandlose Probenhalterung als vorteilhaft. So sind die Untersuchungsergebnisse vergleichbar mit denen herkoemmlicher Probenhalter und uebertreffen diese teilweise hinsichtlich der Datenqualitaet. Einen besonderen Erfolg stellt die Integration des akustischen Levitators in die experimentellen Aufbauten der Messplaetze am Synchrotron dar. Die Anwendung der Ultraschallfalle am BESSY konnte im Rahmen dieser Arbeit etabliert werden und bildet derzeit die Grundlage intensiver interdisziplinaerer Forschung. Ausserdem wurde das Potential der Falle zur Aufkonzentration erkannt und zum Studium verdunstungskontrollierter Prozesse angewendet. Die

  10. Neutron activation analysis as analytical tool of environmental issue

    International Nuclear Information System (INIS)

    Otoshi, Tsunehiko

    2004-01-01

    Neutron activation analysis (NAA) ia applicable to the sample of wide range of research fields, such as material science, biology, geochemistry and so on. However, respecting the advantages of NAA, a sample with small amounts or a precious sample is the most suitable samples for NAA, because NAA is capable of trace analysis and non-destructive determination. In this paper, among these fields, NAA of atmospheric particulate matter (PM) sample is discussed emphasizing on the use of obtained data as an analytical tool of environmental issue. Concentration of PM in air is usually very low, and it is not easy to get vast amount of sample even using a high volume air sampling devise. Therefore, high sensitive NAA is suitable to determine elements in PM samples. Main components of PM is crust oriented silicate, and so on in rural/remote area, and carbonic materials and heavy metals are concentrated in PM in urban area, because of automobile exhaust and other anthropogenic emission source. Elemental pattern of PM reflects a condition of air around the monitoring site. Trends of air pollution can be traced by periodical monitoring of PM by NAA method. Elemental concentrations in air change by season. For example, crustal elements increase in dry season, and sea salts components increase their concentration when wind direction from sea is dominant. Elements that emitted from anthropogenic sources are mainly contained in fine portion of PM, and increase their concentration during winter season, when emission from heating system is high and air is stable. For further analysis and understanding of environmental issues, indicator elements for various emission sources, and elemental concentration ratios of some environmental samples and source portion assignment techniques are useful. (author)

  11. Developing new chemical tools for solvent extraction

    International Nuclear Information System (INIS)

    Moyer, B.A.; Baes, C.F.; Burns, J.H.; Case, G.N.; Sachleben, R.A.; Bryan, S.A.; Lumetta, G.J.; McDowell, W.J.; Sachleben, R.A.

    1993-01-01

    Prospects for innovation and for greater technological impact in the field of solvent extraction (SX) seem as bright as ever, despite the maturation of SX as an economically significant separation method and as an important technique in the laboratory. New industrial, environmental, and analytical problems provide compelling motivation for diversifying the application of SX, developing new solvent systems, and seeking improved properties. Toward this end, basic research must be dedicated to enhancing the tools of SX: physical tools for probing the basis of extraction and molecular tools for developing new SX chemistries. In this paper, the authors describe their progress in developing and applying the general tools of equilibrium analysis and of ion recognition in SX. Nearly half a century after the field of SX began in earnest, coordination chemistry continues to provide the impetus for important advancements in understanding SX systems and in controlling SX chemistry. In particular, the physical tools of equilibrium analysis, X-ray crystallography, and spectroscopy are elucidating the molecular basis of SX in unprecedented detail. Moreover, the principles of ion recognition are providing the molecular tools with which to achieve new selectivities and new applications

  12. ATTIRE (analytical tools for thermal infrared engineering): A sensor simulation and modeling package

    Science.gov (United States)

    Jaggi, S.

    1993-01-01

    The Advanced Sensor Development Laboratory (ASDL) at the Stennis Space Center develops, maintains and calibrates remote sensing instruments for the National Aeronautics & Space Administration (NASA). To perform system design trade-offs, analysis, and establish system parameters, ASDL has developed a software package for analytical simulation of sensor systems. This package called 'Analytical Tools for Thermal InfraRed Engineering' - ATTIRE, simulates the various components of a sensor system. The software allows each subsystem of the sensor to be analyzed independently for its performance. These performance parameters are then integrated to obtain system level information such as Signal-to-Noise Ratio (SNR), Noise Equivalent Radiance (NER), Noise Equivalent Temperature Difference (NETD) etc. This paper describes the uses of the package and the physics that were used to derive the performance parameters.

  13. Microplasmas for chemical analysis: analytical tools or research toys?

    International Nuclear Information System (INIS)

    Karanassios, Vassili

    2004-01-01

    An overview of the activities of the research groups that have been involved in fabrication, development and characterization of microplasmas for chemical analysis over the last few years is presented. Microplasmas covered include: miniature inductively coupled plasmas (ICPs); capacitively coupled plasmas (CCPs); microwave-induced plasmas (MIPs); a dielectric barrier discharge (DBD); microhollow cathode discharge (MCHD) or microstructure electrode (MSE) discharges, other microglow discharges (such as those formed between 'liquid' electrodes); microplasmas formed in micrometer-diameter capillary tubes for gas chromatography (GC) or high-performance liquid chromatography (HPLC) applications, and a stabilized capacitive plasma (SCP) for GC applications. Sample introduction into microplasmas, in particular, into a microplasma device (MPD), battery operation of a MPD and of a mini- in-torch vaporization (ITV) microsample introduction system for MPDs, and questions of microplasma portability for use on site (e.g., in the field) are also briefly addressed using examples of current research. To emphasize the significance of sample introduction into microplasmas, some previously unpublished results from the author's laboratory have also been included. And an overall assessment of the state-of-the-art of analytical microplasma research is provided

  14. Factors Influencing Attitudes Towards the Use of CRM’s Analytical Tools in Organizations

    Directory of Open Access Journals (Sweden)

    Šebjan Urban

    2016-02-01

    Full Text Available Background and Purpose: Information solutions for analytical customer relationship management CRM (aCRM IS that include the use of analytical tools are becoming increasingly important, due organizations’ need for knowledge of their customers and the ability to manage big data. The objective of the research is, therefore, to determine how the organizations’ orientations (process, innovation, and technology as critical organizational factors affect the attitude towards the use of the analytical tools of aCRM IS.

  15. Environmental tools in product development

    DEFF Research Database (Denmark)

    Wenzel, Henrik; Hauschild, Michael Zwicky; Jørgensen, Jørgen

    1994-01-01

    A precondition for design of environmentally friendly products is that the design team has access to methods and tools supporting the introduction of environmental criteria in product development. A large Danish program, EDIP, is being carried out by the Institute for Product Development, Technical...

  16. Urban Development Tools in Denmark

    DEFF Research Database (Denmark)

    Aunsborg, Christian; Enemark, Stig; Sørensen, Michael Tophøj

    2005-01-01

    Artiklen indeholder følgende afsnit: 1. Urbax and the Danish Planning system 2. Main Challenges in the Urban Development 3. Coordination and Growth (Management) Policies and Spatial Planning Policies 4. Coordination of Market Events and Spatial Planning 5. The application of Urban Development Tools...

  17. Hydrogen transport in containments: a survey of analytical tools and benchmark experiments

    International Nuclear Information System (INIS)

    Manno, V.P.; Golay, M.W.

    1984-01-01

    The Three-Mile Island, Unit 2 accident fostered renewed attention to the issue of hydrogen safety in reactor containments. This article reviews the principal analytical tools developed to simulate postulated events prior to combustion as well as experimental programs designed to augment the state of knowledge in this area. The important physical mechanisms driving prechemical reaction transport, such as source momentum, intercompartmental flow, forced convection, natural convection, and diffusion, are put in perspective. A number of lumped-parameter and/or continuum formulation computer codes, including COBRA-NC, HECTR, HMS, LIMIT, RALOC, and TEMPEST, are described and compared in terms of their physical models, bounds of applicability, and reported validation. The Battelle Frankfurt and Hanford Engineering Development Laboratory hydrogen-mixing tests are reviewed as well as some smaller-scale efforts. Finally, the overall state of knowledge is critiqued, and a qualitative specification of future analytical and experimental work is presented

  18. A Business Analytics Software Tool for Monitoring and Predicting Radiology Throughput Performance.

    Science.gov (United States)

    Jones, Stephen; Cournane, Seán; Sheehy, Niall; Hederman, Lucy

    2016-12-01

    Business analytics (BA) is increasingly being utilised by radiology departments to analyse and present data. It encompasses statistical analysis, forecasting and predictive modelling and is used as an umbrella term for decision support and business intelligence systems. The primary aim of this study was to determine whether utilising BA technologies could contribute towards improved decision support and resource management within radiology departments. A set of information technology requirements were identified with key stakeholders, and a prototype BA software tool was designed, developed and implemented. A qualitative evaluation of the tool was carried out through a series of semi-structured interviews with key stakeholders. Feedback was collated, and emergent themes were identified. The results indicated that BA software applications can provide visibility of radiology performance data across all time horizons. The study demonstrated that the tool could potentially assist with improving operational efficiencies and management of radiology resources.

  19. Advances in analytical tools for high throughput strain engineering

    DEFF Research Database (Denmark)

    Marcellin, Esteban; Nielsen, Lars Keld

    2018-01-01

    The emergence of inexpensive, base-perfect genome editing is revolutionising biology. Modern industrial biotechnology exploits the advances in genome editing in combination with automation, analytics and data integration to build high-throughput automated strain engineering pipelines also known...... as biofoundries. Biofoundries replace the slow and inconsistent artisanal processes used to build microbial cell factories with an automated design–build–test cycle, considerably reducing the time needed to deliver commercially viable strains. Testing and hence learning remains relatively shallow, but recent...... advances in analytical chemistry promise to increase the depth of characterization possible. Analytics combined with models of cellular physiology in automated systems biology pipelines should enable deeper learning and hence a steeper pitch of the learning cycle. This review explores the progress...

  20. Employability Skills Assessment Tool Development

    Science.gov (United States)

    Rasul, Mohamad Sattar; Rauf, Rose Amnah Abd; Mansor, Azlin Norhaini; Puvanasvaran, A. P.

    2012-01-01

    Research nationally and internationally found that technical graduates are lacking in employability skills. As employability skills are crucial in outcome-based education, the main goal of this research is to develop an Employability Skill Assessment Tool to help students and lecturers produce competent graduates in employability skills needed by…

  1. Micromechanical String Resonators: Analytical Tool for Thermal Characterization of Polymers

    DEFF Research Database (Denmark)

    Bose, Sanjukta; Schmid, Silvan; Larsen, Tom

    2014-01-01

    of the polymer, and a change in Q reflects the change in damping of the polymer-coated string. The frequency response of the microstring is validated with an analytical model. From the frequency independent tensile stress change, static Tg values of 40.6 and 57.6 °C were measured for PDLLA and PLLA, respectively...

  2. Developing An Analytic Approach to Understanding the Patient Care Experience

    Science.gov (United States)

    Springman, Mary Kate; Bermeo, Yalissa; Limper, Heather M

    2016-01-01

    The amount of data available to health-care institutions regarding the patient care experience has grown tremendously. Purposeful approaches to condensing, interpreting, and disseminating these data are becoming necessary to further understand how clinical and operational constructs relate to patient satisfaction with their care, identify areas for improvement, and accurately measure the impact of initiatives designed to improve the patient experience. We set out to develop an analytic reporting tool deeply rooted in the patient voice that would compile patient experience data obtained throughout the medical center. PMID:28725852

  3. Kalisphera: an analytical tool to reproduce the partial volume effect of spheres imaged in 3D

    International Nuclear Information System (INIS)

    Tengattini, Alessandro; Andò, Edward

    2015-01-01

    In experimental mechanics, where 3D imaging is having a profound effect, spheres are commonly adopted for their simplicity and for the ease of their modeling. In this contribution we develop an analytical tool, ‘kalisphera’, to produce 3D raster images of spheres including their partial volume effect. This allows us to evaluate the metrological performance of existing image-based measurement techniques (knowing a priori the ground truth). An advanced application of ‘kalisphera’ is developed here to identify and accurately characterize spheres in real 3D x-ray tomography images with the objective of improving trinarization and contact detection. The effect of the common experimental imperfections is assessed and the overall performance of the tool tested on real images. (paper)

  4. Analytical Design Package (ADP2): A computer aided engineering tool for aircraft transparency design

    Science.gov (United States)

    Wuerer, J. E.; Gran, M.; Held, T. W.

    1994-01-01

    The Analytical Design Package (ADP2) is being developed as a part of the Air Force Frameless Transparency Program (FTP). ADP2 is an integrated design tool consisting of existing analysis codes and Computer Aided Engineering (CAE) software. The objective of the ADP2 is to develop and confirm an integrated design methodology for frameless transparencies, related aircraft interfaces, and their corresponding tooling. The application of this methodology will generate high confidence for achieving a qualified part prior to mold fabrication. ADP2 is a customized integration of analysis codes, CAE software, and material databases. The primary CAE integration tool for the ADP2 is P3/PATRAN, a commercial-off-the-shelf (COTS) software tool. The open architecture of P3/PATRAN allows customized installations with different applications modules for specific site requirements. Integration of material databases allows the engineer to select a material, and those material properties are automatically called into the relevant analysis code. The ADP2 materials database will be composed of four independent schemas: CAE Design, Processing, Testing, and Logistics Support. The design of ADP2 places major emphasis on the seamless integration of CAE and analysis modules with a single intuitive graphical interface. This tool is being designed to serve and be used by an entire project team, i.e., analysts, designers, materials experts, and managers. The final version of the software will be delivered to the Air Force in Jan. 1994. The Analytical Design Package (ADP2) will then be ready for transfer to industry. The package will be capable of a wide range of design and manufacturing applications.

  5. Analytical tools for identifying bicycle route suitability, coverage, and continuity.

    Science.gov (United States)

    2012-05-01

    This report presents new tools created to assess bicycle suitability using geographic information systems (GIS). Bicycle suitability is a rating of how appropriate a roadway is for bicycle travel based on attributes of the roadway, such as vehi...

  6. Selecting analytical tools for characterization of polymersomes in aqueous solution

    DEFF Research Database (Denmark)

    Habel, Joachim Erich Otto; Ogbonna, Anayo; Larsen, Nanna

    2015-01-01

    Selecting the appropriate analytical methods for characterizing the assembly and morphology of polymer-based vesicles, or polymersomes are required to reach their full potential in biotechnology. This work presents and compares 17 different techniques for their ability to adequately report size, ...... for characterizing polymersomes per se but the comparative overview is also intended to serve as a starting point for selecting methods for characterizing polymersomes with encapsulated compounds or polymersomes with incorporated biomolecules (e.g. membrane proteins)....

  7. Information and Analytic Maintenance of Nanoindustry Development

    Directory of Open Access Journals (Sweden)

    Glushchenko Aleksandra Vasilyevna

    2015-05-01

    Full Text Available The successful course of nanotechnological development in many respects depends on the volume and quality of information provided to external and internal users. The objective of the present research is to reveal the information requirements of various groups of users for effective management of nanotech industry and to define ways of their most effective satisfaction. The authors also aim at developing the system of the indicators characterizing the current state and the dynamic parameters of nanotech industry development. On the basis of the conducted research the need of information system of nanotech industry development is proved. The information interrelations of subjects of nanotech industry for development of communicative function of the account which becomes dominating in comparison with control function are revealed. The information needs of users of financial and non-financial information are defined. The stages of its introduction, since determination of character, volume, the list and degree of efficiency of information before creation of system of the administrative reporting, the analysis and control are in detail registered. The information and analytical system is focused on the general assessment of efficiency and the major economic indicators, the general tendencies of development of nanotech industry, possible reserves of increasing the efficiency of their functioning. The authors develop pthe system of the indicators characterizing the advancement of nanotech industry and allowing to estimate innovative activity in the sphere of nanotech industry, to calculate intensity of nano-innovations costs, to define the productivity and efficiency of nanotech industry in branch, the region, national economy in general.

  8. Can We Practically Bring Physics-based Modeling Into Operational Analytics Tools?

    Energy Technology Data Exchange (ETDEWEB)

    Granderson, Jessica [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Bonvini, Marco [Whisker Labs, Oakland, CA (United States); Piette, Mary Ann [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Page, Janie [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Lin, Guanjing [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hu, R. Lilly [Univ. of California, Berkeley, CA (United States)

    2017-08-11

    We present that analytics software is increasingly used to improve and maintain operational efficiency in commercial buildings. Energy managers, owners, and operators are using a diversity of commercial offerings often referred to as Energy Information Systems, Fault Detection and Diagnostic (FDD) systems, or more broadly Energy Management and Information Systems, to cost-effectively enable savings on the order of ten to twenty percent. Most of these systems use data from meters and sensors, with rule-based and/or data-driven models to characterize system and building behavior. In contrast, physics-based modeling uses first-principles and engineering models (e.g., efficiency curves) to characterize system and building behavior. Historically, these physics-based approaches have been used in the design phase of the building life cycle or in retrofit analyses. Researchers have begun exploring the benefits of integrating physics-based models with operational data analytics tools, bridging the gap between design and operations. In this paper, we detail the development and operator use of a software tool that uses hybrid data-driven and physics-based approaches to cooling plant FDD and optimization. Specifically, we describe the system architecture, models, and FDD and optimization algorithms; advantages and disadvantages with respect to purely data-driven approaches; and practical implications for scaling and replicating these techniques. Finally, we conclude with an evaluation of the future potential for such tools and future research opportunities.

  9. Development of analytical procedures for the simultaneous ...

    African Journals Online (AJOL)

    The mechanical shaking extraction technique for the isolation of target analytes was optimised. Different ... Target analytes were quantified using a high capillary gas chromatograph (GC) equipped with an electron capture detector. Under the optimum GC ... and BB 153. It is efficient, moderately rapid and cost-effective.

  10. Analytic hierarchy process (AHP) as a tool in asset allocation

    Science.gov (United States)

    Zainol Abidin, Siti Nazifah; Mohd Jaffar, Maheran

    2013-04-01

    Allocation capital investment into different assets is the best way to balance the risk and reward. This can prevent from losing big amount of money. Thus, the aim of this paper is to help investors in making wise investment decision in asset allocation. This paper proposes modifying and adapting Analytic Hierarchy Process (AHP) model. The AHP model is widely used in various fields of study that are related in decision making. The results of the case studies show that the proposed model can categorize stocks and determine the portion of capital investment. Hence, it can assist investors in decision making process and reduce the risk of loss in stock market investment.

  11. Application of quantum dots as analytical tools in automated chemical analysis: a review.

    Science.gov (United States)

    Frigerio, Christian; Ribeiro, David S M; Rodrigues, S Sofia M; Abreu, Vera L R G; Barbosa, João A C; Prior, João A V; Marques, Karine L; Santos, João L M

    2012-07-20

    Colloidal semiconductor nanocrystals or quantum dots (QDs) are one of the most relevant developments in the fast-growing world of nanotechnology. Initially proposed as luminescent biological labels, they are finding new important fields of application in analytical chemistry, where their photoluminescent properties have been exploited in environmental monitoring, pharmaceutical and clinical analysis and food quality control. Despite the enormous variety of applications that have been developed, the automation of QDs-based analytical methodologies by resorting to automation tools such as continuous flow analysis and related techniques, which would allow to take advantage of particular features of the nanocrystals such as the versatile surface chemistry and ligand binding ability, the aptitude to generate reactive species, the possibility of encapsulation in different materials while retaining native luminescence providing the means for the implementation of renewable chemosensors or even the utilisation of more drastic and even stability impairing reaction conditions, is hitherto very limited. In this review, we provide insights into the analytical potential of quantum dots focusing on prospects of their utilisation in automated flow-based and flow-related approaches and the future outlook of QDs applications in chemical analysis. Copyright © 2012 Elsevier B.V. All rights reserved.

  12. Application of quantum dots as analytical tools in automated chemical analysis: A review

    International Nuclear Information System (INIS)

    Frigerio, Christian; Ribeiro, David S.M.; Rodrigues, S. Sofia M.; Abreu, Vera L.R.G.; Barbosa, João A.C.; Prior, João A.V.; Marques, Karine L.; Santos, João L.M.

    2012-01-01

    Highlights: ► Review on quantum dots application in automated chemical analysis. ► Automation by using flow-based techniques. ► Quantum dots in liquid chromatography and capillary electrophoresis. ► Detection by fluorescence and chemiluminescence. ► Electrochemiluminescence and radical generation. - Abstract: Colloidal semiconductor nanocrystals or quantum dots (QDs) are one of the most relevant developments in the fast-growing world of nanotechnology. Initially proposed as luminescent biological labels, they are finding new important fields of application in analytical chemistry, where their photoluminescent properties have been exploited in environmental monitoring, pharmaceutical and clinical analysis and food quality control. Despite the enormous variety of applications that have been developed, the automation of QDs-based analytical methodologies by resorting to automation tools such as continuous flow analysis and related techniques, which would allow to take advantage of particular features of the nanocrystals such as the versatile surface chemistry and ligand binding ability, the aptitude to generate reactive species, the possibility of encapsulation in different materials while retaining native luminescence providing the means for the implementation of renewable chemosensors or even the utilisation of more drastic and even stability impairing reaction conditions, is hitherto very limited. In this review, we provide insights into the analytical potential of quantum dots focusing on prospects of their utilisation in automated flow-based and flow-related approaches and the future outlook of QDs applications in chemical analysis.

  13. Analytics to Literacies: The Development of a Learning Analytics Framework for Multiliteracies Assessment

    Directory of Open Access Journals (Sweden)

    Shane Dawson

    2014-09-01

    Full Text Available The rapid advances in information and communication technologies, coupled with increased access to information and the formation of global communities, have resulted in interest among researchers and academics to revise educational practice to move beyond traditional ‘literacy’ skills towards an enhanced set of “multiliteracies” or “new media literacies”. Measuring the literacy of a population, in the light of its linkage to individual and community wealth and wellbeing, is essential to determining the impact of compulsory education. The opportunity now is to develop tools to assess individual and societal attainment of these new literacies. Drawing on the work of Jenkins and colleagues (2006 and notions of a participatory culture, this paper proposes a conceptual framework for how learning analytics can assist in measuring individual achievement of multiliteracies and how this evaluative process can be scaled to provide an institutional perspective of the educational progress in fostering these fundamental skills.

  14. An Analytical Study of Tools and Techniques for Movie Marketing

    Directory of Open Access Journals (Sweden)

    Garima Maik

    2014-08-01

    Full Text Available Abstract. Bollywood or Hindi movie industry is one of the fastest growing sector in the media and entertainment space creating numerous business and employment opportunities. Movies in India are a major source of entertainment for all sects of society. They not only face competition from other movie industries and movies but from other source of entertainment such as adventure sports, amusement parks, theatre and drama, pubs and discothèques. A lot of man power, man hours, creative brains, and money are put in to build a quality feature film. Bollywood is the industry which continuously works towards providing the 7 billion population with something new always. So it is important for the movie and production team to stand out, to grab the due attention of the maximum audience. Movie makers employ various tools and techniques today to market their movies. They leave no stone unturned. They roll out teasers, First look, Theatrical trailer release, Music launch, City tours, Producer’s and director’s interview, Movie premier, Movie release, post release follow up and etc. to pull the viewers to the Cineplex. The audience today which comprises mainly of youth requires photos, videos, meet ups, gossip, debate, collaboration and content creation. These requirements of today’s generation are most fulfilled through digital platforms. However, the traditional media like newspapers, radio, and television are not old school. They reach out to mass audience and play an upper role in effective marketing. This study aims at analysing these tools for their effectiveness. The objectives are fulfilled through a consumer survey. This study will bring out the effectiveness and relational importance of various tools which are employed by movie marketers to generate maximum returns on the investments by using various data reduction techniques like factor analysis and statistical techniques like chi-square test with data visualization using pie charts

  15. Improving web site performance using commercially available analytical tools.

    Science.gov (United States)

    Ogle, James A

    2010-10-01

    It is easy to accurately measure web site usage and to quantify key parameters such as page views, site visits, and more complex variables using commercially available tools that analyze web site log files and search engine use. This information can be used strategically to guide the design or redesign of a web site (templates, look-and-feel, and navigation infrastructure) to improve overall usability. The data can also be used tactically to assess the popularity and use of new pages and modules that are added and to rectify problems that surface. This paper describes software tools used to: (1) inventory search terms that lead to available content; (2) propose synonyms for commonly used search terms; (3) evaluate the effectiveness of calls to action; (4) conduct path analyses to targeted content. The American Academy of Orthopaedic Surgeons (AAOS) uses SurfRay's Behavior Tracking software (Santa Clara CA, USA, and Copenhagen, Denmark) to capture and archive the search terms that have been entered into the site's Google Mini search engine. The AAOS also uses Unica's NetInsight program to analyze its web site log files. These tools provide the AAOS with information that quantifies how well its web sites are operating and insights for making improvements to them. Although it is easy to quantify many aspects of an association's web presence, it also takes human involvement to analyze the results and then recommend changes. Without a dedicated resource to do this, the work often is accomplished only sporadically and on an ad hoc basis.

  16. Laser: a Tool for Optimization and Enhancement of Analytical Methods

    Energy Technology Data Exchange (ETDEWEB)

    Preisler, Jan [Iowa State Univ., Ames, IA (United States)

    1997-01-01

    In this work, we use lasers to enhance possibilities of laser desorption methods and to optimize coating procedure for capillary electrophoresis (CE). We use several different instrumental arrangements to characterize matrix-assisted laser desorption (MALD) at atmospheric pressure and in vacuum. In imaging mode, 488-nm argon-ion laser beam is deflected by two acousto-optic deflectors to scan plumes desorbed at atmospheric pressure via absorption. All absorbing species, including neutral molecules, are monitored. Interesting features, e.g. differences between the initial plume and subsequent plumes desorbed from the same spot, or the formation of two plumes from one laser shot are observed. Total plume absorbance can be correlated with the acoustic signal generated by the desorption event. A model equation for the plume velocity as a function of time is proposed. Alternatively, the use of a static laser beam for observation enables reliable determination of plume velocities even when they are very high. Static scattering detection reveals negative influence of particle spallation on MS signal. Ion formation during MALD was monitored using 193-nm light to photodissociate a portion of insulin ion plume. These results define the optimal conditions for desorbing analytes from matrices, as opposed to achieving a compromise between efficient desorption and efficient ionization as is practiced in mass spectrometry. In CE experiment, we examined changes in a poly(ethylene oxide) (PEO) coating by continuously monitoring the electroosmotic flow (EOF) in a fused-silica capillary during electrophoresis. An imaging CCD camera was used to follow the motion of a fluorescent neutral marker zone along the length of the capillary excited by 488-nm Ar-ion laser. The PEO coating was shown to reduce the velocity of EOF by more than an order of magnitude compared to a bare capillary at pH 7.0. The coating protocol was important, especially at an intermediate pH of 7.7. The increase of p

  17. Analytical Modeling Tool for Design of Hydrocarbon Sensitive Optical Fibers

    Directory of Open Access Journals (Sweden)

    Khalil Al Handawi

    2017-09-01

    Full Text Available Pipelines are the main transportation means for oil and gas products across large distances. Due to the severe conditions they operate in, they are regularly inspected using conventional Pipeline Inspection Gages (PIGs for corrosion damage. The motivation for researching a real-time distributed monitoring solution arose to mitigate costs and provide a proactive indication of potential failures. Fiber optic sensors with polymer claddings provide a means of detecting contact with hydrocarbons. By coating the fibers with a layer of metal similar in composition to that of the parent pipeline, corrosion of this coating may be detected when the polymer cladding underneath is exposed to the surrounding hydrocarbons contained within the pipeline. A Refractive Index (RI change occurs in the polymer cladding causing a loss in intensity of a traveling light pulse due to a reduction in the fiber’s modal capacity. Intensity losses may be detected using Optical Time Domain Reflectometry (OTDR while pinpointing the spatial location of the contact via time delay calculations of the back-scattered pulses. This work presents a theoretical model for the above sensing solution to provide a design tool for the fiber optic cable in the context of hydrocarbon sensing following corrosion of an external metal coating. Results are verified against the experimental data published in the literature.

  18. Process simulation of heavy water plants - a powerful analytical tool

    International Nuclear Information System (INIS)

    Miller, A.I.

    1978-10-01

    The commercially conscious designs of Canadian GS (Girdler-Sulphide) have proved sensitive to process conditions. That, combined with the large scale of our units, has meant that computer simulation of their behaviour has been a natural and profitable development. Atomic Energy of Canada Limited has developed a family of steady state simulations to describe all of the Canadian plants. Modelling of plant conditions has demonstrated that the simulation description is very precise and it has become an integral part of the industry's assessments of both plant operation and decisions on capital expenditures. The simulation technique has also found extensive use in detailed designing of both the rehabilitated Glace Bay and the new La Prade plants. It has opened new insights into plant design and uncovered a radical and significant flowsheet change for future designs as well as many less dramatic but valuable lesser changes. (author)

  19. Chemometric classification techniques as a tool for solving problems in analytical chemistry.

    Science.gov (United States)

    Bevilacqua, Marta; Nescatelli, Riccardo; Bucci, Remo; Magrì, Andrea D; Magrì, Antonio L; Marini, Federico

    2014-01-01

    Supervised pattern recognition (classification) techniques, i.e., the family of chemometric methods whose aim is the prediction of a qualitative response on a set of samples, represent a very important assortment of tools for solving problems in several areas of applied analytical chemistry. This paper describes the theory behind the chemometric classification techniques most frequently used in analytical chemistry together with some examples of their application to real-world problems.

  20. Analytical sensitivity analysis of geometric errors in a three axis machine tool

    International Nuclear Information System (INIS)

    Park, Sung Ryung; Yang, Seung Han

    2012-01-01

    In this paper, an analytical method is used to perform a sensitivity analysis of geometric errors in a three axis machine tool. First, an error synthesis model is constructed for evaluating the position volumetric error due to the geometric errors, and then an output variable is defined, such as the magnitude of the position volumetric error. Next, the global sensitivity analysis is executed using an analytical method. Finally, the sensitivity indices are calculated using the quantitative values of the geometric errors

  1. Median of patient results as a tool for assessment of analytical stability

    DEFF Research Database (Denmark)

    Jørgensen, Lars Mønster; Hansen, Steen Ingemann; Petersen, Per Hyltoft

    2015-01-01

    BACKGROUND: In spite of the well-established external quality assessment and proficiency testing surveys of analytical quality performance in laboratory medicine, a simple tool to monitor the long-term analytical stability as a supplement to the internal control procedures is often needed. METHOD......: Patient data from daily internal control schemes was used for monthly appraisal of the analytical stability. This was accomplished by using the monthly medians of patient results to disclose deviations from analytical stability, and by comparing divergences with the quality specifications for allowable...... control procedures are the most reliable sources of material as they represent the genuine substance of the measurements and therefore circumvent the problems associated with non-commutable materials in external assessment. CONCLUSION: Patient medians in the monthly monitoring of analytical stability...

  2. Status of immunoassay as an analytical tool in environmental investigations

    International Nuclear Information System (INIS)

    Van Emon, J.M.

    2000-01-01

    Immunoassay methods were initially applied in clinical situations where their sensitivity and selectivity were utilized for diagnostic purposes. In the 1970s, pesticide chemists realized the potential benefits of immunoassay methods for compounds difficult to analyze by gas chromatography. This transition of the technology has extended to the analysis of soil, water, food and other matrices of environmental and human exposure significance particularly for compounds difficult to analyze by chromatographic methods. The utility of radioimmunoassays and enzyme immunoassays for environmental investigations was recognized in the 1980s by the U.S. Environmental Protection Agency (U.S. EPA) with the initiation of an immunoassay development programme. The U.S. Department of Agriculture (USDA) and the U.S. Food and Drug Administration (PDA) have investigated immunoassays for the detection of residues in food both from an inspection and a contamination prevention perspective. Environmental immunoassays are providing rapid screening information as well as quantitative information to fulfill rigorous data quality objectives for monitoring programmes

  3. Developing a Code of Practice for Learning Analytics

    Science.gov (United States)

    Sclater, Niall

    2016-01-01

    Ethical and legal objections to learning analytics are barriers to development of the field, thus potentially denying students the benefits of predictive analytics and adaptive learning. Jisc, a charitable organization that champions the use of digital technologies in UK education and research, has attempted to address this with the development of…

  4. Electron Tomography: A Three-Dimensional Analytic Tool for Hard and Soft Materials Research.

    Science.gov (United States)

    Ercius, Peter; Alaidi, Osama; Rames, Matthew J; Ren, Gang

    2015-10-14

    Three-dimensional (3D) structural analysis is essential to understand the relationship between the structure and function of an object. Many analytical techniques, such as X-ray diffraction, neutron spectroscopy, and electron microscopy imaging, are used to provide structural information. Transmission electron microscopy (TEM), one of the most popular analytic tools, has been widely used for structural analysis in both physical and biological sciences for many decades, in which 3D objects are projected into two-dimensional (2D) images. In many cases, 2D-projection images are insufficient to understand the relationship between the 3D structure and the function of nanoscale objects. Electron tomography (ET) is a technique that retrieves 3D structural information from a tilt series of 2D projections, and is gradually becoming a mature technology with sub-nanometer resolution. Distinct methods to overcome sample-based limitations have been separately developed in both physical and biological science, although they share some basic concepts of ET. This review discusses the common basis for 3D characterization, and specifies difficulties and solutions regarding both hard and soft materials research. It is hoped that novel solutions based on current state-of-the-art techniques for advanced applications in hybrid matter systems can be motivated. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Environmental equity research: review with focus on outdoor air pollution research methods and analytic tools.

    Science.gov (United States)

    Miao, Qun; Chen, Dongmei; Buzzelli, Michael; Aronson, Kristan J

    2015-01-01

    The objective of this study was to review environmental equity research on outdoor air pollution and, specifically, methods and tools used in research, published in English, with the aim of recommending the best methods and analytic tools. English language publications from 2000 to 2012 were identified in Google Scholar, Ovid MEDLINE, and PubMed. Research methodologies and results were reviewed and potential deficiencies and knowledge gaps identified. The publications show that exposure to outdoor air pollution differs by social factors, but findings are inconsistent in Canada. In terms of study designs, most were small and ecological and therefore prone to the ecological fallacy. Newer tools such as geographic information systems, modeling, and biomarkers offer improved precision in exposure measurement. Higher-quality research using large, individual-based samples and more precise analytic tools are needed to provide better evidence for policy-making to reduce environmental inequities.

  6. System analysis: Developing tools for the future

    Energy Technology Data Exchange (ETDEWEB)

    De Jong, K.; clever, J.; Draper, J.V.; Davies, B.; Lonks, A.

    1996-02-01

    This report introduces and evaluates system analysis tools that were developed, or are under development, for the Robotics Technology Development Program (RTDP). Additionally, it discusses system analysis work completed using these tools aimed at completing a system analysis of the retrieval of waste from underground storage tanks on the Hanford Reservation near Richland, Washington. The tools developed and evaluated include a mixture of commercially available tools adapted to RTDP requirements, and some tools developed in house. The tools that are included in this report include: a Process Diagramming Tool, a Cost Modeling Tool, an Amortization Modeling Tool, a graphical simulation linked to the Cost Modeling Tool, a decision assistance tool, and a system thinking tool. Additionally, the importance of performance testing to the RTDP and the results of such testing executed is discussed. Further, the results of the Tank Waste Retrieval (TWR) System Diagram, the TWR Operations Cost Model, and the TWR Amortization Model are presented, and the implication of the results are discussed. Finally, the RTDP system analysis tools are assessed and some recommendations are made regarding continuing development of the tools and process.

  7. Analytical Quality by Design Approach to Test Method Development and Validation in Drug Substance Manufacturing

    Directory of Open Access Journals (Sweden)

    N. V. V. S. S. Raman

    2015-01-01

    Full Text Available Pharmaceutical industry has been emerging rapidly for the last decade by focusing on product Quality, Safety, and Efficacy. Pharmaceutical firms increased the number of product development by using scientific tools such as QbD (Quality by Design and PAT (Process Analytical Technology. ICH guidelines Q8 to Q11 have discussed QbD implementation in API synthetic process and formulation development. ICH Q11 guidelines clearly discussed QbD approach for API synthesis with examples. Generic companies are implementing QbD approach in formulation development and even it is mandatory for USFDA perspective. As of now there is no specific requirements for AQbD (Analytical Quality by Design and PAT in analytical development from all regulatory agencies. In this review, authors have discussed the implementation of QbD and AQbD simultaneously for API synthetic process and analytical methods development. AQbD key tools are identification of ATP (Analytical Target Profile, CQA (Critical Quality Attributes with risk assessment, Method Optimization and Development with DoE, MODR (method operable design region, Control Strategy, AQbD Method Validation, and Continuous Method Monitoring (CMM. Simultaneous implementation of QbD activities in synthetic and analytical development will provide the highest quality product by minimizing the risks and even it is very good input for PAT approach.

  8. Recent developments in analytical detection methods for radiation processed foods

    International Nuclear Information System (INIS)

    Wu Jilan

    1993-01-01

    A short summary of the programmes of 'ADMIT' (FAO/IAEA) and the developments in analytical detection methods for radiation processed foods has been given. It is suggested that for promoting the commercialization of radiation processed foods and controlling its quality, one must pay more attention to the study of analytical detection methods of irradiated food

  9. Pulsatile microfluidics as an analytical tool for determining the dynamic characteristics of microfluidic systems

    DEFF Research Database (Denmark)

    Vedel, Søren; Olesen, Laurits Højgaard; Bruus, Henrik

    2010-01-01

    An understanding of all fluid dynamic time scales is needed to fully understand and hence exploit the capabilities of fluid flow in microfluidic systems. We propose the use of harmonically oscillating microfluidics as an analytical tool for the deduction of these time scales. Furthermore, we...

  10. Improved process analytical technology for protein a chromatography using predictive principal component analysis tools.

    Science.gov (United States)

    Hou, Ying; Jiang, Canping; Shukla, Abhinav A; Cramer, Steven M

    2011-01-01

    Protein A chromatography is widely employed for the capture and purification of antibodies and Fc-fusion proteins. Due to the high cost of protein A resins, there is a significant economic driving force for using these chromatographic materials for a large number of cycles. The maintenance of column performance over the resin lifetime is also a significant concern in large-scale manufacturing. In this work, several statistical methods are employed to develop a novel principal component analysis (PCA)-based tool for predicting protein A chromatographic column performance over time. A method is developed to carry out detection of column integrity failures before their occurrence without the need for a separate integrity test. In addition, analysis of various transitions in the chromatograms was also employed to develop PCA-based models to predict both subtle and general trends in real-time protein A column yield decay. The developed approach has significant potential for facilitating timely and improved decisions in large-scale chromatographic operations in line with the process analytical technology (PAT) guidance from the Food and Drug Administration (FDA). © 2010 Wiley Periodicals, Inc.

  11. Measuring the bright side of being blue: a new tool for assessing analytical rumination in depression.

    Directory of Open Access Journals (Sweden)

    Skye P Barbic

    Full Text Available BACKGROUND: Diagnosis and management of depression occurs frequently in the primary care setting. Current diagnostic and management of treatment practices across clinical populations focus on eliminating signs and symptoms of depression. However, there is debate that some interventions may pathologize normal, adaptive responses to stressors. Analytical rumination (AR is an example of an adaptive response of depression that is characterized by enhanced cognitive function to help an individual focus on, analyze, and solve problems. To date, research on AR has been hampered by the lack of theoretically-derived and psychometrically sound instruments. This study developed and tested a clinically meaningful measure of AR. METHODS: Using expert panels and an extensive literature review, we developed a conceptual framework for AR and 22 candidate items. Items were field tested to 579 young adults; 140 of whom completed the items at a second time point. We used Rasch measurement methods to construct and test the item set; and traditional psychometric analyses to compare items to existing rating scales. RESULTS: Data were high quality (0.81; evidence for divergent validity. Evidence of misfit for 2 items suggested that a 20-item scale with 4-point response categories best captured the concept of AR, fitting the Rasch model (χ2 = 95.26; df = 76, p = 0.07, with high reliability (rp = 0.86, ordered response scale structure, and no item bias (gender, age, time. CONCLUSION: Our study provides evidence for a 20-item Analytical Rumination Questionnaire (ARQ that can be used to quantify AR in adults who experience symptoms of depression. The ARQ is psychometrically robust and a clinically useful tool for the assessment and improvement of depression in the primary care setting. Future work is needed to establish the validity of this measure in people with major depression.

  12. Studying Behaviors Among Neurosurgery Residents Using Web 2.0 Analytic Tools.

    Science.gov (United States)

    Davidson, Benjamin; Alotaibi, Naif M; Guha, Daipayan; Amaral, Sandi; Kulkarni, Abhaya V; Lozano, Andres M

    Web 2.0 technologies (e.g., blogs, social networks, and wikis) are increasingly being used by medical schools and postgraduate training programs as tools for information dissemination. These technologies offer the unique opportunity to track metrics of user engagement and interaction. Here, we employ Web 2.0 tools to assess academic behaviors among neurosurgery residents. We performed a retrospective review of all educational lectures, part of the core Neurosurgery Residency curriculum at the University of Toronto, posted on our teaching website (www.TheBrainSchool.net). Our website was developed using publicly available Web 2.0 platforms. Lecture usage was assessed by the number of clicks, and associations were explored with lecturer academic position, timing of examinations, and lecture/subspecialty topic. The overall number of clicks on 77 lectures was 1079. Most of these clicks were occurring during the in-training examination month (43%). Click numbers were significantly higher on lectures presented by faculty (mean = 18.6, standard deviation ± 4.1) compared to those delivered by residents (mean = 8.4, standard deviation ± 2.1) (p = 0.031). Lectures covering topics in functional neurosurgery received the most clicks (47%), followed by pediatric neurosurgery (22%). This study demonstrates the value of Web 2.0 analytic tools in examining resident study behavior. Residents tend to "cram" by downloading lectures in the same month of training examinations and display a preference for faculty-delivered lectures. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  13. Lowering the Barrier to Reproducible Research by Publishing Provenance from Common Analytical Tools

    Science.gov (United States)

    Jones, M. B.; Slaughter, P.; Walker, L.; Jones, C. S.; Missier, P.; Ludäscher, B.; Cao, Y.; McPhillips, T.; Schildhauer, M.

    2015-12-01

    Scientific provenance describes the authenticity, origin, and processing history of research products and promotes scientific transparency by detailing the steps in computational workflows that produce derived products. These products include papers, findings, input data, software products to perform computations, and derived data and visualizations. The geosciences community values this type of information, and, at least theoretically, strives to base conclusions on computationally replicable findings. In practice, capturing detailed provenance is laborious and thus has been a low priority; beyond a lab notebook describing methods and results, few researchers capture and preserve detailed records of scientific provenance. We have built tools for capturing and publishing provenance that integrate into analytical environments that are in widespread use by geoscientists (R and Matlab). These tools lower the barrier to provenance generation by automating capture of critical information as researchers prepare data for analysis, develop, test, and execute models, and create visualizations. The 'recordr' library in R and the `matlab-dataone` library in Matlab provide shared functions to capture provenance with minimal changes to normal working procedures. Researchers can capture both scripted and interactive sessions, tag and manage these executions as they iterate over analyses, and then prune and publish provenance metadata and derived products to the DataONE federation of archival repositories. Provenance traces conform to the ProvONE model extension of W3C PROV, enabling interoperability across tools and languages. The capture system supports fine-grained versioning of science products and provenance traces. By assigning global identifiers such as DOIs, reseachers can cite the computational processes used to reach findings. And, finally, DataONE has built a web portal to search, browse, and clearly display provenance relationships between input data, the software

  14. Stochastic airspace simulation tool development

    Science.gov (United States)

    2009-10-01

    Modeling and simulation is often used to study : the physical world when observation may not be : practical. The overall goal of a recent and ongoing : simulation tool project has been to provide a : documented, lifecycle-managed, multi-processor : c...

  15. Development of Open Textbooks Learning Analytics System

    Science.gov (United States)

    Prasad, Deepak; Totaram, Rajneel; Usagawa, Tsuyoshi

    2016-01-01

    Textbook costs have skyrocketed in recent years, putting them beyond the reach of many students, but there are options which can mitigate this problem. Open textbooks, an open educational resource, have proven capable of making textbooks affordable to students. There have been few educational development as promising as the development of open…

  16. Software Development Management: Empirical and Analytical Perspectives

    Science.gov (United States)

    Kang, Keumseok

    2011-01-01

    Managing software development is a very complex activity because it must deal with people, organizations, technologies, and business processes. My dissertation consists of three studies that examine software development management from various perspectives. The first study empirically investigates the impacts of prior experience with similar…

  17. Development of CAD implementing the algorithm of boundary elements’ numerical analytical method

    Directory of Open Access Journals (Sweden)

    Yulia V. Korniyenko

    2015-03-01

    Full Text Available Up to recent days the algorithms for numerical-analytical boundary elements method had been implemented with programs written in MATLAB environment language. Each program had a local character, i.e. used to solve a particular problem: calculation of beam, frame, arch, etc. Constructing matrices in these programs was carried out “manually” therefore being time-consuming. The research was purposed onto a reasoned choice of programming language for new CAD development, allows to implement algorithm of numerical analytical boundary elements method and to create visualization tools for initial objects and calculation results. Research conducted shows that among wide variety of programming languages the most efficient one for CAD development, employing the numerical analytical boundary elements method algorithm, is the Java language. This language provides tools not only for development of calculating CAD part, but also to build the graphic interface for geometrical models construction and calculated results interpretation.

  18. Program Development Tools and Infrastructures

    International Nuclear Information System (INIS)

    Schulz, M.

    2012-01-01

    Exascale class machines will exhibit a new level of complexity: they will feature an unprecedented number of cores and threads, will most likely be heterogeneous and deeply hierarchical, and offer a range of new hardware techniques (such as speculative threading, transactional memory, programmable prefetching, and programmable accelerators), which all have to be utilized for an application to realize the full potential of the machine. Additionally, users will be faced with less memory per core, fixed total power budgets, and sharply reduced MTBFs. At the same time, it is expected that the complexity of applications will rise sharply for exascale systems, both to implement new science possible at exascale and to exploit the new hardware features necessary to achieve exascale performance. This is particularly true for many of the NNSA codes, which are large and often highly complex integrated simulation codes that push the limits of everything in the system including language features. To overcome these limitations and to enable users to reach exascale performance, users will expect a new generation of tools that address the bottlenecks of exascale machines, that work seamlessly with the (set of) programming models on the target machines, that scale with the machine, that provide automatic analysis capabilities, and that are flexible and modular enough to overcome the complexities and changing demands of the exascale architectures. Further, any tool must be robust enough to handle the complexity of large integrated codes while keeping the user's learning curve low. With the ASC program, in particular the CSSE (Computational Systems and Software Engineering) and CCE (Common Compute Environment) projects, we are working towards a new generation of tools that fulfill these requirements and that provide our users as well as the larger HPC community with the necessary tools, techniques, and methodologies required to make exascale performance a reality.

  19. New Tools to Prepare ACE Cross-section Files for MCNP Analytic Test Problems

    International Nuclear Information System (INIS)

    Brown, Forrest B.

    2016-01-01

    Monte Carlo calculations using one-group cross sections, multigroup cross sections, or simple continuous energy cross sections are often used to: (1) verify production codes against known analytical solutions, (2) verify new methods and algorithms that do not involve detailed collision physics, (3) compare Monte Carlo calculation methods with deterministic methods, and (4) teach fundamentals to students. In this work we describe 2 new tools for preparing the ACE cross-section files to be used by MCNP ® for these analytic test problems, simple a ce.pl and simple a ce m g.pl.

  20. Design and Development of Decision Making System Using Fuzzy Analytic Hierarchy Process

    OpenAIRE

    Chin W. Cheong; Lee H. Jie; Mak C. Meng; Amy L.H. Lan

    2008-01-01

    This article aims to develop a fuzzy Multicriteria Decision Making (MCDM) tool that equips with Analytic Hierarchy Process (AHP) framework to help users in semi-structured and unstructured decision making tasks. The tool provides portability and adaptability features by deploying the software on web platform. In addition, this system provides an integrated domain reference channel via a database connection to assist the user obtains relevant information regarding the problem domain before con...

  1. Surveillance of emerging drugs of abuse in Hong Kong: validation of an analytical tool.

    Science.gov (United States)

    Tang, Magdalene H Y; Ching, C K; Tse, M L; Ng, Carol; Lee, Caroline; Chong, Y K; Wong, Watson; Mak, Tony W L

    2015-04-01

    To validate a locally developed chromatography-based method to monitor emerging drugs of abuse whilst performing regular drug testing in abusers. Cross-sectional study. Eleven regional hospitals, seven social service units, and a tertiary level clinical toxicology laboratory in Hong Kong. A total of 972 drug abusers and high-risk individuals were recruited from acute, rehabilitation, and high-risk settings between 1 November 2011 and 31 July 2013. A subset of the participants was of South Asian ethnicity. In total, 2000 urine or hair specimens were collected. Proof of concept that surveillance of emerging drugs of abuse can be performed whilst conducting routine drug of abuse testing in patients. The method was successfully applied to 2000 samples with three emerging drugs of abuse detected in five samples: PMMA (paramethoxymethamphetamine), TFMPP [1-(3-trifluoromethylphenyl)piperazine], and methcathinone. The method also detected conventional drugs of abuse, with codeine, methadone, heroin, methamphetamine, and ketamine being the most frequently detected drugs. Other findings included the observation that South Asians had significantly higher rates of using opiates such as heroin, methadone, and codeine; and that ketamine and cocaine had significantly higher detection rates in acute subjects compared with the rehabilitation population. This locally developed analytical method is a valid tool for simultaneous surveillance of emerging drugs of abuse and routine drug monitoring of patients at minimal additional cost and effort. Continued, proactive surveillance and early identification of emerging drugs will facilitate prompt clinical, social, and legislative management.

  2. Metrics Development for UML Tools evaluation

    OpenAIRE

    Dasso, Aristides; Funes, Ana; Peralta, Mario; Salgado, Carlos Humberto

    2005-01-01

    The Unified Modelling Language (UML) has become a defacto standard for software development practitioners. There are several tools that help the use of UML. Users of those tools must evaluate and compare different versions of the tools they intend to use or are using to assess the possibility of changing or acquiring one. There are several ways to perform this evaluation from the simple rule-of-thumb to numeric or quantitative methods. We present an ongoing project that evaluates UML tools us...

  3. Capitalizing on App Development Tools and Technologies

    Science.gov (United States)

    Luterbach, Kenneth J.; Hubbell, Kenneth R.

    2015-01-01

    Instructional developers and others creating apps must choose from a wide variety of app development tools and technologies. Some app development tools have incorporated visual programming features, which enable some drag and drop coding and contextual programming. While those features help novices begin programming with greater ease, questions…

  4. Developing a Modeling Tool Using Eclipse

    NARCIS (Netherlands)

    Kirtley, Nick; Waqas Kamal, Ahmad; Avgeriou, Paris

    2008-01-01

    Tool development using an open source platform provides autonomy to users to change, use, and develop cost-effective software with freedom from licensing requirements. However, open source tool development poses a number of challenges, such as poor documentation and continuous evolution. In this

  5. EPIPOI: a user-friendly analytical tool for the extraction and visualization of temporal parameters from epidemiological time series.

    Science.gov (United States)

    Alonso, Wladimir J; McCormick, Benjamin J J

    2012-11-15

    There is an increasing need for processing and understanding relevant information generated by the systematic collection of public health data over time. However, the analysis of those time series usually requires advanced modeling techniques, which are not necessarily mastered by staff, technicians and researchers working on public health and epidemiology. Here a user-friendly tool, EPIPOI, is presented that facilitates the exploration and extraction of parameters describing trends, seasonality and anomalies that characterize epidemiological processes. It also enables the inspection of those parameters across geographic regions. Although the visual exploration and extraction of relevant parameters from time series data is crucial in epidemiological research, until now it had been largely restricted to specialists. EPIPOI is freely available software developed in Matlab (The Mathworks Inc) that runs both on PC and Mac computers. Its friendly interface guides users intuitively through useful comparative analyses including the comparison of spatial patterns in temporal parameters. EPIPOI is able to handle complex analyses in an accessible way. A prototype has already been used to assist researchers in a variety of contexts from didactic use in public health workshops to the main analytical tool in published research. EPIPOI can assist public health officials and students to explore time series data using a broad range of sophisticated analytical and visualization tools. It also provides an analytical environment where even advanced users can benefit by enabling a higher degree of control over model assumptions, such as those associated with detecting disease outbreaks and pandemics.

  6. EPIPOI: A user-friendly analytical tool for the extraction and visualization of temporal parameters from epidemiological time series

    Directory of Open Access Journals (Sweden)

    Alonso Wladimir J

    2012-11-01

    Full Text Available Abstract Background There is an increasing need for processing and understanding relevant information generated by the systematic collection of public health data over time. However, the analysis of those time series usually requires advanced modeling techniques, which are not necessarily mastered by staff, technicians and researchers working on public health and epidemiology. Here a user-friendly tool, EPIPOI, is presented that facilitates the exploration and extraction of parameters describing trends, seasonality and anomalies that characterize epidemiological processes. It also enables the inspection of those parameters across geographic regions. Although the visual exploration and extraction of relevant parameters from time series data is crucial in epidemiological research, until now it had been largely restricted to specialists. Methods EPIPOI is freely available software developed in Matlab (The Mathworks Inc that runs both on PC and Mac computers. Its friendly interface guides users intuitively through useful comparative analyses including the comparison of spatial patterns in temporal parameters. Results EPIPOI is able to handle complex analyses in an accessible way. A prototype has already been used to assist researchers in a variety of contexts from didactic use in public health workshops to the main analytical tool in published research. Conclusions EPIPOI can assist public health officials and students to explore time series data using a broad range of sophisticated analytical and visualization tools. It also provides an analytical environment where even advanced users can benefit by enabling a higher degree of control over model assumptions, such as those associated with detecting disease outbreaks and pandemics.

  7. Maritime Analytics Prototype: Final Development Report

    Science.gov (United States)

    2014-04-01

    and data was assembled in a PostgreSQL database, to support the demonstrations. An installation guide and demonstration instructions are also...et les données ont été assemblées dans une base de données PostgreSQL pour soutenir la démonstration. Un guide d’installation et des instructions pour...MVAP in a PostgreSQL database. The following widgets were developed: • Vessel Summary Card: delivers a unique visual representation of each ship

  8. Observation Tools for Professional Development

    Science.gov (United States)

    Malu, Kathleen F.

    2015-01-01

    Professional development of teachers, including English language teachers, empowers them to change in ways that improve teaching and learning (Gall and Acheson 2011; Murray 2010). In their seminal research on staff development--professional development in today's terms--Joyce and Showers (2002) identify key factors that promote teacher change.…

  9. Analytics for smart energy management tools and applications for sustainable manufacturing

    CERN Document Server

    Oh, Seog-Chan

    2016-01-01

    This book introduces the issues and problems that arise when implementing smart energy management for sustainable manufacturing in the automotive manufacturing industry and the analytical tools and applications to deal with them. It uses a number of illustrative examples to explain energy management in automotive manufacturing, which involves most types of manufacturing technology and various levels of energy consumption. It demonstrates how analytical tools can help improve energy management processes, including forecasting, consumption, and performance analysis, emerging new technology identification as well as investment decisions for establishing smart energy consumption practices. It also details practical energy management systems, making it a valuable resource for professionals involved in real energy management processes, and allowing readers to implement the procedures and applications presented.

  10. Analytical and Empirical Modeling of Wear and Forces of CBN Tool in Hard Turning - A Review

    Science.gov (United States)

    Patel, Vallabh Dahyabhai; Gandhi, Anishkumar Hasmukhlal

    2017-08-01

    Machining of steel material having hardness above 45 HRC (Hardness-Rockwell C) is referred as a hard turning. There are numerous models which should be scrutinized and implemented to gain optimum performance of hard turning. Various models in hard turning by cubic boron nitride tool have been reviewed, in attempt to utilize appropriate empirical and analytical models. Validation of steady state flank and crater wear model, Usui's wear model, forces due to oblique cutting theory, extended Lee and Shaffer's force model, chip formation and progressive flank wear have been depicted in this review paper. Effort has been made to understand the relationship between tool wear and tool force based on the different cutting conditions and tool geometries so that appropriate model can be used according to user requirement in hard turning.

  11. Visual Analytics Tools for Sustainable Lifecycle Design: Current Status, Challenges, and Future Opportunities.

    Science.gov (United States)

    Ramanujan, Devarajan; Bernstein, William Z; Chandrasegaran, Senthil K; Ramani, Karthik

    2017-01-01

    The rapid rise in technologies for data collection has created an unmatched opportunity to advance the use of data-rich tools for lifecycle decision-making. However, the usefulness of these technologies is limited by the ability to translate lifecycle data into actionable insights for human decision-makers. This is especially true in the case of sustainable lifecycle design (SLD), as the assessment of environmental impacts, and the feasibility of making corresponding design changes, often relies on human expertise and intuition. Supporting human sense-making in SLD requires the use of both data-driven and user-driven methods while exploring lifecycle data. A promising approach for combining the two is through the use of visual analytics (VA) tools. Such tools can leverage the ability of computer-based tools to gather, process, and summarize data along with the ability of human-experts to guide analyses through domain knowledge or data-driven insight. In this paper, we review previous research that has created VA tools in SLD. We also highlight existing challenges and future opportunities for such tools in different lifecycle stages-design, manufacturing, distribution & supply chain, use-phase, end-of-life, as well as life cycle assessment. Our review shows that while the number of VA tools in SLD is relatively small, researchers are increasingly focusing on the subject matter. Our review also suggests that VA tools can address existing challenges in SLD and that significant future opportunities exist.

  12. Impact of population and economic growth on carbon emissions in Taiwan using an analytic tool STIRPAT

    Directory of Open Access Journals (Sweden)

    Jong-Chao Yeh

    2017-01-01

    Full Text Available Carbon emission has increasingly become an issue of global concern because of climate change. Unfortunately, Taiwan was listed as top 20 countries of carbon emission in 2014. In order to provide appropriate measures to control carbon emission, it appears that there is an urgent need to address how such factors as population and economic growth impact the emission of carbon dioxide in any developing countries. In addition to total population, both the percentages of population living in urban area (i.e., urbanization percentage, and non-dependent population may also serve as limiting factors. On the other hand, the total energy-driven gross domestic production (GDP and the percentage of GDP generated by the manufacturing industries are assessed to see their respective degree of impact on carbon emission. Therefore, based on the past national data in the period 1990–2014 in Taiwan, an analytic tool of Stochastic Impacts by Regression on Population, Affluence and Technology (STIRPAT was employed to see how well those aforementioned factors can describe their individual potential impact on global warming, which is measured by the total amount of carbon emission into the atmosphere. Seven scenarios of STIRPAT model were proposed and tested statistically for the significance of each proposed model. As a result, two models were suggested to predict the impact of carbon emission due to population and economic growth by the year 2025 in Taiwan.

  13. The 'secureplan' bomb utility: A PC-based analytic tool for bomb defense

    International Nuclear Information System (INIS)

    Massa, R.J.

    1987-01-01

    This paper illustrates a recently developed, PC-based software system for simulating the effects of an infinite variety of hypothetical bomb blasts on structures and personnel in the immediate vicinity of such blasts. The system incorporates two basic rectangular geometries in which blast assessments can be made - an external configuration (highly vented) and an internal configuration (vented and unvented). A variety of explosives can be used - each is translated to an equivalent TNT weight. Provisions in the program account for bomb cases (person, satchel, case and vehicle), mixes of explosives and shrapnel aggregates and detonation altitudes. The software permits architects, engineers, security personnel and facility managers, without specific knowledge of explosives, to incorporate realistic construction hardening, screening programs, barriers and stand-off provisions in the design and/or operation of diverse facilities. System outputs - generally represented as peak incident or reflected overpressure or impulses - are both graphic and analytic and integrate damage threshold data for common construction materials including window glazing. The effects of bomb blasts on humans is estimated in terms of temporary and permanent hearing damage, lung damage (lethality) and whole body translation injury. The software system has been used in the field in providing bomb defense services to a number of commercial clients since July of 1986. In addition to the design of siting, screening and hardening components of bomb defense programs, the software has proven very useful in post-incident analysis and repair scenarios and as a teaching tool for bomb defense training

  14. Experiments with Analytic Centers: A confluence of data, tools and help in using them.

    Science.gov (United States)

    Little, M. M.; Crichton, D. J.; Hines, K.; Cole, M.; Quam, B. M.

    2017-12-01

    Traditional repositories have been primarily focused on data stewardship. Over the past two decades, data scientists have attempted to overlay a superstructure to make these repositories more amenable to analysis tasks, with limited success. This poster will summarize lessons learned and some realizations regarding what it takes to create an analytic center. As the volume of Earth Science data grows and the sophistication of analytic tools improves, a pattern has emerged that indicates different science communities uniquely apply a selection of tools to the data to produce scientific results. Infrequently do the experiences of one group help steer other groups. How can the information technology community seed these domains with tools that conform to the thought processes and experiences of that particular science group? What types of succcessful technology infusions have occured and how does technology get adopted. AIST has been experimenting with the management of this analytic center process; this paper will summarize the results and indicate a direction for future infusion attempts.

  15. Awareness Development Across Perspectives Tool (ADAPT)

    Science.gov (United States)

    2010-10-01

    individualist and collectivist cultures are described and linked in the generic knowledge base, and the specific cultural aspects and how they relate to...RTO-MP-HFM-202 2 - 1 Awareness Development Across Perspectives Tool (ADAPT)1 Dr. A.J. van Vliet TNO Human Factors PO Box 23, 3769ZG...This paper discusses the development of this Awareness Development across Perspectives Tool (ADAPT). The Approach Our research and development

  16. Development of analytical techniques for safeguards environmental samples at JAEA

    International Nuclear Information System (INIS)

    Sakurai, Satoshi; Magara, Masaaki; Usuda, Shigekazu; Watanabe, Kazuo; Esaka, Fumitaka; Hirayama, Fumio; Lee, Chi-Gyu; Yasuda, Kenichiro; Inagawa, Jun; Suzuki, Daisuke; Iguchi, Kazunari; Kokubu, Yoko S.; Miyamoto, Yutaka; Ohzu, Akira

    2007-01-01

    JAEA has been developing, under the auspices of the Ministry of Education, Culture, Sports, Science and Technology of Japan, analytical techniques for ultra-trace amounts of nuclear materials in environmental samples in order to contribute to the strengthened safeguards system. Development of essential techniques for bulk and particle analysis, as well as screening, of the environmental swipe samples has been established as ultra-trace analytical methods of uranium and plutonium. In January 2003, JAEA was qualified, including its quality control system, as a member of the JAEA network analytical laboratories for environmental samples. Since 2004, JAEA has conducted the analysis of domestic and the IAEA samples, through which JAEA's analytical capability has been verified and improved. In parallel, advanced techniques have been developed in order to expand the applicability to the samples of various elemental composition and impurities and to improve analytical accuracy and efficiency. This paper summarizes the trace of the technical development in environmental sample analysis at JAEA, and refers to recent trends of research and development in this field. (author)

  17. Benefits and limitations of using decision analytic tools to assess uncertainty and prioritize Landscape Conservation Cooperative information needs

    Science.gov (United States)

    Post van der Burg, Max; Cullinane Thomas, Catherine; Holcombe, Tracy R.; Nelson, Richard D.

    2016-01-01

    The Landscape Conservation Cooperatives (LCCs) are a network of partnerships throughout North America that are tasked with integrating science and management to support more effective delivery of conservation at a landscape scale. In order to achieve this integration, some LCCs have adopted the approach of providing their partners with better scientific information in an effort to facilitate more effective and coordinated conservation decisions. Taking this approach has led many LCCs to begin funding research to provide the information for improved decision making. To ensure that funding goes to research projects with the highest likelihood of leading to more integrated broad scale conservation, some LCCs have also developed approaches for prioritizing which information needs will be of most benefit to their partnerships. We describe two case studies in which decision analytic tools were used to quantitatively assess the relative importance of information for decisions made by partners in the Plains and Prairie Potholes LCC. The results of the case studies point toward a few valuable lessons in terms of using these tools with LCCs. Decision analytic tools tend to help shift focus away from research oriented discussions and toward discussions about how information is used in making better decisions. However, many technical experts do not have enough knowledge about decision making contexts to fully inform the latter type of discussion. When assessed in the right decision context, however, decision analyses can point out where uncertainties actually affect optimal decisions and where they do not. This helps technical experts understand that not all research is valuable in improving decision making. But perhaps most importantly, our results suggest that decision analytic tools may be more useful for LCCs as way of developing integrated objectives for coordinating partner decisions across the landscape, rather than simply ranking research priorities.

  18. Topological data analysis: A promising big data exploration tool in biology, analytical chemistry and physical chemistry.

    Science.gov (United States)

    Offroy, Marc; Duponchel, Ludovic

    2016-03-03

    An important feature of experimental science is that data of various kinds is being produced at an unprecedented rate. This is mainly due to the development of new instrumental concepts and experimental methodologies. It is also clear that the nature of acquired data is significantly different. Indeed in every areas of science, data take the form of always bigger tables, where all but a few of the columns (i.e. variables) turn out to be irrelevant to the questions of interest, and further that we do not necessary know which coordinates are the interesting ones. Big data in our lab of biology, analytical chemistry or physical chemistry is a future that might be closer than any of us suppose. It is in this sense that new tools have to be developed in order to explore and valorize such data sets. Topological data analysis (TDA) is one of these. It was developed recently by topologists who discovered that topological concept could be useful for data analysis. The main objective of this paper is to answer the question why topology is well suited for the analysis of big data set in many areas and even more efficient than conventional data analysis methods. Raman analysis of single bacteria should be providing a good opportunity to demonstrate the potential of TDA for the exploration of various spectroscopic data sets considering different experimental conditions (with high noise level, with/without spectral preprocessing, with wavelength shift, with different spectral resolution, with missing data). Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Development and Validation of Analytical Method for Losartan ...

    African Journals Online (AJOL)

    Development and Validation of Analytical Method for Losartan-Copper Complex Using UV-Vis Spectrophotometry. ... Tropical Journal of Pharmaceutical Research ... Purpose: To develop a new spectrophotometric method for the analysis of losartan potassium in pharmaceutical formulations by making its complex with ...

  20. Development of analytical cell support for vitrification at the West Valley Demonstration Project. Topical report

    Energy Technology Data Exchange (ETDEWEB)

    Barber, F.H.; Borek, T.T.; Christopher, J.Z. [and others

    1997-12-01

    Analytical and Process Chemistry (A&PC) support is essential to the high-level waste vitrification campaign at the West Valley Demonstration Project (WVDP). A&PC characterizes the waste, providing information necessary to formulate the recipe for the target radioactive glass product. High-level waste (HLW) samples are prepared and analyzed in the analytical cells (ACs) and Sample Storage Cell (SSC) on the third floor of the main plant. The high levels of radioactivity in the samples require handling them in the shielded cells with remote manipulators. The analytical hot cells and third floor laboratories were refurbished to ensure optimal uninterrupted operation during the vitrification campaign. New and modified instrumentation, tools, sample preparation and analysis techniques, and equipment and training were required for A&PC to support vitrification. Analytical Cell Mockup Units (ACMUs) were designed to facilitate method development, scientist and technician training, and planning for analytical process flow. The ACMUs were fabricated and installed to simulate the analytical cell environment and dimensions. New techniques, equipment, and tools could be evaluated m in the ACMUs without the consequences of generating or handling radioactive waste. Tools were fabricated, handling and disposal of wastes was addressed, and spatial arrangements for equipment were refined. As a result of the work at the ACMUs the remote preparation and analysis methods and the equipment and tools were ready for installation into the ACs and SSC m in July 1995. Before use m in the hot cells, all remote methods had been validated and four to eight technicians were trained on each. Fine tuning of the procedures has been ongoing at the ACs based on input from A&PC technicians. Working at the ACs presents greater challenges than had development at the ACMUs. The ACMU work and further refinements m in the ACs have resulted m in a reduction m in analysis turnaround time (TAT).

  1. Development of analytical cell support for vitrification at the West Valley Demonstration Project. Topical report

    International Nuclear Information System (INIS)

    Barber, F.H.; Borek, T.T.; Christopher, J.Z.

    1997-12-01

    Analytical and Process Chemistry (A ampersand PC) support is essential to the high-level waste vitrification campaign at the West Valley Demonstration Project (WVDP). A ampersand PC characterizes the waste, providing information necessary to formulate the recipe for the target radioactive glass product. High-level waste (HLW) samples are prepared and analyzed in the analytical cells (ACs) and Sample Storage Cell (SSC) on the third floor of the main plant. The high levels of radioactivity in the samples require handling them in the shielded cells with remote manipulators. The analytical hot cells and third floor laboratories were refurbished to ensure optimal uninterrupted operation during the vitrification campaign. New and modified instrumentation, tools, sample preparation and analysis techniques, and equipment and training were required for A ampersand PC to support vitrification. Analytical Cell Mockup Units (ACMUs) were designed to facilitate method development, scientist and technician training, and planning for analytical process flow. The ACMUs were fabricated and installed to simulate the analytical cell environment and dimensions. New techniques, equipment, and tools could be evaluated m in the ACMUs without the consequences of generating or handling radioactive waste. Tools were fabricated, handling and disposal of wastes was addressed, and spatial arrangements for equipment were refined. As a result of the work at the ACMUs the remote preparation and analysis methods and the equipment and tools were ready for installation into the ACs and SSC m in July 1995. Before use m in the hot cells, all remote methods had been validated and four to eight technicians were trained on each. Fine tuning of the procedures has been ongoing at the ACs based on input from A ampersand PC technicians. Working at the ACs presents greater challenges than had development at the ACMUs. The ACMU work and further refinements m in the ACs have resulted m in a reduction m in

  2. Earth Science Data Analytics: Bridging Tools and Techniques with the Co-Analysis of Large, Heterogeneous Datasets

    Science.gov (United States)

    Kempler, Steve; Mathews, Tiffany

    2016-01-01

    The continuum of ever-evolving data management systems affords great opportunities to the enhancement of knowledge and facilitation of science research. To take advantage of these opportunities, it is essential to understand and develop methods that enable data relationships to be examined and the information to be manipulated. This presentation describes the efforts of the Earth Science Information Partners (ESIP) Federation Earth Science Data Analytics (ESDA) Cluster to understand, define, and facilitate the implementation of ESDA to advance science research. As a result of the void of Earth science data analytics publication material, the cluster has defined ESDA along with 10 goals to set the framework for a common understanding of tools and techniques that are available and still needed to support ESDA.

  3. Electrochemical immunoassay using magnetic beads for the determination of zearalenone in baby food: an anticipated analytical tool for food safety.

    Science.gov (United States)

    Hervás, Miriam; López, Miguel Angel; Escarpa, Alberto

    2009-10-27

    In this work, electrochemical immunoassay involving magnetic beads to determine zearalenone in selected food samples has been developed. The immunoassay scheme has been based on a direct competitive immunoassay method in which antibody-coated magnetic beads were employed as the immobilisation support and horseradish peroxidase (HRP) was used as enzymatic label. Amperometric detection has been achieved through the addition of hydrogen peroxide substrate and hydroquinone as mediator. Analytical performance of the electrochemical immunoassay has been evaluated by analysis of maize certified reference material (CRM) and selected baby food samples. A detection limit (LOD) of 0.011 microg L(-1) and EC(50) 0.079 microg L(-1) were obtained allowing the assessment of the detection of zearalenone mycotoxin. In addition, an excellent accuracy with a high recovery yield ranging between 95 and 108% has been obtained. The analytical features have shown the proposed electrochemical immunoassay to be a very powerful and timely screening tool for the food safety scene.

  4. Adsorptive micro-extraction techniques--novel analytical tools for trace levels of polar solutes in aqueous media.

    Science.gov (United States)

    Neng, N R; Silva, A R M; Nogueira, J M F

    2010-11-19

    A novel enrichment technique, adsorptive μ-extraction (AμE), is proposed for trace analysis of polar solutes in aqueous media. The preparation, stability tests and development of the analytical devices using two geometrical configurations, i.e. bar adsorptive μ-extraction (BAμE) and multi-spheres adsorptive μ-extraction (MSAμE) is fully discussed. From the several sorbent materials tested, activated carbons and polystyrene divinylbenzene phases demonstrated the best stability, robustness and to be the most suitable for analytical purposes. The application of both BAμE and MSAμE devices proved remarkable performance for the determination of trace levels of polar solutes and metabolites (e.g. pesticides, disinfection by-products, drugs of abuse and pharmaceuticals) in water matrices and biological fluids. By comparing AμE techniques with stir bar sorptive extraction based on polydimethylsiloxane phase, great effectiveness is attained overcoming the limitations of the latter enrichment approach regarding the more polar solutes. Furthermore, convenient sensitivity and selectivity is reached through AμE techniques, since the great advantage of this new analytical technology is the possibility to choose the most suitable sorbent to each particular type of application. The enrichment techniques proposed are cost-effective, easy to prepare and work-up, demonstrating robustness and to be a remarkable analytical tool for trace analysis of priority solutes in areas of recognized importance such as environment, forensic and other related life sciences. Copyright © 2010 Elsevier B.V. All rights reserved.

  5. Quality Assurance Project Plan Development Tool

    Science.gov (United States)

    This tool contains information designed to assist in developing a Quality Assurance (QA) Project Plan that meets EPA requirements for projects that involve surface or groundwater monitoring and/or the collection and analysis of water samples.

  6. Developing a Parametric Urban Design Tool

    DEFF Research Database (Denmark)

    Steinø, Nicolai; Obeling, Esben

    2014-01-01

    Parametric urban design is a potentially powerful tool for collaborative urban design processes. Rather than making one- off designs which need to be redesigned from the ground up in case of changes, parametric design tools make it possible keep the design open while at the same time allowing...... for a level of detailing which is high enough to facilitate an understan- ding of the generic qualities of proposed designs. Starting from a brief overview of parametric design, this paper presents initial findings from the development of a parametric urban design tool with regard to developing a structural...... logic which is flexible and expandable. It then moves on to outline and discuss further development work. Finally, it offers a brief reflection on the potentials and shortcomings of the software – CityEngine – which is used for developing the parametric urban design tool....

  7. Analytical protein a chromatography as a quantitative tool for the screening of methionine oxidation in monoclonal antibodies.

    Science.gov (United States)

    Loew, Caroline; Knoblich, Constanze; Fichtl, Jürgen; Alt, Nadja; Diepold, Katharina; Bulau, Patrick; Goldbach, Pierre; Adler, Michael; Mahler, Hanns-Christian; Grauschopf, Ulla

    2012-11-01

    The presence of oxidized methionine residues in therapeutic monoclonal antibodies can potentially impact drug efficacy, safety, as well as antibody half-life in vivo. Therefore, methionine oxidation of antibodies is a strong focus during pharmaceutical development and a well-known degradation pathway. The monitoring of methionine oxidation is currently routinely performed by peptide mapping/liquid chromatography-mass spectrometry techniques, which are laborious and time consuming. We have established analytical protein A chromatography as a method of choice for fast and quantitative screening of total Fc methionine oxidation during formulation and process development. The principle of this method relies on the lower binding affinity of protein A for immunoglobulin G-Fc domains containing oxidized methionines, compared with nonoxidized Fc domains. Our data reveal that highly conserved Fc methionines situated close to the binding site to protein A can serve as marker for the oxidation of other surface-exposed methionine residues. In case of poor separation of oxidized species by protein A chromatography, analytical protein G chromatography is proposed as alternative. We demonstrate that analytical protein A chromatography, and alternatively protein G chromatography, is a valuable tool for the screening of methionine oxidation in therapeutic antibodies during formulation and process development. Copyright © 2012 Wiley Periodicals, Inc.

  8. Data Mining and Optimization Tools for Developing Engine Parameters Tools

    Science.gov (United States)

    Dhawan, Atam P.

    1998-01-01

    This project was awarded for understanding the problem and developing a plan for Data Mining tools for use in designing and implementing an Engine Condition Monitoring System. Tricia Erhardt and I studied the problem domain for developing an Engine Condition Monitoring system using the sparse and non-standardized datasets to be available through a consortium at NASA Lewis Research Center. We visited NASA three times to discuss additional issues related to dataset which was not made available to us. We discussed and developed a general framework of data mining and optimization tools to extract useful information from sparse and non-standard datasets. These discussions lead to the training of Tricia Erhardt to develop Genetic Algorithm based search programs which were written in C++ and used to demonstrate the capability of GA algorithm in searching an optimal solution in noisy, datasets. From the study and discussion with NASA LeRC personnel, we then prepared a proposal, which is being submitted to NASA for future work for the development of data mining algorithms for engine conditional monitoring. The proposed set of algorithm uses wavelet processing for creating multi-resolution pyramid of tile data for GA based multi-resolution optimal search.

  9. ANAlyte: A modular image analysis tool for ANA testing with indirect immunofluorescence.

    Science.gov (United States)

    Di Cataldo, Santa; Tonti, Simone; Bottino, Andrea; Ficarra, Elisa

    2016-05-01

    The automated analysis of indirect immunofluorescence images for Anti-Nuclear Autoantibody (ANA) testing is a fairly recent field that is receiving ever-growing interest from the research community. ANA testing leverages on the categorization of intensity level and fluorescent pattern of IIF images of HEp-2 cells to perform a differential diagnosis of important autoimmune diseases. Nevertheless, it suffers from tremendous lack of repeatability due to subjectivity in the visual interpretation of the images. The automatization of the analysis is seen as the only valid solution to this problem. Several works in literature address individual steps of the work-flow, nonetheless integrating such steps and assessing their effectiveness as a whole is still an open challenge. We present a modular tool, ANAlyte, able to characterize a IIF image in terms of fluorescent intensity level and fluorescent pattern without any user-interactions. For this purpose, ANAlyte integrates the following: (i) Intensity Classifier module, that categorizes the intensity level of the input slide based on multi-scale contrast assessment; (ii) Cell Segmenter module, that splits the input slide into individual HEp-2 cells; (iii) Pattern Classifier module, that determines the fluorescent pattern of the slide based on the pattern of the individual cells. To demonstrate the accuracy and robustness of our tool, we experimentally validated ANAlyte on two different public benchmarks of IIF HEp-2 images with rigorous leave-one-out cross-validation strategy. We obtained overall accuracy of fluorescent intensity and pattern classification respectively around 85% and above 90%. We assessed all results by comparisons with some of the most representative state of the art works. Unlike most of the other works in the recent literature, ANAlyte aims at the automatization of all the major steps of ANA image analysis. Results on public benchmarks demonstrate that the tool can characterize HEp-2 slides in terms of

  10. Recent developments in analytical toxicology : for better or for worse

    NARCIS (Netherlands)

    de Zeeuw, RA

    1998-01-01

    When considering the state of the art in toxicology from an analytical perspective, the key developments relate to three major areas. (1) Forensic horizon: Today forensic analysis has broadened its scope dramatically, to include workplace toxicology, drug abuse testing, drugs and driving, doping,

  11. Predictive analytics tools to adjust and monitor performance metrics for the ATLAS Production System

    CERN Document Server

    Barreiro Megino, Fernando Harald; The ATLAS collaboration

    2017-01-01

    Having information such as an estimation of the processing time or possibility of system outage (abnormal behaviour) helps to assist to monitor system performance and to predict its next state. The current cyber-infrastructure presents computing conditions in which contention for resources among high-priority data analysis happens routinely, that might lead to significant workload and data handling interruptions. The lack of the possibility to monitor and to predict the behaviour of the analysis process (its duration) and system’s state itself caused to focus on design of the built-in situational awareness analytic tools.

  12. ANALYTICAL MODEL FOR LATHE TOOL DISPLACEMENTS CALCULUS IN THE MANUFACTURING P ROCESS

    Directory of Open Access Journals (Sweden)

    Catălin ROŞU

    2014-01-01

    Full Text Available In this paper, we present an analytical model for lathe tools displacements calculus in the manufacturing process. We will present step by step the methodology for the displacements calculus and in the end we will insert these relations in a program for automatic calculus and we extract the conclusions. There is taken into account only the effects of the bending moments (because these insert the highest displacements. The simplifying assumptions and the calculus relations for the displacements (linea r and angular ones are presented in an original way.

  13. The Blackbird Whistling or Just After? Vygotsky's Tool and Sign as an Analytic for Writing

    Science.gov (United States)

    Imbrenda, Jon-Philip

    2016-01-01

    Based on Vygotsky's theory of the interplay of the tool and sign functions of language, this study presents a textual analysis of a corpus of student-authored texts to illuminate aspects of development evidenced through the dialectical tension of tool and sign. Data were drawn from a series of reflective memos I authored during a seminar for new…

  14. Remote tool development for nuclear dismantling operations

    International Nuclear Information System (INIS)

    Craig, G.; Ferlay, J.C.; Ieracitano, F.

    2003-01-01

    Remote tool systems to undertake nuclear dismantling operations require careful design and development not only to perform their given duty but to perform it safely within the constraints imposed by harsh environmental conditions. Framatome ANP NUCLEAR SERVICES has for a long time developed and qualified equipment to undertake specific maintenance operations of nuclear reactors. The tool development methodology from this activity has since been adapted to resolve some very challenging reactor dismantling operations which are demonstrated in this paper. Each nuclear decommissioning project is a unique case, technical characterisation data is generally incomplete. The development of the dismantling methodology and associated equipment is by and large an iterative process combining design and simulation with feasibility and validation testing. The first stage of the development process involves feasibility testing of industrial tools and examining adaptations necessary to control and deploy the tool remotely with respect to the chosen methodology and environmental constraints. This results in a prototype tool and deployment system to validate the basic process. The second stage involves detailed design which integrates any remaining technical and environmental constraints. At the end of this stage, tools and deployment systems, operators and operating procedures are qualified on full scale mock ups. (authors)

  15. Second International Workshop on Teaching Analytics

    DEFF Research Database (Denmark)

    Vatrapu, Ravi; Reimann, Peter; Halb, Wolfgang

    2013-01-01

    Teaching Analytics is conceived as a subfield of learning analytics that focuses on the design, development, evaluation, and education of visual analytics methods and tools for teachers in primary, secondary, and tertiary educational settings. The Second International Workshop on Teaching Analytics...... (IWTA) 2013 seeks to bring together researchers and practitioners in the fields of education, learning sciences, learning analytics, and visual analytics to investigate the design, development, use, evaluation, and impact of visual analytical methods and tools for teachers’ dynamic diagnostic decision...

  16. Computational Tools to Accelerate Commercial Development

    Energy Technology Data Exchange (ETDEWEB)

    Miller, David C

    2013-01-01

    The goals of the work reported are: to develop new computational tools and models to enable industry to more rapidly develop and deploy new advanced energy technologies; to demonstrate the capabilities of the CCSI Toolset on non-proprietary case studies; and to deploy the CCSI Toolset to industry. Challenges of simulating carbon capture (and other) processes include: dealing with multiple scales (particle, device, and whole process scales); integration across scales; verification, validation, and uncertainty; and decision support. The tools cover: risk analysis and decision making; validated, high-fidelity CFD; high-resolution filtered sub-models; process design and optimization tools; advanced process control and dynamics; process models; basic data sub-models; and cross-cutting integration tools.

  17. Process analytical tools for monitoring, understanding, and control of pharmaceutical fluidized bed granulation: A review.

    Science.gov (United States)

    Burggraeve, Anneleen; Monteyne, Tinne; Vervaet, Chris; Remon, Jean Paul; De Beer, Thomas

    2013-01-01

    Fluidized bed granulation is a widely applied wet granulation technique in the pharmaceutical industry to produce solid dosage forms. The process involves the spraying of a binder liquid onto fluidizing powder particles. As a result, the (wetted) particles collide with each other and form larger permanent aggregates (granules). After spraying the required amount of granulation liquid, the wet granules are rapidly dried in the fluid bed granulator. Since the FDA launched its Process Analytical Technology initiative (and even before), a wide range of analytical process sensors has been used for real-time monitoring and control of fluid bed granulation processes. By applying various data analysis techniques to the multitude of data collected from the process analyzers implemented in fluid bed granulators, a deeper understanding of the process has been achieved. This review gives an overview of the process analytical technologies used during fluid bed granulation to monitor and control the process. The fundamentals of the mechanisms contributing to wet granule growth and the characteristics of fluid bed granulation processing are briefly discussed. This is followed by a detailed overview of the in-line applied process analyzers, contributing to improved fluid bed granulation understanding, modeling, control, and endpoint detection. Analysis and modeling tools enabling the extraction of the relevant information from the complex data collected during granulation and the control of the process are highlighted. Copyright © 2012 Elsevier B.V. All rights reserved.

  18. XCluSim: a visual analytics tool for interactively comparing multiple clustering results of bioinformatics data

    Science.gov (United States)

    2015-01-01

    Background Though cluster analysis has become a routine analytic task for bioinformatics research, it is still arduous for researchers to assess the quality of a clustering result. To select the best clustering method and its parameters for a dataset, researchers have to run multiple clustering algorithms and compare them. However, such a comparison task with multiple clustering results is cognitively demanding and laborious. Results In this paper, we present XCluSim, a visual analytics tool that enables users to interactively compare multiple clustering results based on the Visual Information Seeking Mantra. We build a taxonomy for categorizing existing techniques of clustering results visualization in terms of the Gestalt principles of grouping. Using the taxonomy, we choose the most appropriate interactive visualizations for presenting individual clustering results from different types of clustering algorithms. The efficacy of XCluSim is shown through case studies with a bioinformatician. Conclusions Compared to other relevant tools, XCluSim enables users to compare multiple clustering results in a more scalable manner. Moreover, XCluSim supports diverse clustering algorithms and dedicated visualizations and interactions for different types of clustering results, allowing more effective exploration of details on demand. Through case studies with a bioinformatics researcher, we received positive feedback on the functionalities of XCluSim, including its ability to help identify stably clustered items across multiple clustering results. PMID:26328893

  19. The role of big data and advanced analytics in drug discovery, development, and commercialization.

    Science.gov (United States)

    Szlezák, N; Evers, M; Wang, J; Pérez, L

    2014-05-01

    In recent years, few ideas have captured the imagination of health-care practitioners as much as the advent of "big data" and the advanced analytical methods and technologies used to interpret it-it is a trend seen as having the potential to revolutionize biology, medicine, and health care.(1,2,3) As new types of data and tools become available, a unique opportunity is emerging for smarter and more effective discovery, development, and commercialization of innovative biopharmaceutical drugs.

  20. Development of a transportation planning tool

    International Nuclear Information System (INIS)

    Funkhouser, B.R.; Moyer, J.W.; Ballweg, E.L.

    1994-01-01

    This paper describes the application of simulation modeling and logistics techniques to the development of a planning tool for the Department of Energy (DOE). The focus of the Transportation Planning Model (TPM) tool is to aid DOE and Sandia analysts in the planning of future fleet sizes, driver and support personnel sizes, base site locations, and resource balancing among the base sites. The design approach is to develop a rapid modeling environment which will allow analysts to easily set up a shipment scenario and perform multiple ''what if'' evaluations. The TPM is being developed on personal computers using commercial off-the shelf (COTS) software tools under the WINDOWS reg-sign operating environment. Prototype development of the TPM has been completed

  1. International Collaboration Tools for Industrial Development

    CSIR Research Space (South Africa)

    Dan, Nagy

    2017-10-01

    Full Text Available This presentation discusses countries that are ready for Industry 4.0 , International Collaboration Tools and Industrial Development by Dan Nagy at The 6th CSIR Conference: Ideas that work for industrial development, 5-6 October 2017, CSIR...

  2. Information technology tools for curriculum development

    NARCIS (Netherlands)

    McKenney, Susan; Nieveen, N.M.; Strijker, A.; Voogt, Joke; Knezek, Gerald

    2008-01-01

    The widespread introduction and use of computers in the workplace began in the early 1990s. Since then, computer-based tools have been developed to support a myriad of task types, including the complex process of curriculum development. This chapter begins by briefly introducing two concepts that

  3. Visualization and analytics tools for infectious disease epidemiology: a systematic review.

    Science.gov (United States)

    Carroll, Lauren N; Au, Alan P; Detwiler, Landon Todd; Fu, Tsung-Chieh; Painter, Ian S; Abernethy, Neil F

    2014-10-01

    A myriad of new tools and algorithms have been developed to help public health professionals analyze and visualize the complex data used in infectious disease control. To better understand approaches to meet these users' information needs, we conducted a systematic literature review focused on the landscape of infectious disease visualization tools for public health professionals, with a special emphasis on geographic information systems (GIS), molecular epidemiology, and social network analysis. The objectives of this review are to: (1) identify public health user needs and preferences for infectious disease information visualization tools; (2) identify existing infectious disease information visualization tools and characterize their architecture and features; (3) identify commonalities among approaches applied to different data types; and (4) describe tool usability evaluation efforts and barriers to the adoption of such tools. We identified articles published in English from January 1, 1980 to June 30, 2013 from five bibliographic databases. Articles with a primary focus on infectious disease visualization tools, needs of public health users, or usability of information visualizations were included in the review. A total of 88 articles met our inclusion criteria. Users were found to have diverse needs, preferences and uses for infectious disease visualization tools, and the existing tools are correspondingly diverse. The architecture of the tools was inconsistently described, and few tools in the review discussed the incorporation of usability studies or plans for dissemination. Many studies identified concerns regarding data sharing, confidentiality and quality. Existing tools offer a range of features and functions that allow users to explore, analyze, and visualize their data, but the tools are often for siloed applications. Commonly cited barriers to widespread adoption included lack of organizational support, access issues, and misconceptions about tool

  4. Development of quality-by-design analytical methods.

    Science.gov (United States)

    Vogt, Frederick G; Kord, Alireza S

    2011-03-01

    Quality-by-design (QbD) is a systematic approach to drug development, which begins with predefined objectives, and uses science and risk management approaches to gain product and process understanding and ultimately process control. The concept of QbD can be extended to analytical methods. QbD mandates the definition of a goal for the method, and emphasizes thorough evaluation and scouting of alternative methods in a systematic way to obtain optimal method performance. Candidate methods are then carefully assessed in a structured manner for risks, and are challenged to determine if robustness and ruggedness criteria are satisfied. As a result of these studies, the method performance can be understood and improved if necessary, and a control strategy can be defined to manage risk and ensure the method performs as desired when validated and deployed. In this review, the current state of analytical QbD in the industry is detailed with examples of the application of analytical QbD principles to a range of analytical methods, including high-performance liquid chromatography, Karl Fischer titration for moisture content, vibrational spectroscopy for chemical identification, quantitative color measurement, and trace analysis for genotoxic impurities. Copyright © 2010 Wiley-Liss, Inc.

  5. Windows Developer Power Tools Turbocharge Windows development with more than 170 free and open source tools

    CERN Document Server

    Avery, James

    2007-01-01

    Software developers need to work harder and harder to bring value to their development process in order to build high quality applications and remain competitive. Developers can accomplish this by improving their productivity, quickly solving problems, and writing better code. A wealth of open source and free software tools are available for developers who want to improve the way they create, build, deploy, and use software. Tools, components, and frameworks exist to help developers at every point in the development process. Windows Developer Power Tools offers an encyclopedic guide to m

  6. Developments and retrospectives in Lie theory geometric and analytic methods

    CERN Document Server

    Penkov, Ivan; Wolf, Joseph

    2014-01-01

    This volume reviews and updates a prominent series of workshops in representation/Lie theory, and reflects the widespread influence of those  workshops in such areas as harmonic analysis, representation theory, differential geometry, algebraic geometry, and mathematical physics.  Many of the contributors have had leading roles in both the classical and modern developments of Lie theory and its applications. This Work, entitled Developments and Retrospectives in Lie Theory, and comprising 26 articles, is organized in two volumes: Algebraic Methods and Geometric and Analytic Methods. This is the Geometric and Analytic Methods volume. The Lie Theory Workshop series, founded by Joe Wolf and Ivan Penkov and joined shortly thereafter by Geoff Mason, has been running for over two decades. Travel to the workshops has usually been supported by the NSF, and local universities have provided hospitality. The workshop talks have been seminal in describing new perspectives in the field covering broad areas of current re...

  7. Analytic solution to leading order coupled DGLAP evolution equations: A new perturbative QCD tool

    International Nuclear Information System (INIS)

    Block, Martin M.; Durand, Loyal; Ha, Phuoc; McKay, Douglas W.

    2011-01-01

    We have analytically solved the LO perturbative QCD singlet DGLAP equations [V. N. Gribov and L. N. Lipatov, Sov. J. Nucl. Phys. 15, 438 (1972)][G. Altarelli and G. Parisi, Nucl. Phys. B126, 298 (1977)][Y. L. Dokshitzer, Sov. Phys. JETP 46, 641 (1977)] using Laplace transform techniques. Newly developed, highly accurate, numerical inverse Laplace transform algorithms [M. M. Block, Eur. Phys. J. C 65, 1 (2010)][M. M. Block, Eur. Phys. J. C 68, 683 (2010)] allow us to write fully decoupled solutions for the singlet structure function F s (x,Q 2 ) and G(x,Q 2 ) as F s (x,Q 2 )=F s (F s0 (x 0 ),G 0 (x 0 )) and G(x,Q 2 )=G(F s0 (x 0 ),G 0 (x 0 )), where the x 0 are the Bjorken x values at Q 0 2 . Here F s and G are known functions--found using LO DGLAP splitting functions--of the initial boundary conditions F s0 (x)≡F s (x,Q 0 2 ) and G 0 (x)≡G(x,Q 0 2 ), i.e., the chosen starting functions at the virtuality Q 0 2 . For both G(x) and F s (x), we are able to either devolve or evolve each separately and rapidly, with very high numerical accuracy--a computational fractional precision of O(10 -9 ). Armed with this powerful new tool in the perturbative QCD arsenal, we compare our numerical results from the above equations with the published MSTW2008 and CTEQ6L LO gluon and singlet F s distributions [A. D. Martin, W. J. Stirling, R. S. Thorne, and G. Watt, Eur. Phys. J. C 63, 189 (2009)], starting from their initial values at Q 0 2 =1 GeV 2 and 1.69 GeV 2 , respectively, using their choice of α s (Q 2 ). This allows an important independent check on the accuracies of their evolution codes and, therefore, the computational accuracies of their published parton distributions. Our method completely decouples the two LO distributions, at the same time guaranteeing that both G and F s satisfy the singlet coupled DGLAP equations. It also allows one to easily obtain the effects of the starting functions on the evolved gluon and singlet structure functions, as functions of both Q

  8. A PSYCHOSOCIAL TOOL FOR CHILD'S DEVELOPMENT Nicholas

    African Journals Online (AJOL)

    the entire life span. The developmental psychology in children include issues such as the gradual accumulation of knowledge versus stage- like development or the extent to which children are ... The Practical use of Dance as a Psychological tool for Child's ..... stories to children change their moods and help them to identify.

  9. Research tools | IDRC - International Development Research Centre

    International Development Research Centre (IDRC) Digital Library (Canada)

    Through training materials and guides, we aim to build skills and knowledge to enhance the quality of development research. We also offer free access to our database of funded research projects, known as IDRIS+, and our digital library. Our research tools include. Guide to research databases at IDRC: How to access and ...

  10. Latest Developments in PVD Coatings for Tooling

    Directory of Open Access Journals (Sweden)

    Gabriela Strnad

    2010-06-01

    Full Text Available The paper presents the recent developments in the field of PVD coating for manufacturing tools. A review of monoblock, multilayer, nanocomposite, DLC and oxinitride coatings is discussed, with the emphasis on coatings which enables the manufacturers to implement high productivity processes such as high speed cutting and dry speed machining.

  11. Online analytical processing (OLAP): a fast and effective data mining tool for gene expression databases.

    Science.gov (United States)

    Alkharouf, Nadim W; Jamison, D Curtis; Matthews, Benjamin F

    2005-06-30

    Gene expression databases contain a wealth of information, but current data mining tools are limited in their speed and effectiveness in extracting meaningful biological knowledge from them. Online analytical processing (OLAP) can be used as a supplement to cluster analysis for fast and effective data mining of gene expression databases. We used Analysis Services 2000, a product that ships with SQLServer2000, to construct an OLAP cube that was used to mine a time series experiment designed to identify genes associated with resistance of soybean to the soybean cyst nematode, a devastating pest of soybean. The data for these experiments is stored in the soybean genomics and microarray database (SGMD). A number of candidate resistance genes and pathways were found. Compared to traditional cluster analysis of gene expression data, OLAP was more effective and faster in finding biologically meaningful information. OLAP is available from a number of vendors and can work with any relational database management system through OLE DB.

  12. The combined use of analytical tools for exploring tetanus toxin and tetanus toxoid structures.

    Science.gov (United States)

    Bayart, Caroline; Peronin, Sébastien; Jean, Elisa; Paladino, Joseph; Talaga, Philippe; Borgne, Marc Le

    2017-06-01

    Aldehyde detoxification is a process used to convert toxin into toxoid for vaccine applications. In the case of tetanus toxin (TT), formaldehyde is used to obtain the tetanus toxoid (TTd), which is used either for the tetanus vaccine or as carrier protein in conjugate vaccines. Several studies have already been conducted to better understand the exact mechanism of this detoxification. Those studies led to the identification of a number of formaldehyde-induced modifications on lab scale TTd samples. To obtain greater insights of the changes induced by formaldehyde, we used three industrial TTd batches to identify repeatable modifications in the detoxification process. Our strategy was to combine seven analytical tools to map these changes. Mass spectrometry (MS), colorimetric test and amino acid analysis (AAA) were used to study modifications on amino acids. SDS-PAGE, asymmetric flow field flow fractionation (AF4), fluorescence spectroscopy and circular dichroism (CD) were used to study formaldehyde modifications on the whole protein structure. We identified 41 formaldehyde-induced modifications across the 1315 amino acid primary sequence of TT. Of these, five modifications on lysine residues were repeatable across TTd batches. Changes in protein conformation were also observed using SDS-PAGE, AF4 and CD techniques. Each analytical tool brought a piece of information regarding formaldehyde induced-modifications, and all together, these methods provided a comprehensive overview of the structural changes that occurred with detoxification. These results could be the first step leading to site-directed TT mutagenesis studies that may enable the production of a non-toxic equivalent protein without using formaldehyde. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Analytic autoethnography:a tool to inform the lecturer’s use of self when teaching mental health nursing?

    OpenAIRE

    Struthers, John

    2012-01-01

    This research explores the value of analytic autoethnography to develop the lecturer’s use of self when teaching mental health nursing. Sharing the lecturer’s selfunderstanding developed through analytic reflexivity focused on their autoethnographic narrative offers a pedagogical approach to contribute to the nursing profession’s policy drive to increase the use of reflective practices. The research design required me to develop my own analytic autoethnography. Four themes emerged from the da...

  14. Developing automated analytical methods for scientific environments using LabVIEW.

    Science.gov (United States)

    Wagner, Christoph; Armenta, Sergio; Lendl, Bernhard

    2010-01-15

    The development of new analytical techniques often requires the building of specially designed devices, each requiring its own dedicated control software. Especially in the research and development phase, LabVIEW has proven to be one highly useful tool for developing this software. Yet, it is still common practice to develop individual solutions for different instruments. In contrast to this, we present here a single LabVIEW-based program that can be directly applied to various analytical tasks without having to change the program code. Driven by a set of simple script commands, it can control a whole range of instruments, from valves and pumps to full-scale spectrometers. Fluid sample (pre-)treatment and separation procedures can thus be flexibly coupled to a wide range of analytical detection methods. Here, the capabilities of the program have been demonstrated by using it for the control of both a sequential injection analysis - capillary electrophoresis (SIA-CE) system with UV detection, and an analytical setup for studying the inhibition of enzymatic reactions using a SIA system with FTIR detection.

  15. Visualization tool for advanced laser system development

    Science.gov (United States)

    Crockett, Gregg A.; Brunson, Richard L.

    2002-06-01

    Simulation development for Laser Weapon Systems design and system trade analyses has progressed to new levels with the advent of object-oriented software development tools and PC processor capabilities. These tools allow rapid visualization of upcoming laser weapon system architectures and the ability to rapidly respond to what-if scenario questions from potential user commands. These simulations can solve very intensive problems in short time periods to investigate the parameter space of a newly emerging weapon system concept, or can address user mission performance for many different scenario engagements. Equally important to the rapid solution of complex numerical problems is the ability to rapidly visualize the results of the simulation, and to effectively interact with visualized output to glean new insights into the complex interactions of a scenario. Boeing has applied these ideas to develop a tool called the Satellite Visualization and Signature Tool (SVST). This Windows application is based upon a series of C++ coded modules that have evolved from several programs at Boeing-SVS. The SVST structure, extensibility, and some recent results of applying the simulation to weapon system concepts and designs will be discussed in this paper.

  16. 100-B/C Target Analyte List Development for Soil

    Energy Technology Data Exchange (ETDEWEB)

    R.W. Ovink

    2010-03-18

    This report documents the process used to identify source area target analytes in support of the 100-B/C remedial investigation/feasibility study addendum to DOE/RL-2008-46. This report also establishes the analyte exclusion criteria applicable for 100-B/C use and the analytical methods needed to analyze the target analytes.

  17. Development of special analytical system for determination of free acid

    International Nuclear Information System (INIS)

    Zhang Lihua; Wu Jizong; Liu Huanliang; Liu Quanwei; Fan Dejun; Su Tao

    2008-01-01

    The determination of free-acid plays an important role in spent fuel reprocessing analysis. Its work accounts for about 30% of all analytical work in reprocessing. It is necessary to study and develop a special fast analytical system for determination of free acid. The special analytical system is particularly applicable to determination of free acid in high-level radioactive environment, which is composed of an optical fiber spectrophotometer and an automatic sample-in device. Small sample-in volume needed, fast procedure, easy operation and physical protection are its advantages. All kinds of performance and parameters satisfy the requirements of spent fuel reprocessing control analysis. For long-distance determination, the optical fiber spectrophotometer is connected with an 4.5 meters long optical fiber. To resolve the change of 0.1 mol/L acidity, the measuring optical path is 2 cm. Mass of 10-20 μm in diameter optical fibers are assembled. The optical fiber probe is composed of a reflecting mirror and a concave mirror on the top of optical fibers. To eliminate the interference of external light, a stainless steel measuring chamber is used. The automatic sample-in device is composed of state valve, quantifying pump and pipe. The sample-in precision of 15 μl and 35 μl quantifying loops is better than 0.5%. The special analytical system takes less than 7 minutes to complete one measurement. The linear range is 0.5 mol/L-3.5 mol/L. The relative standard deviation is better than 2.0% when the concentration of the free acid is about 2.0 mol/L. For samples in different medium, the results are comparable with the method of pH titration of determining the free acid in reprocessing. (authors)

  18. Development of bore tools for pipe inspection

    Energy Technology Data Exchange (ETDEWEB)

    Oka, Kiyoshi; Nakahira, Masataka; Taguchi, Kou; Ito, Akira [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1998-04-01

    In the International Thermonuclear Reactor (ITER), replacement and maintenance on in-vessel components requires that all cooling pipes connected be cut and removed, that a new component be installed, and that all cooling pipes be rewelded. After welding is completed, welded area must be inspected for soundness. These tasks require a new work concept for securing shielded area and access from narrow ports. Tools had to be developed for nondestructive inspection and leak testing to evaluate pipe welding soundness by accessing areas from inside pipes using autonomous locomotion welding and cutting tools. A system was proposed for nondestructive inspection of branch pipes and the main pipe after passing through pipe curves, the same as for welding and cutting tool development. Nondestructive inspection and leak testing sensors were developed and the basic parameters were obtained. In addition, the inspection systems which can move inside pipes and conduct the nondestructive inspection and the leak testing were developed. In this paper, an introduction will be given to the current situation concerning the development of nondestructive inspection and leak testing machines for the branch pipes. (author)

  19. Stable isotope ratio analysis: A potential analytical tool for the authentication of South African lamb meat.

    Science.gov (United States)

    Erasmus, Sara Wilhelmina; Muller, Magdalena; van der Rijst, Marieta; Hoffman, Louwrens Christiaan

    2016-02-01

    Stable isotope ratios ((13)C/(12)C and (15)N/(14)N) of South African Dorper lambs from farms with different vegetation types were measured by isotope ratio mass spectrometry (IRMS), to evaluate it as a tool for the authentication of origin and feeding regime. Homogenised and defatted meat of the Longissimus lumborum (LL) muscle of lambs from seven different farms was assessed. The δ(13)C values were affected by the origin of the meat, mainly reflecting the diet. The Rûens and Free State farms had the lowest (p ⩽ 0.05) δ(15)N values, followed by the Northern Cape farms, with Hantam Karoo/Calvinia having the highest δ(15)N values. Discriminant analysis showed δ(13)C and δ(15)N differences as promising results for the use of IRMS as a reliable analytical tool for lamb meat authentication. The results suggest that diet, linked to origin, is an important factor to consider regarding region of origin classification for South African lamb. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Selecting Tools for Renewable Energy Analysis in Developing Countries: An Expanded Review

    Energy Technology Data Exchange (ETDEWEB)

    Irsyad, M. Indra al [School of Earth and Environmental Science, University of Queensland, Brisbane, QLD (Australia); Ministry of Energy and Mineral Resources, Jakarta (Indonesia); Halog, Anthony Basco, E-mail: a.halog@uq.edu.au [School of Earth and Environmental Science, University of Queensland, Brisbane, QLD (Australia); Nepal, Rabindra [Massey Business School, Massey University, Palmerston North (New Zealand); Koesrindartoto, Deddy P. [School of Business and Management, Institut Teknologi Bandung, Bandung (Indonesia)

    2017-12-20

    Renewable energy planners in developing countries should be cautious in using analytical tools formulated in developed countries. Traditional energy consumption, economic and demography transitions, high-income inequality, and informal economy are some characteristics of developing countries that may contradict the assumptions of mainstream, widely used analytical tools. In this study, we synthesize the debate in previous review studies on energy models for developing countries and then extend the scope of the previous studies by highlighting emerging methods of system thinking, life cycle thinking, and decision support analysis. We then discuss how these tools have been used for renewable energy analysis in developing countries and found out that not all studies are aware of the emerging critical issues in developing countries. We offer here a guidance to select the most appropriate analytical tool, mainly when dealing with energy modeling and analysis for developing countries. We also suggest potential future improvements to the analytical tool for renewable energy modeling and analysis in the developing countries.

  1. Selecting Tools for Renewable Energy Analysis in Developing Countries: An Expanded Review

    International Nuclear Information System (INIS)

    Irsyad, M. Indra al; Halog, Anthony Basco; Nepal, Rabindra; Koesrindartoto, Deddy P.

    2017-01-01

    Renewable energy planners in developing countries should be cautious in using analytical tools formulated in developed countries. Traditional energy consumption, economic and demography transitions, high-income inequality, and informal economy are some characteristics of developing countries that may contradict the assumptions of mainstream, widely used analytical tools. In this study, we synthesize the debate in previous review studies on energy models for developing countries and then extend the scope of the previous studies by highlighting emerging methods of system thinking, life cycle thinking, and decision support analysis. We then discuss how these tools have been used for renewable energy analysis in developing countries and found out that not all studies are aware of the emerging critical issues in developing countries. We offer here a guidance to select the most appropriate analytical tool, mainly when dealing with energy modeling and analysis for developing countries. We also suggest potential future improvements to the analytical tool for renewable energy modeling and analysis in the developing countries.

  2. ProteoLens: a visual analytic tool for multi-scale database-driven biological network data mining.

    Science.gov (United States)

    Huan, Tianxiao; Sivachenko, Andrey Y; Harrison, Scott H; Chen, Jake Y

    2008-08-12

    New systems biology studies require researchers to understand how interplay among myriads of biomolecular entities is orchestrated in order to achieve high-level cellular and physiological functions. Many software tools have been developed in the past decade to help researchers visually navigate large networks of biomolecular interactions with built-in template-based query capabilities. To further advance researchers' ability to interrogate global physiological states of cells through multi-scale visual network explorations, new visualization software tools still need to be developed to empower the analysis. A robust visual data analysis platform driven by database management systems to perform bi-directional data processing-to-visualizations with declarative querying capabilities is needed. We developed ProteoLens as a JAVA-based visual analytic software tool for creating, annotating and exploring multi-scale biological networks. It supports direct database connectivity to either Oracle or PostgreSQL database tables/views, on which SQL statements using both Data Definition Languages (DDL) and Data Manipulation languages (DML) may be specified. The robust query languages embedded directly within the visualization software help users to bring their network data into a visualization context for annotation and exploration. ProteoLens supports graph/network represented data in standard Graph Modeling Language (GML) formats, and this enables interoperation with a wide range of other visual layout tools. The architectural design of ProteoLens enables the de-coupling of complex network data visualization tasks into two distinct phases: 1) creating network data association rules, which are mapping rules between network node IDs or edge IDs and data attributes such as functional annotations, expression levels, scores, synonyms, descriptions etc; 2) applying network data association rules to build the network and perform the visual annotation of graph nodes and edges

  3. Web Analytics

    Science.gov (United States)

    EPA’s Web Analytics Program collects, analyzes, and provides reports on traffic, quality assurance, and customer satisfaction metrics for EPA’s website. The program uses a variety of analytics tools, including Google Analytics and CrazyEgg.

  4. Developing Expert Tools for the LHC

    CERN Document Server

    AUTHOR|(CDS)2160780; Timkó, Helga

    2017-10-12

    This Thesis describes software tools developed for automated, precision setting-up of low-power level radio frequency (LLRF) loops, which will help expert users to have better control and faster setting-up of the radio-frequency (RF) system in the Large Hadron Collider (LHC) experiment. The aim was to completely redesign the software architecture, to add new features, to improve certain algorithms, and to increase the automation.

  5. Development of configuration risk management tool

    International Nuclear Information System (INIS)

    Masuda, Takahiro; Doi, Eiji

    2003-01-01

    Tokyo Electric Power Company (referred to as TEPCO hereinafter), and other Japanese utilities as well, have been trying to improve the capacity factor of their Nuclear Power Plants (NPPs) through modernization of Operation and Maintenance strategy. TEPCO intends to apply risk information to O and M field with maintaining or even improving both safety and production efficiency. Under these situations, TEPCO with some BWR utilities started to develop a Configuration Risk Management (CRM) tool that can estimate risk in various plant conditions due to configuration changes during outage. Moreover, we also intend to apply CRM to on-line maintenance (OLM) in the near future. This tool can calculate the Core Damage Frequency (CDF) according to given plant condition, such as SSCs availability, decay heat level and the inventory of coolant in both outage state and full-power operation. From deterministic viewpoint, whether certain configuration meet the related requirements of Technical Specifications. User-friendly interface is one of the important features of this tool because this enables the site engineers with little experience in PSA to quantify and utilize the risk information by this tool. (author)

  6. Development of a flexible visualization tool

    Energy Technology Data Exchange (ETDEWEB)

    Emoto, M. E-mail: emo@nifs.ac.jp; Shibata, K.; Watanabe, K.; Ohdachi, S.; Ida, K.; Sudo, S

    2002-06-01

    User-friendly visualization tools are indispensable for quick recognition of experimental data. One such tool, the dwscope component of the MDS-Plus system, is widely used to visualize the data that MDS-Plus acquires. However, the National Institute for Fusion Science does not use MDS-Plus, so our researchers on the Large Helical Device (LHD) project cannot use dwscope without modification. Therefore, we developed a new visualization tool, NIFScope. The user interface of NIFScope is based on JavaScope, which is a Java version of dwscope, but NIFScope has its own unique characteristics, including the following: (1) the GUI toolkit is GTK+; (2) Ruby is the equation evaluator; and (3) data loaders are provided as Ruby modules. With these features, NIFScope becomes a multi-purpose and flexible visualization tool. For example, because GTK+ is a multi-platform open source GUI toolkit, NIFScope can run on both MS-Windows and UNIX, and it can be delivered freely. The second characteristic enables users to plot various equations besides experimental data. Furthermore, Ruby is an object-oriented script language and is widely used on the Internet, allowing it to serve not only as an equation evaluator but also as an ordinal programming language. This means users can easily add new data loaders for their own data formats.

  7. Surrogate Analysis and Index Developer (SAID) tool

    Science.gov (United States)

    Domanski, Marian M.; Straub, Timothy D.; Landers, Mark N.

    2015-10-01

    The use of acoustic and other parameters as surrogates for suspended-sediment concentrations (SSC) in rivers has been successful in multiple applications across the Nation. Tools to process and evaluate the data are critical to advancing the operational use of surrogates along with the subsequent development of regression models from which real-time sediment concentrations can be made available to the public. Recent developments in both areas are having an immediate impact on surrogate research and on surrogate monitoring sites currently (2015) in operation.

  8. Electrochemical immunoassay using magnetic beads for the determination of zearalenone in baby food: An anticipated analytical tool for food safety

    International Nuclear Information System (INIS)

    Hervas, Miriam; Lopez, Miguel Angel; Escarpa, Alberto

    2009-01-01

    In this work, electrochemical immunoassay involving magnetic beads to determine zearalenone in selected food samples has been developed. The immunoassay scheme has been based on a direct competitive immunoassay method in which antibody-coated magnetic beads were employed as the immobilisation support and horseradish peroxidase (HRP) was used as enzymatic label. Amperometric detection has been achieved through the addition of hydrogen peroxide substrate and hydroquinone as mediator. Analytical performance of the electrochemical immunoassay has been evaluated by analysis of maize certified reference material (CRM) and selected baby food samples. A detection limit (LOD) of 0.011 μg L -1 and EC 50 0.079 μg L -1 were obtained allowing the assessment of the detection of zearalenone mycotoxin. In addition, an excellent accuracy with a high recovery yield ranging between 95 and 108% has been obtained. The analytical features have shown the proposed electrochemical immunoassay to be a very powerful and timely screening tool for the food safety scene.

  9. SE Requirements Development Tool User Guide

    Energy Technology Data Exchange (ETDEWEB)

    Benson, Faith Ann [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-13

    The LANL Systems Engineering Requirements Development Tool (SERDT) is a data collection tool created in InfoPath for use with the Los Alamos National Laboratory’s (LANL) SharePoint sites. Projects can fail if a clear definition of the final product requirements is not performed. For projects to be successful requirements must be defined early in the project and those requirements must be tracked during execution of the project to ensure the goals of the project are met. Therefore, the focus of this tool is requirements definition. The content of this form is based on International Council on Systems Engineering (INCOSE) and Department of Defense (DoD) process standards and allows for single or collaborative input. The “Scoping” section is where project information is entered by the project team prior to requirements development, and includes definitions and examples to assist the user in completing the forms. The data entered will be used to define the requirements and once the form is filled out, a “Requirements List” is automatically generated and a Word document is created and saved to a SharePoint document library. SharePoint also includes the ability to download the requirements data defined in the InfoPath from into an Excel spreadsheet. This User Guide will assist you in navigating through the data entry process.

  10. Process-Based Quality (PBQ) Tools Development

    Energy Technology Data Exchange (ETDEWEB)

    Cummins, J.L.

    2001-12-03

    The objective of this effort is to benchmark the development of process-based quality tools for application in CAD (computer-aided design) model-based applications. The processes of interest are design, manufacturing, and quality process applications. A study was commissioned addressing the impact, current technologies, and known problem areas in application of 3D MCAD (3-dimensional mechanical computer-aided design) models and model integrity on downstream manufacturing and quality processes. The downstream manufacturing and product quality processes are profoundly influenced and dependent on model quality and modeling process integrity. The goal is to illustrate and expedite the modeling and downstream model-based technologies for available or conceptual methods and tools to achieve maximum economic advantage and advance process-based quality concepts.

  11. LitPathExplorer: A Confidence-based Visual Text Analytics Tool for Exploring Literature-Enriched Pathway Models.

    Science.gov (United States)

    Soto, Axel J; Zerva, Chrysoula; Batista-Navarro, Riza; Ananiadou, Sophia

    2017-12-08

    Pathway models are valuable resources that help us understand the various mechanisms underpinning complex biological processes. Their curation is typically carried out through manual inspection of published scientific literature to find information relevant to a model, which is a laborious and knowledge-intensive task. Furthermore, models curated manually cannot be easily updated and maintained with new evidence extracted from the literature without automated support. We have developed LitPathExplorer, a visual text analytics tool that integrates advanced text mining, semi-supervised learning and interactive visualization, to facilitate the exploration and analysis of pathway models using statements (i.e., events) extracted automatically from the literature and organized according to levels of confidence. LitPathExplorer supports pathway modellers and curators alike by: 1) extracting events from the literature that corroborate existing models with evidence; 2) discovering new events which can update models; and 3) providing a confidence value for each event that is automatically computed based on linguistic features and article metadata. Our evaluation of event extraction showed a precision of 89% and a recall of 71%. Evaluation of our confidence measure, when used for ranking sampled events, showed an average precision ranging between 61% and 73%, which can be improved to 95% when the user is involved in the semi-supervised learning process. Qualitative evaluation using pair analytics based on the feedback of three domain experts confirmed the utility of our tool within the context of pathway model exploration. LitPathExplorer is available at http://nactem.ac.uk/LitPathExplorer_BI/. sophia.ananiadou@manchester.ac.uk. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  12. USING ANALYTIC HIERARCHY PROCESS (AHP METHOD IN RURAL DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Tülay Cengiz

    2003-04-01

    Full Text Available Rural development is a body of economical and social policies towards improving living conditions in rural areas through enabling rural population to utilize economical, social, cultural and technological blessing of city life in place, without migrating. As it is understood from this description, rural development is a very broad concept. Therefore, in development efforts problem should be stated clearly, analyzed and many criterias should be evaluated by experts. Analytic Hierarchy Process (AHP method can be utilized at there stages of development efforts. AHP methods is one of multi-criteria decision method. After degrading a problem in smaller pieces, relative importance and level of importance of two compared elements are determined. It allows evaluation of quality and quantity factors. At the same time, it permits utilization of ideas of many experts and use them in decision process. Because mentioned features of AHP method, it could be used in rural development works. In this article, cultural factors, one of the important components of rural development is often ignored in many studies, were evaluated as an example. As a result of these applications and evaluations, it is concluded that AHP method could be helpful in rural development efforts.

  13. Applicability of bioanalysis of multiple analytes in drug discovery and development: review of select case studies including assay development considerations.

    Science.gov (United States)

    Srinivas, Nuggehally R

    2006-05-01

    The development of sound bioanalytical method(s) is of paramount importance during the process of drug discovery and development culminating in a marketing approval. Although the bioanalytical procedure(s) originally developed during the discovery stage may not necessarily be fit to support the drug development scenario, they may be suitably modified and validated, as deemed necessary. Several reviews have appeared over the years describing analytical approaches including various techniques, detection systems, automation tools that are available for an effective separation, enhanced selectivity and sensitivity for quantitation of many analytes. The intention of this review is to cover various key areas where analytical method development becomes necessary during different stages of drug discovery research and development process. The key areas covered in this article with relevant case studies include: (a) simultaneous assay for parent compound and metabolites that are purported to display pharmacological activity; (b) bioanalytical procedures for determination of multiple drugs in combating a disease; (c) analytical measurement of chirality aspects in the pharmacokinetics, metabolism and biotransformation investigations; (d) drug monitoring for therapeutic benefits and/or occupational hazard; (e) analysis of drugs from complex and/or less frequently used matrices; (f) analytical determination during in vitro experiments (metabolism and permeability related) and in situ intestinal perfusion experiments; (g) determination of a major metabolite as a surrogate for the parent molecule; (h) analytical approaches for universal determination of CYP450 probe substrates and metabolites; (i) analytical applicability to prodrug evaluations-simultaneous determination of prodrug, parent and metabolites; (j) quantitative determination of parent compound and/or phase II metabolite(s) via direct or indirect approaches; (k) applicability in analysis of multiple compounds in select

  14. Nonnegative matrix factorization: an analytical and interpretive tool in computational biology.

    Directory of Open Access Journals (Sweden)

    Karthik Devarajan

    2008-07-01

    Full Text Available In the last decade, advances in high-throughput technologies such as DNA microarrays have made it possible to simultaneously measure the expression levels of tens of thousands of genes and proteins. This has resulted in large amounts of biological data requiring analysis and interpretation. Nonnegative matrix factorization (NMF was introduced as an unsupervised, parts-based learning paradigm involving the decomposition of a nonnegative matrix V into two nonnegative matrices, W and H, via a multiplicative updates algorithm. In the context of a pxn gene expression matrix V consisting of observations on p genes from n samples, each column of W defines a metagene, and each column of H represents the metagene expression pattern of the corresponding sample. NMF has been primarily applied in an unsupervised setting in image and natural language processing. More recently, it has been successfully utilized in a variety of applications in computational biology. Examples include molecular pattern discovery, class comparison and prediction, cross-platform and cross-species analysis, functional characterization of genes and biomedical informatics. In this paper, we review this method as a data analytical and interpretive tool in computational biology with an emphasis on these applications.

  15. Raman-in-SEM, a multimodal and multiscale analytical tool: performance for materials and expertise.

    Science.gov (United States)

    Wille, Guillaume; Bourrat, Xavier; Maubec, Nicolas; Lahfid, Abdeltif

    2014-12-01

    The availability of Raman spectroscopy in a powerful analytical scanning electron microscope (SEM) allows morphological, elemental, chemical, physical and electronic analysis without moving the sample between instruments. This paper documents the metrological performance of the SEMSCA commercial Raman interface operated in a low vacuum SEM. It provides multiscale and multimodal analyses as Raman/EDS, Raman/cathodoluminescence or Raman/STEM (STEM: scanning transmission electron microscopy) as well as Raman spectroscopy on nanomaterials. Since Raman spectroscopy in a SEM can be influenced by several SEM-related phenomena, this paper firstly presents a comparison of this new tool with a conventional micro-Raman spectrometer. Then, some possible artefacts are documented, which are due to the impact of electron beam-induced contamination or cathodoluminescence contribution to the Raman spectra, especially with geological samples. These effects are easily overcome by changing or adapting the Raman spectrometer and the SEM settings and methodology. The deletion of the adverse effect of cathodoluminescence is solved by using a SEM beam shutter during Raman acquisition. In contrast, this interface provides the ability to record the cathodoluminescence (CL) spectrum of a phase. In a second part, this study highlights the interest and efficiency of the coupling in characterizing micrometric phases at the same point. This multimodal approach is illustrated with various issues encountered in geosciences. Copyright © 2014 Elsevier Ltd. All rights reserved.

  16. Two-dimensional electrophoresis protein profiling as an analytical tool for human acute leukemia classification.

    Science.gov (United States)

    Cui, Jiu-Wei; Wang, Jie; He, Kun; Jin, Bao-Feng; Wang, Hong-Xia; Li, Wei; Kang, Li-Hua; Hu, Mei-Ru; Li, Hui-Yan; Yu, Ming; Shen, Bei-Fen; Wang, Guan-Jun; Zhang, Xue-Min

    2005-01-01

    Two-dimensional electrophoresis (2-DE) was used to profile the proteins of leukemic cells from 61 cases of akute leukemia (AL) characterized by the French-American-British (FAB) classification. The differentially expressed protein spots were identified by matrix assisted laser desorption/ionization-time of flight-mass spectrometry (MALDI-TOF-MS) and electrospray ionization-tandem MS (ESI-MS/MS). The distinct protein profiles (DPPs) of AL FAB subtypes were explored successfully, including acute myeloid leukemia (AML), its subtypes (M2, M3, and M5), and acute lymphoid leukemia (ALL), which were homogeneous within different samples of the same subgroup but clearly differed from all other subgroups. We also found a group of proteins differentially expressed between AL cells and normal white blood cells. Among the DPPs of AL subtypes, some proteins have been reported, but most of them were first reported here to mark AML differentiation and to discriminate AML from ALL. These data show that 2-DE protein profiling could be used as an analytical tool for facilitating molecular definition of human AL classification and understanding the mechanism of leukemogensis, and the extension of the present analysis to the currently less well-defined AL will identify additional subgroups and may promote the identification of new targets for specific treatment approaches.

  17. Room temperature phosphorescence in the liquid state as a tool in analytical chemistry

    International Nuclear Information System (INIS)

    Kuijt, Jacobus; Ariese, Freek; Brinkman, Udo A.Th.; Gooijer, Cees

    2003-01-01

    A wide-ranging overview of room temperature phosphorescence in the liquid state (RTPL ) is presented, with a focus on recent developments. RTPL techniques like micelle-stabilized (MS)-RTP, cyclodextrin-induced (CD)-RTP, and heavy atom-induced (HAI)-RTP are discussed. These techniques are mainly applied in the stand-alone format, but coupling with some separation techniques appears to be feasible. Applications of direct, sensitized and quenched phosphorescence are also discussed. As regards sensitized and quenched RTP, emphasis is on the coupling with liquid chromatography (LC) and capillary electrophoresis (CE), but stand-alone applications are also reported. Further, the application of RTPL in immunoassays and in RTP optosensing - the optical sensing of analytes based on RTP - is reviewed. Next to the application of RTPL in quantitative analysis, its use for the structural probing of protein conformations and for time-resolved microscopy of labelled biomolecules is discussed. Finally, an overview is presented of the various analytical techniques which are based on the closely related phenomenon of long-lived lanthanide luminescence. The paper closes with a short evaluation of the state-of-the-art in RTP and a discussion on future perspectives

  18. A Tool for Conceptualising in PSS development

    DEFF Research Database (Denmark)

    Matzen, Detlef; McAloone, Timothy Charles

    2006-01-01

    This paper introduces a tool for conceptualising in the development of product/servicesystems (PSS), based upon the modelling of service activities. Our argumentation is built on two previous articles by the same author, previously presented at the 16. Symposium “Design for X” [1] and the 9th...... International Design Conference [2]. In this contribution, we take the step from a fundamental understanding of the phenomenon to creating a normative exploitation of this understanding for PSS concept development. The developed modelling technique is based on the Customer Activity Cycle (CAC) model...... the integrated consideration of the customers’ activities, possible PSS offerings and beneficial partnering options (i.e. between different supplier companies) within the delivery value chain....

  19. Basic Conceptual Systems (BCSs)--Tools for Analytic Coding, Thinking and Learning: A Concept Teaching Curriculum in Norway

    Science.gov (United States)

    Hansen, Andreas

    2009-01-01

    The role of basic conceptual systems (for example, colour, shape, size, position, direction, number, pattern, etc.) as psychological tools for analytic coding, thinking, learning is emphasised, and a proposal for a teaching order of BCSs in kindergarten and primary school is introduced. The first part of this article explains briefly main aspects…

  20. Detection of UV-treatment effects on plankton by rapid analytic tools for ballast water compliance monitoring immediately following treatment

    Science.gov (United States)

    Bradie, Johanna; Gianoli, Claudio; He, Jianjun; Lo Curto, Alberto; Stehouwer, Peter; Veldhuis, Marcel; Welschmeyer, Nick; Younan, Lawrence; Zaake, André; Bailey, Sarah

    2018-03-01

    Non-indigenous species seriously threaten native biodiversity. To reduce establishments, the International Maritime Organization established the Convention for the Control and Management of Ships' Ballast Water and Sediments which limits organism concentrations at discharge under regulation D-2. Most ships will comply by using on-board treatment systems to disinfect their ballast water. Port state control officers will need simple, rapid methods to detect compliance. Appropriate monitoring methods may be dependent on treatment type, since different treatments will affect organisms by a variety of mechanisms. Many indicative tools have been developed, but must be examined to ensure the measured variable is an appropriate signal for the response of the organisms to the applied treatment. We assessed the abilities of multiple analytic tools to rapidly detect the effects of a ballast water treatment system based on UV disinfection. All devices detected a large decrease in the concentrations of vital organisms ≥ 50 μm and organisms organisms, as examined herein for UV-C treatment systems.

  1. Life cycle assessment as an analytical tool in strategic environmental assessment. Lessons learned from a case study on municipal energy planning in Sweden

    International Nuclear Information System (INIS)

    Björklund, Anna

    2012-01-01

    Life cycle assessment (LCA) is explored as an analytical tool in strategic environmental assessment (SEA), illustrated by case where a previously developed SEA process was applied to municipal energy planning in Sweden. The process integrated decision-making tools for scenario planning, public participation and environmental assessment. This article describes the use of LCA for environmental assessment in this context, with focus on methodology and practical experiences. While LCA provides a systematic framework for the environmental assessment and a wider systems perspective than what is required in SEA, LCA cannot address all aspects of environmental impact required, and therefore needs to be complemented by other tools. The integration of LCA with tools for public participation and scenario planning posed certain methodological challenges, but provided an innovative approach to designing the scope of the environmental assessment and defining and assessing alternatives. - Research highlights: ► LCA was explored as analytical tool in an SEA process of municipal energy planning. ► The process also integrated LCA with scenario planning and public participation. ► Benefits of using LCA were a systematic framework and wider systems perspective. ► Integration of tools required some methodological challenges to be solved. ► This proved an innovative approach to define alternatives and scope of assessment.

  2. saltPAD: A New Analytical Tool for Monitoring Salt Iodization in Low Resource Settings

    Directory of Open Access Journals (Sweden)

    Nicholas M. Myers

    2016-03-01

    Full Text Available We created a paper test card that measures a common iodizing agent, iodate, in salt. To test the analytical metrics, usability, and robustness of the paper test card when it is used in low resource settings, the South African Medical Research Council and GroundWork performed independ‐ ent validation studies of the device. The accuracy and precision metrics from both studies were comparable. In the SAMRC study, more than 90% of the test results (n=1704 were correctly classified as corresponding to adequately or inadequately iodized salt. The cards are suitable for market and household surveys to determine whether salt is adequately iodized. Further development of the cards will improve their utility for monitoring salt iodization during production.

  3. Accept or Decline? An Analytics-Based Decision Tool for Kidney Offer Evaluation.

    Science.gov (United States)

    Bertsimas, Dimitris; Kung, Jerry; Trichakis, Nikolaos; Wojciechowski, David; Vagefi, Parsia A

    2017-12-01

    When a deceased-donor kidney is offered to a waitlisted candidate, the decision to accept or decline the organ relies primarily upon a practitioner's experience and intuition. Such decisions must achieve a delicate balance between estimating the immediate benefit of transplantation and the potential for future higher-quality offers. However, the current experience-based paradigm lacks scientific rigor and is subject to the inaccuracies that plague anecdotal decision-making. A data-driven analytics-based model was developed to predict whether a patient will receive an offer for a deceased-donor kidney at Kidney Donor Profile Index thresholds of 0.2, 0.4, and 0.6, and at timeframes of 3, 6, and 12 months. The model accounted for Organ Procurement Organization, blood group, wait time, DR antigens, and prior offer history to provide accurate and personalized predictions. Performance was evaluated on data sets spanning various lengths of time to understand the adaptability of the method. Using United Network for Organ Sharing match-run data from March 2007 to June 2013, out-of-sample area under the receiver operating characteristic curve was approximately 0.87 for all Kidney Donor Profile Index thresholds and timeframes considered for the 10 most populous Organ Procurement Organizations. As more data becomes available, area under the receiver operating characteristic curve values increase and subsequently level off. The development of a data-driven analytics-based model may assist transplant practitioners and candidates during the complex decision of whether to accept or forgo a current kidney offer in anticipation of a future high-quality offer. The latter holds promise to facilitate timely transplantation and optimize the efficiency of allocation.

  4. Implementing analytics a blueprint for design, development, and adoption

    CERN Document Server

    Sheikh, Nauman

    2013-01-01

    Implementing Analytics demystifies the concept, technology and application of analytics and breaks its implementation down to repeatable and manageable steps, making it possible for widespread adoption across all functions of an organization. Implementing Analytics simplifies and helps democratize a very specialized discipline to foster business efficiency and innovation without investing in multi-million dollar technology and manpower. A technology agnostic methodology that breaks down complex tasks like model design and tuning and emphasizes business decisions rather than the technology behi

  5. Development and first application of an operating events ranking tool

    International Nuclear Information System (INIS)

    Šimić, Zdenko; Zerger, Benoit; Banov, Reni

    2015-01-01

    Highlights: • A method using analitycal hierarchy process for ranking operating events is developed and tested. • The method is applied for 5 years of U.S. NRC Licensee Event Reports (1453 events). • Uncertainty and sensitivity of the ranking results are evaluated. • Real events assessment shows potential of the method for operating experience feedback. - Abstract: The operating experience feedback is important for maintaining and improving safety and availability in nuclear power plants. Detailed investigation of all events is challenging since it requires excessive resources, especially in case of large event databases. This paper presents an event groups ranking method to complement the analysis of individual operating events. The basis for the method is the use of an internationally accepted events characterization scheme that allows different ways of events grouping and ranking. The ranking method itself consists of implementing the analytical hierarchy process (AHP) by means of a custom developed tool which allows events ranking based on ranking indexes pre-determined by expert judgment. Following the development phase, the tool was applied to analyze a complete set of 5 years of real nuclear power plants operating events (1453 events). The paper presents the potential of this ranking method to identify possible patterns throughout the event database and therefore to give additional insights into the events as well as to give quantitative input for the prioritization of further more detailed investigation of selected event groups

  6. Chemometrics in analytical chemistry-part I: history, experimental design and data analysis tools.

    Science.gov (United States)

    Brereton, Richard G; Jansen, Jeroen; Lopes, João; Marini, Federico; Pomerantsev, Alexey; Rodionova, Oxana; Roger, Jean Michel; Walczak, Beata; Tauler, Romà

    2017-10-01

    Chemometrics has achieved major recognition and progress in the analytical chemistry field. In the first part of this tutorial, major achievements and contributions of chemometrics to some of the more important stages of the analytical process, like experimental design, sampling, and data analysis (including data pretreatment and fusion), are summarised. The tutorial is intended to give a general updated overview of the chemometrics field to further contribute to its dissemination and promotion in analytical chemistry.

  7. Development of analytical techniques for water and environmental samples (2)

    Energy Technology Data Exchange (ETDEWEB)

    Eum, Chul Hun; Jeon, Chi Wan; Jung, Kang Sup; Song, Kyung Sun; Kim, Sang Yeon [Korea Institute of Geology Mining and Materials, Taejon (Korea)

    1998-12-01

    The purpose of this study is to develop new analytical methods with good detection limit for toxic inorganic and organic compounds. The analyses of CN, organic acids, particulate materials in environmental samples have been done using several methods such as Ion Chromatography, SPE, SPME, GC/MS, GC/FID, SPLITT (split-flow thin cell fractionation) during the second year of this project. Advantage and disadvantage of several distillation method (by KS, JIS, EPA) for CN analysis in wastewater were investigated. As the results, we proposed new distillation apparatus for CN analysis, which was proved to be simpler, faster and to get better recovery than conventional apparatus. And ion chromatograph/pulsed amperometric detector (IC/PAD) system instead of colorimetry for CN detection was setup to solve matrix interference. And SPE(solid phase extraction) and SPME (solid phase micro extraction) as liquid-solid extraction technique were applied to the analysis of phenols in wastewater. Optimum experimental conditions and factors influencing analytical results were determined. From these results, It could be concluded that C{sub 18} cartridge and polystyrene-divinylbenzene disk in SPE method, polyacrylate fiber in SPME were proper solid phase adsorbent for phenol. Optimum conditions to analyze phenol derivatives simultaneously were established. Also, Continuous SPLITT (Split-flow thin cell) Fractionation (CSF) is a new preparative separation technique that is useful for fractionation of particulate and macromolecular materials. CSF is carried out in a thin ribbon-like channel equipped with two splitters at both inlet and outlet of the channel. In this work, we set up a new CSF system, and tested using polystyrene latex standard particles. And then we fractionated particles contained in air and underground water based on their sedimentation coefficients using CSF. (author). 27 refs., 13 tabs., 31 figs.

  8. Development of analytical procedures for coprocessing. Final technical report

    Energy Technology Data Exchange (ETDEWEB)

    Anderson, R.P.; Green, J.B.; Vogh, J.W.

    1991-07-01

    One phase of improving understanding of the fundamental chemistry of coprocessing involves development of the ability to distinguish between products originating from coal versus those originating from petroleum resid. A primary objective of this project was to develop analytical techniques to determine the source (coal versus resid) of the various compound types found in coprocessing products. A corollary objective was to develop an expanded knowledge of the detailed composition of coprocessing products. Two approaches were evaluated for distinguishing between products originating from coal and those originating from petroleum resid. One was based on the use of carbon isotope ratios and the other was based on variations in compound classes in response to changes in the ratio of coal to resid in the coprocessing feed. Other researchers using carbon isotope ratios to determine the origin of products have typically examined distillation fractions. This project involved determination of the origin of chemical classes (e.g., saturates, neutral aromatics, phenols, indoles, etc.) rather than distillate classes. Maya resid and Illinois No. 6 coal (with coal feed varying from 2 to 40 percent) were coprocessed in a batch autoclave to obtain products for detailed analysis.

  9. Innovative analytical tools to characterize prebiotic carbohydrates of functional food interest.

    Science.gov (United States)

    Corradini, Claudio; Lantano, Claudia; Cavazza, Antonella

    2013-05-01

    Functional foods are one of the most interesting areas of research and innovation in the food industry. A functional food or functional ingredient is considered to be any food or food component that provides health benefits beyond basic nutrition. Recently, consumers have shown interest in natural bioactive compounds as functional ingredients in the diet owing to their various beneficial effects for health. Water-soluble fibers and nondigestible oligosaccharides and polysaccharides can be defined as functional food ingredients. Fructooligosaccharides (FOS) and inulin are resistant to direct metabolism by the host and reach the caecocolon, where they are used by selected groups of beneficial bacteria. Furthermore, they are able to improve physical and structural properties of food, such as hydration, oil-holding capacity, viscosity, texture, sensory characteristics, and shelf-life. This article reviews major innovative analytical developments to screen and identify FOS, inulins, and the most employed nonstarch carbohydrates added or naturally present in functional food formulations. High-performance anion-exchange chromatography with pulsed electrochemical detection (HPAEC-PED) is one of the most employed analytical techniques for the characterization of those molecules. Mass spectrometry is also of great help, in particularly matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS), which is able to provide extensive information regarding the molecular weight and length profiles of oligosaccharides and polysaccharides. Moreover, MALDI-TOF-MS in combination with HPAEC-PED has been shown to be of great value for the complementary information it can provide. Some other techniques, such as NMR spectroscopy, are also discussed, with relevant examples of recent applications. A number of articles have appeared in the literature in recent years regarding the analysis of inulin, FOS, and other carbohydrates of interest in the field and

  10. Evolutionary developments in x ray and electron energy loss microanalysis instrumentation for the analytical electron microscope

    Science.gov (United States)

    Zaluzec, Nester J.

    Developments in instrumentation for both X ray Dispersive and Electron Energy Loss Spectroscopy (XEDS/EELS) over the last ten years have given the experimentalist a greatly enhanced set of analytical tools for characterization. Microanalysts have waited for nearly two decades now in the hope of getting a true analytical microscope and the development of 300 to 400 kV instruments should have allowed us to attain this goal. Unfortunately, this has not generally been the case. While there have been some major improvements in the techniques, there has also been some devolution in the modern AEM (Analytical Electron Microscope). In XEDS, the majority of today's instruments are still plagued by the hole count effect, which was first described in detail over fifteen years ago. The magnitude of this problem can still reach the 20 percent level for medium atomic number species in a conventional off-the-shelf intermediate voltage AEM. This is an absurd situation and the manufacturers should be severely criticized. Part of the blame, however, also rests on the AEM community for not having come up with a universally agreed upon standard test procedure. Fortunately, such a test procedure is in the early stages of refinement. The proposed test specimen consists of an evaporated Cr film approx. 500 to 1000A thick supported upon a 3mm diameter Molybdenum 200 micron aperture.

  11. Using fuzzy analytical hierarchy process (AHP to evaluate web development platform

    Directory of Open Access Journals (Sweden)

    Ahmad Sarfaraz

    2012-01-01

    Full Text Available Web development is plays an important role on business plans and people's lives. One of the key decisions in which both short-term and long-term success of the project depends is choosing the right development platform. Its criticality can be judged by the fact that once a platform is chosen, one has to live with it throughout the software development life cycle. The entire shape of the project depends on the language, operating system, tools, frameworks etc., in short the web development platform chosen. In addition, choosing the right platform is a multi criteria decision making (MCDM problem. We propose a fuzzy analytical hierarchy process model to solve the MCDM problem. We try to tap the real-life modeling potential of fuzzy logic and conjugate it with the commonly used powerful AHP modeling method.

  12. The Broad Application of Data Science and Analytics: Essential Tools for the Liberal Arts Graduate

    Science.gov (United States)

    Cárdenas-Navia, Isabel; Fitzgerald, Brian K.

    2015-01-01

    New technologies and data science are transforming a wide range of organizations into analytics-intensive enterprises. Despite the resulting demand for graduates with experience in the application of analytics, though, undergraduate education has been slow to change. The academic and policy communities have engaged in a decade-long conversation…

  13. Development of the SOFIA Image Processing Tool

    Science.gov (United States)

    Adams, Alexander N.

    2011-01-01

    The Stratospheric Observatory for Infrared Astronomy (SOFIA) is a Boeing 747SP carrying a 2.5 meter infrared telescope capable of operating between at altitudes of between twelve and fourteen kilometers, which is above more than 99 percent of the water vapor in the atmosphere. The ability to make observations above most water vapor coupled with the ability to make observations from anywhere, anytime, make SOFIA one of the world s premiere infrared observatories. SOFIA uses three visible light CCD imagers to assist in pointing the telescope. The data from these imagers is stored in archive files as is housekeeping data, which contains information such as boresight and area of interest locations. A tool that could both extract and process data from the archive files was developed.

  14. Evaluating the Development of Science Research Skills in Work-Integrated Learning through the Use of Workplace Science Tools

    Science.gov (United States)

    McCurdy, Susan M.; Zegwaard, Karsten E.; Dalgety, Jacinta

    2013-01-01

    Concept understanding, the development of analytical skills and a research mind set are explored through the use of academic tools common in a tertiary science education and relevant work-integrated learning (WIL) experiences. The use and development of the tools; laboratory book, technical report, and literature review are examined by way of…

  15. Developing a business analytics methodology: a case study in the foodbank sector

    OpenAIRE

    Hindle, Giles; Vidgen, Richard

    2017-01-01

    The current research seeks to address the following question: how can organizations align their business analytics development projects with their business goals? To pursue this research agenda we adopt an action research framework to develop and apply a business analytics methodology (BAM). The four-stage BAM (problem situation structuring, business model mapping, analytics leverage analysis, and analytics implementation) is not a prescription. Rather, it provides a logical structure and log...

  16. Tools for the quantitative analysis of sedimentation boundaries detected by fluorescence optical analytical ultracentrifugation.

    Directory of Open Access Journals (Sweden)

    Huaying Zhao

    Full Text Available Fluorescence optical detection in sedimentation velocity analytical ultracentrifugation allows the study of macromolecules at nanomolar concentrations and below. This has significant promise, for example, for the study of systems of high-affinity protein interactions. Here we describe adaptations of the direct boundary modeling analysis approach implemented in the software SEDFIT that were developed to accommodate unique characteristics of the confocal fluorescence detection system. These include spatial gradients of signal intensity due to scanner movements out of the plane of rotation, temporal intensity drifts due to instability of the laser and fluorophores, and masking of the finite excitation and detection cone by the sample holder. In an extensive series of experiments with enhanced green fluorescent protein ranging from low nanomolar to low micromolar concentrations, we show that the experimental data provide sufficient information to determine the parameters required for first-order approximation of the impact of these effects on the recorded data. Systematic deviations of fluorescence optical sedimentation velocity data analyzed using conventional sedimentation models developed for absorbance and interference optics are largely removed after these adaptations, resulting in excellent fits that highlight the high precision of fluorescence sedimentation velocity data, thus allowing a more detailed quantitative interpretation of the signal boundaries that is otherwise not possible for this system.

  17. Technology development for high temperature logging tools

    Energy Technology Data Exchange (ETDEWEB)

    Veneruso, A.F.; Coquat, J.A.

    1979-01-01

    A set of prototype, high temperature logging tools (temperature, pressure and flow) were tested successfully to temperatures up to 275/sup 0/C in a Union geothermal well during November 1978 as part of the Geothermal Logging Instrumentation Development Program. This program is being conducted by Sandia Laboratories for the Department of Energy's Division of Geothermal Energy. The progress and plans of this industry based program to develop and apply the high temperature instrumentation technology needed to make reliable geothermal borehole measurements are described. Specifically, this program is upgrading existing sondes for improved high temperature performance, as well as applying new materials (elastomers, polymers, metals and ceramics) and developing component technology such as high temperature cables, cableheads and electronics to make borehole measurements such as formation temperature, flow rate, high resolution pressure and fracture mapping. In order to satisfy critical existing needs, the near term goal is for operation up to 275/sup 0/C and 7000 psi by the end of FY80. The long term goal is for operation up to 350/sup 0/C and 20,000 psi by the end of FY84.

  18. Prioritizing of effective factors on development of medicinal plants cultivation using analytic network process

    Directory of Open Access Journals (Sweden)

    Ghorbanali Rassam

    2014-07-01

    Full Text Available For the overall development of medicinal plants cultivation in Iran, there is a need to identify various effective factors on medicinal plant cultivation. A proper method for identifying the most effective factor on the development of the medicinal plants cultivation is essential. This research conducted in order to prioritizing of the effective criteria for the development of medicinal plant cultivation in North Khorasan province in Iran using Analytical Network Process (ANP method. The multi-criteria decision making (MCDM is suggested to be a viable method for factor selection and the analytic network process (ANP has been used as a tool for MCDM. For this purpose a list of effective factors offered to expert group. Then pair wise comparison questionnaires were distributed between relevant researchers and local producer experts of province to get their opinions about the priority of criteria and sub- criteria. The questionnaires were analyzed using Super Decision software. We illustrated the use of the ANP by ranking main effective factors such as economic, educational-extension services, cultural-social and supportive policies on development of medicinal plants. The main objective of the present study was to develop ANP as a decision making tool for prioritizing factors affecting the development of medicinal plants cultivation. Results showed that the ANP methodology was perfectly suited to tackling the complex interrelations involved in selection factor in this case. Also the results of the process revealed that among the factors, supporting the cultivation of medicinal plants, build the infrastructure for marketing support, having educated farmer and easy access to production input have most impact on the development of medicinal plant cultivation.

  19. The Development of An Analytical Overlay Design Procedure

    Directory of Open Access Journals (Sweden)

    Djunaedi Kosasih

    2008-01-01

    Full Text Available Pavement structural evaluation using pavement modulus values resulting from back calculation process on non-destructive deflection data has been adopted to quantify objectively the conditions of existing pavements under various traffic loading and environmental conditions. However, such an advanced technique is not yet followed widely by advances in analytical overlay design procedures. One possible reason is perhaps due to its requirement to perform complex computations. A new module of computer program BackCalc has been developed to do that task based on the allowable maximum deflection criterion specified by the Asphalt Institute’83. The rationale is that adequate overlay thickness will be computed by iteration to result in theoretical maximum deflection that closely matches against the specified allowable maximum deflection. This paper outlines the major components of the program module illustrated by using a practical example. The overlay thickness obtained was found to be comparable with that of the known AASHTO’93 method

  20. Near-infrared reflectance spectroscopy as a process analytical technology tool in Ginkgo biloba extract qualification.

    Science.gov (United States)

    Rosa, Sílvia S; Barata, Pedro A; Martins, José M; Menezes, José C

    2008-06-09

    Here, we describe the use of near-infrared diffuse reflectance spectroscopy for qualification of Ginkgo biloba extract as raw material for use in pharmaceutical products. G. biloba extract shows unpredicted and uncontrolled variability in some of its quality specifications, intrinsic to its natural origin, which have influence on the manufacturing process of solid dosage forms (viz. granulation and compression). Some of these properties could not be determined by conventional quality control tests, so we investigated the use of NIR to qualify the batches of Ginkgo extract accordingly to its different features and establish a relationship with some of the manufacturing steps behaviour based on their qualification. Several approaches were evaluated, and the NIR method developed demonstrated to be sensitive to changes in important quality specifications and therefore adequate to qualify incoming batches of G. biloba extract. This could be considered a process analytical technology (PAT) application since it: (1) establishes the source of variability in a qualitative way, (2) explains its propagation to the final product quality attributes and (3) lays the basis for a control strategy to be applied in the manufacturing process.

  1. SlicerAstro: A 3-D interactive visual analytics tool for HI data

    Science.gov (United States)

    Punzo, D.; van der Hulst, J. M.; Roerdink, J. B. T. M.; Fillion-Robin, J. C.; Yu, L.

    2017-04-01

    SKA precursors are capable of detecting hundreds of galaxies in HI in a single 12 h pointing. In deeper surveys one will probe more easily faint HI structures, typically located in the vicinity of galaxies, such as tails, filaments, and extraplanar gas. The importance of interactive visualization in data exploration has been demonstrated by the wide use of tools (e.g. Karma, Casaviewer, VISIONS) that help users to receive immediate feedback when manipulating the data. We have developed SlicerAstro, a 3-D interactive viewer with new analysis capabilities, based on traditional 2-D input/output hardware. These capabilities enhance the data inspection, allowing faster analysis of complex sources than with traditional tools. SlicerAstro is an open-source extension of 3DSlicer, a multi-platform open source software package for visualization and medical image processing. We demonstrate the capabilities of the current stable binary release of SlicerAstro, which offers the following features: (i) handling of FITS files and astronomical coordinate systems; (ii) coupled 2-D/3-D visualization; (iii) interactive filtering; (iv) interactive 3-D masking; (v) and interactive 3-D modeling. In addition, SlicerAstro has been designed with a strong, stable and modular C++ core, and its classes are also accessible via Python scripting, allowing great flexibility for user-customized visualization and analysis tasks.

  2. Analytical Models Development of Compact Monopole Vortex Flows

    Directory of Open Access Journals (Sweden)

    Pavlo V. Lukianov

    2017-09-01

    Conclusions. The article contains series of the latest analytical models that describe both laminar and turbulent dynamics of monopole vortex flows which have not been reflected in traditional publications up to the present. The further research must be directed to search of analytical models for the coherent vortical structures in flows of viscous fluids, particularly near curved surfaces, where known in hydromechanics “wall law” is disturbed and heat and mass transfer anomalies take place.

  3. Development of analytical methods for iodine speciation in fresh water

    International Nuclear Information System (INIS)

    Takaku, Yuichi; Ohtsuka, Yoshihito; Hisamatsu, Shun'ichi

    2007-01-01

    Analytical methods for physicochemical speciation of iodine in fresh water samples were developed to elucidate its behavior in the environment. The methods combined inductively coupled plasma mass spectrometry (ICP-MS) with size exclusion high performance liquid chromatography (SEC) or capillary electrophoresis (CE). Freshwater samples were collected from Lake Towada and rivers surrounding the lake. After filtration through a 0.45 μm pore size membrane filter, iodine in the water samples was pre-concentrated with an ultra-filtration filter which had a cut-off size of 10 kDa. The fraction with molecular size over 10 kDa was concentrated to 100 times in the original water, and then introduced into the SEC-ICP-MS. Molecular size chromatograms of all river and lake water samples showed two peaks for iodine concentrations: 40 kDa and 20 kDa. The method for separately determining two valence states of inorganic iodine, I - and IO 3 - , was also developed using the CE-ICP-MS system and it was successfully applied to the fresh water samples. Analysis results of surface water samples in Lake Towada and rivers surrounding the lake indicated that the chemical form of inorganic iodine in all samples was IO 3 - . Additional lake water samples were collected from Lake O-ike-higashi in the Juni-ko area at Shirakami-Sanchi, which is a UNESCO natural world heritage. The lake has a strong thermocline during all seasons; its bottom layer is in a highly reductive state. Depth profiles of I - and IO 3 - clearly showed that I - was not detected in the surface layer, but it was predominant in the bottom layer, and vice versa for IO 3 - . As this separation method is rapid and sensitive, it will be widely used in the future. (author)

  4. Development of a Framework for Sustainable Outsourcing: Analytic Balanced Scorecard Method (A-BSC

    Directory of Open Access Journals (Sweden)

    Fabio De Felice

    2015-06-01

    Full Text Available Nowadays, many enterprises choose to outsource its non-core business to other enterprises to reduce cost and increase the efficiency. Many enterprises choose to outsource their supply chain management (SCM and leave it to a third-party organization in order to improve their services. The paper proposes an integrated and multicriteria tool useful to monitor and to improve performance in an outsourced supply chain. The Analytic Balanced Scorecard method (A-BSC is proposed as an effective method useful to analyze strategic performance within an outsourced supply chain. The aim of the paper is to present the integration of two methodologies: Balanced Scorecard, a multiple perspective framework for performance assessment, and Analytic Hierarchy Process, a decision-making tool used to prioritize multiple performance perspectives and to generate a unified metric. The development of the framework is aimed to provide a performance analysis to achieve better sustainability performance of supply chain. A real case study concerning a typical value chain is presented.

  5. Thermodynamics and structure of liquid surfaces investigated directly with surface analytical tools

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, Gunther [Flinders Univ., Adelaide, SA (Australia). Centre for NanoScale Science and Technology; Morgner, Harald [Leipzig Univ. (Germany). Wilhelm Ostwald Inst. for Physical and Theoretical Chemistry

    2017-06-15

    Measuring directly the composition, the distribution of constituents as function of the depth and the orientation of molecules at liquid surfaces is essential for determining physicochemical properties of liquid surfaces. While the experimental tools that have been developed for analyzing solid surfaces can in principal be applied to liquid surfaces, it turned out that they had to be adjusted to the particular challenges imposed by liquid samples, e.g. by the unavoidable vapor pressure and by the mobility of the constituting atoms/molecules. In the present work it is shown, how electron spectroscopy and ion scattering spectroscopy have been used for analyzing liquid surfaces. The emphasis of this review is on using the structural information gained for determining the physicochemical properties of liquid surfaces. (copyright 2017 by WILEY-VCH Verlag GmbH and Co. KGaA, Weinheim)

  6. Development of robot arm for automatic analytical operation in nuclear reprocessing plant

    Energy Technology Data Exchange (ETDEWEB)

    Shibayama, S.; Ohnishi, K. [Mitsubishi Heavy Industries Ltd., Kobe (Japan); Hayashibara, H. [Mitsubishi Heavy Industries Ltd., Takasago Research and Development Center, Takasago-shi, Hyogo-Ken (Japan)

    1998-07-01

    The analytical work in the nuclear reprocessing plant is very important role to operate the plant in normal and safety. The new compact robot arm has been developed for the automatic analytical system installed in the analytical box with the heavy shielding and confirmed the availability for this system by results of several validation tests. (author)

  7. Teaching Analytical Method Transfer through Developing and Validating Then Transferring Dissolution Testing Methods for Pharmaceuticals

    Science.gov (United States)

    Kimaru, Irene; Koether, Marina; Chichester, Kimberly; Eaton, Lafayette

    2017-01-01

    Analytical method transfer (AMT) and dissolution testing are important topics required in industry that should be taught in analytical chemistry courses. Undergraduate students in senior level analytical chemistry laboratory courses at Kennesaw State University (KSU) and St. John Fisher College (SJFC) participated in development, validation, and…

  8. BUDGETARY SAFETY PASSPORT AS A TOOL FOR IMPROVING AN ANALYTICAL COMPONENT OF ENSURING COUNTRY’S BUDGETARY SAFETY

    Directory of Open Access Journals (Sweden)

    Oksana Bolduieva

    2017-11-01

    Full Text Available The purpose of the paper is to explore the feasibility of using the budgetary safety passport as a tool for improving the analytical component of ensuring country’s budgetary safety and fiscal planning. Methodology. In the research, there are used general scientific methods of learning economic facts and the use of processes in their steady development and correlation: logic analysis, methods of scientific abstraction, induction, deduction, optimization, grouping, economic modelling, comparison, as well as tabular methods. Results. The work identifies the Budgetary Safety Passport as a document that contains information on the quality of budget process and an integrated assessment of the country’s budget potential with regard to internal and external threats. The Passport is presented as structured into 15 sections, which are combined into five groups according to their contents: revenue potential of the state budget; core indicators of the state budget; monitoring the state budget performance; forecasting revenue from national taxes and duties; monitoring threats to budgetary safety of the country, and assessing the efficiency of threat prevention and neutralization programs. Practical implications. The article presents a systematization of the main results aimed at the practical application of the suggested Budget Safety Passport. Value/originality. It is concluded that Budget Safety Passport is an illustrative, systemic and fundamentally new tool for comprehensive evaluation of the state and prospects of the country’s budgetary system, which allows us to raise the responsibility of government and administrative bodies for the budgetary safety of Ukraine and efficiency of budget performance, improve the quality and accessibility of public information.

  9. Bio-electrosprays: from bio-analytics to a generic tool for the health sciences.

    Science.gov (United States)

    Jayasinghe, Suwan N

    2011-03-07

    Electrosprays or electrospraying is a process by which an aerosol is generated between two charged electrodes. This aerosol generation methodology has been known for well over a century, and has undergone exploration in aerosol and materials sciences, to many other areas of research and development. In one such exploration, electrosprays were partnered with mass spectrometry for the accurate characterisation of molecules. This technology now widely referred to as electrospray ionisation mass spectrometry (ESI MS) significantly contributes to molecular analysis and cancer biology to name a few. In fact these findings were recognised by the Chemistry Nobel Committee in 2002, and have catapulted electrosprays to many areas of research and development. In this review, the author wishes to introduce and discuss another such recent discovery, where electrosprays have been investigated for directly handling living cells and whole organisms. Over the past few years these electrosprays now referred to as "bio-electrosprays" have undergone rigorous developmental studies both in terms of understanding all the associate physical, chemical and biological sciences for completely assessing their effects, if any on the direct handling of living biological materials. Therefore, the review will bring together all the work that has contributed to fully understanding that bio-electrosprays are an inert technology for directly handling living biological materials, while elucidating some unique features they possess over competing technologies. Hence, demonstrating this approach as a flexible methodology for a wide range of applications spanning bio-analytics, diagnostics to the possible creation of synthetic tissues, for repairing and replacing damaged/ageing tissues, to the targeted and controlled delivery of personalised medicine through experimental and/or medical cells and/or genes. Therefore, elucidating the far reaching ramifications bio-electrosprays have to our health sciences

  10. European Institutional and Organisational Tools for Maritime Human Resources Development

    OpenAIRE

    Dragomir Cristina

    2012-01-01

    Seafarers need to continuously develop their career, at all stages of their professional life. This paper presents some tools of institutional and organisational career development. At insitutional level there are presented vocational education and training tools provided by the European Union institutions while at organisational level are exemplified some tools used by private crewing companies for maritime human resources assessment and development.

  11. Big data analytics from strategic planning to enterprise integration with tools, techniques, NoSQL, and graph

    CERN Document Server

    Loshin, David

    2013-01-01

    Big Data Analytics will assist managers in providing an overview of the drivers for introducing big data technology into the organization and for understanding the types of business problems best suited to big data analytics solutions, understanding the value drivers and benefits, strategic planning, developing a pilot, and eventually planning to integrate back into production within the enterprise. Guides the reader in assessing the opportunities and value propositionOverview of big data hardware and software architecturesPresents a variety of te

  12. Development of a brachytherapy audit checklist tool.

    Science.gov (United States)

    Prisciandaro, Joann; Hadley, Scott; Jolly, Shruti; Lee, Choonik; Roberson, Peter; Roberts, Donald; Ritter, Timothy

    2015-01-01

    To develop a brachytherapy audit checklist that could be used to prepare for Nuclear Regulatory Commission or agreement state inspections, to aid in readiness for a practice accreditation visit, or to be used as an annual internal audit tool. Six board-certified medical physicists and one radiation oncologist conducted a thorough review of brachytherapy-related literature and practice guidelines published by professional organizations and federal regulations. The team members worked at two facilities that are part of a large, academic health care center. Checklist items were given a score based on their judged importance. Four clinical sites performed an audit of their program using the checklist. The sites were asked to score each item based on a defined severity scale for their noncompliance, and final audit scores were tallied by summing the products of importance score and severity score for each item. The final audit checklist, which is available online, contains 83 items. The audit scores from the beta sites ranged from 17 to 71 (out of 690) and identified a total of 7-16 noncompliance items. The total time to conduct the audit ranged from 1.5 to 5 hours. A comprehensive audit checklist was developed which can be implemented by any facility that wishes to perform a program audit in support of their own brachytherapy program. The checklist is designed to allow users to identify areas of noncompliance and to prioritize how these items are addressed to minimize deviations from nationally-recognized standards. Copyright © 2015 American Brachytherapy Society. All rights reserved.

  13. Development of balanced key performance indicators for emergency departments strategic dashboards following analytic hierarchical process.

    Science.gov (United States)

    Safdari, Reza; Ghazisaeedi, Marjan; Mirzaee, Mahboobeh; Farzi, Jebrail; Goodini, Azadeh

    2014-01-01

    Dynamic reporting tools, such as dashboards, should be developed to measure emergency department (ED) performance. However, choosing an effective balanced set of performance measures and key performance indicators (KPIs) is a main challenge to accomplish this. The aim of this study was to develop a balanced set of KPIs for use in ED strategic dashboards following an analytic hierarchical process. The study was carried out in 2 phases: constructing ED performance measures based on balanced scorecard perspectives and incorporating them into analytic hierarchical process framework to select the final KPIs. The respondents placed most importance on ED internal processes perspective especially on measures related to timeliness and accessibility of care in ED. Some measures from financial, customer, and learning and growth perspectives were also selected as other top KPIs. Measures of care effectiveness and care safety were placed as the next priorities too. The respondents placed least importance on disease-/condition-specific "time to" measures. The methodology can be presented as a reference model for development of KPIs in various performance related areas based on a consistent and fair approach. Dashboards that are designed based on such a balanced set of KPIs will help to establish comprehensive performance measurements and fair benchmarks and comparisons.

  14. Monte Carlo simulation: tool for the calibration in analytical determination of radionuclides

    International Nuclear Information System (INIS)

    Gonzalez, Jorge A. Carrazana; Ferrera, Eduardo A. Capote; Gomez, Isis M. Fernandez; Castro, Gloria V. Rodriguez; Ricardo, Niury Martinez

    2013-01-01

    This work shows how is established the traceability of the analytical determinations using this calibration method. Highlights the advantages offered by Monte Carlo simulation for the application of corrections by differences in chemical composition, density and height of the samples analyzed. Likewise, the results obtained by the LVRA in two exercises organized by the International Agency for Atomic Energy (IAEA) are presented. In these exercises (an intercomparison and a proficiency test) all reported analytical results were obtained based on calibrations in efficiency by Monte Carlo simulation using the DETEFF program

  15. Recent developments in analytical techniques for characterization of ...

    Indian Academy of Sciences (India)

    With continual decrease of geometries used in modern IC devices, the trace metal impurities of process materials and chemicals used in their manufacture are moving to increasingly lower levels, i.e. ng/g and pg/g levels. An attempt is made to give a brief overview of the use of different analytical techniques in the analysis of ...

  16. Recent Developments and Applications of Analytic Number Theory ...

    African Journals Online (AJOL)

    Mathematics Subject Classification (1991): 11-02, 11N, 11T55 Keywords: analytic number theory, research exposition, multiplicative number theory, arithmetic theory of polynomial rings over finite fields, arithmetical semigroups, semisimple finite rings, Lie, symmetric Riemannian manifolds, finite topological spaces, finite ...

  17. 100-K Target Analyte List Development for Soil

    Energy Technology Data Exchange (ETDEWEB)

    Ovink, R.

    2012-09-18

    This report documents the process used to identify source area target analytes in support of the 100-K Area remedial investigation/feasibility study (RI/FS) addendum to the Integrated 100 Area Remedial Investigation/Feasibility Study Work Plan (DOE/RL-2008-46, Rev. 0).

  18. 100-F Target Analyte List Development for Soil

    Energy Technology Data Exchange (ETDEWEB)

    Ovink, R.

    2012-09-18

    This report documents the process used to identify source area target analytes in support of the 100-F Area remedial investigation/feasibility study (RI/FS) addendum to the Integrated 100 Area Remedial Investigation/Feasibility Study Work Plan (DOE/RL-2008-46, Rev. 0).

  19. Online Programs as Tools to Improve Parenting: A meta-analytic review

    NARCIS (Netherlands)

    Prof. Dr. Jo J.M.A Hermanns; Prof. Dr. Ruben R.G. Fukkink; dr. Christa C.C. Nieuwboer

    2013-01-01

    Background. A number of parenting programs, aimed at improving parenting competencies,have recently been adapted or designed with the use of online technologies. Although webbased services have been claimed to hold promise for parent support, a meta-analytic review of online parenting interventions

  20. Instruments of value: using the analytic tools of public value theory in teaching and practice

    NARCIS (Netherlands)

    de Jong, Jorrit; Douglas, S.C.; Sicilia, Mariafrancesca; Radnor, Zoe; Noordegraaf, M.; Debus, Peter

    2017-01-01

    The tools of public value management – such as the strategic triangle and the public value account – are increasingly used by scholars and practitioners alike. At the same time, some confusion remains regarding their functionality in action. Based on our experiences with these tools in classrooms

  1. The Efficacy of Violence Prediction: A Meta-Analytic Comparison of Nine Risk Assessment Tools

    Science.gov (United States)

    Yang, Min; Wong, Stephen C. P.; Coid, Jeremy

    2010-01-01

    Actuarial risk assessment tools are used extensively to predict future violence, but previous studies comparing their predictive accuracies have produced inconsistent findings as a result of various methodological issues. We conducted meta-analyses of the effect sizes of 9 commonly used risk assessment tools and their subscales to compare their…

  2. Evaluation of Heat Flux Measurement as a New Process Analytical Technology Monitoring Tool in Freeze Drying.

    Science.gov (United States)

    Vollrath, Ilona; Pauli, Victoria; Friess, Wolfgang; Freitag, Angelika; Hawe, Andrea; Winter, Gerhard

    2017-05-01

    This study investigates the suitability of heat flux measurement as a new technique for monitoring product temperature and critical end points during freeze drying. The heat flux sensor is tightly mounted on the shelf and measures non-invasively (no contact with the product) the heat transferred from shelf to vial. Heat flux data were compared to comparative pressure measurement, thermocouple readings, and Karl Fischer titration as current state of the art monitoring techniques. The whole freeze drying process including freezing (both by ramp freezing and controlled nucleation) and primary and secondary drying was considered. We found that direct measurement of the transferred heat enables more insights into thermodynamics of the freezing process. Furthermore, a vial heat transfer coefficient can be calculated from heat flux data, which ultimately provides a non-invasive method to monitor product temperature throughout primary drying. The end point of primary drying determined by heat flux measurements was in accordance with the one defined by thermocouples. During secondary drying, heat flux measurements could not indicate the progress of drying as monitoring the residual moisture content. In conclusion, heat flux measurements are a promising new non-invasive tool for lyophilization process monitoring and development using energy transfer as a control parameter. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  3. Evaluation And Selection Process of Suppliers Through Analytical Framework: An Emprical Evidence of Evaluation Tool

    Directory of Open Access Journals (Sweden)

    Imeri Shpend

    2015-09-01

    Full Text Available The supplier selection process is very important to companies as selecting the right suppliers that fit companies strategy needs brings drastic savings. Therefore, this paper seeks to address the key area of supplies evaluation from the supplier review perspective. The purpose was to identify the most important criteria for suppliers’ evaluation and develop evaluation tool based on surveyed criteria. The research was conducted through structured questionnaire and the sample focused on small to medium sized companies (SMEs in Greece. In total eighty companies participated in the survey answering the full questionnaire which consisted of questions whether these companies utilize some suppliers’ evaluation criteria and what criteria if any is applied. The main statistical instrument used in the study is Principal Component Analysis (PCA. Thus, the research has shown that the main criteria are: the attitude of the vendor towards the customer, supplier delivery time, product quality and price. Conclusions are made on the suitability and usefulness of suppliers’ evaluation criteria and in way they are applied in enterprises.

  4. Moving standard deviation and moving sum of outliers as quality tools for monitoring analytical precision.

    Science.gov (United States)

    Liu, Jiakai; Tan, Chin Hon; Badrick, Tony; Loh, Tze Ping

    2018-02-01

    An increase in analytical imprecision (expressed as CV a ) can introduce additional variability (i.e. noise) to the patient results, which poses a challenge to the optimal management of patients. Relatively little work has been done to address the need for continuous monitoring of analytical imprecision. Through numerical simulations, we describe the use of moving standard deviation (movSD) and a recently described moving sum of outlier (movSO) patient results as means for detecting increased analytical imprecision, and compare their performances against internal quality control (QC) and the average of normal (AoN) approaches. The power of detecting an increase in CV a is suboptimal under routine internal QC procedures. The AoN technique almost always had the highest average number of patient results affected before error detection (ANPed), indicating that it had generally the worst capability for detecting an increased CV a . On the other hand, the movSD and movSO approaches were able to detect an increased CV a at significantly lower ANPed, particularly for measurands that displayed a relatively small ratio of biological variation to CV a. CONCLUSION: The movSD and movSO approaches are effective in detecting an increase in CV a for high-risk measurands with small biological variation. Their performance is relatively poor when the biological variation is large. However, the clinical risks of an increase in analytical imprecision is attenuated for these measurands as an increased analytical imprecision will only add marginally to the total variation and less likely to impact on the clinical care. Copyright © 2017 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  5. An analytical model for the study of a small LFR core dynamics: development and benchmark

    International Nuclear Information System (INIS)

    Bortot, S.; Cammi, A.; Lorenzi, S.; Moisseytsev, A.

    2011-01-01

    An analytical model for the study of a small Lead-cooled Fast Reactor (LFR) control-oriented dynamics has been developed aimed at providing a useful, very flexible and straightforward, though accurate, tool allowing relatively quick transient design-basis and stability analyses. A simplified lumped-parameter approach has been adopted to couple neutronics and thermal-hydraulics: the point-kinetics approximation has been employed and an average-temperature heat-exchange model has been implemented. The reactor transient responses following postulated accident initiators such as Unprotected Control Rod Withdrawal (UTOP), Loss of Heat Sink (ULOHS) and Loss of Flow (ULOF) have been studied for a MOX and a metal-fuelled core at the Beginning of Cycle (BoC) and End of Cycle (EoC) configurations. A benchmark analysis has been then performed by means of the SAS4A/SASSYS-1 Liquid Metal Reactor Code System, in which a core model based on three representative channels has been built with the purpose of providing verification for the analytical outcomes and indicating how the latter relate to more realistic one-dimensional calculations. As a general result, responses concerning the main core characteristics (namely, power, reactivity, etc.) have turned out to be mutually consistent in terms of both steady-state absolute figures and transient developments, showing discrepancies of the order of only some percents, thus confirming a very satisfactory agreement. (author)

  6. Analytical tools and functions of GIS in the process control and decision support of mining company

    Directory of Open Access Journals (Sweden)

    Semrád Peter

    2001-12-01

    Full Text Available The development of computer techniques, the increase in demands for the professional and possible fastest data processing, as well as for the fluency and efficiency of information gaining, exchanging and providing has a strong influence on forming the new generation of information technologies - Geografic Information Systems (GIS that rose in the second half of the twentieth century.Advancement in this area is still progressing and GIS gradually find the enforcement in individual fields where they play a great role in the process control and decision support. Nowadays, there are applications in mining and geology, where are used especially at processing and evaluating of mining - geological documentation, optimalization of mining and technical processes, planning, distributing and managing of mining as well as economic analysis that are important in terms of investment decisions to mining business.GIS are the systems for the effective keeping, updating, processing, analysing, modelling, simulating and presenting geographically oriented information. We can identify them as computer systems helping to solve real problems that should be normally required to solve by human expert.Well equipped GIS have graphic ability and accordingly manage descriptive (attribute data. They are able to secure mutual connection between graphical and descriptive data and in addition to command countless number of functions that enable the execution of spatial analysis. This fact is very important in mining and geological application.There are exploited mostly geostatistical analysis (e. g. modelling of distribution valuable and harmful components of mineral resouce in a mineral deposit, surface modelling and surface model analysis (e. g. at modelling the subsidence of mining territory, different methods of creating spatial and attribute queries about database for seeking necessary data (e. g. to find all mining blocks of deposit that meet required conditions and to

  7. Workshop on Software Development Tools for Petascale Computing

    Energy Technology Data Exchange (ETDEWEB)

    Vetter, Jeffrey [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Georgia Inst. of Technology, Atlanta, GA (United States)

    2007-08-01

    Petascale computing systems will soon be available to the DOE science community. Recent studies in the productivity of HPC platforms point to better software environments as a key enabler to science on these systems. To prepare for the deployment and productive use of these petascale platforms, the DOE science and general HPC community must have the software development tools, such as performance analyzers and debuggers that meet application requirements for scalability, functionality, reliability, and ease of use. In this report, we identify and prioritize the research opportunities in the area of software development tools for high performance computing. To facilitate this effort, DOE hosted a group of 55 leading international experts in this area at the Software Development Tools for PetaScale Computing (SDTPC) Workshop, which was held in Washington, D.C. on August 1 and 2, 2007. Software development tools serve as an important interface between the application teams and the target HPC architectures. Broadly speaking, these roles can be decomposed into three categories: performance tools, correctness tools, and development environments. Accordingly, this SDTPC report has four technical thrusts: performance tools, correctness tools, development environment infrastructures, and scalable tool infrastructures. The last thrust primarily targets tool developers per se, rather than end users. Finally, this report identifies non-technical strategic challenges that impact most tool development. The organizing committee emphasizes that many critical areas are outside the scope of this charter; these important areas include system software, compilers, and I/O.

  8. Analytical, critical and creative thinking development of the gifted children in the USA schools

    OpenAIRE

    Kuvarzina, Anna

    2013-01-01

    Teachers of the gifted students should not only make an enrichment and acceleration program for them but also pay attention to the development of analytical, critical and creative thinking skills. Despite great interest for this issue in the last years, the topic of analytical and creative thinking is poorly considered in the textbooks for the gifted. In this article some methods, materials and programs of analytical, critical and creative thinking skills development, which are used in the US...

  9. Analytical developments in ICP-MS for arsenic and selenium speciation. Application to granitic waters

    International Nuclear Information System (INIS)

    Garraud, Herve

    1999-01-01

    Nuclear waste storage in geological areas needs the understanding of the physico-chemistry of groundwaters interactions with surrounding rocks. Redox potential measurements and speciation, calculated from geochemical modelling are not significant for the determination of water reactivity. We have thus chosen to carry out experimental speciation by developing sensitive analytical tools with respect of specie chemical identity. We have studied two redox indicators from reference sites (thermal waters from Pyrenees, France): arsenic and selenium. At first, we have determined the concentrations in major ions (sulphide, sulphate, chloride, fluoride, carbonate, Na, K, Ca). Speciation was conducted by HPLC hyphenated to quadrupole ICP-MS and high resolution ICP-MS. These analyses have shown the presence of two new arsenic species in solution, in addition of a great reactivity of these waters during stability studies. A sampling, storage and analysis method is described. (author) [fr

  10. Specialized case tools for the development of the accounting ...

    African Journals Online (AJOL)

    The paper presents an approach to building specialized CASE tools for the development of accounting applications. These tools form an integrated development environment allowing the computer aided development of the different applications in this field. This development environment consists of a formula interpreter, ...

  11. The development of a visualization tool for displaying analysis and test results

    International Nuclear Information System (INIS)

    Uncapher, W.L.; Ammerman, D.J.; Ludwigsen, J.S.; Wix, S.D.

    1995-01-01

    The evaluation and certification of packages for transportation of radioactive materials is performed by analysis, testing, or a combination of both. Within the last few years, many transport packages that were certified have used a combination of analysis and testing. The ability to combine and display both kinds of data with interactive graphical tools allows a faster and more complete understanding of the response of the package to these environments. Sandia National Laboratories has developed an initial version of a visualization tool that allows the comparison and display of test and of analytical data as part of a Department of Energy-sponsored program to support advanced analytical techniques and test methodologies. The capability of the tool extends to both mechanical (structural) and thermal data

  12. Visualizing the Geography of the Diseases of China: Western Disease Maps from Analytical Tools to Tools of Empire, Sovereignty, and Public Health Propaganda, 1878-1929.

    Science.gov (United States)

    Hanson, Marta

    2017-09-01

    Argument This article analyzes for the first time the earliest western maps of diseases in China spanning fifty years from the late 1870s to the end of the 1920s. The 24 featured disease maps present a visual history of the major transformations in modern medicine from medical geography to laboratory medicine wrought on Chinese soil. These medical transformations occurred within new political formations from the Qing dynasty (1644-1911) to colonialism in East Asia (Hong Kong, Taiwan, Manchuria, Korea) and hypercolonialism within China (Tianjin, Shanghai, Amoy) as well as the new Republican Chinese nation state (1912-49). As a subgenre of persuasive graphics, physicians marshaled disease maps for various rhetorical functions within these different political contexts. Disease maps in China changed from being mostly analytical tools to functioning as tools of empire, national sovereignty, and public health propaganda legitimating new medical concepts, public health interventions, and political structures governing over human and non-human populations.

  13. Development and in-line validation of a Process Analytical Technology to facilitate the scale up of coating processes.

    Science.gov (United States)

    Wirges, M; Funke, A; Serno, P; Knop, K; Kleinebudde, P

    2013-05-05

    Incorporation of an active pharmaceutical ingredient (API) into the coating layer of film-coated tablets is a method mainly used to formulate fixed-dose combinations. Uniform and precise spray-coating of an API represents a substantial challenge, which could be overcome by applying Raman spectroscopy as process analytical tool. In pharmaceutical industry, Raman spectroscopy is still mainly used as a bench top laboratory analytical method and usually not implemented in the production process. Concerning the application in the production process, a lot of scientific approaches stop at the level of feasibility studies and do not manage the step to production scale and process applications. The present work puts the scale up of an active coating process into focus, which is a step of highest importance during the pharmaceutical development. Active coating experiments were performed at lab and production scale. Using partial least squares (PLS), a multivariate model was constructed by correlating in-line measured Raman spectral data with the coated amount of API. By transferring this model, being implemented for a lab scale process, to a production scale process, the robustness of this analytical method and thus its applicability as a Process Analytical Technology (PAT) tool for the correct endpoint determination in pharmaceutical manufacturing could be shown. Finally, this method was validated according to the European Medicine Agency (EMA) guideline with respect to the special requirements of the applied in-line model development strategy. Copyright © 2013 Elsevier B.V. All rights reserved.

  14. Learning analytics as a tool for closing the assessment loop in higher education

    Directory of Open Access Journals (Sweden)

    Karen D. Mattingly

    2012-09-01

    Full Text Available This paper examines learning and academic analytics and its relevance to distance education in undergraduate and graduate programs as it impacts students and teaching faculty, and also academic institutions. The focus is to explore the measurement, collection, analysis, and reporting of data as predictors of student success and drivers of departmental process and program curriculum. Learning and academic analytics in higher education is used to predict student success by examining how and what students learn and how success is supported by academic programs and institutions. The paper examines what is being done to support students, whether or not it is effective, and if not why, and what educators can do. The paper also examines how these data can be used to create new metrics and inform a continuous cycle of improvement. It presents examples of working models from a sample of institutions of higher education: The Graduate School of Medicine at the University of Wollongong, the University of Michigan, Purdue University, and the University of Maryland, Baltimore County. Finally, the paper identifies considerations and recommendations for using analytics and offer suggestions for future research.

  15. Developing Learning Analytics Design Knowledge in the "Middle Space": The Student Tuning Model and Align Design Framework for Learning Analytics Use

    Science.gov (United States)

    Wise, Alyssa Friend; Vytasek, Jovita Maria; Hausknecht, Simone; Zhao, Yuting

    2016-01-01

    This paper addresses a relatively unexplored area in the field of learning analytics: how analytics are taken up and used as part of teaching and learning processes. Initial steps are taken towards developing design knowledge for this "middle space," with a focus on students as analytics users. First, a core set of challenges for…

  16. Prognostic risk estimates of patients with multiple sclerosis and their physicians: comparison to an online analytical risk counseling tool.

    Science.gov (United States)

    Heesen, Christoph; Gaissmaier, Wolfgang; Nguyen, Franziska; Stellmann, Jan-Patrick; Kasper, Jürgen; Köpke, Sascha; Lederer, Christian; Neuhaus, Anneke; Daumer, Martin

    2013-01-01

    Prognostic counseling in multiple sclerosis (MS) is difficult because of the high variability of disease progression. Simultaneously, patients and physicians are increasingly confronted with making treatment decisions at an early stage, which requires taking individual prognoses into account to strike a good balance between benefits and harms of treatments. It is therefore important to understand how patients and physicians estimate prognostic risk, and whether and how these estimates can be improved. An online analytical processing (OLAP) tool based on pooled data from placebo cohorts of clinical trials offers short-term prognostic estimates that can be used for individual risk counseling. The aim of this study was to clarify if personalized prognostic information as presented by the OLAP tool is considered useful and meaningful by patients. Furthermore, we used the OLAP tool to evaluate patients' and physicians' risk estimates. Within this evaluation process we assessed short-time prognostic risk estimates of patients with MS (final n = 110) and their physicians (n = 6) and compared them with the estimates of OLAP. Patients rated the OLAP tool as understandable and acceptable, but to be only of moderate interest. It turned out that patients, physicians, and the OLAP tool ranked patients similarly regarding their risk of disease progression. Both patients' and physicians' estimates correlated most strongly with those disease covariates that the OLAP tool's estimates also correlated with most strongly. Exposure to the OLAP tool did not change patients' risk estimates. While the OLAP tool was rated understandable and acceptable, it was only of modest interest and did not change patients' prognostic estimates. The results suggest, however, that patients had some idea regarding their prognosis and which factors were most important in this regard. Future work with OLAP should assess long-term prognostic estimates and clarify its usefulness for patients and physicians

  17. Prognostic risk estimates of patients with multiple sclerosis and their physicians: comparison to an online analytical risk counseling tool.

    Directory of Open Access Journals (Sweden)

    Christoph Heesen

    Full Text Available BACKGROUND: Prognostic counseling in multiple sclerosis (MS is difficult because of the high variability of disease progression. Simultaneously, patients and physicians are increasingly confronted with making treatment decisions at an early stage, which requires taking individual prognoses into account to strike a good balance between benefits and harms of treatments. It is therefore important to understand how patients and physicians estimate prognostic risk, and whether and how these estimates can be improved. An online analytical processing (OLAP tool based on pooled data from placebo cohorts of clinical trials offers short-term prognostic estimates that can be used for individual risk counseling. OBJECTIVE: The aim of this study was to clarify if personalized prognostic information as presented by the OLAP tool is considered useful and meaningful by patients. Furthermore, we used the OLAP tool to evaluate patients' and physicians' risk estimates. Within this evaluation process we assessed short-time prognostic risk estimates of patients with MS (final n = 110 and their physicians (n = 6 and compared them with the estimates of OLAP. RESULTS: Patients rated the OLAP tool as understandable and acceptable, but to be only of moderate interest. It turned out that patients, physicians, and the OLAP tool ranked patients similarly regarding their risk of disease progression. Both patients' and physicians' estimates correlated most strongly with those disease covariates that the OLAP tool's estimates also correlated with most strongly. Exposure to the OLAP tool did not change patients' risk estimates. CONCLUSION: While the OLAP tool was rated understandable and acceptable, it was only of modest interest and did not change patients' prognostic estimates. The results suggest, however, that patients had some idea regarding their prognosis and which factors were most important in this regard. Future work with OLAP should assess long-term prognostic

  18. An Analytic Hierarchy Process for School Quality and Inspection: Model Development and Application

    Science.gov (United States)

    Al Qubaisi, Amal; Badri, Masood; Mohaidat, Jihad; Al Dhaheri, Hamad; Yang, Guang; Al Rashedi, Asma; Greer, Kenneth

    2016-01-01

    Purpose: The purpose of this paper is to develop an analytic hierarchy planning-based framework to establish criteria weights and to develop a school performance system commonly called school inspections. Design/methodology/approach: The analytic hierarchy process (AHP) model uses pairwise comparisons and a measurement scale to generate the…

  19. Ceramic cutting tools materials, development and performance

    CERN Document Server

    Whitney, E Dow

    1994-01-01

    Interest in ceramics as a high speed cutting tool material is based primarily on favorable material properties. As a class of materials, ceramics possess high melting points, excellent hardness and good wear resistance. Unlike most metals, hardness levels in ceramics generally remain high at elevated temperatures which means that cutting tip integrity is relatively unaffected at high cutting speeds. Ceramics are also chemically inert against most workmetals.

  20. System Architecture Development for Energy and Water Infrastructure Data Management and Geovisual Analytics

    Science.gov (United States)

    Berres, A.; Karthik, R.; Nugent, P.; Sorokine, A.; Myers, A.; Pang, H.

    2017-12-01

    Building an integrated data infrastructure that can meet the needs of a sustainable energy-water resource management requires a robust data management and geovisual analytics platform, capable of cross-domain scientific discovery and knowledge generation. Such a platform can facilitate the investigation of diverse complex research and policy questions for emerging priorities in Energy-Water Nexus (EWN) science areas. Using advanced data analytics, machine learning techniques, multi-dimensional statistical tools, and interactive geovisualization components, such a multi-layered federated platform is being developed, the Energy-Water Nexus Knowledge Discovery Framework (EWN-KDF). This platform utilizes several enterprise-grade software design concepts and standards such as extensible service-oriented architecture, open standard protocols, event-driven programming model, enterprise service bus, and adaptive user interfaces to provide a strategic value to the integrative computational and data infrastructure. EWN-KDF is built on the Compute and Data Environment for Science (CADES) environment in Oak Ridge National Laboratory (ORNL).

  1. Role of analytical chemistry in the development of nuclear fuels

    International Nuclear Information System (INIS)

    Ramakumar, K.L.

    2012-01-01

    Analytical chemistry is indispensable and plays a pivotal role in the entire gamut of nuclear fuel cycle activities starting from ore refining, conversion, nuclear fuel fabrication, reactor operation, nuclear fuel reprocessing to waste management. As the fuel is the most critical component of the reactor where the fissions take place to produce power, extreme care should be taken to qualify the fuel. For example, in nuclear fuel fabrication, depending upon the reactor system, selection of nuclear fuel has to be made. The fuel for thermal reactors is normally uranium oxide either natural or slightly enriched. For research reactors it can be uranium metal or alloy. The fuel for FBR can be metal, alloy, oxide, carbide or nitride. India is planning an advanced heavy water reactor for utilization of vast resources of thorium in the country. Also research is going on to identify suitable metallic/alloy fuels for our future fast reactors and possible use in fast breeder test reactor. Other advanced fuel materials are also being investigated for thermal reactors for realizing increased performance levels. For example, advanced fuels made from UO 2 doped with Cr 2 O 3 and Al 2 O 3 are being suggested in LWR applications. These have shown to facilitate pellet densification during sintering and enlarge the pellet grain size. The chemistry of these materials has to be understood during the preparation to the stringent specification. A number of analytical parameters need to be determined as a part of chemical quality control of nuclear materials. Myriad of analytical techniques starting from the classical to sophisticated instrumentation techniques are available for this purpose. Insatiable urge of the analytical chemist enables to devise and adopt new superior methodologies in terms of reduction in the time of analysis, improvement in the measurement precision and accuracy, simplicity of the technique itself etc. Chemical quality control provides a means to ensure that the

  2. Infrared Spectroscopy as a Versatile Analytical Tool for the Quantitative Determination of Antioxidants in Agricultural Products, Foods and Plants

    Science.gov (United States)

    Cozzolino, Daniel

    2015-01-01

    Spectroscopic methods provide with very useful qualitative and quantitative information about the biochemistry and chemistry of antioxidants. Near infrared (NIR) and mid infrared (MIR) spectroscopy are considered as powerful, fast, accurate and non-destructive analytical tools that can be considered as a replacement of traditional chemical analysis. In recent years, several reports can be found in the literature demonstrating the usefulness of these methods in the analysis of antioxidants in different organic matrices. This article reviews recent applications of infrared (NIR and MIR) spectroscopy in the analysis of antioxidant compounds in a wide range of samples such as agricultural products, foods and plants. PMID:26783838

  3. Long range manipulator development and experiments with dismantling tools

    International Nuclear Information System (INIS)

    Mueller, K.

    1993-01-01

    An existing handling system (EMIR) was used as a carrier system for various tools for concrete dismantling and radiation protection monitoring. It combined the advantages of long reach and high payload with highly dexterous kinematics. This system was enhanced mechanically to allow the use of different tools. Tool attachment devices for automatic tool exchange were investigated as well as interfaces (electric, hydraulic, compressed air, cooling water and signals). The control system was improved with regard to accuracy and sensor data processing. Programmable logic controller functions for tool control were incorporated. A free field mockup of the EMIR was build that allowed close simulation of dismantling scenarios without radioactive inventory. Aged concrete was provided for the integration tests. The development scheduled included the basic concept investigation; the development of tools and sensors; the EMIR hardware enhancement including a tool exchange; the adaption of tools and mockup and the final evaluation of the system during experiments

  4. Preparation of standard hair material and development of analytical methodology

    International Nuclear Information System (INIS)

    Gangadharan, S.; Walvekar, A.P.; Ali, M.M.; Thantry, S.S.; Verma, R.; Devi, R.

    1995-01-01

    The concept of the use of human scalp hair as a first level indicator of exposure to inorganic pollutants has been established by us earlier. Efforts towards the preparation of a hair reference material are described. The analytical approaches for the determination of total mercury by cold vapour AAS and INAA and of methylmercury by extraction combined with gas chromatography coupled to an ECD are summarized with results on some of the samples analyzed, including the stability of values over a period of time of storage. (author)

  5. Development of analytical techniques of vanadium isotope in seawater

    Science.gov (United States)

    Huang, T.; Owens, J. D.; Sarafian, A.; Sen, I. S.; Huang, K. F.; Blusztajn, J.; Nielsen, S.

    2015-12-01

    Vanadium (V) is a transition metal with isotopes of 50V and 51V, and oxidation states of +2, +3, +4 and +5. The average concentration in seawater is 1.9 ppb, which results in a marine residence time of ~50 kyrs. Its various oxidation states make it a potential tool for investigating redox conditions in the ocean and sediments due to redox related changes in the valance state of vanadium. In turn, chemical equilibrium between different oxidation states of V will likely cause isotopic fractionation that can potentially be utilized to quantify past ocean redox states. In order to apply V isotopes as a paleo-redox tracer, it is required that we know the isotopic composition of seawater and the relation to marine sources and sinks of V. We developed a novel method for pre-concentrating V and measuring the isotope ratio in seawater samples. In our method, we used four ion exchange chromatography columns to separate vanadium from seawater matrix elements, in particular titanium and chromium, which both have an isobaric interference on 50V. The first column uses the NOBIAS resin, which effectively separates V and other transition metals from the majority of seawater matrix. Subsequent columns are identical to those utilized when separating V from silicate samples (Nielsen et al, Geostand. Geoanal. Res., 2011). The isotopic composition of the purified V is measured using a Thermo Scientific Neptune multiple collector inductively coupled plasma mass spectrometer (MC-ICP-MS) in medium resolution mode. This setup resolves all molecular interferences from masses 49, 50, 51, 52 and 53 including S-O species on mass 50. To test the new method, we spiked an open ocean seawater sample from the Bermuda Atlantic Time Series (BATS) station with 10-25 μg of Alfa Aesar vanadium solution, which has an isotopic composition of δ51V = 0 [where δ51V = 1000 × [(51V/50Vsample - 51V/50VAA)/51V/50VAA]. The average of six spiked samples is -0.03±0.19‰, which is within error of the true

  6. Energy-dispersive X-ray fluorescence systems as analytical tool for assessment of contaminated soils.

    Science.gov (United States)

    Vanhoof, Chris; Corthouts, Valère; Tirez, Kristof

    2004-04-01

    To determine the heavy metal content in soil samples at contaminated locations, a static and time consuming procedure is used in most cases. Soil samples are collected and analyzed in the laboratory at high quality and high analytical costs. The demand by government and consultants for a more dynamic approach and by customers requiring performances in which analyses are performed in the field with immediate feedback of the analytical results, is growing. Especially during the follow-up of remediation projects or during the determination of the sampling strategy, field analyses are advisable. For this purpose four types of ED-XRF systems, ranging from portable up to high performance laboratory systems, have been evaluated. The evaluation criteria are based on the performance characteristics for all the ED-XRF systems such as limit of detection, accuracy and the measurement uncertainty on one hand, and also the influence of the sample pretreatment on the obtained results on the other hand. The study proved that the field portable system and the bench top system, placed in a mobile van, can be applied as field techniques, resulting in semi-quantitative analytical results. A limited homogenization of the analyzed sample significantly increases the representativeness of the soil sample. The ED-XRF systems can be differentiated by their limits of detection which are a factor of 10 to 20 higher for the portable system. The accuracy of the results and the measurement uncertainty also improved using the bench top system. Therefore, the selection criteria for applicability of both field systems are based on the required detection level and also the required accuracy of the results.

  7. Development of novel tools to measure food neophobia in children

    DEFF Research Database (Denmark)

    Damsbo-Svendsen, Marie; Frøst, Michael Bom; Olsen, Annemarie

    2017-01-01

    The main tool currently used to measure food neophobia (the Food Neophobia Scale, FNS, developed by Pliner & Hobden, 1992) may not remain optimal forever. It was developed around 25 years ago, and the perception and availability of “novel” and “ethnic” foods may have changed in the meantime....... Consequently, there is a need for developing updated tools for measuring food neophobia....

  8. The development of a post occupancy evaluation tool for primary schools: learner comfort assessment tool (LCAT)

    CSIR Research Space (South Africa)

    Motsatsi, L

    2015-12-01

    Full Text Available in order to facilitate teaching and learning. The aim of this study was to develop a Post Occupational Evaluation (POE) tool to assess learner comfort in relation to indoor environmental quality in the classroom. The development of POE tool followed a...

  9. Developing A SPOT CRM Debriefing Tool

    Science.gov (United States)

    Martin, Lynne; Villeda, Eric; Orasanu, Judith; Connors, Mary M. (Technical Monitor)

    1998-01-01

    In a study of CRM LOFT briefings published in 1997, Dismukes, McDonnell & Jobe reported that briefings were not being utilized as fully as they could be and that crews may not be getting the full benefit from LOFT that is possible. On the basis of their findings, they suggested a set of general guidelines for briefings for the industry. Our work builds on this study to try to provide a specific debriefing tool which provides a focus for the strategies that Dismukes et al suggest.

  10. Analytical tool for measuring emissions impact of acceleration and deceleration lanes : final report.

    Science.gov (United States)

    2001-04-01

    Air quality has become one of the important factors to be considered in making transportation improvement : decisions. Thus, tools are expected to help such decision-makings. On the other hand, MOBILE5 model, which : has been widely used in evaluatin...

  11. Challenges in the development of analytical soil compaction models

    DEFF Research Database (Denmark)

    Keller, Thomas; Lamandé, Mathieu

    2010-01-01

    Soil compaction can cause a number of environmental and agronomic problems (e.g. flooding, erosion, leaching of agrochemicals to recipient waters, emission of greenhouse gases to the atmosphere, crop yield losses), resulting in significant economic damage to society and agriculture. Strategies...... and recommendations for the prevention of soil compaction often rely on simulation models. This paper highlights some issues that need further consideration in order to improve soil compaction modelling, with the focus on analytical models. We discuss the different issues based on comparisons between experimental...... to stress propagation, an anomaly that needs further attention. We found large differences between soil stress-strain behaviour obtained from in situ measurements during wheeling experiments and those measured on cylindrical soil samples in standard laboratory tests. We concluded that the main reason...

  12. Combining Tools to Design and Develop Software Support for Capabilities

    Directory of Open Access Journals (Sweden)

    Martin Henkel

    2017-04-01

    Full Text Available Analyzing, designing and implementing software systems based on the concept of capabilities have several benefits, such as the ability to design efficient monitoring of capabilities and their execution context. Today, there exist new model-driven methods and development tools that support capability-based analysis, design, and implementation. However, there are also a plethora of existing efficient development tools that are currently in use by organizations. In this article, we examine how a new set of capability based tools, the Capability Driven Development (CDD environment, can be combined with model-driven development tools to leverage both novel capability-based functionality and the proven functionality of existing tools. We base the examination on a case study where an existing model-driven tool is combined with the CDD environment.

  13. Artificial intelligence tool development and applications to nuclear power

    International Nuclear Information System (INIS)

    Naser, J.A.

    1987-01-01

    Two parallel efforts are being performed at the Electric Power Research Institute (EPRI) to help the electric utility industry take advantage of the expert system technology. The first effort is the development of expert system building tools, which are tailored to electric utility industry applications. The second effort is the development of expert system applications. These two efforts complement each other. The application development tests the tools and identifies additional tool capabilities that are required. The tool development helps define the applications that can be successfully developed. Artificial intelligence, as demonstrated by the developments described is being established as a credible technological tool for the electric utility industry. The challenge to transferring artificial intelligence technology and an understanding of its potential to the electric utility industry is to gain an understanding of the problems that reduce power plant performance and identify which can be successfully addressed using artificial intelligence

  14. On-line near infrared spectroscopy as a Process Analytical Technology (PAT) tool to control an industrial seeded API crystallization.

    Science.gov (United States)

    Schaefer, C; Lecomte, C; Clicq, D; Merschaert, A; Norrant, E; Fotiadu, F

    2013-09-01

    The final step of an active pharmaceutical ingredient (API) manufacturing synthesis process consists of a crystallization during which the API and residual solvent contents have to be quantified precisely in order to reach a predefined seeding point. A feasibility study was conducted to demonstrate the suitability of on-line NIR spectroscopy to control this step in line with new version of the European Medicines Agency (EMA) guideline [1]. A quantitative method was developed at laboratory scale using statistical design of experiments (DOE) and multivariate data analysis such as principal component analysis (PCA) and partial least squares (PLS) regression. NIR models were built to quantify the API in the range of 9-12% (w/w) and to quantify the residual methanol in the range of 0-3% (w/w). To improve the predictive ability of the models, the development procedure encompassed: outliers elimination, optimum model rank definition, spectral range and spectral pre-treatment selection. Conventional criteria such as, number of PLS factors, R(2), root mean square errors of calibration, cross-validation and prediction (RMSEC, RMSECV, RMSEP) enabled the selection of three model candidates. These models were tested in the industrial pilot plant during three technical campaigns. Results of the most suitable models were evaluated against to the chromatographic reference methods. Maximum relative bias of 2.88% was obtained about API target content. Absolute bias of 0.01 and 0.02% (w/w) respectively were achieved at methanol content levels of 0.10 and 0.13% (w/w). The repeatability was assessed as sufficient for the on-line monitoring of the 2 analytes. The present feasibility study confirmed the possibility to use on-line NIR spectroscopy as a PAT tool to monitor in real-time both the API and the residual methanol contents, in order to control the seeding of an API crystallization at industrial scale. Furthermore, the successful scale-up of the method proved its capability to be

  15. Development and Validation of a Learning Analytics Framework: Two Case Studies Using Support Vector Machines

    Science.gov (United States)

    Ifenthaler, Dirk; Widanapathirana, Chathuranga

    2014-01-01

    Interest in collecting and mining large sets of educational data on student background and performance to conduct research on learning and instruction has developed as an area generally referred to as learning analytics. Higher education leaders are recognizing the value of learning analytics for improving not only learning and teaching but also…

  16. Some key issues in the development of ergonomic intervention tools

    DEFF Research Database (Denmark)

    Edwards, Kasper; Winkel, Jørgen

    2016-01-01

    Literature reviews suggest that tools facilitating the ergonomic intervention processes should be integrated into rationalization tools, particular if such tools are participative. Such a Tool has recently been developed as an add-in module to the Lean tool “Value Stream Mapping” (VSM). However......, in the investigated context this module seems not to have any direct impact on the generation of proposals with ergonomic consideration. Contextual factors of importance seem to be e.g. allocation of sufficient resources and if work environment issues are generally accepted as part of the VSM methodology...

  17. Advanced reach tool (ART) : Development of the mechanistic model

    NARCIS (Netherlands)

    Fransman, W.; Tongeren, M. van; Cherrie, J.W.; Tischer, M.; Schneider, T.; Schinkel, J.; Kromhout, H.; Warren, N.; Goede, H.; Tielemans, E.

    2011-01-01

    This paper describes the development of the mechanistic model within a collaborative project, referred to as the Advanced REACH Tool (ART) project, to develop a tool to model inhalation exposure for workers sharing similar operational conditions across different industries and locations in Europe.

  18. Development of a piano learning tool

    OpenAIRE

    Baloh, Matevž

    2012-01-01

    This thesis analyzes the appropriateness of the formula defined by the game 'Guitar Hero' in an application, which aims to help it's users learn how to play the piano. The appropriateness is determined through the development of an application. The thesis describes an attempt at the development of a game, the primary intention of which is to be fun, with the secondary purpose of teaching how to play the piano. After this, it describes an attempt at the development of an application, the p...

  19. Design and development of progressive tool for manufacturing washer

    Science.gov (United States)

    Annigeri, Ulhas K.; Raghavendra Ravi Kiran, K.; Deepthi, Y. P.

    2017-07-01

    In a progressive tool the raw material is worked at different station to finally fabricate the component. A progressive tool is a lucrative tool for mass production of components. A lot of automobile and other transport industries develop progressive tool for the production of components. The design of tool involves lot of planning and the same amount of skill of process planning is required in the fabrication of the tool. The design also involves use of thumb rules and standard elements as per experience gained in practice. Manufacturing the press tool is a laborious task as special jigs and fixtures have to be designed for the purpose. Assembly of all the press tool elements is another task where use of accurate measuring instruments for alignment of various tool elements is important. In the present study, design and fabrication of progressive press tool for production of washer has been developed and the press tool has been tried out on a mechanical type of press. The components produced are to dimensions.

  20. DEVELOPING A TOOL FOR ENVIRONMENTALLY PREFERABLE PURCHASING

    Science.gov (United States)

    LCA-based guidance was developed by EPA under the Framework for Responsible Environmental Decision Making (FRED) effort to demonstrate how to conduct a relative comparison between product types to determine environmental preferability. It identifies data collection needs and iss...

  1. DEVELOPMENT OF SOLUBILITY PRODUCT VISUALIZATION TOOLS

    Energy Technology Data Exchange (ETDEWEB)

    T.F. Turner; A.T. Pauli; J.F. Schabron

    2004-05-01

    Western Research Institute (WRI) has developed software for the visualization of data acquired from solubility tests. The work was performed in conjunction with AB Nynas Petroleum, Nynashamn, Sweden who participated as the corporate cosponsor for this Jointly Sponsored Research (JSR) task. Efforts in this project were split between software development and solubility test development. The Microsoft Windows-compatible software developed inputs up to three solubility data sets, calculates the parameters for six solid body types to fit the data, and interactively displays the results in three dimensions. Several infrared spectroscopy techniques have been examined for potential use in determining bitumen solubility in various solvents. Reflectance, time-averaged absorbance, and transmittance techniques were applied to bitumen samples in single and binary solvent systems. None of the techniques were found to have wide applicability.

  2. Development of a tool for evaluating multimedia for surgical education.

    Science.gov (United States)

    Coughlan, Jane; Morar, Sonali S

    2008-09-01

    Educational multimedia has been designed to provide surgical trainees with expert operative information outside of the operating theater. The effectiveness of multimedia (e.g., CD-ROMs) for learning has been a common research topic since the 1990s. To date, however, little discussion has taken place on the mechanisms to evaluate the quality of multimedia-driven teaching. This may be because of a lack of research into the development of appropriate tools for evaluating multimedia, especially for surgical education. This paper reports on a small-scale pilot and exploratory study (n = 12) that developed a tool for surgical multimedia evaluation. The validity of the developed tool was established through adaptation of an existing tool, which was reviewed using experts in surgery, usability, and education. The reliability of the developed tool was tested with surgical trainees who used it to assess a multimedia CD-ROM created for teaching basic surgical skills. The findings contribute to an understanding of surgical trainees' experience of using educational multimedia, in terms of characteristics of the learning material for interface design and content and the process of developing evaluation tools, in terms of inclusion of appropriate assessment criteria. The increasing use of multimedia in medical education necessitates the development of standardized tools for determining the quality of teaching and learning. Little research exists into the development of such tools and so the present work stimulates discussion on how to evaluate surgical training.

  3. 105-KE Basin isolation barrier leak rate test analytical development. Revision 1

    International Nuclear Information System (INIS)

    Irwin, J.J.

    1995-01-01

    This document provides an analytical development in support of the proposed leak rate test of the 105-KE Basin. The analytical basis upon which the K-basin leak test results will be used to determine the basin leakage rates is developed in this report. The leakage of the K-Basin isolation barriers under postulated accident conditions will be determined from the test results. There are two fundamental flow regimes that may exist in the postulated K-Basin leakage: viscous laminar and turbulent flow. An analytical development is presented for each flow regime. The basic geometry and nomenclature of the postulated leak paths are denoted

  4. Developing a 300C Analog Tool for EGS

    Energy Technology Data Exchange (ETDEWEB)

    Normann, Randy

    2015-03-23

    This paper covers the development of a 300°C geothermal well monitoring tool for supporting future EGS (enhanced geothermal systems) power production. This is the first of 3 tools planed. This is an analog tool designed for monitoring well pressure and temperature. There is discussion on 3 different circuit topologies and the development of the supporting surface electronics and software. There is information on testing electronic circuits and component. One of the major components is the cable used to connect the analog tool to the surface.

  5. Developing a Tool for Digital Transformations:How to Improve Service Quality in the RelocationIndustry

    OpenAIRE

    IVARSSON, ADAM; Lindstrand, Dag

    2014-01-01

    The purpose of this study was to develop a tool that shows relocation companies how they should digitally transform their services. The purpose was satisfied by dividing the research into two studies. Study 1 conducted a qualitative literature review focusing on the fields of blue ocean strategy and service quality to develop an analytical tool that could provide relocation companies with a strategic direction of how to improve service quality through digital transformation. Study 2 tested th...

  6. Magnetic optical sensor particles: a flexible analytical tool for microfluidic devices.

    Science.gov (United States)

    Ungerböck, Birgit; Fellinger, Siegfried; Sulzer, Philipp; Abel, Tobias; Mayr, Torsten

    2014-05-21

    In this study we evaluate magnetic optical sensor particles (MOSePs) with incorporated sensing functionalities regarding their applicability in microfluidic devices. MOSePs can be separated from the surrounding solution to form in situ sensor spots within microfluidic channels, while read-out is accomplished outside the chip. These magnetic sensor spots exhibit benefits of sensor layers (high brightness and convenient usage) combined with the advantages of dispersed sensor particles (ease of integration). The accumulation characteristics of MOSePs with different diameters were investigated as well as the in situ sensor spot stability at varying flow rates. Magnetic sensor spots were stable at flow rates specific to microfluidic applications. Furthermore, MOSePs were optimized regarding fiber optic and imaging read-out systems, and different referencing schemes were critically discussed on the example of oxygen sensors. While the fiber optic sensing system delivered precise and accurate results for measurement in microfluidic channels, limitations due to analyte consumption were found for microscopic oxygen imaging. A compensation strategy is provided, which utilizes simple pre-conditioning by exposure to light. Finally, new application possibilities were addressed, being enabled by the use of MOSePs. They can be used for microscopic oxygen imaging in any chip with optically transparent covers, can serve as flexible sensor spots to monitor enzymatic activity or can be applied to form fixed sensor spots inside microfluidic structures, which would be inaccessible to integration of sensor layers.

  7. Thermal Lens Spectroscopy as a 'new' analytical tool for actinide determination in nuclear reprocessing processes

    International Nuclear Information System (INIS)

    Canto, Fabrice; Couston, Laurent; Magnaldo, Alastair; Broquin, Jean-Emmanuel; Signoret, Philippe

    2008-01-01

    Thermal Lens Spectroscopy (TLS) consists of measuring the effects induced by the relaxation of molecules excited by photons. Twenty years ago, the Cea already worked on TLS. Technologic reasons impeded. But, needs in sensitive analytical methods coupled with very low sample volumes (for example, traces of Np in the COEX TM process) and also the reduction of the nuclear wastes encourage us to revisit this method thanks to the improvement of optoelectronic technologies. We can also imagine coupling TLS with micro-fluidic technologies, decreasing significantly the experiments cost. Generally two laser beams are used for TLS: one for the selective excitation by molecular absorption (inducing the thermal lens) and one for probing the thermal lens. They can be coupled with different geometries, collinear or perpendicular, depending on the application and on the laser mode. Also, many possibilities of measurement have been studied to detect the thermal lens signal: interferometry, direct intensities variations, deflection etc... In this paper, one geometrical configuration and two measurements have been theoretically evaluated. For a single photodiode detection (z-scan) the limit of detection is calculated to be near 5*10 -6 mol*L -1 for Np(IV) in dodecane. (authors)

  8. External beam milli-PIXE as analytical tool for Neolithic obsidian provenance studies

    Energy Technology Data Exchange (ETDEWEB)

    Constantinescu, B.; Cristea-Stan, D. [National Institute for Nuclear Physics and Engineering Horia Hulubei, Bucharest-Magurele (Romania); Kovács, I.; Szõkefalvi-Nagy, Z. [Wigner Research Centre for Phyics, Institute for Particle and Nuclear Physics, Budapest (Hungary)

    2013-07-01

    Full text: Obsidian is the most important archaeological material used for tools and weapons before metals appearance. Its geological sources are limited and concentrated in few geographical zones: Armenia, Eastern Anatolia, Italian Lipari and Sardinia islands, Greek Melos and Yali islands, Hungarian and Slovak Tokaj Mountains. Due to this fact, in Mesolithic and Neolithic periods obsidian was the first archaeological material intensively traded even at long distances. To determine the geological provenance of obsidian and to identify the prehistoric long-range trade routes and possible population migrations elemental concentration ratios can help a 101, since each geological source has its 'fingerprints'. In this work external milli-PIXE technique was applied for elemental concentration ratio determinations in some Neolithic tools found in Transylvania and in the lron Gales region near Danube, and on few relevant geological samples (Slovak Tokaj Mountains, Lipari,Armenia). In Transylvania (the North-Western part of Romania, a region surrounded by Carpathian Mountains), Neolithic obsidian tools were discovered mainly in three regions: North-West - Oradea (near the border with Hungary, Slovakia and Ukraine), Centre -Cluj and Southwest- Banat (near the border with Serbia). A special case is lron Gales - Mesolithic and Early Neolithic sites, directly related to the appearance of agriculture replacing the Mesolithic economy based on hunting and fishing. Three long-distance trade routes could be considered: from Caucasus Mountains via North of the Black Sea, from Greek islands or Asia Minor via ex-Yugoslavia area or via Greece-Bulgaria or from Central Europe- Tokaj Mountains in the case of obsidian. As provenance 'fingerprints', we focused on Ti to Mn, and Rb-Sr-Y-Zr ratios. The measurements were performed at the external milli-PIXE beam-line of the 5MV VdG accelerator of the Wigner RCP. Proton energy of 3MeV and beam currents in the range of 1 ±1 D

  9. External beam milli-PIXE as analytical tool for Neolithic obsidian provenance studies

    International Nuclear Information System (INIS)

    Constantinescu, B.; Cristea-Stan, D.; Kovács, I.; Szõkefalvi-Nagy, Z.

    2013-01-01

    Full text: Obsidian is the most important archaeological material used for tools and weapons before metals appearance. Its geological sources are limited and concentrated in few geographical zones: Armenia, Eastern Anatolia, Italian Lipari and Sardinia islands, Greek Melos and Yali islands, Hungarian and Slovak Tokaj Mountains. Due to this fact, in Mesolithic and Neolithic periods obsidian was the first archaeological material intensively traded even at long distances. To determine the geological provenance of obsidian and to identify the prehistoric long-range trade routes and possible population migrations elemental concentration ratios can help a 101, since each geological source has its 'fingerprints'. In this work external milli-PIXE technique was applied for elemental concentration ratio determinations in some Neolithic tools found in Transylvania and in the lron Gales region near Danube, and on few relevant geological samples (Slovak Tokaj Mountains, Lipari,Armenia). In Transylvania (the North-Western part of Romania, a region surrounded by Carpathian Mountains), Neolithic obsidian tools were discovered mainly in three regions: North-West - Oradea (near the border with Hungary, Slovakia and Ukraine), Centre -Cluj and Southwest- Banat (near the border with Serbia). A special case is lron Gales - Mesolithic and Early Neolithic sites, directly related to the appearance of agriculture replacing the Mesolithic economy based on hunting and fishing. Three long-distance trade routes could be considered: from Caucasus Mountains via North of the Black Sea, from Greek islands or Asia Minor via ex-Yugoslavia area or via Greece-Bulgaria or from Central Europe- Tokaj Mountains in the case of obsidian. As provenance 'fingerprints', we focused on Ti to Mn, and Rb-Sr-Y-Zr ratios. The measurements were performed at the external milli-PIXE beam-line of the 5MV VdG accelerator of the Wigner RCP. Proton energy of 3MeV and beam currents in the range of 1 ±1 D

  10. Networks as Tools for Sustainable Urban Development

    DEFF Research Database (Denmark)

    Jensen, Jesper Ole; Tollin, Nicola

    . By applying the GREMI2-theories of “innovative milieux” (Aydalot, 1986; Camagni, 1991) to the case study, we will suggest some reasons for the benefits achieved by the Dogme-network, compared to other networks. This analysis will point to the existence of an “innovative milieu” on sustainability within......, strategies and actions. There has been little theoretically development on the subject. In practice networks for sustainable development can be seen as combining different theoretical approaches to networks, including governance, urban competition and innovation. To give a picture of the variety...

  11. Developing Multilateral Surveillance Tools in the EU

    NARCIS (Netherlands)

    de Ruiter, R.

    2008-01-01

    The development of the infrastructure of the Open Method of Coordination (OMC) is an unaddressed topic in scholarly debates. On the basis of secondary literature on the European Employment Strategy, it is hypothesised that a conflict between an incentive and reluctance to act on the EU level on the

  12. Awareness Development Across Perspectives Tool (ADAPT)

    NARCIS (Netherlands)

    Petiet, P.; Maanen, P.P. van; Bemmel, I.E. van; Vliet, A.J. van

    2010-01-01

    Reality can be viewed from several perspectives or disciplines. Due to their background, training and education, soldiers developed a military perspective which is not solely restricted to kinetic activities. In current missions, military personnel is confronted with a reality in which other

  13. Tools for educational change | IDRC - International Development ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    2011-02-03

    Feb 3, 2011 ... One floor above Pinto's office is a relatively new computer lab, part of SchoolNet Mozambique, a project supported by Canada's International Development Research Centre (IDRC) to link schools via the Internet to enhance learning opportunities for students, teachers, and the surrounding community.

  14. Tools for Nanotechnology Education Development Program

    Energy Technology Data Exchange (ETDEWEB)

    Dorothy Moore

    2010-09-27

    The overall focus of this project was the development of reusable, cost-effective educational modules for use with the table top scanning electron microscope (TTSEM). The goal of this project's outreach component was to increase students' exposure to the science and technology of nanoscience.

  15. 78 FR 68459 - Medical Device Development Tools; Draft Guidance for Industry, Tool Developers, and Food and Drug...

    Science.gov (United States)

    2013-11-14

    ... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration [Docket No. FDA-2013-D-1279] Medical Device Development Tools; Draft Guidance for Industry, Tool Developers, and Food and Drug Administration Staff; Availability AGENCY: Food and Drug Administration, HHS. ACTION: Notice. SUMMARY: The Food...

  16. EXPERT SYSTEMS - DEVELOPMENT OF AGRICULTURAL INSURANCE TOOL

    Directory of Open Access Journals (Sweden)

    NAN Anca-Petruţa

    2013-07-01

    Full Text Available Because of the fact that specialty agricultural assistance is not always available when the farmers need it, we identified expert systems as a strong instrument with an extended potential in agriculture. This started to grow in scale recently, including all socially-economic activity fields, having the role of collecting data regarding different aspects from human experts with the purpose of assisting the user in the necessary steps for solving problems, at the performance level of the expert, making his acquired knowledge and experience available. We opted for a general presentation of the expert systems as well as their necessity, because, the solution to develop the agricultural system can come from artificial intelligence by implementing the expert systems in the field of agricultural insurance, promoting existing insurance products, farmers finding options in depending on their necessities and possibilities. The objective of this article consists of collecting data about different aspects about specific areas of interest of agricultural insurance, preparing the database, a conceptual presentation of a pilot version which will become constantly richer depending on the answers received from agricultural producers, with the clearest exposure of knowledgebase possible. We can justify picking this theme with the fact that even while agricultural insurance plays a very important role in agricultural development, the registered result got from them are modest, reason why solutions need to be found in the scope of developing the agricultural sector. The importance of this consists in the proposal of an immediate viable solution to correspond with the current necessities of agricultural producers and in the proposal of an innovative solution, namely the implementation of expert system in agricultural insurance as a way of promoting insurance products. Our research, even though it treats the subject at an conceptual level, it wants to undertake an

  17. Libraries Are Dynamic Tools for National Development

    Directory of Open Access Journals (Sweden)

    Amaoge Dorathy Agbo

    2014-12-01

    Full Text Available Building an ideal nation requires a holistic approach. All facets of human activity must be harnessed while all indices of nation building must be taken care of. In doing this, all academic and professional disciplines are involved. Libraries are not exception. This paper looks at various types of libraries and their basic functions, their roles in national development, and in particular, the challenges facing library services in Nigeria, such as inadequately trained staff to meet the increasing demands of users.

  18. Development and Validation of Analytical Method for Losartan ...

    African Journals Online (AJOL)

    HP

    Purpose: To develop a new spectrophotometric method for the analysis of losartan potassium in pharmaceutical formulations by making its complex with copper. Method: A coloured complex based on UV/Vis spectroscopic method was developed for the determination of losartan potassium concentration in pharmaceutical ...

  19. Supporting medical technology development with the analytic hierarchy process

    NARCIS (Netherlands)

    Hummel, J. Marjan

    2001-01-01

    This thesis aims to develop an adequate method of CTA to influence decision making about the development and clinical application of a medical technology. The adequacy of this method is related to the timing of its application, the information used in the assessment, the consensus formation about,

  20. Seductive Atmospheres: Using tools to effectuate spaces for Leadership Development

    DEFF Research Database (Denmark)

    Elmholdt, Kasper Trolle; Clausen, Rune Thorbjørn; Madsen, Mona T

    2018-01-01

    Hospital, this study investigates how a business game is used as a tool to effectuate episodic spaces for leadership development. The study reveals three tool affordances and discusses how they enable and constrain episodic spaces for development and further develops the notion of seductive atmospheres......This study applies an affordance lens to understand the use of management tools and how atmospheres for change and development are created and exploited. Drawing on an ethnographic case study of a consultant-facilitated change intervention among a group of research leaders at a Danish Public...... as an important mechanism. The article suggests that a broader understanding of the use of tools and the role of atmospheres is essential for understanding how episodic spaces for development come to work in relation to organizational change and development....

  1. Technology of developing subject-oriented analytical systems

    International Nuclear Information System (INIS)

    Kumykov, B.Kh.

    2013-01-01

    One of the problem areas of modern information technology is the development of data warehouse (DW) architecture. This task is always topical because the purely software-oriented solutions are cheaper and admit a broader range of application. A new DW architecture is proposed, which allows developers to get advantage in some aspects: time to implement the system, time to consolidate data, and time to change the algorithms in the system [ru

  2. Photography by Cameras Integrated in Smartphones as a Tool for Analytical Chemistry Represented by an Butyrylcholinesterase Activity Assay.

    Science.gov (United States)

    Pohanka, Miroslav

    2015-06-11

    Smartphones are popular devices frequently equipped with sensitive sensors and great computational ability. Despite the widespread availability of smartphones, practical uses in analytical chemistry are limited, though some papers have proposed promising applications. In the present paper, a smartphone is used as a tool for the determination of cholinesterasemia i.e., the determination of a biochemical marker butyrylcholinesterase (BChE). The work should demonstrate suitability of a smartphone-integrated camera for analytical purposes. Paper strips soaked with indoxylacetate were used for the determination of BChE activity, while the standard Ellman's assay was used as a reference measurement. In the smartphone-based assay, BChE converted indoxylacetate to indigo blue and coloration was photographed using the phone's integrated camera. A RGB color model was analyzed and color values for the individual color channels were determined. The assay was verified using plasma samples and samples containing pure BChE, and validated using Ellmans's assay. The smartphone assay was proved to be reliable and applicable for routine diagnoses where BChE serves as a marker (liver function tests; some poisonings, etc.). It can be concluded that the assay is expected to be of practical applicability because of the results' relevance.

  3. Evaluation and selection of CASE tool for SMART OTS development

    International Nuclear Information System (INIS)

    Park, K. O; Seo, S. M.; Seo, Y. S.; Koo, I. S.; Jang, M. H.

    1999-01-01

    CASE(Computer-Aided Software Engineering) tool is a software that aids in software engineering activities such as requirement analysis, design, testing, configuration management, and project management. The evaluation and selection of commercial CASE tools for the specific software development project is not a easy work because the technical ability of an evaluator and the maturity of a software development organization are required. In this paper, we discuss selection strategies, characteristic survey, evaluation criteria, and the result of CASE tool selection for the development of SMART(System-integrated Modular Advanced ReacTor) OTS(Operator Training Simulator)

  4. Market research companies and new product development tools

    NARCIS (Netherlands)

    Nijssen, Edwin J.; Frambach, Ruud T.

    1998-01-01

    This research investigates (1) the share of new product development (NPD) research services in market research (MR) companies' turnover, (2) MR companies' awareness and use of NPD tools and the modifications made to these NPD tools, and (3) MR company managers' perceptions of the influence of client

  5. Developing Tool Support for Problem Diagrams with CPN and VDM++

    DEFF Research Database (Denmark)

    Tjell, Simon; Lassen, Kristian Bisgaard

    2008-01-01

    In this paper, we describe ongoing work on the development of tool support for formal description of domains found in Problem Diagrams. The purpose of the tool is to handle the generation of a CPN model based on a collection of Problem Diagrams. The Problem Diagrams are used for representing the ...

  6. Market research companies and new product development tools

    NARCIS (Netherlands)

    Nijssen, E.J.; Frambach, R.T.

    1998-01-01

    This research investigates (1) the share of new product development (NPD) research services in market research (MR) companies’ turnover, (2) MR companies’ awareness and use of NPD tools and the modifications made to these NPD tools, and (3) MR company managers’ perceptions of the influence of client

  7. Evaluating IMU communication skills training programme: assessment tool development.

    Science.gov (United States)

    Yeap, R; Beevi, Z; Lukman, H

    2008-08-01

    This article describes the development of four assessment tools designed to evaluate the communication skills training (CST) programme at the International Medical University (IMU). The tools measure pre-clinical students' 1) perceived competency in basic interpersonal skills, 2) attitude towards patient-centred communication, 3) conceptual knowledge on doctor-patient communication, and 4) acceptance of the CST programme.

  8. Analytical Tools for the Application of Operational Culture: A Case Study in the Trans-Sahel

    Science.gov (United States)

    2011-03-28

    Examples of simple contexts may be certain culturally specific behaviors such as how to greet someone, whether an offer of food and drink should be... Colombia for the Pan American Development Foundation. These projects covered agribusiness and agricultural development, infrastructure construction and...conversations. We are very energized by what we have learned to date and look forward to working with you on this important study. See you all shortly

  9. The Development of a Decision Aid with a Multi Criterial Analytic Approach for Women with Pelvic Organ Prolapse

    DEFF Research Database (Denmark)

    Hulbæk, Mette; Primdahl, Jette; Nielsen, Jesper Bo

    The Development of a Decision Aid with a Multi Criterial Analytic Approach for Women with Pelvic Organ Prolapse.......The Development of a Decision Aid with a Multi Criterial Analytic Approach for Women with Pelvic Organ Prolapse....

  10. Designing a tool for curriculum leadership development in postgraduate programs

    Directory of Open Access Journals (Sweden)

    M Avizhgan

    2016-07-01

    Full Text Available Introduction: Leadership in the area of curriculum development is increasingly important as we look for ways to improve our programmes and practices. In curriculum studies, leadership has received little attention. Considering the lack of an evaluation tool with objective criteria in postgraduate curriculum leadership process, this study aimed to design a specific tool and determine the validity and reliability of the tool. Method: This study is a methodological research.  At first, domains and items of the tool were determined through expert interviews and literature review. Then, using Delphi technique, 54 important criteria were developed. A panel of experts was used to confirm content and face validity. Reliability was determined by a descriptive study in which 30 faculties from two of Isfahan universities and was estimated by internal consistency. The data were analyzed by SPSS software, using Pearson Correlation Coefficient and reliability analysis. Results: At first, considering the definition of curriculum leadership determined the domains and items of the tool and they were developed primary tool. Expert’s faculties’ views were used in deferent stages of development and psychometry. The tool internal consistency with Cronbach's alpha coefficient times was 96.5. This was determined for each domain separately. Conclution: Applying this instrument can improve the effectiveness of curriculum leadership. Identifying the characteristics of successful and effective leaders, and utilizing this knowledge in developing and implementing curriculum might help us to have better respond to the changing needs of our students, teachers and schools of tomorrow.

  11. Development of tools for optimization of HWC

    International Nuclear Information System (INIS)

    Wikmark, Gunnar; Lundgren, Klas; Wijkstroem, Hjalmar; Pein, Katarina; Ullberg, Mats

    2004-06-01

    An ECP model for the Swedish Boiling Water Reactors (BWRs) was developed in a previous project sponsored by the Swedish Nuclear Power Inspectorate. The present work is an extension of that effort. The model work has been extended in three ways. Some potential problem areas of the ECP sub-model have been treated in full detail. A comprehensive calibration data set has been assembled from plant data and from laboratory and in-plant experiments. The model has been fitted to the calibration data set and the model parameters adjusted. The work on the ECP sub-model has demonstrated that the generalised Butler Volmer equation, as previously used, adequately describes the electrochemistry. Thus, there is no need to treat the system surface oxides as semiconductors or to take double layer effects into account. The existence of a pseudo potential for the reaction of oxygen on stainless steel is confirmed. The concentration dependence and temperature dependence of the exchange current densities are still unclear. An experimental investigation of these is therefore desirable. An interesting alternative to a conventional experimental set-up is to combine modelling with simpler and more easily controlled experiments. In addition to a calibration data set, the survey of plant data has also led to an improved understanding of the necessary parameters of an ECP model. Thus, variations of the H 2 injection rate at constant reactor power level and constant recirculation flow rate were traced to variations of the relative power level of the fuel elements in the core periphery. The power level in the core periphery determines the dose rate in the down comer and controls the recombination reaction that is fundamental to Hydrogen Water Chemistry (HWC). To accurately model ECP as a function of hydrogen injection rate and other plant parameters, the relative power level of the core periphery is a necessary model parameter that has to be regularly updated from core management codes

  12. Water Loss Management: Tools and Methods for Developing Countries

    NARCIS (Netherlands)

    Mutikanga, H.E.

    2012-01-01

    Water utilities in developing countries are struggling to provide customers with a reliable level of service due to their peculiar water distribution characteristics including poorly zoned networks with irregular supply operating under restricted budgets. These unique conditions demand unique tools

  13. Water Loss Management : Tools and Methods for Developing Countries

    NARCIS (Netherlands)

    Mutikanga, H.E.

    2012-01-01

    Water utilities in developing countries are struggling to provide customers with a reliable level of service due to their peculiar water distribution characteristics including poorly zoned networks with irregular supply operating under restricted budgets. These unique conditions demand unique tools

  14. Psychometric properties of a Mental Health Team Development Audit Tool.

    LENUS (Irish Health Repository)

    Roncalli, Silvia

    2013-02-01

    To assist in improving team working in Community Mental Health Teams (CMHTs), the Mental Health Commission formulated a user-friendly but yet-to-be validated 25-item Mental Health Team Development Audit Tool (MHDAT).

  15. Development of a multi-analyte integrated optical sensor platform for indoor air-quality monitoring

    Science.gov (United States)

    McGaughey, Orla; Nooney, Robert; McEvoy, Aisling K.; McDonagh, Colette; MacCraith, Brian D.

    2005-11-01

    The major trends driving optical chemical sensor technology are miniaturisation and multi-parameter functionality on a single platform (so-called multi-analyte sensing). A multi-analyte sensor chip device based on miniature waveguide structures, porous sensor materials and compact optoelectronic components has been developed. One of the major challenges in fluorescence-based optical sensor design is the efficient capture of emitted fluorescence from a fluorophore and the effective detection of the signal. In this work, the sensor platform has been fabricated using poly(methyl methacrylate), PMMA, as the waveguide material. These platforms employ a novel optical configuration along with rapid prototyping technology, which facilitates the production of an effective sensor platform. Sensing films for oxygen, carbon dioxide and humidity have been developed. These films consist of a fluorescent indicator dye entrapped in a porous immobilisation matrix. The analyte diffuses through the porous matrix and reacts with the indicator dye, causing changes in the detected fluorescence. The reaction between the dye and the analyte is completely reversible with no degradation of the signal after detection of different concentrations of the analyte. A single LED excitation source is used for all three analytes, and the sensor platform is housed in a compact unit containing the excitation source, filters and detector. The simultaneous detection of several analytes is a major requirement for fields such as food packaging, environmental quality control and biomedical diagnostics. The current sensor chip is designed for use in indoor air-quality monitoring.

  16. BAC-Dkk3-EGFP Transgenic Mouse: An In Vivo Analytical Tool for Dkk3 Expression

    Directory of Open Access Journals (Sweden)

    Yuki Muranishi

    2012-01-01

    Full Text Available Dickkopf (DKK family proteins are secreted modulators of the Wnt signaling pathway and are capable of regulating the development of many organs and tissues. We previously identified Dkk3 to be a molecule predominantly expressed in the mouse embryonic retina. However, which cell expresses Dkk3 in the developing and mature mouse retina remains to be elucidated. To examine the precise expression of the Dkk3 protein, we generated BAC-Dkk3-EGFP transgenic mice that express EGFP integrated into the Dkk3 gene in a BAC plasmid. Expression analysis using the BAC-Dkk3-EGFP transgenic mice revealed that Dkk3 is expressed in retinal progenitor cells (RPCs at embryonic stages and in Müller glial cells in the adult retina. Since Müller glial cells may play a potential role in retinal regeneration, BAC-Dkk3-EGFP mice could be useful for retinal regeneration studies.

  17. ISS Biotechnology Facility - Overview of Analytical Tools for Cellular Biotechnology Investigations

    Science.gov (United States)

    Jeevarajan, A. S.; Towe, B. C.; Anderson, M. M.; Gonda, S. R.; Pellis, N. R.

    2001-01-01

    The ISS Biotechnology Facility (BTF) platform provides scientists with a unique opportunity to carry out diverse experiments in a microgravity environment for an extended period of time. Although considerable progress has been made in preserving cells on the ISS for long periods of time for later return to Earth, future biotechnology experiments would desirably monitor, process, and analyze cells in a timely way on-orbit. One aspect of our work has been directed towards developing biochemical sensors for pH, glucose, oxygen, and carbon dioxide for perfused bioreactor system developed at Johnson Space Center. Another aspect is the examination and identification of new and advanced commercial biotechnologies that may have applications to on-orbit experiments.

  18. Completely Analytical Tools for the Next Generation of Surface and Coating Optimization

    Directory of Open Access Journals (Sweden)

    Norbert Schwarzer

    2014-04-01

    Full Text Available Usually, some severe efforts are required to obtain tribological parameters like Archard’s wear depth parameter kd. Complex tribological experiments have to be performed and analyzed. The paper features an approach where such parameters are extracted from effective interaction potentials in combination with more physical-oriented measurements, such as Nanoindentation and physical scratch. Thereby, the effective potentials are built up and fed from such tests. By using effective material potentials one can derive critical loading situations leading to failure (decomposition strength for any contact situation. A subsequent connection of these decomposition or failure states with the corresponding stress or strain distributions allows the development of rather comprehensive tribological parameter models, applicable in wear and fatigue simulations, as demonstrated in this work. From this, a new relatively general wear model has been developed on the basis of the effective indenter concept by using the extended Hertzian approach for a great variety of loading situations. The models do not only allow to analyze certain tribological experiments, such as the well known pin-on disk test or the more recently developed nano-fretting test, but also to forward simulate such tests and even give hints for structured optimization or result in better component life-time prediction. The work will show how the procedure has to be applied in general and a small selection of practical examples will be presented.

  19. Effectiveness of operation tools developed by KEKB operators

    International Nuclear Information System (INIS)

    Sugino, K.; Satoh, Y.; Kitabayashi, T.

    2004-01-01

    The main tasks of KEKB (High Energy Accelerator Research Organization B-physics) operators are beam tuning and injection, operation logging, monitoring of accelerator conditions and safety management. New beam tuning methods are frequently applied to KEKB in order to accomplish high luminosity. In such a situation, various operation tools have been developed by the operators to realize efficient operation. In this paper, we describe effectiveness of tools developed by the operators. (author)

  20. Measuring vaccine hesitancy: The development of a survey tool.

    OpenAIRE

    Larson, HJ; Jarrett, C; Schulz, WS; Chaudhuri, M; Zhou, Y; Dube, E; Schuster, M; MacDonald, NE; Wilson, R; SAGE Working Group on Vaccine Hesitancy,; , COLLABORATORS; Eskola, J; Liang, X; Chaudhuri, M; Dubé, E

    2015-01-01

    : In March 2012, the SAGE Working Group on Vaccine Hesitancy was convened to define the term "vaccine hesitancy", as well as to map the determinants of vaccine hesitancy and develop tools to measure and address the nature and scale of hesitancy in settings where it is becoming more evident. The definition of vaccine hesitancy and a matrix of determinants guided the development of a survey tool to assess the nature and scale of hesitancy issues. Additionally, vaccine hesitancy questi...

  1. Developments and automation in purex process control analytical measurement systems (Preprint no. IT-20)

    International Nuclear Information System (INIS)

    Ramanujam, A.

    1991-02-01

    The fuel reprocessing facility based on purex process depends on efficient process control analytical measurement systems for its successful operation. The process control laboratory plays a vital role in catering to these requirements. This paper describes the various efforts put in to improve its performance capabilities in three major areas of operation, viz. sample handling, analytical and data processing. In developing automation aids and analytical techniques, apart from the special emphasis put on reduction in personnel exposure to radiation and time required for analysis, due consideration has been given to operational reliability and safety of the system. (author). 15 refs., 4 tabs., 3 figs

  2. Review of the Development of Learning Analytics Applied in College-Level Institutes

    Directory of Open Access Journals (Sweden)

    Ken-Zen Chen

    2014-07-01

    Full Text Available This article focuses on the recent development of Learning Analytics using higher education institutional big-data. It addresses current state of Learning Analytics, creates a shared understanding, and clarifies misconceptions about the field. This article also reviews prominent examples from peer institutions that are conducting analytics, identifies their data and methodological framework, and comments on market vendors and non-for-profit initiatives. Finally, it suggests an implementation agenda for potential institutions and their stakeholders by drafting necessary preparations and creating iterative implementation flows.

  3. Work and Learner Identity -Developing an analytical framework

    DEFF Research Database (Denmark)

    Kondrup, Sissel

    to comprehend work situations as crucial spaces for learning and for the continuing development, maintenance or transformation of identity. Therefore it is necessary to bring work to the forefront of analysis and focus on peoples’ work-life-experiences when trying to understand how they perceive themselves...... condition people have to comply with the demand for engagement in lifelong learning. Secondly I argue, that peoples’ engagement in work must be considered essential when wanting to understand and examine how learner identities are formed, maintained or even transformed throughout peoples’ life course...... between work and Identity. Based on Archer’s and Salling-Olesen’s concepts I finally outline a theoretical framework enabling researchers to understand and examine the dialectical nature of learner identities, formed through peoples on-going engagement in specific historical, social and material work...

  4. Active content determination of pharmaceutical tablets using near infrared spectroscopy as Process Analytical Technology tool.

    Science.gov (United States)

    Chavez, Pierre-François; Sacré, Pierre-Yves; De Bleye, Charlotte; Netchacovitch, Lauranne; Mantanus, Jérôme; Motte, Henri; Schubert, Martin; Hubert, Philippe; Ziemons, Eric

    2015-11-01

    The aim of this study was to develop Near infrared (NIR) methods to determine the active content of non-coated pharmaceutical tablets manufactured from a proportional tablet formulation. These NIR methods intend to be used for the monitoring of the active content of tablets during the tableting process. Firstly, methods were developed in transmission and reflection modes to quantify the API content of the lowest dosage strength. Secondly, these methods were fully validated for a concentration range of 70-130% of the target active content using the accuracy profile approach based on β-expectation tolerance intervals. The model using the transmission mode showed a better ability to predict the right active content compared to the reflection one. However, the ability of the reflection mode to quantify the API content in the highest dosage strength was assessed. Furthermore, the NIR method based on the transmission mode was successfully used to monitor at-line the tablet active content during the tableting process, providing better insight of the API content during the process. This improvement of control of the product quality provided by this PAT method is thoroughly compliant with the Quality by Design (QbD) concept. Finally, the transfer of the transmission model from the off-line to an on-line spectrometer was efficiently investigated. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. A Decision-Analytic Feasibility Study of Upgrading Machinery at a Tools Workshop

    Directory of Open Access Journals (Sweden)

    M. L. Chew Hernandez

    2012-04-01

    Full Text Available This paper presents the evaluation, from a Decision Analysis point of view, of the feasibility of upgrading machinery at an existing metal-forming workshop. The Integral Decision Analysis (IDA methodology is applied to clarify the decision and develop a decision model. One of the key advantages of the IDA is its careful selection of the problem frame, allowing a correct problem definition. While following most of the original IDA methodology, an addition to this methodology is proposed in this work, that of using the strategic Means-Ends Objective Network as a backbone for the development of the decision model. The constructed decision model uses influence diagrams to include factual operator and vendor expertise, simulation to evaluate the alternatives and a utility function to take into account the risk attitude of the decision maker. Three alternatives are considered: Base (no modification, CNC (installing an automatic lathe and CF (installation of an automatic milling machine. The results are presented as a graph showing zones in which a particular alternative should be selected. The results show the potential of IDA to tackle technical decisions that are otherwise approached without the due care.

  6. Filmes de metal-hexacianoferrato: uma ferramenta em química analítica Metal-hexacyanoferrate films: a tool in analytical Chemistry

    Directory of Open Access Journals (Sweden)

    Ivanildo Luiz de Mattos

    2001-04-01

    Full Text Available Chemically modified electrodes based on hexacyanometalate films are presented as a tool in analytical chemistry. Use of amperometric sensors and/or biosensors based on the metal-hexacyanoferrate films is a tendency. This article reviews some applications of these films for analytical determination of both inorganic (e.g. As3+, S2O3(2- and organic (e.g. cysteine, hydrazine, ascorbic acid, gluthatione, glucose, etc. compounds.

  7. Development of an analytical microbial consortia method for enhancing performance monitoring at aerobic wastewater treatment plants.

    Science.gov (United States)

    Razban, Behrooz; Nelson, Kristina Y; McMartin, Dena W; Cullimore, D Roy; Wall, Michelle; Wang, Dunling

    2012-01-01

    An analytical method to produce profiles of bacterial biomass fatty acid methyl esters (FAME) was developed employing rapid agitation followed by static incubation (RASI) using selective media of wastewater microbial communities. The results were compiled to produce a unique library for comparison and performance analysis at a Wastewater Treatment Plant (WWTP). A total of 146 samples from the aerated WWTP, comprising 73 samples of each secondary and tertiary effluent, were included analyzed. For comparison purposes, all samples were evaluated via a similarity index (SI) with secondary effluents producing an SI of 0.88 with 2.7% variation and tertiary samples producing an SI 0.86 with 5.0% variation. The results also highlighted significant differences between the fatty acid profiles of the tertiary and secondary effluents indicating considerable shifts in the bacterial community profile between these treatment phases. The WWTP performance results using this method were highly replicable and reproducible indicating that the protocol has potential as a performance-monitoring tool for aerated WWTPs. The results quickly and accurately reflect shifts in dominant bacterial communities that result when processes operations and performance change.

  8. THz spectroscopy: An emerging technology for pharmaceutical development and pharmaceutical Process Analytical Technology (PAT) applications

    Science.gov (United States)

    Wu, Huiquan; Khan, Mansoor

    2012-08-01

    As an emerging technology, THz spectroscopy has gained increasing attention in the pharmaceutical area during the last decade. This attention is due to the fact that (1) it provides a promising alternative approach for in-depth understanding of both intermolecular interaction among pharmaceutical molecules and pharmaceutical product quality attributes; (2) it provides a promising alternative approach for enhanced process understanding of certain pharmaceutical manufacturing processes; and (3) the FDA pharmaceutical quality initiatives, most noticeably, the Process Analytical Technology (PAT) initiative. In this work, the current status and progress made so far on using THz spectroscopy for pharmaceutical development and pharmaceutical PAT applications are reviewed. In the spirit of demonstrating the utility of first principles modeling approach for addressing model validation challenge and reducing unnecessary model validation "burden" for facilitating THz pharmaceutical PAT applications, two scientific case studies based on published THz spectroscopy measurement results are created and discussed. Furthermore, other technical challenges and opportunities associated with adapting THz spectroscopy as a pharmaceutical PAT tool are highlighted.

  9. Analytical tools for the study of cellular glycosylation in the immune system

    Directory of Open Access Journals (Sweden)

    Yvette eVan Kooyk

    2013-12-01

    Full Text Available It is becoming increasingly clear that glycosylation plays important role in intercellular communication within the immune system. Glycosylation-dependent interactions are crucial for the innate and adaptive immune system and regulate immune cell trafficking, synapse formation, activation, and survival. These functions take place by the cis or trans interaction of lectins with glycans. Classical immunological and biochemical methods have been used for the study of lectin function; however, the investigation of their counterparts, glycans, requires very specialized methodologies that have been extensively developed in the past decade within the Glycobiology scientific community. This Mini-Review intends to summarize the available technology for the study of glycan biosynthesis, its regulation and characterization for their application to the study of glycans in Immunology.

  10. Temperature-controlled micro-TLC: a versatile green chemistry and fast analytical tool for separation and preliminary screening of steroids fraction from biological and environmental samples.

    Science.gov (United States)

    Zarzycki, Paweł K; Slączka, Magdalena M; Zarzycka, Magdalena B; Bartoszuk, Małgorzata A; Włodarczyk, Elżbieta; Baran, Michał J

    2011-11-01

    This paper is a continuation of our previous research focusing on development of micro-TLC methodology under temperature-controlled conditions. The main goal of present paper is to demonstrate separation and detection capability of micro-TLC technique involving simple analytical protocols without multi-steps sample pre-purification. One of the advantages of planar chromatography over its column counterpart is that each TLC run can be performed using non-previously used stationary phase. Therefore, it is possible to fractionate or separate complex samples characterized by heavy biological matrix loading. In present studies components of interest, mainly steroids, were isolated from biological samples like fish bile using single pre-treatment steps involving direct organic liquid extraction and/or deproteinization by freeze-drying method. Low-molecular mass compounds with polarity ranging from estetrol to progesterone derived from the environmental samples (lake water, untreated and treated sewage waters) were concentrated using optimized solid-phase extraction (SPE). Specific bands patterns for samples derived from surface water of the Middle Pomerania in northern part of Poland can be easily observed on obtained micro-TLC chromatograms. This approach can be useful as simple and non-expensive complementary method for fast control and screening of treated sewage water discharged by the municipal wastewater treatment plants. Moreover, our experimental results show the potential of micro-TLC as an efficient tool for retention measurements of a wide range of steroids under reversed-phase (RP) chromatographic conditions. These data can be used for further optimalization of SPE or HPLC systems working under RP conditions. Furthermore, we also demonstrated that micro-TLC based analytical approach can be applied as an effective method for the internal standard (IS) substance search. Generally, described methodology can be applied for fast fractionation or screening of the

  11. Development Of Remote Hanford Connector Gasket Replacement Tooling For DWPF

    International Nuclear Information System (INIS)

    Krementz, D.; Coughlin, Jeffrey

    2009-01-01

    The Defense Waste Processing Facility (DWPF) requested the Savannah River National Laboratory (SRNL) to develop tooling and equipment to remotely replace gaskets in mechanical Hanford connectors to reduce personnel radiation exposure as compared to the current hands-on method. It is also expected that radiation levels will continually increase with future waste streams. The equipment is operated in the Remote Equipment Decontamination Cell (REDC), which is equipped with compressed air, two master-slave manipulators (MSM's) and an electro-mechanical manipulator (EMM) arm for operation of the remote tools. The REDC does not provide access to electrical power, so the equipment must be manually or pneumatically operated. The MSM's have a load limit at full extension of ten pounds, which limited the weight of the installation tool. In order to remotely replace Hanford connector gaskets several operations must be performed remotely, these include: removal of the spent gasket and retaining ring (retaining ring is also called snap ring), loading the new snap ring and gasket into the installation tool and installation of the new gasket into the Hanford connector. SRNL developed and tested tools that successfully perform all of the necessary tasks. Removal of snap rings from horizontal and vertical connectors is performed by separate air actuated retaining ring removal tools and is manipulated in the cell by the MSM. In order install a new gasket, the snap ring loader is used to load a new snap ring into a groove in the gasket installation tool. A new gasket is placed on the installation tool and retained by custom springs. An MSM lifts the installation tool and presses the mounted gasket against the connector block. Once the installation tool is in position, the gasket and snap ring are installed onto the connector by pneumatic actuation. All of the tools are located on a custom work table with a pneumatic valve station that directs compressed air to the desired tool and

  12. DEVELOPMENT OF REMOTE HANFORD CONNECTOR GASKET REPLACEMENT TOOLING FOR DWPF

    Energy Technology Data Exchange (ETDEWEB)

    Krementz, D.; Coughlin, Jeffrey

    2009-05-05

    The Defense Waste Processing Facility (DWPF) requested the Savannah River National Laboratory (SRNL) to develop tooling and equipment to remotely replace gaskets in mechanical Hanford connectors to reduce personnel radiation exposure as compared to the current hands-on method. It is also expected that radiation levels will continually increase with future waste streams. The equipment is operated in the Remote Equipment Decontamination Cell (REDC), which is equipped with compressed air, two master-slave manipulators (MSM's) and an electro-mechanical manipulator (EMM) arm for operation of the remote tools. The REDC does not provide access to electrical power, so the equipment must be manually or pneumatically operated. The MSM's have a load limit at full extension of ten pounds, which limited the weight of the installation tool. In order to remotely replace Hanford connector gaskets several operations must be performed remotely, these include: removal of the spent gasket and retaining ring (retaining ring is also called snap ring), loading the new snap ring and gasket into the installation tool and installation of the new gasket into the Hanford connector. SRNL developed and tested tools that successfully perform all of the necessary tasks. Removal of snap rings from horizontal and vertical connectors is performed by separate air actuated retaining ring removal tools and is manipulated in the cell by the MSM. In order install a new gasket, the snap ring loader is used to load a new snap ring into a groove in the gasket installation tool. A new gasket is placed on the installation tool and retained by custom springs. An MSM lifts the installation tool and presses the mounted gasket against the connector block. Once the installation tool is in position, the gasket and snap ring are installed onto the connector by pneumatic actuation. All of the tools are located on a custom work table with a pneumatic valve station that directs compressed air to the desired

  13. Analytic calculation of radio emission from parametrized extensive air showers: A tool to extract shower parameters

    Science.gov (United States)

    Scholten, O.; Trinh, T. N. G.; de Vries, K. D.; Hare, B. M.

    2018-01-01

    The radio intensity and polarization footprint of a cosmic-ray induced extensive air shower is determined by the time-dependent structure of the current distribution residing in the plasma cloud at the shower front. In turn, the time dependence of the integrated charge-current distribution in the plasma cloud, the longitudinal shower structure, is determined by interesting physics which one would like to extract, such as the location and multiplicity of the primary cosmic-ray collision or the values of electric fields in the atmosphere during thunderstorms. To extract the structure of a shower from its footprint requires solving a complicated inverse problem. For this purpose we have developed a code that semianalytically calculates the radio footprint of an extensive air shower given an arbitrary longitudinal structure. This code can be used in an optimization procedure to extract the optimal longitudinal shower structure given a radio footprint. On the basis of air-shower universality we propose a simple parametrization of the structure of the plasma cloud. This parametrization is based on the results of Monte Carlo shower simulations. Deriving the parametrization also teaches which aspects of the plasma cloud are important for understanding the features seen in the radio-emission footprint. The calculated radio footprints are compared with microscopic CoREAS simulations.

  14. SASfit: a tool for small-angle scattering data analysis using a library of analytical expressions.

    Science.gov (United States)

    Breßler, Ingo; Kohlbrecher, Joachim; Thünemann, Andreas F

    2015-10-01

    SASfit is one of the mature programs for small-angle scattering data analysis and has been available for many years. This article describes the basic data processing and analysis workflow along with recent developments in the SASfit program package (version 0.94.6). They include (i) advanced algorithms for reduction of oversampled data sets, (ii) improved confidence assessment in the optimized model parameters and (iii) a flexible plug-in system for custom user-provided models. A scattering function of a mass fractal model of branched polymers in solution is provided as an example for implementing a plug-in. The new SASfit release is available for major platforms such as Windows, Linux and MacOS. To facilitate usage, it includes comprehensive indexed documentation as well as a web-based wiki for peer collaboration and online videos demonstrating basic usage. The use of SASfit is illustrated by interpretation of the small-angle X-ray scattering curves of monomodal gold nanoparticles (NIST reference material 8011) and bimodal silica nanoparticles (EU reference material ERM-FD-102).

  15. Analytical support of plant specific SAMG development validation of SAMG using MELCOR 1.8.5

    International Nuclear Information System (INIS)

    Duspiva, Jiri

    2006-01-01

    They are two NPPs in operation in Czech Republic. Both of NPPs operated in CR have already implemented EOPs, developed under collaboration with the WESE. The project on SAMG development has started and follows the previous one for EOPs also with the WESE as the leading organization. Plant specific SAMGs for the Temelin as well as Dukovany NPPs are based on the WOG generic SAMGs. The analytical support of plant specific SAMGs development is performed by the NRI Rez within the validation process. Basic conditions as well as their filling by NRI Rez are focused on analyst, analytical tools and their applications. More detail description is attended to the approach of the preparation of the MELCOR code application to the evaluation of hydrogen risk, validation of recent set of hydrogen passive autocatalytic recombiners and definition of proposals to amend system of hydrogen removal. Such kind of parametric calculations will request to perform very wide set of runs. It could not be possible with the whole plant model and decoupling of such calculation with storing of mass and energy sources into the containment is only one way. The example of this decoupling for the LOCA scenario is shown. It includes seven sources - heat losses from primary and secondary circuits, fluid blowndown through cold leg break, fission products blowndown through cold leg break, fluid blowndown through break in reactor pressure vessel bottom head, fission products through break in reactor pressure vessel bottom head, melt ejection from reactor pressure vessel to cavity and gas masses and heat losses from corium in cavity. The stand alone containment analysis was tested in two configurations - with or without taking of fission products into account. Testing showed very good agreement of all calculations until lower head failure and acceptable agreement after that. Also some problematic features appeared. The stand alone test with fission product was possible only after the changes in source code

  16. CorRECTreatment: a web-based decision support tool for rectal cancer treatment that uses the analytic hierarchy process and decision tree.

    Science.gov (United States)

    Suner, A; Karakülah, G; Dicle, O; Sökmen, S; Çelikoğlu, C C

    2015-01-01

    The selection of appropriate rectal cancer treatment is a complex multi-criteria decision making process, in which clinical decision support systems might be used to assist and enrich physicians' decision making. The objective of the study was to develop a web-based clinical decision support tool for physicians in the selection of potentially beneficial treatment options for patients with rectal cancer. The updated decision model contained 8 and 10 criteria in the first and second steps respectively. The decision support model, developed in our previous study by combining the Analytic Hierarchy Process (AHP) method which determines the priority of criteria and decision tree that formed using these priorities, was updated and applied to 388 patients data collected retrospectively. Later, a web-based decision support tool named corRECTreatment was developed. The compatibility of the treatment recommendations by the expert opinion and the decision support tool was examined for its consistency. Two surgeons were requested to recommend a treatment and an overall survival value for the treatment among 20 different cases that we selected and turned into a scenario among the most common and rare treatment options in the patient data set. In the AHP analyses of the criteria, it was found that the matrices, generated for both decision steps, were consistent (consistency ratiodecisions of experts, the consistency value for the most frequent cases was found to be 80% for the first decision step and 100% for the second decision step. Similarly, for rare cases consistency was 50% for the first decision step and 80% for the second decision step. The decision model and corRECTreatment, developed by applying these on real patient data, are expected to provide potential users with decision support in rectal cancer treatment processes and facilitate them in making projections about treatment options.

  17. Development and Application of Predictive Tools for MHD Stability Limits in Tokamaks

    Energy Technology Data Exchange (ETDEWEB)

    Brennan, Dylan [Princeton Univ., NJ (United States); Miller, G. P. [Univ. of Tulsa, Tulsa, AZ (United States)

    2016-10-03

    This is a project to develop and apply analytic and computational tools to answer physics questions relevant to the onset of non-ideal magnetohydrodynamic (MHD) instabilities in toroidal magnetic confinement plasmas. The focused goal of the research is to develop predictive tools for these instabilities, including an inner layer solution algorithm, a resistive wall with control coils, and energetic particle effects. The production phase compares studies of instabilities in such systems using analytic techniques, PEST- III and NIMROD. Two important physics puzzles are targeted as guiding thrusts for the analyses. The first is to form an accurate description of the physics determining whether the resistive wall mode or a tearing mode will appear first as β is increased at low rotation and low error fields in DIII-D. The second is to understand the physical mechanism behind recent NIMROD results indicating strong damping and stabilization from energetic particle effects on linear resistive modes. The work seeks to develop a highly relevant predictive tool for ITER, advance the theoretical description of this physics in general, and analyze these instabilities in experiments such as ASDEX Upgrade, DIII-D, JET, JT-60U and NTSX. The awardee on this grant is the University of Tulsa. The research efforts are supervised principally by Dr. Brennan. Support is included for two graduate students, and a strong collaboration with Dr. John M. Finn of LANL. The work includes several ongoing collaborations with General Atomics, PPPL, and the NIMROD team, among others.

  18. Embedded Systems Development Tools: A MODUS-oriented Market Overview

    Directory of Open Access Journals (Sweden)

    Loupis Michalis

    2014-03-01

    Full Text Available Background: The embedded systems technology has perhaps been the most dominating technology in high-tech industries, in the past decade. The industry has correctly identified the potential of this technology and has put its efforts into exploring its full potential. Objectives: The goal of the paper is to explore the versatility of the application in the embedded system development based on one FP7-SME project. Methods/Approach: Embedded applications normally demand high resilience and quality, as well as conformity to quality standards and rigid performance. As a result embedded system developers have adopted software methods that yield high quality. The qualitative approach to examining embedded systems development tools has been applied in this work. Results: This paper presents a MODUS-oriented market analysis in the domains of Formal Verification tools, HW/SW co-simulation tools, Software Performance Optimization tools and Code Generation tools. Conclusions: The versatility of applications this technology serves is amazing. With all this performance potential, the technology has carried with itself a large number of issues which the industry essentially needs to resolve to be able to harness the full potential contained. The MODUS project toolset addressed four discrete domains of the ESD Software Market, in which corresponding open tools were developed

  19. Analytical Validation of Accelerator Mass Spectrometry for Pharmaceutical Development: the Measurement of Carbon-14 Isotope Ratio.

    Energy Technology Data Exchange (ETDEWEB)

    Keck, B D; Ognibene, T; Vogel, J S

    2010-02-05

    Accelerator mass spectrometry (AMS) is an isotope based measurement technology that utilizes carbon-14 labeled compounds in the pharmaceutical development process to measure compounds at very low concentrations, empowers microdosing as an investigational tool, and extends the utility of {sup 14}C labeled compounds to dramatically lower levels. It is a form of isotope ratio mass spectrometry that can provide either measurements of total compound equivalents or, when coupled to separation technology such as chromatography, quantitation of specific compounds. The properties of AMS as a measurement technique are investigated here, and the parameters of method validation are shown. AMS, independent of any separation technique to which it may be coupled, is shown to be accurate, linear, precise, and robust. As the sensitivity and universality of AMS is constantly being explored and expanded, this work underpins many areas of pharmaceutical development including drug metabolism as well as absorption, distribution and excretion of pharmaceutical compounds as a fundamental step in drug development. The validation parameters for pharmaceutical analyses were examined for the accelerator mass spectrometry measurement of {sup 14}C/C ratio, independent of chemical separation procedures. The isotope ratio measurement was specific (owing to the {sup 14}C label), stable across samples storage conditions for at least one year, linear over 4 orders of magnitude with an analytical range from one tenth Modern to at least 2000 Modern (instrument specific). Further, accuracy was excellent between 1 and 3 percent while precision expressed as coefficient of variation is between 1 and 6% determined primarily by radiocarbon content and the time spent analyzing a sample. Sensitivity, expressed as LOD and LLOQ was 1 and 10 attomoles of carbon-14 (which can be expressed as compound equivalents) and for a typical small molecule labeled at 10% incorporated with {sup 14}C corresponds to 30 fg

  20. Measurement of very low amounts of arsenic in soils and waters: is ICP-MS the indispensable analytical tool?

    Science.gov (United States)

    López-García, Ignacio; Marín-Hernández, Juan Jose; Perez-Sirvent, Carmen; Hernandez-Cordoba, Manuel

    2017-04-01

    The toxicity of arsenic and its wide distribution in the nature needs nowadays not to be emphasized, and the convenience of reliable analytical tools for arsenic determination at very low levels is clear. Leaving aside atomic fluorescence spectrometers specifically designed for this purpose, the task is currently carried out by using inductively coupled plasma mass spectrometry (ICP-MS), a powerful but expensive technique that is not available in all laboratories. However, as the recent literature clearly shows, a similar or even better analytical performance for the determination of several elements can be achieved by replacing the ICP-MS instrument by an AAS spectrometer (which is commonly present in any laboratory and involves low acquisition and maintenance costs) provided that a simple microextraction step is used to preconcentrate the sample. This communication reports the optimization and results obtained with a new analytical procedure based on this idea and focused to the determination of very low concentrations of arsenic in waters and extracts from soils and sediments. The procedure is based on a micro-solid phase extraction process for the separation and preconcentration of arsenic that uses magnetic particles covered with silver nanoparticles functionalized with the sodium salt of 2-mercaptoethane-sulphonate (MESNa). This composite is obtained in an easy way in the laboratory. After the sample is treated with a low amount (only a few milligrams) of the magnetic material, the solid phase is separated by means of a magnetic field, and then introduced into an electrothermal atomizer (ETAAS) for arsenic determination. The preconcentration factor is close to 200 with a detection limit below 0.1 µg L-1 arsenic. Speciation of As(III) and As(V) can be achieved by means of two extractions carried out at different acidity. The results for total arsenic are verified using certified reference materials. The authors are grateful to the Comunidad Autonóma de la

  1. Developing Free and Open Source Interactive Teaching Tools

    Science.gov (United States)

    Nelson, E.

    2016-12-01

    Online learning has become an embedded component of education, but existing resources are often provided as institution-hosted content management systems (that may or may not be closed source). Creating interactive online applets to enhance student education is an alternative to these limited-customization systems that can be accomplished on a small budget. This presentation will break down the anatomy of author-developed online teaching tools created with open source packages to provide a survey of the development tools utilized—from the underlying website framework to interfacing with the scientific data. The availability of hosting and maintaining interactive teaching tools, whether static or dynamic, on no- or low-cost platforms will also be discussed. By constructing an interactive teaching tool from the ground up, scientists and educators are afforded complete flexibility and creativity in the design.

  2. The environment power system analysis tool development program

    Science.gov (United States)

    Jongeward, Gary A.; Kuharski, Robert A.; Kennedy, Eric M.; Stevens, N. John; Putnam, Rand M.; Roche, James C.; Wilcox, Katherine G.

    1990-01-01

    The Environment Power System Analysis Tool (EPSAT) is being developed to provide space power system design engineers with an analysis tool for determining system performance of power systems in both naturally occurring and self-induced environments. The program is producing an easy to use computer aided engineering (CAE) tool general enough to provide a vehicle for technology transfer from space scientists and engineers to power system design engineers. The results of the project after two years of a three year development program are given. The EPSAT approach separates the CAE tool into three distinct functional units: a modern user interface to present information, a data dictionary interpreter to coordinate analysis; and a data base for storing system designs and results of analysis.

  3. The Development of a Tool for Sustainable Building Design:

    DEFF Research Database (Denmark)

    Tine Ring Hansen, Hanne; Knudstrup, Mary-Ann

    2009-01-01

    architecture will gain more focus in the coming years, thus, establishing the need for the development of a new tool and methodology, The paper furthermore describes the background and considerations involved in the development of a design support tool for sustainable building design. A tool which considers...... for sustainable buildings, as well as, an analysis of the relationship between the different approaches (e.g. low-energy, environmental, green building, solar architecture, bio-climatic architecture etc.) to sustainable building design and these indicators. The paper furthermore discusses how sustainable...... the context that the building is located in, as well as, a tool which facilitates the discussion of which type of sustainability is achieved in specific projects....

  4. Developing and Validating a New Classroom Climate Observation Assessment Tool.

    Science.gov (United States)

    Leff, Stephen S; Thomas, Duane E; Shapiro, Edward S; Paskewich, Brooke; Wilson, Kim; Necowitz-Hoffman, Beth; Jawad, Abbas F

    2011-01-01

    The climate of school classrooms, shaped by a combination of teacher practices and peer processes, is an important determinant for children's psychosocial functioning and is a primary factor affecting bullying and victimization. Given that there are relatively few theoretically-grounded and validated assessment tools designed to measure the social climate of classrooms, our research team developed an observation tool through participatory action research (PAR). This article details how the assessment tool was designed and preliminarily validated in 18 third-, fourth-, and fifth-grade classrooms in a large urban public school district. The goals of this study are to illustrate the feasibility of a PAR paradigm in measurement development, ascertain the psychometric properties of the assessment tool, and determine associations with different indices of classroom levels of relational and physical aggression.

  5. Computational Tools for Accelerating Carbon Capture Process Development

    Energy Technology Data Exchange (ETDEWEB)

    Miller, David

    2013-01-01

    The goals of the work reported are: to develop new computational tools and models to enable industry to more rapidly develop and deploy new advanced energy technologies; to demonstrate the capabilities of the CCSI Toolset on non-proprietary case studies; and to deploy the CCSI Toolset to industry. Challenges of simulating carbon capture (and other) processes include: dealing with multiple scales (particle, device, and whole process scales); integration across scales; verification, validation, and uncertainty; and decision support. The tools cover: risk analysis and decision making; validated, high-fidelity CFD; high-resolution filtered sub-models; process design and optimization tools; advanced process control and dynamics; process models; basic data sub-models; and cross-cutting integration tools.

  6. Development of a data capture tool for researching tech entrepreneurship

    DEFF Research Database (Denmark)

    Andersen, Jakob Axel Bejbro; Howard, Thomas J.; McAloone, Tim C.

    2014-01-01

    Startups play a crucial role in exploiting the commercial advantages created by new, advanced technologies. Surprisingly, the processes by which the entrepreneur commercialises these technologies are largely undescribed - partly due to the absence of appropriate process data capture tools....... This paper elucidates the requirements for such tools by drawing on knowledge of the entrepreneurial phenomenon and by building on the existing research tools used in design research. On this basis, the development of a capture method for tech startup processes is described and its potential discussed....

  7. Large scale experiments as a tool for numerical model development

    DEFF Research Database (Denmark)

    Kirkegaard, Jens; Hansen, Erik Asp; Fuchs, Jesper

    2003-01-01

    for improvement of the reliability of physical model results. This paper demonstrates by examples that numerical modelling benefits in various ways from experimental studies (in large and small laboratory facilities). The examples range from very general hydrodynamic descriptions of wave phenomena to specific......Experimental modelling is an important tool for study of hydrodynamic phenomena. The applicability of experiments can be expanded by the use of numerical models and experiments are important for documentation of the validity of numerical tools. In other cases numerical tools can be applied...... hydrodynamic interaction with structures. The examples also show that numerical model development benefits from international co-operation and sharing of high quality results....

  8. Waste Tank Organic Safety Program: Analytical methods development. Progress report, FY 1994

    International Nuclear Information System (INIS)

    Campbell, J.A.; Clauss, S.A.; Grant, K.E.

    1994-09-01

    The objectives of this task are to develop and document extraction and analysis methods for organics in waste tanks, and to extend these methods to the analysis of actual core samples to support the Waste Tank organic Safety Program. This report documents progress at Pacific Northwest Laboratory (a) during FY 1994 on methods development, the analysis of waste from Tank 241-C-103 (Tank C-103) and T-111, and the transfer of documented, developed analytical methods to personnel in the Analytical Chemistry Laboratory (ACL) and 222-S laboratory. This report is intended as an annual report, not a completed work

  9. Microfield exposure tool enables advances in EUV lithography development

    Energy Technology Data Exchange (ETDEWEB)

    Naulleau, Patrick

    2009-09-07

    With demonstrated resist resolution of 20 nm half pitch, the SEMATECH Berkeley BUV microfield exposure tool continues to push crucial advances in the areas of BUY resists and masks. The ever progressing shrink in computer chip feature sizes has been fueled over the years by a continual reduction in the wavelength of light used to pattern the chips. Recently, this trend has been threatened by unavailability of lens materials suitable for wavelengths shorter than 193 nm. To circumvent this roadblock, a reflective technology utilizing a significantly shorter extreme ultraviolet (EUV) wavelength (13.5 nm) has been under development for the past decade. The dramatic wavelength shrink was required to compensate for optical design limitations intrinsic in mirror-based systems compared to refractive lens systems. With this significant reduction in wavelength comes a variety of new challenges including developing sources of adequate power, photoresists with suitable resolution, sensitivity, and line-edge roughness characteristics, as well as the fabrication of reflection masks with zero defects. While source development can proceed in the absence of available exposure tools, in order for progress to be made in the areas of resists and masks it is crucial to have access to advanced exposure tools with resolutions equal to or better than that expected from initial production tools. These advanced development tools, however, need not be full field tools. Also, implementing such tools at synchrotron facilities allows them to be developed independent of the availability of reliable stand-alone BUY sources. One such tool is the SEMATECH Berkeley microfield exposure tool (MET). The most unique attribute of the SEMA TECH Berkeley MET is its use of a custom-coherence illuminator made possible by its implementation on a synchrotron beamline. With only conventional illumination and conventional binary masks, the resolution limit of the 0.3-NA optic is approximately 25 nm, however

  10. Method Engineering: Engineering of Information Systems Development Methods and Tools

    OpenAIRE

    Brinkkemper, J.N.; Brinkkemper, Sjaak

    1996-01-01

    This paper proposes the term method engineering for the research field of the construction of information systems development methods and tools. Some research issues in method engineering are identified. One major research topic in method engineering is discussed in depth: situational methods, i.e. the configuration of a project approach that is tuned to the project at hand. A language and support tool for the engineering of situational methods are discussed.

  11. DEVELOPMENT OF A WIRELINE CPT SYSTEM FOR MULTIPLE TOOL USAGE

    Energy Technology Data Exchange (ETDEWEB)

    Stephen P. Farrington; Martin L. Gildea; J. Christopher Bianchi

    1999-08-01

    The first phase of development of a wireline cone penetrometer system for multiple tool usage was completed under DOE award number DE-AR26-98FT40366. Cone penetrometer technology (CPT) has received widespread interest and is becoming more commonplace as a tool for environmental site characterization activities at several Department of Energy (DOE) facilities. Although CPT already offers many benefits for site characterization, the wireline system can improve CPT technology by offering greater utility and increased cost savings. Currently the use of multiple CPT tools during a site characterization (i.e. piezometric cone, chemical sensors, core sampler, grouting tool) must be accomplished by withdrawing the entire penetrometer rod string to change tools. This results in multiple penetrations being required to collect the data and samples that may be required during characterization of a site, and to subsequently seal the resulting holes with grout. The wireline CPT system allows multiple CPT tools to be interchanged during a single penetration, without withdrawing the CPT rod string from the ground. The goal of the project is to develop and demonstrate a system by which various tools can be placed at the tip of the rod string depending on the type of information or sample desired. Under the base contract, an interchangeable piezocone and grouting tool was designed, fabricated, and evaluated. The results of the evaluation indicate that success criteria for the base contract were achieved. In addition, the wireline piezocone tool was validated against ASTM standard cones, the depth capability of the system was found to compare favorably with that of conventional CPT, and the reliability and survivability of the system were demonstrated.

  12. Development of parallel/serial program analyzing tool

    International Nuclear Information System (INIS)

    Watanabe, Hiroshi; Nagao, Saichi; Takigawa, Yoshio; Kumakura, Toshimasa

    1999-03-01

    Japan Atomic Energy Research Institute has been developing 'KMtool', a parallel/serial program analyzing tool, in order to promote the parallelization of the science and engineering computation program. KMtool analyzes the performance of program written by FORTRAN77 and MPI, and it reduces the effort for parallelization. This paper describes development purpose, design, utilization and evaluation of KMtool. (author)

  13. Computer-based tools to support curriculum developers

    NARCIS (Netherlands)

    Nieveen, N.M.; Gustafson, Kent

    2000-01-01

    Since the start of the early 90’s, an increasing number of people are interested in supporting the complex tasks of the curriculum development process with computer-based tools. ‘Curriculum development’ refers to an intentional process or activity directed at (re) designing, developing and

  14. ANALYTICAL, CRITICAL AND CREATIVE THINKING DEVELOPMENT OF THE GIFTED CHILDREN IN THE USA SCHOOLS

    Directory of Open Access Journals (Sweden)

    Anna Yurievna Kuvarzina

    2013-11-01

    Full Text Available Teachers of the gifted students should not only make an enrichment and acceleration program for them but also pay attention to the development of analytical, critical and creative thinking skills. Despite great interest for this issue in the last years, the topic of analytical and creative thinking is poorly considered in the textbooks for the gifted. In this article some methods, materials and programs of analytical, critical and creative thinking skills development, which are used in the USA, are described.  The author analyses and systematize the methods and also suggests some ways of their usage in the Russian educational system.Purpose: to analyze and systematize methods, materials and programs, that are used in the USA for teaching gifted children analytical, critical and creative thinking, for development of their capacities of problem-solving and decision-making. Methods and methodology of the research: analysis, comparison, principle of the historical and logical approaches unity.Results: positive results of employment of analytical, critical and creative thinking development methods were shown in the practical experience of teaching and educating gifted children in the USA educational system.Results employment field: the Russian Federation educational system: schools, special classes and courses for the gifted children.DOI: http://dx.doi.org/10.12731/2218-7405-2013-7-42

  15. Flammable gas safety program. Analytical methods development: FY 1994 progress report

    International Nuclear Information System (INIS)

    Campbell, J.A.; Clauss, S.; Grant, K.; Hoopes, V.; Lerner, B.; Lucke, R.; Mong, G.; Rau, J.; Wahl, K.; Steele, R.

    1994-09-01

    This report describes the status of developing analytical methods to account for the organic components in Hanford waste tanks, with particular focus on tanks assigned to the Flammable Gas Watch List. The methods that have been developed are illustrated by their application to samples obtained from Tank 241-SY-101 (Tank 101-SY)

  16. The development and application of advanced analytical methods to commercial ICF reactor chambers. Final report

    International Nuclear Information System (INIS)

    Cousseau, P.; Engelstad, R.; Henderson, D.L.

    1997-10-01

    Progress is summarized in this report for each of the following tasks: (1) multi-dimensional radiation hydrodynamics computer code development; (2) 2D radiation-hydrodynamic code development; (3) ALARA: analytic and Laplacian adaptive radioactivity analysis -- a complete package for analysis of induced activation; (4) structural dynamics modeling of ICF reactor chambers; and (5) analysis of self-consistent target chamber clearing

  17. The Application of State-of-the-Art Analytic Tools (Biosensors and Spectroscopy in Beverage and Food Fermentation Process Monitoring

    Directory of Open Access Journals (Sweden)

    Shaneel Chandra

    2017-09-01

    Full Text Available The production of several agricultural products and foods are linked with fermentation. Traditional methods used to control and monitor the quality of the products and processes are based on the use of simple chemical analysis. However, these methods are time-consuming and do not provide sufficient relevant information to guarantee the chemical changes during the process. Commonly used methods applied in the agriculture and food industries to monitor fermentation are those based on simple or single-point sensors, where only one parameter is measured (e.g., temperature or density. These sensors are used several times per day and are often the only source of data available from which the conditions and rate of fermentation are monitored. In the modern food industry, an ideal method to control and monitor the fermentation process should enable a direct, rapid, precise, and accurate determination of several target compounds, with minimal to no sample preparation or reagent consumption. Here, state-of-the-art advancements in both the application of sensors and analytical tools to monitor beverage and food fermentation processes will be discussed.

  18. Soft x-ray microscopy - a powerful analytical tool to image magnetism down to fundamental length and times scales

    Energy Technology Data Exchange (ETDEWEB)

    Fischer, Peter

    2008-08-01

    The magnetic properties of low dimensional solid state matter is of the utmost interest both scientifically as well as technologically. In addition to the charge of the electron which is the base for current electronics, by taking into account the spin degree of freedom in future spintronics applications open a new avenue. Progress towards a better physical understanding of the mechanism and principles involved as well as potential applications of nanomagnetic devices can only be achieved with advanced analytical tools. Soft X-ray microscopy providing a spatial resolution towards 10nm, a time resolution currently in the sub-ns regime and inherent elemental sensitivity is a very promising technique for that. This article reviews the recent achievements of magnetic soft X-ray microscopy by selected examples of spin torque phenomena, stochastical behavior on the nanoscale and spin dynamics in magnetic nanopatterns. The future potential with regard to addressing fundamental magnetic length and time scales, e.g. imaging fsec spin dynamics at upcoming X-ray sources is pointed out.

  19. Recent developments and future trends in solid phase microextraction techniques towards green analytical chemistry.

    Science.gov (United States)

    Spietelun, Agata; Marcinkowski, Łukasz; de la Guardia, Miguel; Namieśnik, Jacek

    2013-12-20

    Solid phase microextraction find increasing applications in the sample preparation step before chromatographic determination of analytes in samples with a complex composition. These techniques allow for integrating several operations, such as sample collection, extraction, analyte enrichment above the detection limit of a given measuring instrument and the isolation of analytes from sample matrix. In this work the information about novel methodological and instrumental solutions in relation to different variants of solid phase extraction techniques, solid-phase microextraction (SPME), stir bar sorptive extraction (SBSE) and magnetic solid phase extraction (MSPE) is presented, including practical applications of these techniques and a critical discussion about their advantages and disadvantages. The proposed solutions fulfill the requirements resulting from the concept of sustainable development, and specifically from the implementation of green chemistry principles in analytical laboratories. Therefore, particular attention was paid to the description of possible uses of novel, selective stationary phases in extraction techniques, inter alia, polymeric ionic liquids, carbon nanotubes, and silica- and carbon-based sorbents. The methodological solutions, together with properly matched sampling devices for collecting analytes from samples with varying matrix composition, enable us to reduce the number of errors during the sample preparation prior to chromatographic analysis as well as to limit the negative impact of this analytical step on the natural environment and the health of laboratory employees. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. Development of bore tools for pipe welding and cutting

    Energy Technology Data Exchange (ETDEWEB)

    Oka, Kiyoshi; Ito, Akira; Takiguchi, Yuji [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1998-04-01

    In the International Thermonuclear Experimental Reactor (ITER), in-vessel components replacement and maintenance requires that connected cooling pipes be cut and removed beforehand and that new components be installed to which cooling pipes must be rewelded. All welding must be inspected for soundness after completion. These tasks require a new task concept for ensuring shielded areas and access from narrow ports. Thus, it became necessary to develop autonomous locomotion welding and cutting tools for branch and main pipes to weld pipes by in-pipe access; a system was proposed that cut and welded branch and main pipes after passing inside pipe curves, and elemental technologies developed. This paper introduces current development in tools for welding and cutting branch pipes and other tools for welding and cutting the main pipe. (author)

  1. High-resolution continuum source electrothermal atomic absorption spectrometry - An analytical and diagnostic tool for trace analysis

    International Nuclear Information System (INIS)

    Welz, Bernhard; Borges, Daniel L.G.; Lepri, Fabio G.; Vale, Maria Goreti R.; Heitmann, Uwe

    2007-01-01

    The literature about applications of high-resolution continuum source atomic absorption spectrometry (HR-CS AAS) with electrothermal atomization is reviewed. The historic development of HR-CS AAS is briefly summarized and the main advantages of this technique, mainly the 'visibility' of the spectral environment around the analytical line at high resolution and the unequaled simultaneous background correction are discussed. Simultaneous multielement CS AAS has been realized only in a very limited number of cases. The direct analysis of solid samples appears to have gained a lot from the special features of HR-CS AAS, and the examples from the literature suggest that calibration can be carried out against aqueous standards. Low-temperature losses of nickel and vanadyl porphyrins could be detected and avoided in the analysis of crude oil due to the superior background correction system. The visibility of the spectral environment around the analytical line revealed that the absorbance signal measured for phosphorus at the 213.6 nm non-resonance line without a modifier is mostly due to the PO molecule, and not to atomic phosphorus. The future possibility to apply high-resolution continuum source molecular absorption for the determination of non-metals is discussed

  2. Development of thick wall welding and cutting tools for ITER

    Energy Technology Data Exchange (ETDEWEB)

    Nakahira, Masataka; Takahashi, Hiroyuki; Akou, Kentaro; Koizumi, Koichi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment

    1998-04-01

    The Vacuum Vessel, which is a core component of International Thermonuclear Experimental Reactor (ITER), is required to be exchanged remotely in a case of accident such as superconducting coil failure. The in-vessel components such as blanket and divertor are planned to be exchanged or fixed. In these exchange or maintenance operations, the thick wall welding and cutting are inevitable and remote handling tools are necessary. The thick wall welding and cutting tools for blanket are under developing in the ITER R and D program. The design requirement is to weld or cut the stainless steel of 70 mm thickness in the narrow space. Tungsten inert gas (TIG) arc welding, plasma cutting and iodine laser welding/cutting are selected as primary option. Element welding and cutting tests, design of small tools to satisfy space requirement, test fabrication and performance tests were performed. This paper reports the tool design and overview of welding and cutting tests. (author)

  3. Development of a Safety Management Web Tool for Horse Stables.

    Science.gov (United States)

    Leppälä, Jarkko; Kolstrup, Christina Lunner; Pinzke, Stefan; Rautiainen, Risto; Saastamoinen, Markku; Särkijärvi, Susanna

    2015-11-12

    Managing a horse stable involves risks, which can have serious consequences for the stable, employees, clients, visitors and horses. Existing industrial or farm production risk management tools are not directly applicable to horse stables and they need to be adapted for use by managers of different types of stables. As a part of the InnoEquine project, an innovative web tool, InnoHorse, was developed to support horse stable managers in business, safety, pasture and manure management. A literature review, empirical horse stable case studies, expert panel workshops and stakeholder interviews were carried out to support the design. The InnoHorse web tool includes a safety section containing a horse stable safety map, stable safety checklists, and examples of good practices in stable safety, horse handling and rescue planning. This new horse stable safety management tool can also help in organizing work processes in horse stables in general.

  4. Developing shape analysis tools to assist complex spatial decision making

    International Nuclear Information System (INIS)

    Mackey, H.E.; Ehler, G.B.; Cowen, D.

    1996-01-01

    The objective of this research was to develop and implement a shape identification measure within a geographic information system, specifically one that incorporates analytical modeling for site location planning. The application that was developed incorporated a location model within a raster-based GIS, which helped address critical performance issues for the decision support system. Binary matrices, which approximate the object's geometrical form, are passed over the grided data structure and allow identification of irregular and regularly shaped objects. Lastly, the issue of shape rotation is addressed and is resolved by constructing unique matrices corresponding to the object's orientation

  5. Development of IFC based fire safety assesment tools

    DEFF Research Database (Denmark)

    Taciuc, Anca; Karlshøj, Jan; Dederichs, Anne

    2016-01-01

    changes need to be implemented, involving supplementary work and costs with negative impact on the client. The aim of this project is to create a set of automatic compliance checking rules for prescriptive design and to develop a web application tool for performance based design that retrieves data from...... Building Information Models (BIM) to evacuate the safety level in the building during the conceptual design stage. The findings show that the developed tools can be useful in AEC industry. Integrating BIM from conceptual design stage for analyzing the fire safety level can ensure precision in further...

  6. Health Heritage© a web-based tool for the collection and assessment of family health history: initial user experience and analytic validity.

    Science.gov (United States)

    Cohn, W F; Ropka, M E; Pelletier, S L; Barrett, J R; Kinzie, M B; Harrison, M B; Liu, Z; Miesfeldt, S; Tucker, A L; Worrall, B B; Gibson, J; Mullins, I M; Elward, K S; Franko, J; Guterbock, T M; Knaus, W A

    2010-01-01

    A detailed family health history is currently the most potentially useful tool for diagnosis and risk assessment in clinical genetics. We developed and evaluated the usability and analytic validity of a patient-driven web-based family health history collection and analysis tool. Health Heritage(©) guides users through the collection of their family health history by relative, generates a pedigree, completes risk assessment, stratification, and recommendations for 89 conditions. We compared the performance of Health Heritage to that of Usual Care using a nonrandomized cohort trial of 109 volunteers. We contrasted the completeness and sensitivity of family health history collection and risk assessments derived from Health Heritage and Usual Care to those obtained by genetic counselors and genetic assessment teams. Nearly half (42%) of the Health Heritage participants reported discovery of health risks; 63% found the information easy to understand and 56% indicated it would change their health behavior. Health Heritage consistently outperformed Usual Care in the completeness and accuracy of family health history collection, identifying 60% of the elevated risk conditions specified by the genetic team versus 24% identified by Usual Care. Health Heritage also had greater sensitivity than Usual Care when comparing the identification of risks. These results suggest a strong role for automated family health history collection and risk assessment and underscore the potential of these data to serve as the foundation for comprehensive, cost-effective personalized genomic medicine. Copyright © 2010 S. Karger AG, Basel.

  7. Experimental design and multiple response optimization. Using the desirability function in analytical methods development.

    Science.gov (United States)

    Candioti, Luciana Vera; De Zan, María M; Cámara, María S; Goicoechea, Héctor C

    2014-06-01

    A review about the application of response surface methodology (RSM) when several responses have to be simultaneously optimized in the field of analytical methods development is presented. Several critical issues like response transformation, multiple response optimization and modeling with least squares and artificial neural networks are discussed. Most recent analytical applications are presented in the context of analytLaboratorio de Control de Calidad de Medicamentos (LCCM), Facultad de Bioquímica y Ciencias Biológicas, Universidad Nacional del Litoral, C.C. 242, S3000ZAA Santa Fe, ArgentinaLaboratorio de Control de Calidad de Medicamentos (LCCM), Facultad de Bioquímica y Ciencias Biológicas, Universidad Nacional del Litoral, C.C. 242, S3000ZAA Santa Fe, Argentinaical methods development, especially in multiple response optimization procedures using the desirability function. Copyright © 2014 Elsevier B.V. All rights reserved.

  8. Process development and tooling design for intrinsic hybrid composites

    Science.gov (United States)

    Riemer, M.; Müller, R.; Drossel, W. G.; Landgrebe, D.

    2017-09-01

    Hybrid parts, which combine the advantages of different material classes, are moving into the focus of lightweight applications. This development is amplified by their high potential for usage in the field of crash relevant structures. By the current state of the art, hybrid parts are mainly made in separate, subsequent forming and joining processes. By using the concept of an intrinsic hybrid, the shaping of the part and the joining of the different materials are performed in a single process step for shortening the overall processing time and thereby the manufacturing costs. The investigated hybrid part is made from continuous fibre reinforced plastic (FRP), in which a metallic reinforcement structure is integrated. The connection between these layered components is realized by a combination of adhesive bonding and a geometrical form fit. The form fit elements are intrinsically generated during the forming process. This contribution regards the development of the forming process and the design of the forming tool for the single step production of a hybrid part. To this end a forming tool, which combines the thermo-forming and the metal forming process, is developed. The main challenge by designing the tool is the temperature management of the tool elements for the variothermal forming process. The process parameters are determined in basic tests and finite element (FE) simulation studies. On the basis of these investigations a control concept for the steering of the motion axes and the tool temperature is developed. Forming tests are carried out with the developed tool and the manufactured parts are analysed by computer assisted tomography (CT) scans.

  9. Developing an Analytical Framework: Incorporating Ecosystem Services into Decision Making - Proceedings of a Workshop

    Science.gov (United States)

    Hogan, Dianna; Arthaud, Greg; Pattison, Malka; Sayre, Roger G.; Shapiro, Carl

    2010-01-01

    The analytical framework for understanding ecosystem services in conservation, resource management, and development decisions is multidisciplinary, encompassing a combination of the natural and social sciences. This report summarizes a workshop on 'Developing an Analytical Framework: Incorporating Ecosystem Services into Decision Making,' which focused on the analytical process and on identifying research priorities for assessing ecosystem services, their production and use, their spatial and temporal characteristics, their relationship with natural systems, and their interdependencies. Attendees discussed research directions and solutions to key challenges in developing the analytical framework. The discussion was divided into two sessions: (1) the measurement framework: quantities and values, and (2) the spatial framework: mapping and spatial relationships. This workshop was the second of three preconference workshops associated with ACES 2008 (A Conference on Ecosystem Services): Using Science for Decision Making in Dynamic Systems. These three workshops were designed to explore the ACES 2008 theme on decision making and how the concept of ecosystem services can be more effectively incorporated into conservation, restoration, resource management, and development decisions. Preconference workshop 1, 'Developing a Vision: Incorporating Ecosystem Services into Decision Making,' was held on April 15, 2008, in Cambridge, MA. In preconference workshop 1, participants addressed what would have to happen to make ecosystem services be used more routinely and effectively in conservation, restoration, resource management, and development decisions, and they identified some key challenges in developing the analytical framework. Preconference workshop 3, 'Developing an Institutional Framework: Incorporating Ecosystem Services into Decision Making,' was held on October 30, 2008, in Albuquerque, NM; participants examined the relationship between the institutional framework and

  10. Comparative study between univariate spectrophotometry and multivariate calibration as analytical tools for simultaneous quantitation of Moexipril and Hydrochlorothiazide

    Science.gov (United States)

    Tawakkol, Shereen M.; Farouk, M.; Elaziz, Omar Abd; Hemdan, A.; Shehata, Mostafa A.

    2014-12-01

    Three simple, accurate, reproducible, and selective methods have been developed and subsequently validated for the simultaneous determination of Moexipril (MOX) and Hydrochlorothiazide (HCTZ) in pharmaceutical dosage form. The first method is the new extended ratio subtraction method (EXRSM) coupled to ratio subtraction method (RSM) for determination of both drugs in commercial dosage form. The second and third methods are multivariate calibration which include Principal Component Regression (PCR) and Partial Least Squares (PLSs). A detailed validation of the methods was performed following the ICH guidelines and the standard curves were found to be linear in the range of 10-60 and 2-30 for MOX and HCTZ in EXRSM method, respectively, with well accepted mean correlation coefficient for each analyte. The intra-day and inter-day precision and accuracy results were well within the acceptable limits.

  11. Applying decision-making tools to national e-waste recycling policy: an example of Analytic Hierarchy Process.

    Science.gov (United States)

    Lin, Chun-Hsu; Wen, Lihchyi; Tsai, Yue-Mi

    2010-05-01

    As policy making is in essence a process of discussion, decision-making tools have in many cases been proposed to resolve the differences of opinion among the different parties. In our project that sought to promote a country's performance in recycling, we used the Analytic Hierarchy Process (AHP) to evaluate the possibilities and determine the priority of the addition of new mandatory recycled waste, also referred to as Due Recycled Wastes, from candidate waste appliances. The evaluation process started with the collection of data based on telephone interviews and field investigations to understand the behavior of consumers as well as their overall opinions regarding the disposal of certain waste appliances. With the data serving as background information, the research team then implemented the Analytic Hierarchy Process using the information that formed an incomplete hierarchy structure in order to determine the priority for recycling. Since the number of objects to be evaluated exceeded the number that the AHP researchers had suggested, we reclassified the objects into four groups and added one more level of pair-wise comparisons, which substantially reduced the inconsistency in the judgment of the AHP participants. The project was found to serve as a flexible and achievable application of AHP to the environmental policy-making process. In addition, based on the project's outcomes derived from the project as a whole, the research team drew conclusions regarding the government's need to take back 15 of the items evaluated, and suggested instruments that could be used or recycling regulations that could be changed in the future. Further analysis on the top three items recommended by the results of the evaluation for recycling, namely, Compact Disks, Cellular Phones and Computer Keyboards, was then conducted to clarify their concrete feasibility. After the trial period for recycling ordered by the Taiwan Environmental Protection Administration, only Computer

  12. Synthesis and Development of Diagnostic Tools for Medical Imaging

    DEFF Research Database (Denmark)

    Schaarup-Jensen, Henrik

    The need for novel diagnostic tools in medical imaging is increasing since they can improve the positive therapeutic outcome as well as patient compliance. In this thesis different diagnostic tools were developed within an interdisciplinary project, whereas the main work reported in this thesis...... of injectable fiducial tissue markers for surgical guidance of non-palpable tumors and brachytherapy. As radioactive tracer, radioiodinated SAIB-derivatives were developed based on the regioselective ipso-iodination of aryl-TMS moieties. Radioiodination was conducted under carrier free conditions in high...... was synthesized. Remote loading of one candidate was successful; however, the proper contrast level was not sufficient to be visible by CT-imaging. Another diagnostic tool for blood pool imaging is DOTA-modified pluronic/cyclodextrin (CD)-based polyrotaxanes (PRs). With the previously reported chelation of Gd and...

  13. Development of culturally sensitive dialog tools in diabetes education

    Directory of Open Access Journals (Sweden)

    Nana Folmann Hempler

    2015-01-01

    Full Text Available Person-centeredness is a goal in diabetes education, and cultural influences are important to consider in this regard. This report describes the use of a design-based research approach to develop culturally sensitive dialog tools to support person-centered dietary education targeting Pakistani immigrants in Denmark with type 2 diabetes. The approach appears to be a promising method to develop dialog tools for patient education that are culturally sensitive, thereby increasing their acceptability among ethnic minority groups. The process also emphasizes the importance of adequate training and competencies in the application of dialog tools and of alignment between researchers and health care professionals with regards to the educational philosophy underlying their use.

  14. Assessment Tool Development for Extracurricular Smet Programs for Girls

    Science.gov (United States)

    House, Jody; Johnson, Molly; Borthwick, Geoffrey

    Many different programs have been designed to increase girls' interest in and exposure to science, mathematics, engineering, and technology (SMET). Two of these programs are discussed and contrasted in the dimensions of length, level of science content, pedagogical approach, degree of self- vs. parent-selected participants, and amount of communitybuilding content. Two different evaluation tools were used. For one program, a modified version of the University of Pittsburgh's undergraduate engineering attitude assessment survey was used. Program participants' responses were compared to those from a fifth grade, mixed-sex science class. The only gender difference found was in the area of parental encouragement. The girls in the special class were more encouraged to participate in SMET areas. For the second program, a new age-appropriate tool developed specifically for these types of programs was used, and the tool itself was evaluated. The results indicate that the new tool has construct validity. On the basis of these preliminary results, a long-term plan for the continued development of the assessment tool is outlined.

  15. Development of the Sports Organization Concussion Risk Assessment Tool (SOCRAT).

    Science.gov (United States)

    Yeung, A; Munjal, V; Virji-Babul, N

    2017-01-01

    In this paper, we describe the development of a novel tool-the Sports Organization Concussion Risk Assessment Tool (SOCRAT)-to assist sport organizations in assessing the overall risk of concussion at a team level by identifying key risk factors. We first conducted a literature review to identify risk factors of concussion using ice hockey as a model. We then developed an algorithm by combining the severity and the probability of occurrence of concussions of the identified risk factors by adapting a risk assessment tool commonly used in engineering applications. The following risk factors for ice hockey were identified: age, history of previous concussions, previous body checking experience, allowance of body checking, type of helmet worn and the game or practice environment. These risk factors were incorporated into the algorithm, resulting in an individual risk priority number (RPN) for each risk factor and an overall RPN that provides an estimate of the risk in the given circumstances. The SOCRAT can be used to analyse how different risk factors contribute to the overall risk of concussion. The tool may be tailored to organizations to provide: (1) an RPN for each risk factor and (2) an overall RPN that takes into account all the risk factors. Further work is needed to validate the tool based on real data.

  16. ENVIRONMENTAL ACCOUNTING: A MANAGEMENT TOOL FOR SUSTAINABLE DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    Nicolae Virag

    2014-12-01

    Full Text Available The paper aims to analyze the ways in which accounting as a social science and management information tool can contribute to sustainable development. The paper highlights the emergence of the environmental accounting concept, the applicability of the environmental accounting, types of environmental accounting, scope and benefits of environmental accounting.

  17. Development of the writing readiness inventory tool in context (WRITIC)

    NARCIS (Netherlands)

    Hartingsveldt, M.J. van; Vries, L. de; Cup, E.H.C.; Groot, I.J.M. de; Nijhuis-Van der Sanden, M.W.G.

    2014-01-01

    This article describes the development of the Writing Readiness Inventory Tool in Context (WRITIC), a measurement evaluating writing readiness in Dutch kindergarten children (5 and 6 years old). Content validity was established through 10 expert evaluations in three rounds. Construct validity was

  18. Developing an Intranet: Tool Selection and Management Issues.

    Science.gov (United States)

    Chou, David C.

    1998-01-01

    Moving corporate systems onto an intranet will increase the data traffic within the corporate network, which necessitates a high-quality management process to the intranet. Discusses costs and benefits of adopting an intranet, tool availability and selection criteria, and management issues for developing an intranet. (Author/AEF)

  19. Accessing Curriculum Through Technology Tools (ACTTT): A Model Development Project

    Science.gov (United States)

    Daytner, Katrina M.; Johanson, Joyce; Clark, Letha; Robinson, Linda

    2012-01-01

    Accessing Curriculum Through Technology Tools (ACTTT), a project funded by the U.S. Office of Special Education Programs (OSEP), developed and tested a model designed to allow children in early elementary school, including those "at risk" and with disabilities, to better access, participate in, and benefit from the general curriculum.…

  20. 109 Strategizing Drama as Tool for Advocacy and Rural Development

    African Journals Online (AJOL)

    Nekky Umera

    undertones, most of these organizations have failed to discover and employ drama/theatre as potent tool for the effective .... accept new innovations and changes. The Longman Dictionary of. Contemporary ... Development on the other hand is the end product of the success of advocacy. Citing contemporary paradigm shift, ...

  1. Sharpening a Tool for Teaching: The Zone of Proximal Development

    Science.gov (United States)

    Wass, Rob; Golding, Clinton

    2014-01-01

    Vygotsky's Zone of Proximal Development (ZPD) provides an important understanding of learning, but its implications for teachers are often unclear or limited and could be further explored. We use conceptual analysis to sharpen the ZPD as a teaching tool, illustrated with examples from teaching critical thinking in zoology. Our conclusions are…

  2. Interactive test tool for interoperable C-ITS development

    NARCIS (Netherlands)

    Voronov, A.; Englund, C.; Bengtsson, H.H.; Chen, L.; Ploeg, J.; Jongh, J.F.C.M. de; Sluis, H.J.D. van de

    2015-01-01

    This paper presents the architecture of an Interactive Test Tool (ITT) for interoperability testing of Cooperative Intelligent Transport Systems (C-ITS). Cooperative systems are developed by different manufacturers at different locations, which makes interoperability testing a tedious task. Up until

  3. Development of a Psychotropic PRN Medication Evaluative Tool

    Science.gov (United States)

    Silk, Larry; Watt, Jackie; Pilon, Nancy; Draper, Chad

    2013-01-01

    This article describes a psychotropic PRN Evaluative Tool developed by interprofessional clinicians to address inconsistent reporting and assessment of the effectiveness of PRN medications used for people who are developmentally disabled. Fifty-nine participants (37 males, 22 females), ages 16 to 60 years, were included in the review, all…

  4. The Limitations of Monetary Tools in a Developing Economy like ...

    African Journals Online (AJOL)

    The Limitations of Monetary Tools in a Developing Economy like Nigeria. ... AFRREV IJAH: An International Journal of Arts and Humanities ... and prices flexibility, belief that the economy of was self adjusting and equilibrium income always tend towards its full employment level when disturbed especially in the long-run.

  5. Economy diversification: a potent tool for tourism development in ...

    African Journals Online (AJOL)

    Economy diversification: a potent tool for tourism development in Nigeria. ... AFRREV STECH: An International Journal of Science and Technology ... On this vain, this work reviewed the current state of some sectors in Nigeria, highlighting the effect of dependence on mono-product economy and emphasize tourism potential ...

  6. Crash Attenuator Data Collection and Life Cycle Tool Development

    Science.gov (United States)

    2014-06-14

    This research study was aimed at data collection and development of a decision support tool for life cycle cost assessment of crash attenuators. Assessing arrenuator life cycle costs based on in-place expected costs and not just the initial cost enha...

  7. Tool development to understand rural resource users' land use and ...

    African Journals Online (AJOL)

    Tool development to understand rural resource users' land use and impacts on land type changes in Madagascar. ... explore and understand decisions and management strategies. We finally report on first outcomes of the game including land use decisions, reaction to market fluctuation and landscape change. RÉSUMÉ

  8. Method Engineering: Engineering of Information Systems Development Methods and Tools

    NARCIS (Netherlands)

    Brinkkemper, J.N.; Brinkkemper, Sjaak

    1996-01-01

    This paper proposes the term method engineering for the research field of the construction of information systems development methods and tools. Some research issues in method engineering are identified. One major research topic in method engineering is discussed in depth: situational methods, i.e.

  9. budgeting as a strategic tool for development in the arts

    African Journals Online (AJOL)

    Admin

    This paper examines budgeting as a strategic tool for development in the Arts. Budgeting as a fundamental ... controlling the spending of money. It refers to ... executing adequate control over the many units of the organization, inter alia, towards effective planning and control, best described as “a management tool”. Types Of ...

  10. Millennium Development Goals: Tool or token of global social governance?

    NARCIS (Netherlands)

    Al Raee, M.; Amoateng, Elvis; Avenyo, E.K.; Beshay, Youssef; Bierbaum, M.; Keijser, C.; Sinha, R.

    2014-01-01

    In this paper we argue that the Millennium Development Goals (MDGs) experience suggests that Global Social Governance (GSG) exists and that the MDGs have been an effective tool in creating a global accountability framework despite shortcomings mainly arising in the formulation process. The paper

  11. Reflective Journaling: A Tool for Teacher Professional Development

    Science.gov (United States)

    Dreyer, Lorna M.

    2015-01-01

    This qualitative study explores the introduction of postgraduate education students to reflective journaling as a tool for professional development. Students were purposefully selected to keep a weekly journal in which they reflected in and on the activities (methodologies, techniques, strategies) they engaged in while executing a workplace…

  12. Development of the Writing Readiness Inventory Tool in Context (WRITIC)

    NARCIS (Netherlands)

    van Hartingsveldt, Margo J.; de Vries, Liesbeth; Cup, Edith HC; de Groot, Imelda JM; Nijhuis-van der Sanden, Maria WG

    2014-01-01

    This article describes the development of the Writing Readiness Inventory Tool in Context (WRITIC), a measurement evaluating writing readiness in Dutch kindergarten children (5 and 6 years old). Content validity was established through 10 expert evaluations in three rounds. Construct validity was

  13. Development of Desktop Computing Applications and Engineering Tools on GPUs

    DEFF Research Database (Denmark)

    Sørensen, Hans Henrik Brandenborg; Glimberg, Stefan Lemvig; Hansen, Toke Jansen

    (GPUs) for high-performance computing applications and software tools in science and engineering, inverse problems, visualization, imaging, dynamic optimization. The goals are to contribute to the development of new state-of-the-art mathematical models and algorithms for maximum throughout performance...

  14. Methodology for Developing a Diesel Exhaust After Treatment Simulation Tool

    DEFF Research Database (Denmark)

    Christiansen, Tine; Jensen, Johanne; Åberg, Andreas

    2018-01-01

    A methodology for the development of catalyst models is presented. Also, a methodology of the implementation of such models into a modular simulation tool, which simulates the units in succession, is presented. A case study is presented illustrating how suitable models can be found and used for s...

  15. Do FDA label changes work? Assessment of the 2010 class label change for proton pump inhibitors using the Sentinel System's analytic tools.

    Science.gov (United States)

    Sobel, Rachel E; Bate, Andrew; Marshall, James; Haynes, Kevin; Selvam, Nandini; Nair, Vinit; Daniel, Gregory; Brown, Jeffrey S; Reynolds, Robert F

    2018-03-01

    To pilot use of the US Food and Drug Administration's (FDA's) Sentinel System data and analytic tools by a non-FDA stakeholder through the Innovation in Medical Evidence Development and Surveillance system of the Reagan Udall Foundation. We evaluated the US FDA 2010 proton pump inhibitor (PPI) class label change that warned of increased risk of bone fracture, to use PPIs for the lowest dose and shortest duration, and to manage bone status for those at risk for osteoporosis. The cohort consisted of adults aged 18 years or older prescribed PPIs without fracture risk factors. We evaluated incident and prevalent uses of the 8 PPIs noted in the label change. Outcomes evaluated before and after label change were PPI dispensing patterns, incident fractures, and osteoporosis screening or interventions. Consistent with FDA use of descriptive tools, we did not include direct comparisons or statistical testing. There were 1 488 869 and 2 224 420 incident PPI users in the before [PRE] and after [POST] periods, respectively. Users with 1 year or more of exposure decreased (8.4% vs 7.5%), as did mean days supplied/user (130.4 to 113.7 d among all users and 830.8 to 645.4 d among users with 1 y or more of exposure). Osteoporosis screening and interventions did not appear to increase, but the proportion of patients with fractures decreased (4.4% vs 3.1%). Prevalent user results were similar. This analysis demonstrated the ability to use Sentinel tools to assess the effectiveness of a label change and accompanying communication at the population level and suggests an influence on subsequent dispensing behavior. Copyright © 2018 John Wiley & Sons, Ltd.

  16. Development of the Operational Events Groups Ranking Tool

    International Nuclear Information System (INIS)

    Simic, Zdenko; Banov, Reni

    2014-01-01

    Both because of complexity and ageing, facilities like nuclear power plants require feedback from the operating experience in order to further improve safety and operation performance. That is the reason why significant effort is dedicated to operating experience feedback. This paper contains description of the specification and development of the application for the operating events ranking software tool. Robust and consistent way of selecting most important events for detail investigation is important because it is not feasible or even useful to investigate all of them. Development of the tool is based on the comprehensive events characterisation and methodical prioritization. This includes rich set of events parameters which allow their top level preliminary analysis, different ways of groupings and even to evaluate uncertainty propagation to the ranking results. One distinct feature of the implemented method is that user (i.e., expert) could determine how important is particular ranking parameter based on their pairwise comparison. For tools demonstration and usability it is crucial that sample database is also created. For useful analysis the whole set of events for 5 years is selected and characterised. Based on the preliminary results this tool seems valuable for new preliminary prospective on data as whole, and especially for the identification of events groups which should have priority in the more detailed assessment. The results are consisting of different informative views on the events groups importance and related sensitivity and uncertainty results. This presents valuable tool for improving overall picture about specific operating experience and also for helping identify the most important events groups for further assessment. It is clear that completeness and consistency of the input data characterisation is very important to get full and valuable importance ranking. Method and tool development described in this paper is part of continuous effort of

  17. Effective Management Tools in Implementing Operational Programme Administrative Capacity Development

    OpenAIRE

    Carmen – Elena DOBROTĂ; Claudia VASILCA

    2015-01-01

    Public administration in Romania and the administrative capacity of the central and local government has undergone a significant progress since 2007. The development of the administrative capacity deals with a set of structural and process changes that allow governments to improve the formulation and implementation of policies in order to achieve enhanced results. Identifying, developing and using management tools for a proper implementation of an operational programme dedicated to consolidat...

  18. Development of the Central Dogma Concept Inventory (CDCI) Assessment Tool

    OpenAIRE

    Newman, Dina L.; Snyder, Christopher W.; Fisk, J. Nick; Wright, L. Kate

    2016-01-01

    Scientific teaching requires scientifically constructed, field-tested instruments to accurately evaluate student thinking and gauge teacher effectiveness. We have developed a 23-question, multiple select?format assessment of student understanding of the essential concepts of the central dogma of molecular biology that is appropriate for all levels of undergraduate biology. Questions for the Central Dogma Concept Inventory (CDCI) tool were developed and iteratively revised based on student lan...

  19. An Analytic Approach to Developing Transport Threshold Models of Neoclassical Tearing Modes in Tokamaks

    International Nuclear Information System (INIS)

    Mikhailovskii, A.B.; Shirokov, M.S.; Konovalov, S.V.; Tsypin, V.S.

    2005-01-01

    Transport threshold models of neoclassical tearing modes in tokamaks are investigated analytically. An analysis is made of the competition between strong transverse heat transport, on the one hand, and longitudinal heat transport, longitudinal heat convection, longitudinal inertial transport, and rotational transport, on the other hand, which leads to the establishment of the perturbed temperature profile in magnetic islands. It is shown that, in all these cases, the temperature profile can be found analytically by using rigorous solutions to the heat conduction equation in the near and far regions of a chain of magnetic islands and then by matching these solutions. Analytic expressions for the temperature profile are used to calculate the contribution of the bootstrap current to the generalized Rutherford equation for the island width evolution with the aim of constructing particular transport threshold models of neoclassical tearing modes. Four transport threshold models, differing in the underlying competing mechanisms, are analyzed: collisional, convective, inertial, and rotational models. The collisional model constructed analytically is shown to coincide exactly with that calculated numerically; the reason is that the analytical temperature profile turns out to be the same as the numerical profile. The results obtained can be useful in developing the next generation of general threshold models. The first steps toward such models have already been made

  20. Analytical quality assurance procedures developed for the IAEA's Reference Asian Man Project (Phase 2)

    International Nuclear Information System (INIS)

    Kawamura, H.; Parr, R.M.; Dang, H.S.; Tian, W.; Barnes, R.M.; Iyengar, G.V.

    2000-01-01

    Analytical quality assurance procedures adopted for use in the IAEA Co-ordinated Research Project on Ingestion and Organ Content of Trace Elements of Importance in Radiological Protection are designed to ensure comparability of the analytical results for Cs, I, Sr, Th, U and other elements in human tissues and diets collected and analysed in nine participating countries. The main analytical techniques are NAA and ICP-MS. For sample preparation, all participants are using identical food blenders which have been centrally supplied after testing for contamination. For quality control of the analyses, six NIST SRMs covering a range of matrices with certified and reference values for the elements of interest have been distributed. A new Japanese reference diet material has also been developed. These quality assurance procedures are summarized here and new data are presented for Cs, I, Sr, Th and U in the NIST SRMs. (author)

  1. Tool for test driven development of JavaScript applications

    OpenAIRE

    Stamać, Gregor

    2015-01-01

    Thesis describes the implementation of a tool for testing JavaScript code. The tool is designed to help us in test-driven development of JavaScript-based applications. Therefore, it is important to display test results as quickly as possible. The thesis is divided into four parts. First part describes JavaScript environment. It contains a brief history of the JavaScript language, prevalence, strengths and weaknesses. This section also describes TypeScript programming language that is a super...

  2. Designing the user experience of game development tools

    CERN Document Server

    Lightbown, David

    2015-01-01

    The Big Green Button My Story Who Should Read this Book? Companion Website and Twitter Account Before we BeginWelcome to Designing the User Experience of Game Development ToolsWhat Will We Learn in This Chapter?What Is This Book About?Defining User ExperienceThe Value of Improving the User Experience of Our ToolsParallels Between User Experience and Game DesignHow Do People Benefit From an Improved User Experience?Finding the Right BalanceWrapping UpThe User-Centered Design ProcessWhat Will We

  3. Ongoing development of digital radiotherapy plan review tools

    International Nuclear Information System (INIS)

    Ebert, M.A.; Hatton, J.; Cornes, D.

    2011-01-01

    Full text: To describe ongoing development of software to support the review of radiotherapy treatment planning system (TPS) data. The 'SWAN' software program was conceived in 2000 and initially developed for the RADAR (TROG 03.04) prostate radiotherapy trial. Validation of the SWAN program has been occurring via implementation by TROG in support of multiple clinical trials. Development has continued and the SWAN software program is now supported by modular components which comprise the 'SW AN system'. This provides a comprehensive set of tools for the review, analysis and archive of TPS exports. The SWAN system has now been used in support of over 20 radiotherapy trials and to review the plans of over 2,000 trial participants. The use of the system for the RADAR trial is now culminating in the derivation of dose-outcomes indices for prostate treatment toxicity. Newly developed SWAN tools include enhanced remote data archive/retrieval, display of dose in both relative and absolute modes, and interfacing to a Matlab-based add-on ('VAST') that allows quantitative analysis of delineated volumes including regional overlap statistics for multi-observer studies. Efforts are continuing to develop the SWAN system in the context of international collaboration aimed at harmonising the quality-assurance activities of collaborative trials groups. Tools such as the SWAN system are essential for ensuring the collection of accurate and reliable evidence to guide future radiotherapy treatments. One of the principal challenges of developing such a tool is establishing a development path that will ensure its validity and applicability well into the future.

  4. Preliminary Development of an Object-Oriented Optimization Tool

    Science.gov (United States)

    Pak, Chan-gi

    2011-01-01

    The National Aeronautics and Space Administration Dryden Flight Research Center has developed a FORTRAN-based object-oriented optimization (O3) tool that leverages existing tools and practices and allows easy integration and adoption of new state-of-the-art software. The object-oriented framework can integrate the analysis codes for multiple disciplines, as opposed to relying on one code to perform analysis for all disciplines. Optimization can thus take place within each discipline module, or in a loop between the central executive module and the discipline modules, or both. Six sample optimization problems are presented. The first four sample problems are based on simple mathematical equations; the fifth and sixth problems consider a three-bar truss, which is a classical example in structural synthesis. Instructions for preparing input data for the O3 tool are presented.

  5. National Energy Audit Tool for Multifamily Buildings Development Plan

    Energy Technology Data Exchange (ETDEWEB)

    Malhotra, Mini [ORNL; MacDonald, Michael [Sentech, Inc.; Accawi, Gina K [ORNL; New, Joshua Ryan [ORNL; Im, Piljae [ORNL

    2012-03-01

    The U.S. Department of Energy's (DOE's) Weatherization Assistance Program (WAP) enables low-income families to reduce their energy costs by providing funds to make their homes more energy efficient. In addition, the program funds Weatherization Training and Technical Assistance (T and TA) activities to support a range of program operations. These activities include measuring and documenting performance, monitoring programs, promoting advanced techniques and collaborations to further improve program effectiveness, and training, including developing tools and information resources. The T and TA plan outlines the tasks, activities, and milestones to support the weatherization network with the program implementation ramp up efforts. Weatherization of multifamily buildings has been recognized as an effective way to ramp up weatherization efforts. To support this effort, the 2009 National Weatherization T and TA plan includes the task of expanding the functionality of the Weatherization Assistant, a DOE-sponsored family of energy audit computer programs, to perform audits for large and small multifamily buildings This report describes the planning effort for a new multifamily energy audit tool for DOE's WAP. The functionality of the Weatherization Assistant is being expanded to also perform energy audits of small multifamily and large multifamily buildings. The process covers an assessment of needs that includes input from national experts during two national Web conferences. The assessment of needs is then translated into capability and performance descriptions for the proposed new multifamily energy audit, with some description of what might or should be provided in the new tool. The assessment of needs is combined with our best judgment to lay out a strategy for development of the multifamily tool that proceeds in stages, with features of an initial tool (version 1) and a more capable version 2 handled with currently available resources. Additional

  6. Development of a site analysis tool for distributed wind projects

    Energy Technology Data Exchange (ETDEWEB)

    Shaw, Shawn [The Cadmus Group, Inc., Waltham MA (United States)

    2012-02-28

    The Cadmus Group, Inc., in collaboration with the National Renewable Energy Laboratory (NREL) and Encraft, was awarded a grant from the Department of Energy (DOE) to develop a site analysis tool for distributed wind technologies. As the principal investigator for this project, Mr. Shawn Shaw was responsible for overall project management, direction, and technical approach. The product resulting from this project is the Distributed Wind Site Analysis Tool (DSAT), a software tool for analyzing proposed sites for distributed wind technology (DWT) systems. This user-friendly tool supports the long-term growth and stability of the DWT market by providing reliable, realistic estimates of site and system energy output and feasibility. DSAT-which is accessible online and requires no purchase or download of software-is available in two account types; Standard: This free account allows the user to analyze a limited number of sites and to produce a system performance report for each; and Professional: For a small annual fee users can analyze an unlimited number of sites, produce system performance reports, and generate other customizable reports containing key information such as visual influence and wind resources. The tool’s interactive maps allow users to create site models that incorporate the obstructions and terrain types present. Users can generate site reports immediately after entering the requisite site information. Ideally, this tool also educates users regarding good site selection and effective evaluation practices.

  7. The Neonatal Eating Assessment Tool: Development and Content Validation.

    Science.gov (United States)

    Pados, Britt F; Estrem, Hayley H; Thoyre, Suzanne M; Park, Jinhee; McComish, Cara

    2017-11-01

    To develop and content validate the Neonatal Eating Assessment Tool (NeoEAT), a parent-report measure of infant feeding. The NeoEAT was developed in three phases. Phase 1: Items were generated from a literature review, available assessment tools, and parents' descriptions of problematic feeding in infants.Phase 2: Professionals rated items for relevance and clarity. Content validity indices were calculated. Phase 3: Parent understanding was explored through cognitive interviews. Phase 1: Descriptions of infant feeding were obtained from 12 parents of children with diagnosed feeding problems and 29 parents of infants younger than seven months. Phase 2: Nine professionals rated items. Phase 3: Sixteen parents of infants younger than seven months completed the cognitive interview. Content validity of the NeoEAT. Three versions were developed: NeoEAT Breastfeeding (72 items), NeoEAT Bottle Feeding (74 items), and NeoEAT Breastfeeding and Bottle Feeding (89 items).

  8. Searching for Sentient Design Tools for Game Development

    DEFF Research Database (Denmark)

    Liapis, Antonios

    a large volume of game content or to reduce designer effort by automating the mechanizable aspects of content creation, such as feasibility checking. However elaborate the type of content such tools can create, they remain subservient to their human developers/creators (who have tightly designed all......Over the last twenty years, computer games have grown from a niche market targeting young adults to an important player in the global economy, engaging millions of people from different cultural backgrounds. As both the number and the size of computer games continue to rise, game companies handle...... increasing demand by expanding their cadre, compressing development cycles and reusing code or assets. To limit development time and reduce the cost of content creation, commercial game engines and procedural content generation are popular shortcuts. Content creation tools are means to either generate...

  9. Examination of fast reactor fuels, FBR analytical quality assurance standards and methods, and analytical methods development: irradiation tests. Progress report, April 1--June 30, 1976, and FY 1976

    International Nuclear Information System (INIS)

    Baker, R.D.

    1976-08-01

    Characterization of unirradiated and irradiated LMFBR fuels by analytical chemistry methods will continue, and additional methods will be modified and mechanized for hot cell application. Macro- and microexaminations will be made on fuel and cladding using the shielded electron microprobe, emission spectrograph, radiochemistry, gamma scanner, mass spectrometers, and other analytical facilities. New capabilities will be developed in gamma scanning, analyses to assess spatial distributions of fuel and fission products, mass spectrometric measurements of burnup and fission gas constituents and other chemical analyses. Microstructural analyses of unirradiated and irradiated materials will continue using optical and electron microscopy and autoradiographic and x-ray techniques. Analytical quality assurance standards tasks are designed to assure the quality of the chemical characterizations necessary to evaluate reactor components relative to specifications. Tasks include: (1) the preparation and distribution of calibration materials and quality control samples for use in quality assurance surveillance programs, (2) the development of and the guidance in the use of quality assurance programs for sampling and analysis, (3) the development of improved methods of analysis, and (4) the preparation of continuously updated analytical method manuals. Reliable analytical methods development for the measurement of burnup, oxygen-to-metal (O/M) ratio, and various gases in irradiated fuels is described

  10. DIDADTIC TOOLS FOR THE STUDENTS’ ALGORITHMIC THINKING DEVELOPMENT

    Directory of Open Access Journals (Sweden)

    T. P. Pushkaryeva

    2017-01-01

    Full Text Available Introduction. Modern engineers must possess high potential of cognitive abilities, in particular, the algorithmic thinking (AT. In this regard, the training of future experts (university graduates of technical specialities has to provide the knowledge of principles and ways of designing of various algorithms, abilities to analyze them, and to choose the most optimal variants for engineering activity implementation. For full formation of AT skills it is necessary to consider all channels of psychological perception and cogitative processing of educational information: visual, auditory, and kinesthetic.The aim of the present research is theoretical basis of design, development and use of resources for successful development of AT during the educational process of training in programming.Methodology and research methods. Methodology of the research involves the basic thesis of cognitive psychology and information approach while organizing the educational process. The research used methods: analysis; modeling of cognitive processes; designing training tools that take into account the mentality and peculiarities of information perception; diagnostic efficiency of the didactic tools. Results. The three-level model for future engineers training in programming aimed at development of AT skills was developed. The model includes three components: aesthetic, simulative, and conceptual. Stages to mastering a new discipline are allocated. It is proved that for development of AT skills when training in programming it is necessary to use kinesthetic tools at the stage of mental algorithmic maps formation; algorithmic animation and algorithmic mental maps at the stage of algorithmic model and conceptual images formation. Kinesthetic tools for development of students’ AT skills when training in algorithmization and programming are designed. Using of kinesthetic training simulators in educational process provide the effective development of algorithmic style of

  11. Pilot evaluation of a continuing professional development tool for developing leadership skills.

    Science.gov (United States)

    Patterson, Brandon J; Chang, Elizabeth H; Witry, Matthew J; Garza, Oscar W; Trewet, CoraLynn B

    2013-01-01

    Strategies are needed to assure essential nonclinical competencies, such as leadership, can be gained using a continuing professional development (CPD) framework. The objective of this study was to explore student pharmacists' utilization and perceived effectiveness of a CPD tool for leadership development in an elective course. Students completed 2 CPD cycles during a semester-long leadership elective using a CPD tool. A questionnaire was used to measure students' perceptions of utility, self-efficacy, and satisfaction in completing CPD cycles when using a tool to aid in this process. The CPD tool was completed twice by 7 students. On average, students spent nearly 5 hours per CPD cycle. More than half (57.1%) scored themselves as successful or very successful in achieving their learning plans, and most (71.4%) found the tool somewhat useful in developing their leadership skills. Some perceived that the tool provided a systematic way to engage in leadership development, whereas others found it difficult to use. In this pilot study, most student pharmacists successfully achieved a leadership development plan and found the CPD tool useful. Providing students with more guidance may help facilitate use and effectiveness of CPD tools. There is a need to continue to develop and refine tools that assist in the CPD of pharmacy practitioners at all levels. Copyright © 2013 Elsevier Inc. All rights reserved.

  12. The Development of a Humanitarian Health Ethics Analysis Tool.

    Science.gov (United States)

    Fraser, Veronique; Hunt, Matthew R; de Laat, Sonya; Schwartz, Lisa

    2015-08-01

    Introduction Health care workers (HCWs) who participate in humanitarian aid work experience a range of ethical challenges in providing care and assistance to communities affected by war, disaster, or extreme poverty. Although there is increasing discussion of ethics in humanitarian health care practice and policy, there are very few resources available for humanitarian workers seeking ethical guidance in the field. To address this knowledge gap, a Humanitarian Health Ethics Analysis Tool (HHEAT) was developed and tested as an action-oriented resource to support humanitarian workers in ethical decision making. While ethical analysis tools increasingly have become prevalent in a variety of practice contexts over the past two decades, very few of these tools have undergone a process of empirical validation to assess their usefulness for practitioners. A qualitative study consisting of a series of six case-analysis sessions with 16 humanitarian HCWs was conducted to evaluate and refine the HHEAT. Participant feedback inspired the creation of a simplified and shortened version of the tool and prompted the development of an accompanying handbook. The study generated preliminary insight into the ethical deliberation processes of humanitarian health workers and highlighted different types of ethics support that humanitarian workers might find helpful in supporting the decision-making process.

  13. Polar Bears or People?: How Framing Can Provide a Useful Analytic Tool to Understand & Improve Climate Change Communication in Classrooms

    Science.gov (United States)

    Busch, K. C.

    2014-12-01

    Not only will young adults bear the brunt of climate change's effects, they are also the ones who will be required to take action - to mitigate and to adapt. The Next Generation Science Standards include climate change, ensuring the topic will be covered in U.S. science classrooms in the near future. Additionally, school is a primary source of information about climate change for young adults. The larger question, though, is how can the teaching of climate change be done in such a way as to ascribe agency - a willingness to act - to students? Framing - as both a theory and an analytic method - has been used to understand how language in the media can affect the audience's intention to act. Frames function as a two-way filter, affecting both the message sent and the message received. This study adapted both the theory and the analytic methods of framing, applying them to teachers in the classroom to answer the research question: How do teachers frame climate change in the classroom? To answer this question, twenty-five lessons from seven teachers were analyzed using semiotic discourse analysis methods. It was found that the teachers' frames overlapped to form two distinct discourses: a Science Discourse and a Social Discourse. The Science Discourse, which was dominant, can be summarized as: Climate change is a current scientific problem that will have profound global effects on the Earth's physical systems. The Social Discourse, used much less often, can be summarized as: Climate change is a future social issue because it will have negative impacts at the local level on people. While it may not be surprising that the Science Discourse was most often heard in these science classrooms, it is possibly problematic if it were the only discourse used. The research literature on framing indicates that the frames found in the Science Discourse - global scale, scientific statistics and facts, and impact on the Earth's systems - are not likely to inspire action-taking. This

  14. Development of gas-phase sample-introduction techniques for analytical atomic spectrometry.

    Science.gov (United States)

    Nakahara, Taketoshi

    2005-05-01

    For the last 30 years, several types of gas-phase sample-introduction methods in analytical atomic spectrometry, i.e., atomic absorption spectrometry (AAS), atomic emission spectrometry (AES) and atomic fluorescence spectrometry (AFS), have been investigated and developed in the author's laboratory. Their fundamental results are summarized in this review article. The gas-phase sample-introduction techniques developed in the author's laboratory can be roughly divided into four groups: i) hydride generation, ii) cold-vapor generation of mercury, iii) analyte volatilization reactions and iv) miscellaneous. The analytical figures of merit of the gas-phase sample-introduction methods have been described in detail. Hydride generation has been coupled with the AAS of As, Bi, Ge, Pb, Sb, Se, Sn and Te, with the inductively coupled plasma (ICP) AES of As, Bi, Sn, Se and Sb, with the high-power nitrogen microwave-induced plasma (N2-MIP) AES of As, Bi, Pb, Sb, Se, Sn and Te by their single- and multi-element determinations, with the AFS of As, Bi, Pb, Sb, Se, Sn and Te, and with the ICP mass spectrometry (MS) of As and Se. The cold-vapor generation method for Hg has been combined with atmospheric-pressure helium microwave-induced plasma (He- or Ar-MIP)-AES and AFS. Furthermore, analyte volatilization reactions have been employed in the ICP-AES of iodine, in the He-MIP-AES of iodine bromine, chlorine, sulfur and carbon, and in the ICP-MS of sulfur. As a result, when compared with conventional solution nebulization, a great improvement in the sensitivity has been attained in each instance. In addition, the developed techniques coupled with analytical atomic spectrometry have been successfully applied to the determination of trace elements in a variety of practical samples.

  15. Development of tools for automatic generation of PLC code

    CERN Document Server

    Koutli, Maria; Rochez, Jacques

    This Master thesis was performed at CERN and more specifically in the EN-ICE-PLC section. The Thesis describes the integration of two PLC platforms, that are based on CODESYS development tool, to the CERN defined industrial framework, UNICOS. CODESYS is a development tool for PLC programming, based on IEC 61131-3 standard, and is adopted by many PLC manufacturers. The two PLC development environments are, the SoMachine from Schneider and the TwinCAT from Beckhoff. The two CODESYS compatible PLCs, should be controlled by the SCADA system of Siemens, WinCC OA. The framework includes a library of Function Blocks (objects) for the PLC programs and a software for automatic generation of the PLC code based on this library, called UAB. The integration aimed to give a solution that is shared by both PLC platforms and was based on the PLCOpen XML scheme. The developed tools were demonstrated by creating a control application for both PLC environments and testing of the behavior of the code of the library.

  16. Developing electronic cooperation tools: a case from norwegian health care.

    Science.gov (United States)

    Larsen, Eli; Mydske, Per Kristen

    2013-06-19

    Many countries aim to create electronic cooperational tools in health care, but the progress is rather slow. The study aimed to uncover how the authoritys' financing policies influence the development of electronic cooperational tools within public health care. An interpretative approach was used in this study. We performed 30 semistructured interviews with vendors, policy makers, and public authorities. Additionally, we conducted an extensive documentation study and participated in 18 workshops concerning information and communication technology (ICT) in Norwegian health care. We found that the interorganizational communication in sectors like health care, that have undergone an independent development of their internal information infrastructure would find it difficult to create electronic services that interconnect the organizations because such connections would affect all interconnected organizations within the heterogenic structure. The organizations would, to a large extent, depend on new functionality in existing information systems. Electronic patient records play a central role in all parts of the health care sector and therefore dependence is established to the information systems and theirs vendors. The Norwegian government authorities, which run more than 80% of the Norwegian health care, have not taken extraordinary steps to compensate for this dependency-the government's political philosophy is that each health care institution should pay for further electronic patient record development. However, cooperational tools are complex due to the number of players involved and the way they are intertwined with the overall workflow. The customers are not able to buy new functionalities on the drawing table, while the electronic patient record vendors are not willing to take the economic risk in developing cooperational tools. Thus, the market mechanisms in the domain are challenged. We also found that public projects that were only financed for the first

  17. Development of long-term numerical ephemerides of telluric planets to analytical series

    Science.gov (United States)

    Kudryavtsev, S. M.

    2011-10-01

    We develop numerical ephemerides of telluric planets to compact analytical series valid over 3000BC-3000AD. The long-term planetary ephemerides DE406 are used as the source; a spectral analysis of tabulated values for the heliocentric mean longitude of every telluric planets is made. For that purpose we used our modification of the spectral analysis method which allows one to develop the tabulated values directly to Poisson series where both amplitudes and arguments of the series'terms are high-degree polynomials of time. As a result, the maximum difference between the mean longitudes of the telluric planets given by the numerical ephemerides DE406 and the new analytical series is less than 0.07 arcsec over the total time interval of 6000 years. The number of Poisson terms in every development is less than 950.

  18. The teaching portfolio as a professional development tool for anaesthetists.

    Science.gov (United States)

    Sidhu, N S

    2015-05-01

    A teaching portfolio (TP) is a document containing a factual description of a teacher's teaching strengths and accomplishments, allowing clinicians to display them for examination by others. The primary aim of a TP is to improve quality of teaching by providing a structure for self-reflection, which in turn aids professional development in medical education. Contents typically include a personal statement on teaching, an overview of teaching accomplishments and activities, feedback from colleagues and learners, a reflective component and some examples of teaching material. Electronic portfolios are more portable and flexible compared to paper portfolios. Clinicians gain the most benefit from a TP when it is used as a tool for self-reflection of their teaching practice and not merely as a list of activities and achievements. This article explains why and how anaesthetists might use a TP as a tool for professional development in medical education.

  19. Innovation tools of economic development of the enterprise

    Directory of Open Access Journals (Sweden)

    Fedor Pavlovich Zotov

    2012-12-01

    Full Text Available Ways to generate new economic and financial benefits from the practice of rationalization work in the industrial enterprise are considered. An attempt to combine the practice rationalization work with the capabilities of tools and techniques of the modern management technologies is made. It is offered to learn the tools and techniques of the technologies by members of the 4types of the formed cross-functional teams through the tutorials. It is offered to distribute the tutorials between the four stages of the method PDCA management cycle. It is shown that the creation of teams and development of tutorials will create internal resources for innovation projects to achieve effective changes in economic development of the enterprise.

  20. Oral Development for LSP via Open Source Tools

    Directory of Open Access Journals (Sweden)

    Alejandro Curado Fuentes

    2015-11-01

    Full Text Available For the development of oral abilities in LSP, few computer-based teaching and learning resources have actually focused intensively on web-based listening and speaking. Many more do on reading, writing, vocabulary and grammatical activities. Our aim in this paper is to approach oral communication in the online environment of Moodle by striving to make it suitable for a learning project which incorporates oral skills. The paper describes a blended process in which both individual and collaborative learning strategies can be combined and exploited through the implementation of specific tools and resources which may go hand in hand with traditional face-to-face conversational classes. The challenge with this new perspective is, ultimately, to provide effective tools for oral LSP development in an apparently writing skill-focused medium.

  1. Development and testing of a community stakeholder park audit tool.

    Science.gov (United States)

    Kaczynski, Andrew T; Stanis, Sonja A Wilhelm; Besenyi, Gina M

    2012-03-01

    Parks are valuable community resources, and auditing park environments is important for understanding their influence on physical activity and health. However, few tools exist that engage citizens in this process. The purpose of this study was to develop a user-friendly tool that would enable diverse stakeholders to quickly and reliably audit community parks for their potential to promote physical activity. A secondary aim was to examine community stakeholders' reactions to the process of developing and using the new tool. The study employed a sequential, multiphase process including three workshops and field testing to ensure the new instrument was the product of input and feedback from a variety of potential stakeholders and was psychometrically sound. All study stages, including data collection and analysis, occurred in 2010. Stakeholder recommendations were combined with reviews of existing instruments to create the new Community Park Audit Tool (CPAT). The CPAT contains four sections titled Park Information, Access and Surrounding Neighborhood, Park Activity Areas, and Park Quality and Safety. Inter-rater analyses demonstrated strong reliability for the vast majority of the items in the tool. Further, stakeholders reported a range of positive reactions resulting from their engagement in the project. The CPAT provides a reliable and user-friendly means of auditing parks for their potential to promote physical activity. Future use of the CPAT can facilitate greater engagement of diverse groups in evaluating and advocating for improved parks and overall healthy community design. Copyright © 2012 American Journal of Preventive Medicine. Published by Elsevier Inc. All rights reserved.

  2. Development of the Writing Readiness Inventory Tool in Context (WRITIC)

    OpenAIRE

    van Hartingsveldt, Margo J.; de Vries, Liesbeth; Cup, Edith HC; de Groot, Imelda JM; Nijhuis-van der Sanden, Maria WG

    2014-01-01

    This article describes the development of the Writing Readiness Inventory Tool in Context (WRITIC), a measurement evaluating writing readiness in Dutch kindergarten children (5 and 6 years old). Content validity was established through 10 expert evaluations in three rounds. Construct validity was established with 251 children following regular education. To identify scale constructs, factor analysis was performed. Discriminative validity was established by examining contrast groups with good ...

  3. EUV source development for high-volume chip manufacturing tools

    Science.gov (United States)

    Stamm, Uwe; Yoshioka, Masaki; Kleinschmidt, Jürgen; Ziener, Christian; Schriever, Guido; Schürmann, Max C.; Hergenhan, Guido; Borisov, Vladimir M.

    2007-03-01

    Xenon-fueled gas discharge produced plasma (DPP) sources were integrated into Micro Exposure Tools already in 2004. Operation of these tools in a research environment gave early learning for the development of EUV sources for Alpha and Beta-Tools. Further experiments with these sources were performed for basic understanding on EUV source technology and limits, especially the achievable power and reliability. The intermediate focus power of Alpha-Tool sources under development is measured to values above 10 W. Debris mitigation schemes were successfully integrated into the sources leading to reasonable collector mirror lifetimes with target of 10 billion pulses due to the effective debris flux reduction. Source collector mirrors, which withstand the radiation and temperature load of Xenon-fueled sources, have been developed in cooperation with MediaLario Technologies to support intermediate focus power well above 10 W. To fulfill the requirements for High Volume chip Manufacturing (HVM) applications, a new concept for HVM EUV sources with higher efficiency has been developed at XTREME technologies. The discharge produced plasma (DPP) source concept combines the use of rotating disk electrodes (RDE) with laser exited droplet targets. The source concept is called laser assisted droplet RDE source. The fuel of these sources has been selected to be Tin. The conversion efficiency achieved with the laser assisted droplet RDE source is 2-3x higher compared to Xenon. Very high pulse energies well above 200 mJ / 2π sr have been measured with first prototypes of the laser assisted droplet RDE source. If it is possible to maintain these high pulse energies at higher repetition rates a 10 kHz EUV source could deliver 2000 W / 2π sr. According to the first experimental data the new concept is expected to be scalable to an intermediate focus power on the 300 W level.

  4. Developing security tools of WSN and WBAN networks applications

    CERN Document Server

    A M El-Bendary, Mohsen

    2015-01-01

    This book focuses on two of the most rapidly developing areas in wireless technology (WT) applications, namely, wireless sensors networks (WSNs) and wireless body area networks (WBANs). These networks can be considered smart applications of the recent WT revolutions. The book presents various security tools and scenarios for the proposed enhanced-security of WSNs, which are supplemented with numerous computer simulations. In the computer simulation section, WSN modeling is addressed using MATLAB programming language.

  5. MOOCs as a Professional Development Tool for Librarians

    Directory of Open Access Journals (Sweden)

    Meghan Ecclestone

    2013-11-01

    Full Text Available This article explores how reference and instructional librarians taking over new areas of subject responsibility can develop professional expertise using new eLearning tools called MOOCs. MOOCs – Massive Open Online Courses – are a new online learning model that offers free higher education courses to anyone with an Internet connection and a keen interest to learn. As MOOCs proliferate, librarians have the opportunity to leverage this technology to improve their professional skills.

  6. Personnel training and development as a tool for organizational efficiency

    OpenAIRE

    Shodeinde, Olubukunola

    2015-01-01

    This study examined the personnel training and development as a tool for organizational efficiency. Employees of MTN Corporate Head Office in Lagos State served as the study population. The study adopted a qualitative approach using questionnaire as main instrument of primary data collection. A total of 110 questionnaires were administered to 217 employees of MTN Nigeria. Using bar charts to illustrate the degree of response; the result of the findings shows that respondents agreed that there...

  7. NIR spectroscopy as a process analytical technology (PAT) tool for monitoring and understanding of a hydrolysis process.

    Science.gov (United States)

    Wu, Zhisheng; Peng, Yanfang; Chen, Wei; Xu, Bing; Ma, Qun; Shi, Xinyuan; Qiao, Yanjiang

    2013-06-01

    The use of near infrared spectroscopy was investigated as a process analytical technology to monitor the amino acids concentration profile during hydrolysis process of Cornu Bubali. A protocol was followed, including outlier selection using relationship plot of residuals versus the leverage level, calibration models using interval partial least squares and synergy interval partial least squares (SiPLS). A strategy of four robust root mean square error of predictions (RMSEP) values have been developed to assess calibration models by means of the desirability index. Furthermore, multivariate quantification limits (MQL) values of the optimum model were determined using two types of error. The SiPLS(3) models for L-proline, L-tyrosine, L-valine, L-phenylalanine and L-lysine provided excellent accuracies with RMSEP values of 0.0915 mg/mL, 0.1605 mg/mL, 0.0515 mg/mL, 0.0586 mg/mL and 0.0613 mg/mL, respectively. The MQL ranged from 90 ppm to 810 ppm, which confirmed that these models can be suitable for most applications. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. Using an evaluative tool to develop effective mathscasts

    Science.gov (United States)

    Galligan, Linda; Hobohm, Carola; Peake, Katherine

    2017-09-01

    This study is situated in a course designed for both on-campus and online pre-service and in-service teachers, where student-created mathscasts provide a way for university lecturers to assess students' quality of teaching, and understanding of mathematics. Teachers and pre-service teachers, in a university course with 90% online enrolment, were asked to create mathscasts to explain mathematics concepts at middle school level. This paper describes the process of developing and refining a tool for the creation and evaluation of quality student-produced mathscasts. The study then investigates the usefulness of the tool within the context of pedagogy and mathematical understanding. Despite an abundance of mathscasts already available on the web, there is merit in creating mathscasts, not only as a tool for teaching, but also as a means of learning by doing. The premise for creating student-produced mathscasts was to capture the creators' mathematical understanding and pedagogical approach to teaching a mathematical concept, which were then peer-assessed and graded. The analysis included surveys, practice mathscasts with peer- and self-reviews, and students' final assessed mathscasts. The results indicate that the use of the evaluative tool resulted in an improvement in quality of student-created mathscasts and critiques thereof. The paper concludes with a discussion on future directions of student-produced mathscasts.

  9. Electrospray ionization and matrix assisted laser desorption/ionization mass spectrometry: powerful analytical tools in recombinant protein chemistry

    DEFF Research Database (Denmark)

    Andersen, Jens S.; Svensson, B; Roepstorff, P

    1996-01-01

    Electrospray ionization and matrix assisted laser desorption/ionization are effective ionization methods for mass spectrometry of biomolecules. Here we describe the capabilities of these methods for peptide and protein characterization in biotechnology. An integrated analytical strategy is presen......Electrospray ionization and matrix assisted laser desorption/ionization are effective ionization methods for mass spectrometry of biomolecules. Here we describe the capabilities of these methods for peptide and protein characterization in biotechnology. An integrated analytical strategy...

  10. Development and content validation of the power mobility training tool.

    Science.gov (United States)

    Kenyon, Lisa K; Farris, John P; Cain, Brett; King, Emily; VandenBerg, Ashley

    2018-01-01

    This paper outlines the development and content validation of the power mobility training tool (PMTT), an observational tool designed to assist therapists in developing power mobility training programs for children who have multiple, severe impairments. Initial items on the PMTT were developed based on a literature review and in consultation with therapists experienced in the use of power mobility. Items were trialled in clinical settings, reviewed, and refined. Items were then operationalized and an administration manual detailing scoring for each item was created. Qualitative and quantitative methods were used to establish content validity via a 15 member, international expert panel. The content validity ratio (CVR) was determined for each possible item. Of the 19 original items, 10 achieved minimum required CVR values and were included in the final version of the PMTT. Items related to manoeuvring a power mobility device were merged and an item related to the number of switches used concurrently to operate a power mobility device were added to the PMTT. The PMTT may assist therapists in developing training programs that facilitate the acquisition of beginning power mobility skills in children who have multiple, severe impairments. Implications for Rehabilitation The Power Mobility Training Tool (PMTT) was developed to help guide the development of power mobility intervention programs for children who have multiple, severe impairments. The PMTT can be used with children who access a power mobility device using either a joystick or a switch. Therapists who have limited experience with power mobility may find the PMTT to be helpful in setting up and conducting power mobility training interventions as a feasible aspect of a plan of care for children who have multiple, severe impairments.

  11. Feasibility assessment tool for urban anaerobic digestion in developing countries.

    Science.gov (United States)

    Lohri, Christian Riuji; Rodić, Ljiljana; Zurbrügg, Christian

    2013-09-15

    This paper describes a method developed to support feasibility assessments of urban anaerobic digestion (AD). The method not only uses technical assessment criteria but takes a broader sustainability perspective and integrates technical-operational, environmental, financial-economic, socio-cultural, institutional, policy and legal criteria into the assessment tool developed. Use of the tool can support decision-makers with selecting the most suitable set-up for the given context. The tool consists of a comprehensive set of questions, structured along four distinct yet interrelated dimensions of sustainability factors, which all influence the success of any urban AD project. Each dimension answers a specific question: I) WHY? What are the driving forces and motivations behind the initiation of the AD project? II) WHO? Who are the stakeholders and what are their roles, power, interests and means of intervention? III) WHAT? What are the physical components of the proposed AD chain and the respective mass and resource flows? IV) HOW? What are the key features of the enabling or disabling environment (sustainability aspects) affecting the proposed AD system? Disruptive conditions within these four dimensions are detected. Multi Criteria Decision Analysis is used to guide the process of translating the answers from six sustainability categories into scores, combining them with the relative importance (weights) attributed by the stakeholders. Risk assessment further evaluates the probability that certain aspects develop differently than originally planned and assesses the data reliability (uncertainty factors). The use of the tool is demonstrated with its application in a case study for Bahir Dar in Ethiopia. Copyright © 2013 The Authors. Published by Elsevier Ltd.. All rights reserved.

  12. DEVELOPMENT OF ANALYTICAL METHODS IN METABOLOMICS FOR THE STUDY OF HEREDITARY AND ACQUIRED GENETIC DISEASE

    OpenAIRE

    Arvonio, Raffaele

    2011-01-01

    METABOLOMICS AND MASS SPECTROMETRY The research project take place in the branch of metabolomics, which involves the systematic study of the metabolites present in a cell and in this area MS, thanks to its potential to carry out controlled experiments of fragmentation, plays a role as a key methodology for identification of various metabolites. The work of thesis project is focused on the analytical methods development for the diagnosis of metabolic diseases and is divided as follows: ...

  13. Selenium isotope studies in plants. Development and validation of a novel geochemical tool and its application to organic samples

    Energy Technology Data Exchange (ETDEWEB)

    Banning, Helena

    2016-03-12

    Selenium (Se), being an essential nutrient and a toxin, enters the food chain mainly via plants. Selenium isotope signatures were proved to be an excellent redox tracer, making it a promising tool for the exploration of the Se cycle in plants. The analytical method is sensitive on organic samples and requires particular preparation methods, which were developed and validated in this study. Plant cultivation setups revealed the applicability of these methods to trace plant internal processes.

  14. Approaching Drosophila development through proteomic tools and databases: At the hub of the post-genomic era

    OpenAIRE

    Carmena, Ana

    2009-01-01

    The past decade has witnessed an explosion in the growth of proteomics. The completion of numerous genome sequences, the development of powerful protein analytical technologies, as well as the design of innovative bioinformatics tools have marked the beginning of a new post-genomic era. Proteomics, the large-scale analysis of proteins in an organism, organ or organelle encompasses different aspects: (1) the identification, analysis of post-translational modifications and quantification of pro...

  15. The role of customized computational tools in product development.

    Energy Technology Data Exchange (ETDEWEB)

    Heinstein, Martin Wilhelm; Kempka, Steven Norman; Tikare, Veena

    2005-06-01

    Model-based computer simulations have revolutionized product development in the last 10 to 15 years. Technologies that have existed for many decades or even centuries have been improved with the aid of computer simulations. Everything from low-tech consumer goods such as detergents, lubricants and light bulb filaments to the most advanced high-tech products such as airplane wings, wireless communication technologies and pharmaceuticals is engineered with the aid of computer simulations today. In this paper, we present a framework for describing computational tools and their application within the context of product engineering. We examine a few cases of product development that integrate numerical computer simulations into the development stage. We will discuss how the simulations were integrated into the development process, what features made the simulations useful, the level of knowledge and experience that was necessary to run meaningful simulations and other details of the process. Based on this discussion, recommendations for the incorporation of simulations and computational tools into product development will be made.

  16. A free software tool for the development of decision support systems

    Directory of Open Access Journals (Sweden)

    COLONESE, G

    2008-06-01

    Full Text Available This article describes PostGeoOlap, a free software open source tool for decision support that integrates OLAP (On-Line Analytical Processing and GIS (Geographical Information Systems. Besides describing the tool, we show how it can be used to achieve effective and low cost decision support that is adequate for small and medium companies and for small public offices.

  17. Development and testing of a community flood resilience measurement tool

    Science.gov (United States)

    Keating, Adriana; Campbell, Karen; Szoenyi, Michael; McQuistan, Colin; Nash, David; Burer, Meinrad

    2017-01-01

    Given the increased attention on resilience strengthening in international humanitarian and development work, there is a growing need to invest in its measurement and the overall accountability of resilience strengthening initiatives. The purpose of this article is to present our framework and tool for measuring community-level resilience to flooding and generating empirical evidence and to share our experience in the application of the resilience concept. At the time of writing the tool is being tested in 75 communities across eight countries. Currently 88 potential sources of resilience are measured at the baseline (initial state) and end line (final state) approximately 2 years later. If a flood occurs in the community during the study period, resilience outcome measures are recorded. By comparing pre-flood characteristics to post-flood outcomes, we aim to empirically verify sources of resilience, something which has never been done in this field. There is an urgent need for the continued development of theoretically anchored, empirically verified, and practically applicable disaster resilience measurement frameworks and tools so that the field may (a) deepen understanding of the key components of disaster resilience in order to better target resilience-enhancing initiatives, and (b) enhance our ability to benchmark and measure disaster resilience over time, and (c) compare how resilience changes as a result of different capacities, actions and hazards.

  18. The development of a practical tool for risk assessment of manual work – the HAT-tool

    NARCIS (Netherlands)

    Kraker, H. de; Douwes, M.

    2008-01-01

    For the Dutch Ministry of Social Affairs and Employment we developed a tool to assess the risks of developing complaints of the arm, neck or shoulders during manual work. The tool was developed for every type of organization and is easy to use, does not require measurements other than time and can

  19. Developments in the Tools and Methodologies of Synthetic Biology

    Science.gov (United States)

    Kelwick, Richard; MacDonald, James T.; Webb, Alexander J.; Freemont, Paul

    2014-01-01

    Synthetic biology is principally concerned with the rational design and engineering of biologically based parts, devices, or systems. However, biological systems are generally complex and unpredictable, and are therefore, intrinsically difficult to engineer. In order to address these fundamental challenges, synthetic biology is aiming to unify a “body of knowledge” from several foundational scientific fields, within the context of a set of engineering principles. This shift in perspective is enabling synthetic biologists to address complexity, such that robust biological systems can be designed, assembled, and tested as part of a biological design cycle. The design cycle takes a forward-design approach in which a biological system is specified, modeled, analyzed, assembled, and its functionality tested. At each stage of the design cycle, an expanding repertoire of tools is being developed. In this review, we highlight several of these tools in terms of their applications and benefits to the synthetic biology community. PMID:25505788

  20. Requirements Document for Development of a Livermore Tomography Tools Interface

    Energy Technology Data Exchange (ETDEWEB)

    Seetho, I. M. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-02-09

    In this document, we outline an exercise performed at LLNL to evaluate the user interface deficits of a LLNL-developed CT reconstruction software package, Livermore Tomography Tools (LTT). We observe that a difficult-to-use command line interface and the lack of support functions compound to generate a bottleneck in the CT reconstruction process when input parameters to key functions are not well known. Through the exercise of systems engineering best practices, we generate key performance parameters for a LTT interface refresh, and specify a combination of back-end (“test-mode” functions) and front-end (graphical user interface visualization and command scripting tools) solutions to LTT’s poor user interface that aim to mitigate issues and lower costs associated with CT reconstruction using LTT. Key functional and non-functional requirements and risk mitigation strategies for the solution are outlined and discussed.

  1. Development Life Cycle and Tools for XML Content Models

    Energy Technology Data Exchange (ETDEWEB)

    Kulvatunyou, Boonserm [ORNL; Morris, Katherine [National Institute of Standards and Technology (NIST); Buhwan, Jeong [POSTECH University, South Korea; Goyal, Puja [National Institute of Standards and Technology (NIST)

    2004-11-01

    Many integration projects today rely on shared semantic models based on standards represented using Extensible Mark up Language (XML) technologies. Shared semantic models typically evolve and require maintenance. In addition, to promote interoperability and reduce integration costs, the shared semantics should be reused as much as possible. Semantic components must be consistent and valid in terms of agreed upon standards and guidelines. In this paper, we describe an activity model for creation, use, and maintenance of a shared semantic model that is coherent and supports efficient enterprise integration. We then use this activity model to frame our research and the development of tools to support those activities. We provide overviews of these tools primarily in the context of the W3C XML Schema. At the present, we focus our work on the W3C XML Schema as the representation of choice, due to its extensive adoption by industry.

  2. Developments in the tools and methodologies of synthetic biology

    Directory of Open Access Journals (Sweden)

    Richard eKelwick

    2014-11-01

    Full Text Available Synthetic biology is principally concerned with the rational design and engineering of biologically based parts, devices or systems. However, biological systems are generally complex and unpredictable and are therefore intrinsically difficult to engineer. In order to address these fundamental challenges, synthetic biology is aiming to unify a ‘body of knowledge’ from several foundational scientific fields, within the context of a set of engineering principles. This shift in perspective is enabling synthetic biologists to address complexity, such that robust biological systems can be designed, assembled and tested as part of a biological design cycle. The design cycle takes a forward-design approach in which a biological system is specified, modeled, analyzed, assembled and its functionality tested. At each stage of the design cycle an expanding repertoire of tools is being developed. In this review we highlight several of these tools in terms of their applications and benefits to the synthetic biology community.

  3. Development of a new semi-analytical model for cross-borehole flow experiments in fractured media

    Science.gov (United States)

    Roubinet, Delphine; Irving, James; Day-Lewis, Frederick D.

    2015-01-01

    Analysis of borehole flow logs is a valuable technique for identifying the presence of fractures in the subsurface and estimating properties such as fracture connectivity, transmissivity and storativity. However, such estimation requires the development of analytical and/or numerical modeling tools that are well adapted to the complexity of the problem. In this paper, we present a new semi-analytical formulation for cross-borehole flow in fractured media that links transient vertical-flow velocities measured in one or a series of observation wells during hydraulic forcing to the transmissivity and storativity of the fractures intersected by these wells. In comparison with existing models, our approach presents major improvements in terms of computational expense and potential adaptation to a variety of fracture and experimental configurations. After derivation of the formulation, we demonstrate its application in the context of sensitivity analysis for a relatively simple two-fracture synthetic problem, as well as for field-data analysis to investigate fracture connectivity and estimate fracture hydraulic properties. These applications provide important insights regarding (i) the strong sensitivity of fracture property estimates to the overall connectivity of the system; and (ii) the non-uniqueness of the corresponding inverse problem for realistic fracture configurations.

  4. The development of a tool to predict team performance.

    Science.gov (United States)

    Sinclair, M A; Siemieniuch, C E; Haslam, R A; Henshaw, M J D C; Evans, L

    2012-01-01

    The paper describes the development of a tool to predict quantitatively the success of a team when executing a process. The tool was developed for the UK defence industry, though it may be useful in other domains. It is expected to be used by systems engineers in initial stages of systems design, when concepts are still fluid, including the structure of the team(s) which are expected to be operators within the system. It enables answers to be calculated for questions such as "What happens if I reduce team size?" and "Can I reduce the qualifications necessary to execute this process and still achieve the required level of success?". The tool has undergone verification and validation; it predicts fairly well and shows promise. An unexpected finding is that the tool creates a good a priori argument for significant attention to Human Factors Integration in systems projects. The simulations show that if a systems project takes full account of human factors integration (selection, training, process design, interaction design, culture, etc.) then the likelihood of team success will be in excess of 0.95. As the project derogates from this state, the likelihood of team success will drop as low as 0.05. If the team has good internal communications and good individuals in key roles, the likelihood of success rises towards 0.25. Even with a team comprising the best individuals, p(success) will not be greater than 0.35. It is hoped that these results will be useful for human factors professionals involved in systems design. Copyright © 2011 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  5. Demonstration of Decision Support Tools for Sustainable Development

    Energy Technology Data Exchange (ETDEWEB)

    Shropshire, David Earl; Jacobson, Jacob Jordan; Berrett, Sharon; Cobb, D. A.; Worhach, P.

    2000-11-01

    The Demonstration of Decision Support Tools for Sustainable Development project integrated the Bechtel/Nexant Industrial Materials Exchange Planner and the Idaho National Engineering and Environmental Laboratory System Dynamic models, demonstrating their capabilities on alternative fuel applications in the Greater Yellowstone-Teton Park system. The combined model, called the Dynamic Industrial Material Exchange, was used on selected test cases in the Greater Yellow Teton Parks region to evaluate economic, environmental, and social implications of alternative fuel applications, and identifying primary and secondary industries. The test cases included looking at compressed natural gas applications in Teton National Park and Jackson, Wyoming, and studying ethanol use in Yellowstone National Park and gateway cities in Montana. With further development, the system could be used to assist decision-makers (local government, planners, vehicle purchasers, and fuel suppliers) in selecting alternative fuels, vehicles, and developing AF infrastructures. The system could become a regional AF market assessment tool that could help decision-makers understand the behavior of the AF market and conditions in which the market would grow. Based on this high level market assessment, investors and decision-makers would become more knowledgeable of the AF market opportunity before developing detailed plans and preparing financial analysis.

  6. An innovative approach to the development of a portable unit for analytical flame characterization in a microgravity environment

    Science.gov (United States)

    Dubinskiy, Mark A.; Kamal, Mohammed M.; Misra, Prabhaker

    1995-01-01

    The availability of manned laboratory facilities in space offers wonderful opportunities and challenges in microgravity combustion science and technology. In turn, the fundamentals of microgravity combustion science can be studied via spectroscopic characterization of free radicals generated in flames. The laser-induced fluorescence (LIF) technique is a noninvasive method of considerable utility in combustion physics and chemistry suitable for monitoring not only specific species and their kinetics, but it is also important for imaging of flames. This makes LIF one of the most important tools for microgravity combustion science. Flame characterization under microgravity conditions using LIF is expected to be more informative than other methods aimed at searching for effects like pumping phenomenon that can be modeled via ground level experiments. A primary goal of our work consisted in working out an innovative approach to devising an LIF-based analytical unit suitable for in-space flame characterization. It was decided to follow two approaches in tandem: (1) use the existing laboratory (non-portable) equipment and determine the optimal set of parameters for flames that can be used as analytical criteria for flame characterization under microgravity conditions; and (2) use state-of-the-art developments in laser technology and concentrate some effort in devising a layout for the portable analytical equipment. This paper presents an up-to-date summary of the results of our experiments aimed at the creation of the portable device for combustion studies in a microgravity environment, which is based on a portable UV tunable solid-state laser for excitation of free radicals normally present in flames in detectable amounts. A systematic approach has allowed us to make a convenient choice of species under investigation, as well as the proper tunable laser system, and also enabled us to carry out LIF experiments on free radicals using a solid-state laser tunable in the UV.

  7. Assessment of COTS IR image simulation tools for ATR development

    Science.gov (United States)

    Seidel, Heiko; Stahl, Christoph; Bjerkeli, Frode; Skaaren-Fystro, Paal

    2005-05-01

    Following the tendency of increased use of imaging sensors in military aircraft, future fighter pilots will need onboard artificial intelligence e.g. ATR for aiding them in image interpretation and target designation. The European Aeronautic Defence and Space Company (EADS) in Germany has developed an advanced method for automatic target recognition (ATR) which is based on adaptive neural networks. This ATR method can assist the crew of military aircraft like the Eurofighter in sensor image monitoring and thereby reduce the workload in the cockpit and increase the mission efficiency. The EADS ATR approach can be adapted for imagery of visual, infrared and SAR sensors because of the training-based classifiers of the ATR method. For the optimal adaptation of these classifiers they have to be trained with appropriate and sufficient image data. The training images must show the target objects from different aspect angles, ranges, environmental conditions, etc. Incomplete training sets lead to a degradation of classifier performance. Additionally, ground truth information i.e. scenario conditions like class type and position of targets is necessary for the optimal adaptation of the ATR method. In Summer 2003, EADS started a cooperation with Kongsberg Defence & Aerospace (KDA) from Norway. The EADS/KDA approach is to provide additional image data sets for training-based ATR through IR image simulation. The joint study aims to investigate the benefits of enhancing incomplete training sets for classifier adaptation by simulated synthetic imagery. EADS/KDA identified the requirements of a commercial-off-the-shelf IR simulation tool capable of delivering appropriate synthetic imagery for ATR development. A market study of available IR simulation tools and suppliers was performed. After that the most promising tool was benchmarked according to several criteria e.g. thermal emission model, sensor model, targets model, non-radiometric image features etc., resulting in a

  8. High-Speed-/-Hypersonic-Weapon-Development-Tool Integration

    National Research Council Canada - National Science Library

    Duchow, Erin M; Munson, Michael J; Alonge, Jr, Frank A

    2006-01-01

    Multiple tools exist to aid in the design and evaluation of high-speed weapons. This paper documents efforts to integrate several existing tools, including the Integrated Hypersonic Aeromechanics Tool (IHAT)1-7...

  9. Formulation and Development of a Validated UV-Spectrophotometric Analytical Method of Rutin Tablet

    Directory of Open Access Journals (Sweden)

    Murad N. Abualhasan

    2017-01-01

    Full Text Available Rutin is available in some foods, fruits, and vegetables. It has various beneficial medical effects making it useful in the treatment of various diseases. Rutin is available in different oral dosage forms such as tablets or capsules, widely available in the market. Rutin and many herbal medicines lack quality control due to unavailability of analytical methods. In this study, we formulated rutin tablet and studied its stability using a simple developed analytical method. The dissolution profile of our formulated tablet was also inspected. The results showed that our developed method was linear (R2=0.999, precise (% RSD = 0.026, and accurate (% recovery = 98.55–103.34. The formulated rutin tablet was stable under accelerated conditions as well as room temperature for 150 days (% assay > 91.69. The dissolution profile over 45 minutes of our formulated tablet showed a better dissolution (26.5% compared with the internationally marketed Rutin® tablet (18.5%. This study can serve as a guideline to companies that manufacture herbal products to improve their formulated herbs and apply validated analytical methods to check the quality of their product.

  10. Development and implementation of information systems for the DOE's National Analytical Management Program (NAMP)

    International Nuclear Information System (INIS)

    Streets, W. E.

    1999-01-01

    The Department of Energy (DOE) faces a challenging environmental management effort, including environmental protection, environmental restoration, waste management, and decommissioning. This effort requires extensive sampling and analysis to determine the type and level of contamination and the appropriate technology for cleanup, and to verify compliance with environmental regulations. Data obtained from these sampling and analysis activities are used to support environmental management decisions. Confidence in the data is critical, having legal, regulatory, and therefore, economic impact. To promote quality in the planning, management, and performance of these sampling and analysis operations, DOE's Office of Environmental Management (EM) has established the National Analytical Management Program (NAMP). With a focus on reducing the estimated costs of over $200M per year for EM's analytical services, NAMP has been charged with developing products that will decrease the costs for DOE complex-wide environmental management while maintaining quality in all aspects of the analytical data generation. As part of this thrust to streamline operations, NAMP is developing centralized information systems that will allow DOE complex personnel to share information about EM contacts at the various sites, pertinent methodologies for environmental restoration and waste management, costs of analyses, and performance of contracted laboratories

  11. Development and implementation of an analytical quality assurance plan at the Hanford site

    International Nuclear Information System (INIS)

    Kuhl-Klinger, K.J.; Taylor, C.D.; Kawabata, K.K.

    1995-08-01

    The Hanford Analytical Services Quality Assurance Plan (HASQAP) provides a uniform standard for onsite and offsite laboratories performing analytical work in support of Hanford Site environmental cleanup initiatives. The Hanford Site is a nuclear site that originated during World War 11 and has a legacy of environmental clean up issues. In early 1993, the need for and feasibility of developing a quality assurance plan to direct all analytical activities performed to support environmental cleanup initiatives set forth in the Hanford Federal Facility Agreement and Consent Order were discussed. Several group discussions were held and from them came the HASQAP. This document will become the quality assurance guidance document in a Federal Facility Agreement and Consent Order. This paper presents the mechanics involved in developing a quality assurance plan for this scope of activity, including the approach taken to resolve the variability of quality control requirements driven by numerous regulations. It further describes the consensus building process and how the goal of uniting onsite and offsite laboratories as well as inorganic, organic, and radioanalytic disciplines under a common understanding of basic quality control concepts was achieved

  12. Recent Development in Big Data Analytics for Business Operations and Risk Management.

    Science.gov (United States)

    Choi, Tsan-Ming; Chan, Hing Kai; Yue, Xiaohang

    2017-01-01

    "Big data" is an emerging topic and has attracted the attention of many researchers and practitioners in industrial systems engineering and cybernetics. Big data analytics would definitely lead to valuable knowledge for many organizations. Business operations and risk management can be a beneficiary as there are many data collection channels in the related industrial systems (e.g., wireless sensor networks, Internet-based systems, etc.). Big data research, however, is still in its infancy. Its focus is rather unclear and related studies are not well amalgamated. This paper aims to present the challenges and opportunities of big data analytics in this unique application domain. Technological development and advances for industrial-based business systems, reliability and security of industrial systems, and their operational risk management are examined. Important areas for future research are also discussed and revealed.

  13. Class-modelling in food analytical chemistry: Development, sampling, optimisation and validation issues - A tutorial.

    Science.gov (United States)

    Oliveri, Paolo

    2017-08-22

    Qualitative data modelling is a fundamental branch of pattern recognition, with many applications in analytical chemistry, and embraces two main families: discriminant and class-modelling methods. The first strategy is appropriate when at least two classes are meaningfully defined in the problem under study, while the second strategy is the right choice when the focus is on a single class. For this reason, class-modelling methods are also referred to as one-class classifiers. Although, in the food analytical field, most of the issues would be properly addressed by class-modelling strategies, the use of such techniques is rather limited and, in many cases, discriminant methods are forcedly used for one-class problems, introducing a bias in the outcomes. Key aspects related to the development, optimisation and validation of suitable class models for the characterisation of food products are critically analysed and discussed. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Development of collaborative-creative learning model using virtual laboratory media for instrumental analytical chemistry lectures

    Science.gov (United States)

    Zurweni, Wibawa, Basuki; Erwin, Tuti Nurian

    2017-08-01

    The framework for teaching and learning in the 21st century was prepared with 4Cs criteria. Learning providing opportunity for the development of students' optimal creative skills is by implementing collaborative learning. Learners are challenged to be able to compete, work independently to bring either individual or group excellence and master the learning material. Virtual laboratory is used for the media of Instrumental Analytical Chemistry (Vis, UV-Vis-AAS etc) lectures through simulations computer application and used as a substitution for the laboratory if the equipment and instruments are not available. This research aims to design and develop collaborative-creative learning model using virtual laboratory media for Instrumental Analytical Chemistry lectures, to know the effectiveness of this design model adapting the Dick & Carey's model and Hannafin & Peck's model. The development steps of this model are: needs analyze, design collaborative-creative learning, virtual laboratory media using macromedia flash, formative evaluation and test of learning model effectiveness. While, the development stages of collaborative-creative learning model are: apperception, exploration, collaboration, creation, evaluation, feedback. Development of collaborative-creative learning model using virtual laboratory media can be used to improve the quality learning in the classroom, overcome the limitation of lab instruments for the real instrumental analysis. Formative test results show that the Collaborative-Creative Learning Model developed meets the requirements. The effectiveness test of students' pretest and posttest proves significant at 95% confidence level, t-test higher than t-table. It can be concluded that this learning model is effective to use for Instrumental Analytical Chemistry lectures.

  15. VisTool: A user interface and visualization development system

    DEFF Research Database (Denmark)

    Xu, Shangjin

    Although software usability has long been emphasized, there is a lot of software with poor usability. In Usability Engineering, usability professionals prescribe a classical usability approach to improving software usability. It is essential to prototype and usability test user interfaces before....... However, it is more difficult to follow the classical usability approach for graphical presentation development. These difficulties result from the fact that designers cannot implement user interface with interactions and real data. We developed VisTool – a user interface and visualization development...... interface objects and properties. We built visualizations such as Lifelines, Parallel Coordinates, Heatmap, etc. to show that the formula-based approach is powerful enough for building customized visualizations. The evaluation with Cognitive Dimensions shows that the formula-based approach is cognitively...

  16. Development of dosimetry tools for proton therapy research

    International Nuclear Information System (INIS)

    Kim, Jong-Won; Kim, Dogyun

    2010-01-01

    Dosimetry tools for proton therapy research have been developed to measure the properties of a therapeutic proton beam. A CCD camera-scintillation screen system, which can verify the 2D dose distribution of a scanning beam and can be used for proton radiography, was developed. Also developed were a large area parallel-plate ionization chamber and a multi-layer Faraday cup to monitor the beam current and to measure the beam energy, respectively. To investigate the feasibility of locating the distal dose falloff in real time during patient treatment, a prompt gamma measuring system composed of multi-layer shielding structures was then devised. The system worked well for a pristine proton beam. However, correlation between the distal dose falloff and the prompt gamma distribution was blurred by neutron background for a therapy beam formed by scattering method. We have also worked on the design of a Compton camera to image the 2D distribution of prompt gamma rays.

  17. Medicaid Analytic eXtract (MAX) Chartbooks

    Data.gov (United States)

    U.S. Department of Health & Human Services — The Medicaid Analytic eXtract Chartbooks are research tools and reference guides on Medicaid enrollees and their Medicaid experience in 2002 and 2004. Developed for...

  18. NEEMO 20: Science Training, Operations, and Tool Development

    Science.gov (United States)

    Graff, T.; Miller, M.; Rodriguez-Lanetty, M.; Chappell, S.; Naids, A.; Hood, A.; Coan, D.; Abell, P.; Reagan, M.; Janoiko, B.

    2016-01-01

    The 20th mission of the National Aeronautics and Space Administration (NASA) Extreme Environment Mission Operations (NEEMO) was a highly integrated evaluation of operational protocols and tools designed to enable future exploration beyond low-Earth orbit. NEEMO 20 was conducted from the Aquarius habitat off the coast of Key Largo, FL in July 2015. The habitat and its surroundings provide a convincing analog for space exploration. A crew of six (comprised of astronauts, engineers, and habitat technicians) lived and worked in and around the unique underwater laboratory over a mission duration of 14-days. Incorporated into NEEMO 20 was a diverse Science Team (ST) comprised of geoscientists from the Astromaterials Research and Exploration Science (ARES/XI) Division from the Johnson Space Center (JSC), as well as marine scientists from the Department of Biological Sciences at Florida International University (FIU). This team trained the crew on the science to be conducted, defined sampling techniques and operational procedures, and planned and coordinated the science focused Extra Vehicular Activities (EVAs). The primary science objectives of NEEMO 20 was to study planetary sampling techniques and tools in partial gravity environments under realistic mission communication time delays and operational pressures. To facilitate these objectives two types of science sites were employed 1) geoscience sites with available rocks and regolith for testing sampling procedures and tools and, 2) marine science sites dedicated to specific research focused on assessing the photosynthetic capability of corals and their genetic connectivity between deep and shallow reefs. These marine sites and associated research objectives included deployment of handheld instrumentation, context descriptions, imaging, and sampling; thus acted as a suitable proxy for planetary surface exploration activities. This abstract briefly summarizes the scientific training, scientific operations, and tool

  19. Effective Management Tools in Implementing Operational Programme Administrative Capacity Development

    Directory of Open Access Journals (Sweden)

    Carmen – Elena DOBROTĂ

    2015-12-01

    Full Text Available Public administration in Romania and the administrative capacity of the central and local government has undergone a significant progress since 2007. The development of the administrative capacity deals with a set of structural and process changes that allow governments to improve the formulation and implementation of policies in order to achieve enhanced results. Identifying, developing and using management tools for a proper implementation of an operational programme dedicated to consolidate a performing public administration it was a challenging task, taking into account the types of interventions within Operational Programme Administrative Capacity Development 2007 – 2013 and the continuous changes in the economic and social environment in Romania and Europe. The aim of this article is to provide a short description of the approach used by the Managing Authority for OPACD within the performance management of the structural funds in Romania between 2008 and 2014. The paper offers a broad image of the way in which evaluations (ad-hoc, intermediate and performance were used in different stages of OP implementation as a tool of management.

  20. Development of a biogas planning tool for project owners

    DEFF Research Database (Denmark)

    Fredenslund, Anders Michael; Kjær, Tyge

    A spreadsheet model was developed, which can be used as a tool in the initial phases of planning a centralized biogas plant in Denmark. The model assesses energy production, total plant costs, operational costs and revenues and effect on greenhouse gas emissions. Two energy utilization alternatives...... are considered: Combined heat and power and natural gas grid injection. The main input to the model is the amount and types of substrates available for anaerobic digestion. By substituting the models’ default values with more project specific information, the model can be used in a biogas projects later phases...

  1. Development of analytical methodologies to assess recalcitrant pesticide bioremediation in biobeds at laboratory scale.

    Science.gov (United States)

    Rivero, Anisleidy; Niell, Silvina; Cerdeiras, M Pía; Heinzen, Horacio; Cesio, María Verónica

    2016-06-01

    To assess recalcitrant pesticide bioremediation it is necessary to gradually increase the complexity of the biological system used in order to design an effective biobed assembly. Each step towards this effective biobed design needs a suitable, validated analytical methodology that allows a correct evaluation of the dissipation and bioconvertion. Low recovery yielding methods could give a false idea of a successful biodegradation process. To address this situation, different methods were developed and validated for the simultaneous determination of endosulfan, its main three metabolites, and chlorpyrifos in increasingly complex matrices where the bioconvertor basidiomycete Abortiporus biennis could grow. The matrices were culture media, bran, and finally a laboratory biomix composed of bran, peat and soil. The methodology for the analysis of the first evaluated matrix has already been reported. The methodologies developed for the other two systems are presented in this work. The targeted analytes were extracted from fungi growing over bran in semisolid media YNB (Yeast Nitrogen Based) with acetonitrile using shaker assisted extraction, The salting-out step was performed with MgSO4 and NaCl, and the extracts analyzed by GC-ECD. The best methodology was fully validated for all the evaluated analytes at 1 and 25mgkg(-1) yielding recoveries between 72% and 109% and RSDs endosulfan and 87% of chlorpyrifos after 90 days. Having assessed that A. biennis growing over bran can metabolize the studied pesticides, the next step faced was the development and validation of an analytical procedure to evaluate the analytes in a laboratory scale biobed composed of 50% of bran, 25% of peat and 25% of soil together with fungal micelium. From the different procedures assayed, only ultrasound assisted extraction with ethyl acetate allowed recoveries between 80% and 110% with RSDs endosulfan isomers (α & β) and its metabolites (endosulfan sulfate, ether and diol) as well as for

  2. Effect of Micro Electrical Discharge Machining Process Conditions on Tool Wear Characteristics: Results of an Analytic Study

    DEFF Research Database (Denmark)

    Puthumana, Govindan; P., Rajeev

    2016-01-01

    characteristicsin micro-EDM process. A new approach with two novel factors anticipated to directly control the material removal mechanism from the tool electrode are proposed; using discharge energyfactor (DEf) and dielectric flushing factor (DFf). The results showed that the correlation between the tool wear rate...

  3. Development of Next Generation Multiphase Pipe Flow Prediction Tools

    Energy Technology Data Exchange (ETDEWEB)

    Cem Sarica; Holden Zhang

    2006-05-31

    The developments of oil and gas fields in deep waters (5000 ft and more) will become more common in the future. It is inevitable that production systems will operate under multiphase flow conditions (simultaneous flow of gas, oil and water possibly along with sand, hydrates, and waxes). Multiphase flow prediction tools are essential for every phase of hydrocarbon recovery from design to operation. Recovery from deep-waters poses special challenges and requires accurate multiphase flow predictive tools for several applications, including the design and diagnostics of the production systems, separation of phases in horizontal wells, and multiphase separation (topside, seabed or bottom-hole). It is crucial for any multiphase separation technique, either at topside, seabed or bottom-hole, to know inlet conditions such as flow rates, flow patterns, and volume fractions of gas, oil and water coming into the separation devices. Therefore, the development of a new generation of multiphase flow predictive tools is needed. The overall objective of the proposed study is to develop a unified model for gas-oil-water three-phase flow in wells, flow lines, and pipelines to predict flow characteristics such as flow patterns, phase distributions, and pressure gradient encountered during petroleum production at different flow conditions (pipe diameter and inclination, fluid properties and flow rates). In the current multiphase modeling approach, flow pattern and flow behavior (pressure gradient and phase fractions) prediction modeling are separated. Thus, different models based on different physics are employed, causing inaccuracies and discontinuities. Moreover, oil and water are treated as a pseudo single phase, ignoring the distinct characteristics of both oil and water, and often resulting in inaccurate design that leads to operational problems. In this study, a new model is being developed through a theoretical and experimental study employing a revolutionary approach. The

  4. Near infra red spectroscopy as a multivariate process analytical tool for predicting pharmaceutical co-crystal concentration.

    Science.gov (United States)

    Wood, Clive; Alwati, Abdolati; Halsey, Sheelagh; Gough, Tim; Brown, Elaine; Kelly, Adrian; Paradkar, Anant

    2016-09-10

    The use of near infra red spectroscopy to predict the concentration of two pharmaceutical co-crystals; 1:1 ibuprofen-nicotinamide (IBU-NIC) and 1:1 carbamazepine-nicotinamide (CBZ-NIC) has been evaluated. A partial least squares (PLS) regression model was developed for both co-crystal pairs using sets of standard samples to create calibration and validation data sets with which to build and validate the models. Parameters such as the root mean square error of calibration (RMSEC), root mean square error of prediction (RMSEP) and correlation coefficient were used to assess the accuracy and linearity of the models. Accurate PLS regression models were created for both co-crystal pairs which can be used to predict the co-crystal concentration in a powder mixture of the co-crystal and the active pharmaceutical ingredient (API). The IBU-NIC model had smaller errors than the CBZ-NIC model, possibly due to the complex CBZ-NIC spectra which could reflect the different arrangement of hydrogen bonding associated with the co-crystal compared to the IBU-NIC co-crystal. These results suggest that NIR spectroscopy can be used as a PAT tool during a variety of pharmaceutical co-crystal manufacturing methods and the presented data will facilitate future offline and in-line NIR studies involving pharmaceutical co-crystals. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  5. Development of an information retrieval tool for biomedical patents.

    Science.gov (United States)

    Alves, Tiago; Rodrigues, Rúben; Costa, Hugo; Rocha, Miguel

    2018-06-01

    The volume of biomedical literature has been increasing in the last years. Patent documents have also followed this trend, being important sources of biomedical knowledge, technical details and curated data, which are put together along the granting process. The field of Biomedical text mining (BioTM) has been creating solutions for the problems posed by the unstructured nature of natural language, which makes the search of information a challenging task. Several BioTM techniques can be applied to patents. From those, Information Retrieval (IR) includes processes where relevant data are obtained from collections of documents. In this work, the main goal was to build a patent pipeline addressing IR tasks over patent repositories to make these documents amenable to BioTM tasks. The pipeline was developed within @Note2, an open-source computational framework for BioTM, adding a number of modules to the core libraries, including patent metadata and full text retrieval, PDF to text conversion and optical character recognition. Also, user interfaces were developed for the main operations materialized in a new @Note2 plug-in. The integration of these tools in @Note2 opens opportunities to run BioTM tools over patent texts, including tasks from Information Extraction, such as Named Entity Recognition or Relation Extraction. We demonstrated the pipeline's main functions with a case study, using an available benchmark dataset from BioCreative challenges. Also, we show the use of the plug-in with a user query related to the production of vanillin. This work makes available all the relevant content from patents to the scientific community, decreasing drastically the time required for this task, and provides graphical interfaces to ease the use of these tools. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. Development of diamond coated tool and its performance in ...

    Indian Academy of Sciences (India)

    Unknown

    In recent years, low pressure synthesis of diamond coating from gas phase on a suitable tool substrate has opened up new opportunities to expand applications of diamond tools widely. In fact a coated diamond tool combines the strengths of both single crystal diamond and PCD compact in one cutting tool and has better.

  7. Developing the role of big data and analytics in health professional education.

    Science.gov (United States)

    Ellaway, Rachel H; Pusic, Martin V; Galbraith, Robert M; Cameron, Terri

    2014-03-01

    As we capture more and more data about learners, their learning, and the organization of their learning, our ability to identify emerging patterns and to extract meaning grows exponentially. The insights gained from the analyses of these large amounts of data are only helpful to the extent that they can be the basis for positive action such as knowledge discovery, improved capacity for prediction, and anomaly detection. Big Data involves the aggregation and melding of large and heterogeneous datasets while education analytics involves looking for patterns in educational practice or performance in single or aggregate datasets. Although it seems likely that the use of education analytics and Big Data techniques will have a transformative impact on health professional education, there is much yet to be done before they can become part of mainstream health professional education practice. If health professional education is to be accountable for its programs run and are developed, then health professional educators will need to be ready to deal with the complex and compelling dynamics of analytics and Big Data. This article provides an overview of these emerging techniques in the context of health professional education.

  8. Moxidectin residues in lamb tissues: Development and validation of analytical method by UHPLC-MS/MS.

    Science.gov (United States)

    Del Bianchi A Cruz, Michelle; Fernandes, Maria A M; de C Braga, Patricia A; Monteiro, Alda L G; Daniel, Daniela; Reyes, Felix G R

    2018-01-01

    The development and validation of a throughput method for the quantitation of moxidectin residues in lamb target tissues (muscle, kidney, liver and fat) was conducted using ultra-high-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS). To achieve higher recovery of the analyte from the matrices, a modified QuEChERS method was used for sample preparation. The chromatographic separation was achieved using a Zorbax Eclipse Plus C18 RRHD column with a mobile phase comprising 5mM ammonium formate solution +0.1% formic acid (A) and acetonitrile +0.1% formic acid (B) in a linear gradient program. Method validation was performed based on the Commission Decision 2002/657/EC and VICH GL49. To quantify the analyte, matrix-matched analytical curves were constructed with spiked blank tissues, with a limit of quantitation of 5ngg -1 and limit of detection of 1.5ngg -1 for all matrices. The linearity, decision limit, detection capability accuracy, and inter- and intra-day repeatability of the method are reported. The method was successfully applied to incurred lamb tissue samples (muscle, liver, kidney and fat) in a concentration range from 5 to 200μgkg -1 , which demonstrated its suitability for monitoring moxidectin residues in lamb tissues in health surveillance programs, as well as for pharmacokinetics and residue depletion studies. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. CRMS vegetation analytical team framework: Methods for collection, development, and use of vegetation response variables

    Science.gov (United States)

    Cretini, Kari F.; Visser, Jenneke M.; Krauss, Ken W.; Steyer, Gregory D.

    2011-01-01

    This document identifies the main objectives of the Coastwide Reference Monitoring System (CRMS) vegetation analytical team, which are to provide (1) collection and development methods for vegetation response variables and (2) the ways in which these response variables will be used to evaluate restoration project effectiveness. The vegetation parameters (that is, response variables) collected in CRMS and other coastal restoration projects funded under the Coastal Wetlands Planning, Protection and Restoration Act (CWPPRA) are identified, and the field collection methods for these parameters are summarized. Existing knowledge on community and plant responses to changes in environmental drivers (for example, flooding and salinity) from published literature and from the CRMS and CWPPRA monitoring dataset are used to develop a suite of indices to assess wetland condition in coastal Louisiana. Two indices, the floristic quality index (FQI) and a productivity index, are described for herbaceous and forested vegetation. The FQI for herbaceous vegetation is tested with a long-term dataset from a CWPPRA marsh creation project. Example graphics for this index are provided and discussed. The other indices, an FQI for forest vegetation (that is, trees and shrubs) and productivity indices for herbaceous and forest vegetation, are proposed but not tested. New response variables may be added or current response variables removed as data become available and as our understanding of restoration success indicators develops. Once indices are fully developed, each will be used by the vegetation analytical team to assess and evaluate CRMS/CWPPRA project and program effectiveness. The vegetation analytical teams plan to summarize their results in the form of written reports and/or graphics and present these items to CRMS Federal and State sponsors, restoration project managers, landowners, and other data users for their input.

  10. A Process Analytical Technology (PAT) approach to control a new API manufacturing process: development, validation and implementation.

    Science.gov (United States)

    Schaefer, Cédric; Clicq, David; Lecomte, Clémence; Merschaert, Alain; Norrant, Edith; Fotiadu, Frédéric

    2014-03-01

    Pharmaceutical companies are progressively adopting and introducing Process Analytical Technology (PAT) and Quality-by-Design (QbD) concepts promoted by the regulatory agencies, aiming the building of the quality directly into the product by combining thorough scientific understanding and quality risk management. An analytical method based on near infrared (NIR) spectroscopy was developed as a PAT tool to control on-line an API (active pharmaceutical ingredient) manufacturing crystallization step during which the API and residual solvent contents need to be precisely determined to reach the predefined seeding point. An original methodology based on the QbD principles was designed to conduct the development and validation of the NIR method and to ensure that it is fitted for its intended use. On this basis, Partial least squares (PLS) models were developed and optimized using chemometrics methods. The method was fully validated according to the ICH Q2(R1) guideline and using the accuracy profile approach. The dosing ranges were evaluated to 9.0-12.0% w/w for the API and 0.18-1.50% w/w for the residual methanol. As by nature the variability of the sampling method and the reference method are included in the variability obtained for the NIR method during the validation phase, a real-time process monitoring exercise was performed to prove its fit for purpose. The implementation of this in-process control (IPC) method on the industrial plant from the launch of the new API synthesis process will enable automatic control of the final crystallization step in order to ensure a predefined quality level of the API. In addition, several valuable benefits are expected including reduction of the process time, suppression of a rather difficult sampling and tedious off-line analyses. © 2013 Published by Elsevier B.V.

  11. Simulation of spin dynamics: a tool in MRI system development

    International Nuclear Information System (INIS)

    Stoecker, Tony; Vahedipour, Kaveh; Shah, N Jon

    2011-01-01

    Magnetic Resonance Imaging (MRI) is a routine diagnostic tool in the clinics and the method of choice in soft-tissue contrast medical imaging. It is an important tool in neuroscience to investigate structure and function of the living brain on a systemic level. The latter is one of the driving forces to further develop MRI technology, as neuroscience especially demands higher spatiotemporal resolution which is to be achieved through increasing the static main magnetic field, B 0 . Although standard MRI is a mature technology, ultra high field (UHF) systems, at B 0 ≥ 7 T, offer space for new technical inventions as the physical conditions dramatically change. This work shows that the development strongly benefits from computer simulations of the measurement process on the basis of a semi-classical, nuclear spin-1/2 treatment given by the Bloch equations. Possible applications of such simulations are outlined, suggesting new solutions to the UHF-specific inhomogeneity problems of the static main field as well as the high-frequency transmit field.

  12. Developer Tools for Evaluating Multi-Objective Algorithms

    Science.gov (United States)

    Giuliano, Mark E.; Johnston, Mark D.

    2011-01-01

    Multi-objective algorithms for scheduling offer many advantages over the more conventional single objective approach. By keeping user objectives separate instead of combined, more information is available to the end user to make trade-offs between competing objectives. Unlike single objective algorithms, which produce a single solution, multi-objective algorithms produce a set of solutions, called a Pareto surface, where no solution is strictly dominated by another solution for all objectives. From the end-user perspective a Pareto-surface provides a tool for reasoning about trade-offs between competing objectives. From the perspective of a software developer multi-objective algorithms provide an additional challenge. How can you tell if one multi-objective algorithm is better than another? This paper presents formal and visual tools for evaluating multi-objective algorithms and shows how the developer process of selecting an algorithm parallels the end-user process of selecting a solution for execution out of the Pareto-Surface.

  13. Development of tools and models for computational fracture assessment

    International Nuclear Information System (INIS)

    Talja, H.; Santaoja, K.

    1998-01-01

    The aim of the work presented in this paper has been to develop and test new computational tools and theoretically more sound methods for fracture mechanical analysis. The applicability of the engineering integrity assessment system MASI for evaluation of piping components has been extended. The most important motivation for the theoretical development have been the well-known fundamental limitations in the validity of J-integral, which limits its applicability in many important practical safety assessment cases. Examples are extensive plastic deformation, multimaterial structures and ascending loading paths (especially warm prestress, WPS). Further, the micromechanical Gurson model has been applied to several reactor pressure vessel materials. Special attention is paid to the transferability of Gurson model parameters from tensile test results to prediction of ductile failure behaviour of cracked structures. (author)

  14. Developing Tools and Techniques to Increase Communication Effectiveness

    Science.gov (United States)

    Hayes, Linda A.; Peterson, Doug

    1997-01-01

    The Public Affairs Office (PAO) of the Johnson Space Center (JSC) is responsible for communicating current JSC Space Program activities as well as goals and objectives to the American Public. As part of the 1996 Strategic Communications Plan, a review of PAO' s current communication procedures was conducted. The 1996 Summer Faculty Fellow performed research activities to support this effort by reviewing current research concerning NASA/JSC's customers' perceptions and interests, developing communications tools which enable PAO to more effectively inform JSC customers about the Space Program, and proposing a process for developing and using consistent messages throughout PAO. Note that this research does not attempt to change or influence customer perceptions or interests but, instead, incorporates current customer interests into PAO's communication process.

  15. [Home safety and severe mental disorders: Developing an evaluation tool].

    Science.gov (United States)

    Désormeaux-Moreau, Marjorie; Dumont, Claire; Aubin, Ginette; Larivière, Nadine

    2015-04-01

    Home safety evaluation is an important issue within the context of current perspectives on accommodation for people with a serious mental illness who favour a more independent way of life. This paper describes the development and content validation of the Evaluation de la sécurité a domicile et de la gestion des risques (ESGR), an occupational therapy assessment tool for people with a serious mental illness. The ESGR was developed from scientific knowledge and clinical experience. Assessing content validity was done in two phases and involved the consultation of 11 experts. In its current form, the ESGR includes 67 items organized into three categories (environment, occupation, person). According to the experts consulted, there is a clinical interest in using the ESGR to support occupational therapists in the assessment of home safety for people with serious mental illness. The statements are clear and representative of the concept and the target audience.

  16. Validation of designing tools as part of nuclear pump development process

    International Nuclear Information System (INIS)

    Klemm, T.; Sehr, F.; Spenner, P.; Fritz, J.

    2010-01-01

    Nuclear pumps are characterized by high safety standards, operational reliability as well as long life cycles. For the design process it is of common use to have a down scaled model pump to qualify operating data and simulate exceptional operating conditions. In case of modifications of the pump design compared to existing reactor coolant pumps a model pump is required to develop methods and tools to design the full scale pump. In the presented case it has a geometry scale of 1:2 regarding the full scale pump size. The experimental data of the model pump is basis for validation of methods and tools which are applied in the designing process of the full scale pump. In this paper the selection of qualified tools and the validation process is demonstrated exemplarily on a cooling circuit. The aim is to predict the resulting flow rate. Tools are chosen for different components depending on the benefit to effort ratio. For elementary flow phenomena such as fluid flow in straight pipes or gaps analytic or empirical laws can be used. For more complex flow situations numerical methods are utilized. Main focus is set on the validation process of the applied numerical flow simulation. In this case not only integral data should be compared, it is also necessary to validate local flow structure of numerical flow simulation to avoid systematic errors in CFD Model generation. Due to complex design internal flow measurements are not possible. On that reason simple comparisons of similar flow test cases are used. Results of this study show, that the flow simulation data closely match measured integral pump and test case data. With this validation it is now possible to qualify CFD simulations as a design tool for the full scale pump in similar cooling circuit. (authors)

  17. Priority Determination of Underwater Tourism Site Development in Gorontalo Province using Analytical Hierarchy Process (AHP)

    Science.gov (United States)

    Rohandi, M.; Tuloli, M. Y.; Jassin, R. T.

    2018-02-01

    This research aims to determine the development of priority of underwater tourism in Gorontalo province using the Analytical Hierarchy Process (AHP) method which is one of DSS methods applying Multi-Attribute Decision Making (MADM). This method used 5 criteria and 28 alternatives to determine the best priority of underwater tourism site development in Gorontalo province. Based on the AHP calculation it appeared that the best priority development of underwater tourism site is Pulau Cinta whose total AHP score is 0.489 or 48.9%. This DSS produced a reliable result, faster solution, time-saving, and low cost for the decision makers to obtain the best underwater tourism site to be developed.

  18. Using Analytic Hierarchy Process to Examine the Success Factors of Autonomous Landscape Development in Rural Communities

    Directory of Open Access Journals (Sweden)

    Ta-Ching Liang

    2017-05-01

    Full Text Available The absence of comprehensive plans has resulted in disordered rural development and construction and a mix of new and old buildings in rural communities. Disorganized and blighted spaces have become rural landscape obstacles. After the Rural Rejuvenation Act was passed, rural construction has been guided with plans, and the government expects to enhance surroundings and expand policies through autonomous community development to create a good rural landscape. Through a literature review, this study aims to establish key success factors in autonomous landscape development of rural communities, covering 8 criteria and 28 sub-criteria. A questionnaire survey was conducted among national rural communities, experts, and scholars. The analytic hierarchy process reveals that manpower input has the highest importance, thereby indicating that the improvement of autonomous community development would double with the guidance of community cadres and the participation of artists and experts.

  19. Tools for tracking progress. Indicators for sustainable energy development

    International Nuclear Information System (INIS)

    Khan, A.; Rogner, H.H.; Aslanian, G.

    2000-01-01

    A project on 'Indicators for Sustainable Energy Development (ISED)' was introduced by the IAEA as a part of its work programme on Comparative Assessment of Energy Sources for the biennium 1999-2000. It is being pursued by the Planning and Economic Studies Section of the Department of Nuclear Energy. The envisaged tasks are to: (1) identify the main components of sustainable energy development and derive a consistent set of appropriate indicators, keeping in view the indicators for Agenda 21, (2) establish relationship of ISED with those of the Agenda 21, and (3) review the Agency's databases and tools to determine the modifications required to apply the ISED. The first two tasks are being pursued with the help of experts from various international organizations and Member States. In this connection two expert group meetings were held, one in May 1999 and the other in November 1999. The following nine topics were identified as the key issues: social development; economic development; environmental congeniality and waste management; resource depletion; adequate provision of energy and disparities; energy efficiency; energy security; energy supply options; and energy pricing. A new conceptual framework model specifically tuned to the energy sector was developed, drawing upon work by other organizations in the environmental area. Within the framework of this conceptual model, two provisional lists of ISED - a full list and a core list - have been prepared. They cover indicators for the following energy related themes and sub-themes under the economic, social and environmental dimensions of sustainable energy development: Economic dimension: Economic activity levels; End-use energy intensities of selected sectors and different manufacturing industries; energy supply efficiency; energy security; and energy pricing. Social dimension: Energy accessibility and disparities. Environmental dimension: Air pollution (urban air quality; global climate change concern); water

  20. Development of Nylon Based FDM Filament for Rapid Tooling Application

    Science.gov (United States)

    Singh, R.; Singh, S.

    2014-04-01

    There has been critical need for development of cost effective nylon based wire to be used as feed stock filament for fused deposition modelling (FDM) machine. But hitherto, very less work has been reported for development of alternate solution of acrylonitrile butadiene styrene (ABS) based wire which is presently used in most of FDM machines. The present research work is focused on development of nylon based wire as an alternative of ABS wire (which is to be used as feedstock filament on FDM) without changing any hardware or software of machine. For the present study aluminium oxide (Al2O3) as additive in different proportion has been used with nylon fibre. Single screw extruder was used for wire preparation and wire thus produced was tested on FDM. Mechanical properties i.e. tensile strength and percentage elongation of finally developed wire have been optimized by Taguchi L9 technique. The work represented major development in reducing cost and time in rapid tooling applications.